Facebook is building a new tool to put an end to revenge porn and other postings of intimate photos aiming at shaming an individual.
According to a 2016 study from the Data & Society Research Institute, one in 23 Americans are victim of non-consensual image sharing. And the consequences of these actions are often dramatic. As a leader among online sharing platforms, Facebook has started working on a new tool that will help stop the posting of photos that aim at shaming an individual on the platform.
The project was announced by Antigone Davis herself, the global head of safety policy at Facebook. Since April, users are already able to report an inappropriate photo, even if they are not tagged in. Once reported, the image is reviewed by a trained professional who determines if it violates Facebook’s community standards. If so, the photo is automatically removed and the account that shared it risks being disabled.
Davis explained that Facebook will do everything to fight “the use of those images to coerce people into behaving in ways they don’t want to behave.”
The new tool will go even further, using image matching technology combined with AI to stop a photo from being shared before it even gets a chance to make it live on the platform. The tool will also work across other Facebook owned platforms like Messenger or Instagram.
You might also like
More from Facebook
Facebook Is Giving You More Controls Over What You See On Your Feed
Facebook is introducing new "show more" and "show less" controls to let you adjust what you want to see on …
Meta Introduces Facebook Reels API, Offering An Option To ‘Share To Reels’
Meta has introduced the Facebook Reels API, a solution allowing developers to build a 'share to reels' option into their …
Facebook Gets Into Delivery With DoorDash Partnership
DoorDash is partnering with Meta to pilot Facebook Marketplace deliveries across multiple cities in the U.S. Drivers will only transport items …