Facebook is building a new tool to put an end to revenge porn and other postings of intimate photos aiming at shaming an individual.
According to a 2016 study from the Data & Society Research Institute, one in 23 Americans are victim of non-consensual image sharing. And the consequences of these actions are often dramatic. As a leader among online sharing platforms, Facebook has started working on a new tool that will help stop the posting of photos that aim at shaming an individual on the platform.
The project was announced by Antigone Davis herself, the global head of safety policy at Facebook. Since April, users are already able to report an inappropriate photo, even if they are not tagged in. Once reported, the image is reviewed by a trained professional who determines if it violates Facebook’s community standards. If so, the photo is automatically removed and the account that shared it risks being disabled.
Davis explained that Facebook will do everything to fight “the use of those images to coerce people into behaving in ways they don’t want to behave.”
The new tool will go even further, using image matching technology combined with AI to stop a photo from being shared before it even gets a chance to make it live on the platform. The tool will also work across other Facebook owned platforms like Messenger or Instagram.
You might also like
More from Facebook
Facebook Gaming Partners will now be able to play copyrighted music in the background during their live streams without fear.
Following changes to its Platform Terms and Developer Policies, Facebook is announcing the broad launch of Data Use Checkup for …
Facebook now lets you use its data portability tool to transfer your photos and videos to Dropbox and Koofr - …
Facebook NPE has updated its Whale app with private meme groups, support for GIF memes, and new top meme templates.
Apple blocked a Facebook update because its new feature called out Apple on the 30% App Store tax the company …