Facebook is building a new tool to put an end to revenge porn and other postings of intimate photos aiming at shaming an individual.
According to a 2016 study from the Data & Society Research Institute, one in 23 Americans are victim of non-consensual image sharing. And the consequences of these actions are often dramatic. As a leader among online sharing platforms, Facebook has started working on a new tool that will help stop the posting of photos that aim at shaming an individual on the platform.
The project was announced by Antigone Davis herself, the global head of safety policy at Facebook. Since April, users are already able to report an inappropriate photo, even if they are not tagged in. Once reported, the image is reviewed by a trained professional who determines if it violates Facebook’s community standards. If so, the photo is automatically removed and the account that shared it risks being disabled.
Davis explained that Facebook will do everything to fight “the use of those images to coerce people into behaving in ways they don’t want to behave.”
The new tool will go even further, using image matching technology combined with AI to stop a photo from being shared before it even gets a chance to make it live on the platform. The tool will also work across other Facebook owned platforms like Messenger or Instagram.