Facebook announced new AI-powered detection technology that will help find and remove non-consensual intimate images (aka “revenge porn”) on its platform.
The proliferation of digital audiovisual recording means and the ease with which users can post anything on social networks has led to the increase of intimate images and/or videos being uploaded without consent. And despite Facebook’s efforts to detect and remove material and to avoid it being shared any further, there is a lot more work to do.
Now, in addition to its photo-matching technology, Facebook is announcing that it will also be using machine learning and AI to “proactively detect near nude images or videos that are shared without permission on Facebook and Instagram.” This means that Facebook will be able to find offending content before anyone even reports it. This is important as victims are often afraid of retribution so they don’t always report the content themselves. Also, they are often not even aware of the content in the first place.
As Antigone Davis, Facebook’s Global Head of Safety explains in a recent newsroom post, when a piece of content is found by the new technology, “a specially-trained member of our Community Operations team will review the content found by our technology. If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission.” An appeals process also exists, should someone believe there’s been a mistake.
The new detection technology also works with Facebook’s new pilot program that it runs jointly with victim advocate organizations, providing people with “an emergency option to securely and proactively submit a photo to Facebook.” The image is then turned into a “digital fingerprint,” so that other such images will not be shared on Facebook again. This pilot will be expanded in the next few months for more people to benefit from it.
In addition to the above, Facebook is launching “Not Without My Consent,” a victim-support hub within its Safety Center. The hub has been developed together with experts, and victims use it to find organisations and resources to support them. These resources include steps they can take to remove content and prevent it from being shared further. Finally, Facebook is also going to be making it easier and more intuitive for victims to report when intimate images are shared on its platform.
In the next few months, the social network will also be creating “a victim support toolkit to give people around the world more information with locally and culturally relevant support” – in partnership with the Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-Yeon (South Korea).
You might also like
More from Facebook
Meta Introduces Facebook Reels API, Offering An Option To ‘Share To Reels’
Meta has introduced the Facebook Reels API, a solution allowing developers to build a 'share to reels' option into their …