Face recognition is not a new concept for Facebook, and the latest updates to the technology behind it could potentially give users more control over their pictures.
Back in October, we reported that Facebook was working on a face recognition feature that would allow for an extra layer of security in user accounts. With most device manufacturers already having such technology available, it looked like a natural development for the social media network.
Now, Facebook is putting all the knowledge towards protecting users, by testing a new service that will use face recognition to notify them when a third party or person publishes their image on the platform. As the announcement reads: “We want people to feel confident when they post pictures of themselves on Facebook […] We’re doing this to prevent people from impersonating others on Facebook.”
Facebook has kept it simple too, by letting users simply switch face recognition on or off, straight from within their account. It also reminds users that if their suggestions setting is currently set to “none,” then their default face recognition setting will be set to “off” and will remain that way until they decide to make the switch.
By turning face recognition on, Facebook will send alerts to specific users about potential pictures being uploaded. From there they will be able to confirm whether it’s really them, or even report the pictures and the user who published them.
Facebook is also making sure that even users with visual difficulties can make the most of this new feature. Facebook’s face recognition software will populate the alt text field of the picture, which will be then read out.
You might also like
More from Facebook
As VR content expands beyond gaming, Facebook is bringing support for subscriptions on the Oculus Quest platform.
Facebook has spent more than $23 million to guarantee Mark Zuckerberg's security in 2020, as revealed by a SEC filing …
Facebook Hotline is a slightly different take on Clubhouse that puts more power in the hands of attendees who can …