Last night we learned that Facebook has already introduced this feature on its iOS app. The new functionality called Automatic Alternative Text, generates image descriptions based on advanced object recognition algorithms.
Facebook users operating screen readers on iOS devices will now get to hear the descriptions of the photos as they swipe around them, e.g. “Image may contain three people, smiling, outdoors.” So far, blind or visually-impaired users were only able to hear the name of the user who shared a photo, along with the word “photo” that indicated the existence of the visual element.
The algorithm was developed with the help of neural networks technologies that are “used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown.”
According to the official announcement, more than 39 million people are blind on Facebook, and over 246 million who have a severe visual impairment, so Facebook is aiming at advancing the technology currently used to offer everyone the same experience on the platform, without exceptions.
Automatic Alternative Text is already available in English, but there are plans to add this functionality for other languages and platforms soon.
If you like our stories, there is an easy way to stay updated: