Google Now Gives You More Context About What You See On Google Images

Google is rolling out a new feature that makes it easy to find quick facts about the images you see in your Google Images searches.

Google Images has become a lot more useful in the last few years, with features like captions on thumbnail images in search results, or the ability to search within images you find with Lens, or the ability to discover more with the Related Images feature. Google says that these features were built to “make it easier to find visual inspiration, learn new things, and get more done.”

Now, Google is launching a new feature that lets you easily find “quick facts about what you see on Google Images.” The feature, available in the U.S. for now, lets you see information related to a result from Google’s Knowledge Graph, including people, places, or things related to the image.

Advertisement

Related | Google Brings Fact-Checking To Google Images

Google’s Knowledge Graph database contains billions of facts that will help users explore a topic more, and help drive traffic to a website featuring a specific image.

As Angela Wu, a Software Engineer at Google Search explains:

“For example, let’s say you’re searching for beautiful state parks to visit nearby. You want to swim during your visit, so you tap on a picture of a park with a river. Beneath the photo you might see related topics, such as the name of the river, or which city the park is in. If you tap a specific topic, it will expand and show you a short description of the person, place or thing it references, along with a link to learn more and other related topics for you to explore. With this information, you can better understand the image you’re viewing and whether the web page is relevant to your search.”

The idea is that you can begin somewhere are find a wealth of information around what you originally searched for.

To generate this information, and its links to relevant Knowledge Graph entities, Google uses information from deep learning to use an image’s visual and text signals, and combine this with its own understanding of the text on the website the image belongs to.

It then uses this information to determine “the most likely people, places, or things, that are relevant to a specific image.” Then, it matches those with topics in the Knowledge Graph and then surfaces them in Google Images when a match has been found.

For the moment, the new feature will start appearing on some images of people, places, and things, in Google Images. In the future, Google says it will expand to more images, languages, and surfaces.

Image: Google

 


Advertisement