Training computers to understand

 Training computers to understand

 what they're looking at has been a major part of Google's AI strategy for years now. Google first introduced Lens, its visual AI tool, in 2017. According to the company's stats, it's now used more than 10 billion times per month.

Google is still developing new use cases for Lens, including a feature the company announced on Wednesday for Android phone users. The feature, called "search your screen," will allow people to search for things they're seeing in images and videos on their phone screens without having to leave the messaging app or web page.

If someone sends you a picture of an intriguing mystery location, you'll be able to long-press the home or power button on your phone, and Google Assistant will present you with an option to search your screen and bring up any information about that location. It will do the same for clothing or other items it sees, pointing you toward where you might be able to buy what you're looking at.

Unlike previous Lens features that were US-only at first, "search your screen" will come to all languages where Lens is currently available when it rolls out in the coming months.

Comments