Google has been busy updating Lens, its AI-powered image recognition tool, over the past year. It can recognize dog and cat breeds and is available on iOS and non-Pixel Android phones. Now, at I/O, Google is rolling out the latest Lens update: It will now be integrated directly into the stock camera app. It’ll start with the Pixel but will also come to other Android phones like the recently announced LG G7.
The company is also rolling out several new features to Lens: smart text selection, style match and real-time results. With smart text selection, you can do things like copy and paste text from the real world directly into your phone, and even quickly find the meaning of words from a document. In a demo onstage, for example, Google showed how you could quickly tap around words on a menu, and Lens will tell you what exactly that food is, complete with a visual guide and an ingredient list. Which is especially useful if the menu is in a foreign language you don’t quite understand.
The next feature is called style match, which is very similar to Pinterest’s own Lens tool. Just use the camera to scan a designer lamp or someone’s outfit and it’ll help you find items on the web that look similar to that. You can also preview the items, browse a catalog and even make a purchase from the phone itself.
The other big update is that Lens also now works in real time. Point that camera at anything and it’ll start processing the information in a split second. For example, you can hover your camera lens over a concert poster and it’ll start playing a related music video from the artist. According to Google, this is thanks to state-of-the-art machine learning and on-device intelligence and cloud TPUs.