Google Lens AI-driven visual search tool rolls out for Assistant on Pixel phones
Google Lens delivers contextual information using visual analysis.
Google has started rolling out visual search feature Google Lens in Assistant for the first batch of Pixel and Pixel 2 smartphones.
"The first users have spotted the visual search feature up and running on their Pixel and Pixel 2 phones," 9to5Google reported late on Friday.
Google Lens was unveiled at the company's annual developer conference I/O 2017. Based on machine learning and artificial intelligence, the app delivers contextual information using visual analysis. The technology was also demoed at the Pixel 2 and Pixel 2 XL launch event last month.
Built into the Photos app, Google Lens can recognise things like addresses and books, among others. In Photos, the feature can be activated when viewing any image or screenshot. However, in Google Assistant, it is integrated right into the sheet that pops up after holding down on the home button.
"Lens was always intended for both Pixel 1 and 2 phones," Google had earlier said in a statement.
Apart from Lens, Google had also introduced a 'Clips' camera which also uses AI and machine learning to capture the best pictures of users. Tesla founder Elon Musk, a big critic of AI, had slammed the device. "This doesn't even 'seem' innocent," he tweeted while referring to the Clips camera.
Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.