Google wowed a lot of people earlier this year at Google I/O when they announced and then demonstrated Google Lens. We saw it roll out to Google Photos to some users in recent weeks and now some more lucky users are seeing it appear as an option within their Google Assistant interface.
Google Lens uses deep learning routines and neural networks to detect and identify objects, landmarks and improve OCR accuracy within photos and images. A few weeks ago we saw a Googler tweet that Lens was being added to Google Assistant “in a few weeks”. True to their word it is beginning to arrive on some Pixel and Pixel 2 phones.
According to screenshots captured by those fortunate enough to have received this rollout already Lens is integrated into the Google Assistant interface where activating the Google Assistant interface shows the Lens icon which can be tapped to access the camera. Once the image in the viewfinder is taken by tapping on the display Lens automatically begins searching for a match to the image and results of the search are displayed at the bottom of the display.
At this stage all of Ausdroid’s Pixel and Pixel 2 phones have not been lucky enough to receive this update but we assume (and hope) that it will be arriving on our devices soon. Has it arrived on yours yet? Let us know below.