Since Google announced Lens at I/O last year, Google has slowly been adding increased functionality as well as expanding availability to more devices. Today they’re announcing more functions, and also a new, hopefully easier way to use it by adding it into the camera app
In terms of availability, Google will be making Lens available in the camera on a range of new handsets including of course the Google Pixel phones, the new LG G7 and also phones from Motorola, Xiaomi, Sony Mobile, Nokia, Transsion, TCL, OnePlus, BQ and Asus.
There’s three new features being added to Google Lens, Smart Text Selection, Style Match and Realtime Results.
Smart Text Selection is scary awesome. It can recognise text in a document in the field of view of the camera lens, and then allow you to select, copy and paste it – just like you’re reading a document. Google sees this being useful in things like reading restaurant menus, business cards and more. Lens isn’t just recognising the shape of characters and letters, it’s trying to understand the meaning and context behind the words which allows you to use them more intelligently.
The next feature being added is “Style Match” allowing you to check out things like clothes, furniture and home decor and work out things that are like it – basically it matches the style of the thing and finds similar stuff.
Finally, Google is adding Realtime Results into Lens allowing you to see information about an item you’re looking at and then anchor that related information to a real world object. An example of this is if you see a poster for a concert and look at it through Google Lens, you may be given the option to overlay a music video from that artist.
Google has utilised both on-device intelligence, as well as their TPUs to incorporate these new features, allowing Lens to identify ‘billions of words, phrases, places, and things in a split second.’
The good news is we’ll be seeing these features coming to Lens in ‘the next few weeks’.