Over the last couple of years, Google Lens has been able to garner a lot of attention from the tech community during the company’s annual developer conference. One of the highlights from Google I/O 2018 was the fact that Google Lens will soon start to work in real-time and that it would be receiving a couple of new features. This includes features like Style Match and Smart Text Selection, but we had yet to learn exactly when those features would be rolling out to the community. We’re now seeing reports that these new real-time features are rolling out to the community as we speak.
In case you missed it, Google showed off a number of new Google Lens features on stage at Google I/O earlier this year. One of the new features is being called Style Match and it offers a way for you to search for similar products online by simply pointing the smartphone camera at something. This may result in the exact product you’re looking at it or it may just show you some similarly designed results that it was able to find. To that end, the Smart Text Selection feature lets you point the camera at some text displayed on any object.
This could be a menu, a street sign, a card, anything. The camera software will then detect that you’re looking at some text and let you interact with it as if it was a digital object. This is especially useful as you can then do a Google web search using the text in real-time. We’re now seeing a report from Engadget saying these new Google Lens features are making their way to devices right now. As always, Google likes to do a slow, gradual rollout with new features so if you don’t have access to it right now then it may only take a bit more time before you do.
Google Lens is being built directly into the native camera applications and will be available from smartphone OEMS including Google, LG, Motorola, Xiaomi, Sony Mobile, Nokia, Transsion, TCL, OnePlus, BQ and ASUS.
Source: Engadget
from xda-developers https://ift.tt/2J27HBa
via IFTTT
Aucun commentaire:
Enregistrer un commentaire