Google is bringing together a collection of efforts and teams to bring the benefits of AI to everyone – The Economic Times.
On 17th May, 2017 at Google I/O developer conference, CEO Sunder Pichai introduced a new product “Google Lens”, a powerful AI driven visual search technology.
Google Lens is an artificial intelligence technology wherein Google Search engine and your smartphone’s camera are amalgamated with deep machine learning algorithms in a way that your camera will act like a search engine with vision based computing ability. Now your smartphone camera won’t just see the world as you see it but will also understand what it sees and provide info to help you take action. This will provide you information in a more useful and meaningful way.
If you take a picture of a flower, Google Lens will tell you its name and species, along with optimum requirements for its growth and blooming, etc. Apart from the flower general info, it will also provide details of nearest florist from where you can get that flower.
If you take a picture of a restaurant, Google Lens will tell you not only the name of the restaurant but also it would be able to readout its menu, operational hours of that restaurant and whether it has a vacant seat at present for you or not. Similarly, Google Lens will also recognize club, café and bars too.
Another great utility of Google Lens will be to ease your day to day activity like if you are at your friends place you just need to take a picture of SSID sticker from the back of the router and your phone will automatically get connect with that WiFi network. Gone are the days when you had to strain your eyes to read that tiny password and typing it to your phone.
Google is working day and night for years to develop AI technologies to ease our living and Google Lens is a small product of its larger vision. It certainly must be using AI laden image processing techniques and of course machine learning algorithms so that computer can understand images and videos more efficiently. There are lots more to come and I won’t be surprised when one day Google will start understanding not only the context and semantics of each and every query with much greater precision but also the emotions behind each query.
“When we started working on search, we wanted to do it at scale,” CEO Sundar Pichai said at Google’s 2017 I/O Conference. “That’s why we designed our data centers from the ground up and put a lot of effort into them. Now that we’re evolving for this machine-learning and AI world, we’re building what we think of as AI-first data centers.”
As per CEO Sunder Pichai, Google Lens won’t be a stand-alone application but will be fused with other Google products. Google will first integrate the Google Lens with its Photos and Assistant app and gradually it will be integrated with other Google products. Amalgamated with Photos and Assistant, Google Lens will allow its users to interact more naturally with visual artifacts. For instance, if you landed in a country that does not speak your language, Assistant integrated with Google Lens will translate the text on the shine boards and other pictures in your own language.
Photos app fused with Google Lens, will tell you much more stories about the photos that you already have clicked. This is going to be much more fun and learning with this up-coming technology. You might be able to make a call directly to the number that your friend has sent you in screenshot or to any number that is on your pic in Photos app. isn’t it cool? And it will be much more cooler once this Artificial Intelligence technology will be on your phone.
Would this technology be available for iPhone users? Since Google Assistant is available for iPhone, so it seems Google Lens will also be available for iOS users through Assistant.
Though Google did not announced actual date when Google Lens would be available but is due to launch later this year. I am eagerly waiting to interact with this amazing AI loaded Google Lens technology.