Google Lens being integrated directly into the camera app
Google Lens, the company's camera-powered visual search engine, is getting native support for the camera app. Earlier, it was only supported by Google Assistant and the Google Photos app. In the next few weeks, Google Lens will become compatible with Google Pixel and select smartphones from LG, Motorola, Xiaomi, Sony, HMD Global, Transsion, TCL, OnePlus, BQ, and Asus. It's also getting three major updates.
Through Lens, search for information with a visual reference
Google Lens, first unveiled at I/O 2017, can recognize what's in your images and then directly search the web for more information on it. It can scan business cards and allow users to save a contact through it. It is essentially an image recognition tool that provides contextual suggestions for the same by combining search, artificial intelligence, augmented reality, and computer vision.
Identify landmarks, a dog's breed or a flower type
For example, you can take a monument's picture and Google Lens will show you useful information about it, including its history and opening hours. Further, scanning a restaurant through the camera app will make Google Lens pull up its menu, pricing, reservations, and timings.
Transfer text from the real world to your phone
Google Lens could already identify images, and now it can also identify text. For example, if you don't identify a dish on a menu, Lens will be able to show you its image and more general information on it. This means it isn't just reading words, but also understanding their meaning and context. This is being done by leveraging Google Search's language understanding algorithms.
A Pinterest-like fashion search feature
With the new Style Match feature, you can just point your phone's camera app at an item of clothing and Google Lens will look for other similar items that match your piece's style. Google Lens does this by running searches through millions of items and also having an understanding of stuff like textures, shapes, angles, and lighting conditions.
Browse the world around you in real-time with Google Lens
Google Lens now works in real-time, meaning it can parse text as soon as it appears in your camera. With this, it can proactively identify and surface more information on text written on top storefronts, street signs and concert posters in real-time. Advances in machine learning allow Google Lens to identify billions of words, phrases, places, and objects in a split second.
Google Lens coming to Google Maps
Bringing visual assistance to the camera, Google Lens can now power an augmented reality version of Street View in Google Maps and allow users to identify locations via landmarks and not just street names.