Google shows off AR for Search and big upgrades for Lens

Google shows off AR for Search and big upgrades for Lens

Search will soon surface AR models that you can pull out into the real world.

Google is beefing up its search engine with some nifty computer vision and augmented reality features. At its I/O developer conference today, the company showed how it is going to integrate visual and interactive results directly into search by letting you take advantage of your smartphone camera. 

This is great for those of you who are visual learners, but everyone can arguably benefit from being able to visualize something more clearly. Google cited the example of a medical student searching for information about muscle flexion. With the new search capabilities, the student would be able to access a 3D model of the muscle and even visualize it on their own desk using AR. 

Shoppers could find good use for the feature as well. Google showed off a New Balance shoe on stage, demonstrating how you can look at it from different angles and even put it next to your clothes to see if they match.

There are even more playful uses for AR in Search. For instance, keying in “Great White Shark” into the search box will let you access the shark in 3D and pull it out into the real world to better visualize it at scale. After all, Wikipedia may tell you that the shark can grow up to 4.9m in length, but you won’t really understand how large it is until you see that it’s bigger than your dining table. 


Google Lens is getting a signficant upgrade as well. While you can already point Lens at a monument or building to find out what it is, Lens can now work with restaurant menus and even receipts. If you point your camera at the menu, it’ll highlight the most popular dishes and let you tap on these items to see what they look like and check out customer reviews. When it comes time to foot the bill, Lens can also make sense of your receipt by calculating the tip and even splitting the total. 

  Death date announced for Windows 10 Mobile

Lens can highlight popular items on restaurant menus.

In the kitchen, if you point the camera at a recipe, it could also pull up a video of the dish being made.

That aside, Lens is also getting some new capabilities to improve accessibility. If you point it at a sign in a language you don’t understand, Lens can give you a text or audio translation in real time. Google Go, the company’s search app for entry-level devices, will also have Lens integration that lets users take advantage of this feature. As Google was careful to point out, this could come in handy in developing countries where many still struggle with illiteracy.

On top of that, Google says it will be improving search context in relation to news and podcasts. Given all the talk these days about echo chambers and liberal biases on the part of technology firms, Google is adding a new Full Coverage tab that supposedly offers a more complete picture of how a story is being reported from different sources. This will presumably offer a greater diversity of perspectives and a more balanced picture. Google first announced Full Coverage for Google News last year, and it will arrive on Google Search later this year.

According to CEO Sundar Pichai, Google relies on machine learning to identify different types of stories from various sources. But while it aims to surface a wide range of content, it will also allow you to focus on what really interests you. 

In addition, Google Search will begin indexing podcasts, a popular source of news and information today. The search engine will let you listen to a podcast in the search results without leaving Search and even save an episode for listening later.


Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button