People have always gathered information in a variety of ways — from talking to others, to observing the world around them, to, of course, searching online. Though typing words into a search box has become second nature for many of us, it’s far from the most natural way to express what we need. For example, if I’m walking down the street and see an interesting tree, I might point to it and ask a friend what species it is and if they know of any nearby nurseries that might sell seeds. If I were to express that question to a search engine just a few years ago… well, it would have taken a lot of queries.

But we’ve been working hard to change that. We’ve already started on a journey to make searching more natural. Whether you’re humming the tune that’s been stuck in your head, or using Google Lens to search visually (which now happens more than 8 billion times per month!), there are more ways to search and explore information than ever before.

Today, we’re redefining Google Search yet again, combining our understanding of all types of information — text, voice, visual and more — so you can find helpful information about whatever you see, hear and experience, in whichever ways are most intuitive to you. We envision a future where you can search your whole world, any way and anywhere.

Find local information with multisearch

The recent launch of multisearch, one of our most significant updates to Search in several years, is a milestone on this path. In the Google app, you can search with images and text at the same time — similar to how you might point at something and ask a friend about it.

Related work from others:  Google AR & VR - Artist Felipe Pantone turns the world’s streets into his AR canvas

Now we’re adding a way to find local information with multisearch, so you can uncover what you need from the millions of local businesses on Google. You’ll be able to use a picture or screenshot and add “near me” to see options for local restaurants or retailers that have the apparel, home goods and food you’re looking for.

Later this year, you’ll be able to find local information with multisearch.

For example, say you see a colorful dish online you’d like to try – but you don’t know what’s in it, or what it’s called. When you use multisearch to find it near you, Google scans millions of images and reviews posted on web pages, and from our community of Maps contributors, to find results about nearby spots that offer the dish so you can go enjoy it for yourself.

Local information in multisearch will be available globally later this year in English, and will expand to more languages over time.

Get a more complete picture with scene exploration

Today, when you search visually with Google, we’re able to recognize objects captured in a single frame. But sometimes, you might want information about a whole scene in front of you.

In the future, with an advancement called “scene exploration,” you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.

Video format not supported

In the future, “scene exploration” will help you uncover insights across multiple objects in a scene at the same time.

Imagine you’re trying to pick out the perfect candy bar for your friend who’s a bit of a chocolate connoisseur. You know they love dark chocolate but dislike nuts, and you want to get them something of quality. With scene exploration, you’ll be able to scan the entire shelf with your phone’s camera and see helpful insights overlaid in front of you. Scene exploration is a powerful breakthrough in our devices’ ability to understand the world the way we do – so you can easily find what you’re looking for– and we look forward to bringing it to multisearch in the future.

Related work from others:  Google AR & VR - 7 Maps and Lens features to use on your next trip

These are some of the latest steps we’re taking to help you search any way and anywhere. But there’s more we’re doing, beyond Search. AI advancements are helping bridge the physical and digital worlds in Google Maps, and making it possible to interact with the Google Assistant more naturally and intuitively. To ensure information is truly useful for people from all communities, it’s also critical for people to see themselves represented in the results they find. Underpinning all these efforts is our commitment to helping you search safely, with new ways to control your online presence and information.

Similar Posts