Google I/O 2022: Multisearch, powerful visual exploration feature teased to capture localized information

Google announced at its I/O 2022 consumer keynote on Wednesday that it is updating its search engine with improvements to visual searches. The company revealed that it will be expanding its multi-search feature to allow users to view local results. Meanwhile, Google also said that it is working on a new visual exploration feature to capture information about multiple objects by panning a camera to see information and insights on their screen. This feature will depend on the multisearch feature and the company has not yet disclosed when it will be available to the users.

Feather Google I/O 2022The company revealed that it will further expand its multiple search feature with google lens Which allows users to search with images and text at the same time. Users will be able to search for “near me” results to find options for a local retailer or restaurant based on the picture you clicked and your search term. Google says localized information in Multisearch will be available in English to all users globally later this year, while support for other languages ​​will be added in the future.

multi search local result google multi search google

Google says localized results for multi-search will be available in English to all users globally later this year
photo credit: google

introduced Last month, Multisearch is a feature called Google As one of its “most significant upgrades” to search engines in many years. Multisearch allows users to click on an image of an item or product – such as an outfit or home decor – then swipe up to add text for a ‘combined’ search query. Users can click on an image of an orange dress, then add a ‘green’ or ‘blue’ query to search for similar products in another colour, or click on an image of a houseplant and select ‘Care. You can add a question for ‘Instructions’.

To find local results using Multisearch, the company says it scans millions of images and reviews posted on Web pages and from a community of Maps contributors to find results from nearby places. The feature — which relies on machine learning — can be used to find where you can find a particular dish at a restaurant near you, or find a product at a local retailer, according to Google. can.

The company is also working on expanding the multisearch feature on Google Lens with a new feature called ‘Scene Exploration’. Google says users will be able to use MultiSearch to pan their cameras to see information about “multiple objects in a wider scene.” When making a purchase, this feature can allow the entire shelf of products to be scanned to see insight into the overlays on their screens.

scene exploration google scene exploration google

The company launched Google I/O. but has teased its under-development scene exploration feature
photo credit: screenshot/google

Google says it plans to bring visual exploration to multi-search in the future, but hasn’t disclosed which regions will have access to the feature or what languages ​​will be supported. “Visual exploration is a powerful breakthrough in our devices’ ability to understand the world – so that you can easily find what you’re looking for,” said Prabhakar Raghavan, senior vice president at Google.

In another announcement related to its Search product, Google announced on Wednesday that it will be adding the ability to request the removal of phone numbers, home addresses and email addresses through the Google app in the coming months. company announced Last month when it was expanding its removal policies related to personal information in search results to all users.


Source link

Sharing Is Caring:

Leave a Comment