New Google AI-powered visual tool to enhance search results

Google users will be able to combine images and text in search queries in the coming months as the tech giant is poised to introduce a new visual search tool with AI-powered features.

Update: 2021-10-11 07:54 GMT

Google users will be able to combine images and text in search queries in the coming months as the tech giant is poised to introduce a new visual search tool with AI-powered features.

This new visual feature was announced by Google at its Search On Livestream event held in September 29. At this event, Google shared details about how they were bringing the latest in AI to their products to enable users to search in new ways and explore information in more natural ways, said media reports.

In short the update will let users further narrow searches using text. This new feature, will arrive within months through its Google Lens search tool, the company said at its livestreamed event.

The updates to Lens are powered by a machine learning model that the company had unveiled at I/O earlier this year named MUM or Multitask Unified Model, said a Deccan Herald report. According to Google, MUM will unlock more insights in the future to get users to connect with information on the web that they may not have found otherwise.

Media reports quoted Google as saying that with this new capability, users could tap on the Lens icon when they are looking at a picture of a shirt, and ask Google to find the same pattern – but on another article of clothing, like socks. So, for example, if you snap a photo of a paisley shirt in order to find similar items online using Google Lens, you can add the command “socks with this pattern” to specify the garments you’re looking for.

Also read: Big Tech’s entry into banking sounds death knell for traditional lenders

This helps when you’re looking for something that might be difficult to describe accurately with words alone. By combining images and texts into a single query, Google is making it easier to search visually and express questions in a more natural way, said media reports.

It is also facilitating inspiration visually with a newly-designed browsable results page. This new visual results page is actually designed for users that are hunting for inspiration. Google is also using advanced AI system to pin down key moments in videos like a winning shot in a basketball game or the different steps in a dance routine. The first version of this feature will roll out in the coming weeks.

Users will also be able to run reverse-image searches when surfing on the Google iOS app or the Chrome desktop browser. When users select an image, it will pull up similar online visuals and this can help shoppers find where to buy items seen in photos, and which will eventually direct them to Google Shopping, said an Economic Times report.

Further according to the DH report, Google said it was making it easier for people to search for big to small merchants and also help people to assess the credibility of information online.

Making more items searchable is another priority. Google said it was licensing a free Address Maker app to governments and organisations to map routes and assign addresses to businesses and homes not listed on Google Maps to reduce the time for governments and organisations to assign addresses for a town.

All this work is helping creators, businesses and publishers as well, Google said, adding that it will continue to create more useful products. Their aim being to push boundaries of what it means to search and they want people to find the answers that they are looking for. In this way, they hope to inspire more questions among people.

Tags:    

Similar News