At its annual Search On event, Google announced a series of new features for Google Lens and Google Search that are based on its powerful MUM algorithm that brings a “multimodal” understanding of requests.
You will also be interested in
During the last conference Google I / O, the California online search giant has unveiled MUM (Multitask Unified Model), a machine learning algorithm responsible for helping Internet users find the answers to their queries faster by reducing the number of steps. To do this, MUM is able to analyze data in 75 different languages and cross-reference information into text, images, videos and audio. This is what Google calls for a “multimodal” understanding. On the occasion of its Search On event, which has just ended, the company announced the first concrete uses of MUM in its services.
Thus, visual search with Google Lens will become more relevant thanks to the power of MUM. Soon it will be possible to search from a photo by adding a text question about what you are seeing. For example, Google explains, we can take a picture of an animal and ask for information on how to feed it. Or, using a photo of a t-shirt, ask Google Lens to find a pair of socks with the same design. The deployment of this feature will take place for the English version of the service between the end of the year and the first quarter of 2022.
Google Lens is coming in Chrome
In a few months, the desktop version of the Chrome browser will include Google Lens. It will then be possible to select images, videos and text on a web and get search results without leaving it.
MUM will also help refine and deepen search engine results Google with new tools: Things to know which lists key information related to research; Refine this search / Broaden this search to refine or broaden certain aspects of a topic; A results page to be browsed visually in order to facilitate navigation, especially on mobiles.
Another major development, in a few weeks, MUM will be working to do research at the heart of a video. The algorithm will be able to identify themes, even if they are not specifically mentioned. An option that will first be reserved for requests in English.
Google also strengthens the shopping part of its search engine from a mobile device with, again, more visual results accompanied by better localized information. The main novelty that has been introduced is the “in stock” filter, which allows you to see which shops nearby have the item you are looking for on the shelves. A function which is now available in France.
The fires in Google Maps
After introducing satellite data on forest fires last year, Google Maps will be enriched with a new filter to help Internet users with websites and numbers emergency, evacuation instructions when available or the size and progression of a fire. This function will be available worldwide on Android , iOS and desktop computer next month.
Intéressé par ce que vous venez de lire ?
Note: This article have been indexed to our site. We do not claim ownership or copyright of any of the content above. To see the article at original source Click Here