The North American giant promises to change its search engine (for the better)! The goal is, in his own words, to make the entire search experience more natural and intuitive. Soon we will have a “new” Google search Even better!
The news was shared by the technology company through its official blog, starting with a short tour of the last two decades of existence. There, the company focused on its mission “…organize the world’s information and make it universally accessible and useful“.
There is a “new” search engine on the way with a more natural and intuitive search
More recently, Google, during the Search On event, demonstrated how advances in artificial intelligence are once again enabling it to transform its information products.
According to the company itself, they go beyond the search box to create search experiences that work more like our (human) minds.
That is, they are as multidimensional as we are as people.
According to Google, the company envisions a world in which it is possible to find exactly what a user is looking for through the combination of images, sounds, text and voice.
In fact, just like people do naturally.
Here’s the first look at Google’s multiple search
The user will be able to ask questions, with fewer words, or even no questions at all, and Google search will still be able to understand exactly what they mean.
In addition, the user will be able to explore the information organized in a way that makes more sense. This is the new “Multi Search” and it promises to change the way we use this search engine.
We call this making search more natural and intuitive, and we’re on a long-term journey to bring this vision to people around the world.
To give you an idea of how we’re evolving our information products, here are three highlights of what we’re showing today at Search On.
Visual search will also get new features from Google
Again, the goal is to make visual search work more naturally. In this sense, the company wants to make visual search even more natural through multiple search.
It will be, according to Google, a completely new way of searching using images and text simultaneously. Like what a user does when they point to something and ask a friend a question about what they’re seeing.
“We introduced multisearch earlier this year as a beta in the US, and during Search On we announced that Let’s extend this new functionality to more than 70 languages in the coming months”, says Google.
“We’re taking this feature even further with ‘Multi-Search Near Me’, which allows the user to take a photo of an unknown item, such as a plate or a plant, and find it in a nearby location, such as a restaurant or at a store. gardening store”, adds the technology.
The best of all? The search giant promises to start doing the “multiple search near me” in English in the US later this fall 2022.
Google Lens will come to life with (more) dynamic translation
Advances in artificial intelligence (AI) have enhanced the capabilities of Google services. in addition to text translation for image translation.
In fact, it is now possible to combine the translated text in the background image thanks to a machine learning technology called Generative Adversarial Networks (GAN).
That is, if the user points his camera at a magazine in another language, for example, it will be possible to see the translated and superimposed text realistically in the photos below.
The best navigation app also has new features, here you have the “new” Google Maps
Finally, Google Maps also has new features gradually coming to its platform. In this way, exploring the world with the immersive vision will be a reality!
As with navigation, real-time traffic has made Google Maps much more useful.
Therefore, Google points out, they make another significant advance in mapping by bringing useful information. Data like the weather and how crowded a place is through immersive visualization on Google Maps.
Therefore, with this new experience, the user can feel and get an idea of the place even before entering and can confidently decide when and where to go.
See, for example, if the user is interested in being with a friend at a certain restaurant. You can zoom in on a neighborhood and a restaurant to get an idea of what the place might be like on the date and time you planned the meeting, visualizing things like the weather or how crowded the space might be.
In short, by fusing the world’s advanced imagery with its predictive models, Google can give you an idea of what a location will look like tomorrow, next week, or even next month.
Finally, this information and news about the American giant were revealed by Prabhakar Raghavan, Senior Vice President of official blog the company.