Have you heard about Google’s new Maps, Search, and Shopping features? If not, we are here to tell you all about them!
At its Search On event held on September 2022, Google recently announced that it would offer new features for Google Search, Google Lens, Shopping, and Maps. All the latest features focus on improving the search experience for its users.
In this article, we go through the event’s key takeaways and the new local search features. Then, let’s take a deep dive into some of the new features!
Finding Restaurants by Their Dishes
Google came up with two new ways to look for restaurants in your area. Users can now find restaurants through Multisearch or by searching for specific dishes.
Users can now do a visual search with Google Lens by taking pictures of any tasty dish they want and then typing “near me” in Google Multisearch. Results will then show restaurants that sell the plate nearby.
You can also type the dish’s name into Google’s search bar. The results will show restaurants in your area offering the specific dish, with information about their pricing, ingredients, and other menu details.
Cindy Huynh, project manager at Google Lens, explained, “This new way of searching will help me and local businesses in my community, so I can more easily support neighborhood shops during the holidays.”
Â
Source: Google
Maps Live View
Google Maps also offers a new feature called Live View, which will be available in New York, Tokyo, London, Los Angeles, San Francisco, and Paris.
Live View allows you to utilize the camera on your phone to search for information on the areas around you. For example, you can lift your phone and tap the camera symbol in the search bar to find local stores and locations such as coffee shops, banks, grocery stores, and ATMs.
With the arrows and directions provided by augmented reality (AR), you can quickly determine store locations and distance from where you are and identify places that aren’t immediately visible, such as a clothing store down the street.
Source: Google
AR Shopping
Google is launching a new feature powered by AR to take your shopping to another level. For example, users can now shop for shoes and beauty products with augmented reality.
This AR shopping experience is only available across the VANS, Saucony, and Merrell brands when searching for shoes. However, this feature will be compatible with more brands in the future.
Start typing in the specific type of sneaker you want, such as “Shop blue VANS sneakers,” and then select the option “View in my space.” Subsequently, you can view the shoes in your actual room to spin, zoom, and determine whether you like their color, laces, thread, style, and other details.
Google’s augmented reality (AR) technology makes it easier to find your perfect foundation shade when searching for beauty products online. To accomplish this, Google improved its photo library by including 150 models who reflect a diversity of skin tones, genders, ages, face shapes, ethnicities, and skin types.
This new feature will help users correctly evaluate more than 2,000 foundation, lipstick, and eyeshadow shades from various brands, including MAC cosmetics, Charlotte Tilbury, Black Opal, and L’Oreal.
New Filters for Google Maps
Google Maps also added new filters to identify the nearby electric vehicle (EV) charging stations and wheelchair-accessible locations.
This filter helps locate fast-charging stations or plug types compatible with your electric vehicle (EV). The filter even helps you determine station availability and will be available worldwide for Android and iOS users.
Additionally, after initially releasing its “accessible locations” function in the United States, Australia, Japan, and the United Kingdom in 2020, Google announced this feature will now be available worldwide.
Google now allows users all over the world to search for locations that are wheelchair accessible and have no stairs.
To activate this function, you need to turn on the new setting on Google Maps called “Accessible Places,” and you will be able to see a wheelchair symbol on the places with accessible locations.
Source: Google
Google Lens Translate Update
A new update is coming to Google Lens that brings new artificial intelligence (AR) translation capabilities. The update allows users to translate text on more complex backgrounds.
Instead of hiding the original text under the translated content, Google will remove the original text and rebuild the pixels beneath with an artificial intelligence (AI)-generated background.
GAN models— also referred to as generative adversarial networks— are now used by Google to improve how the translated text is displayed. Similarly, the famous “Magix Eraser” feature on Pixel smartphones also uses this same technology.
Final Words
User experience is essential in today’s digital world, and Google’s new features are here to change the game! Users can now shop, navigate, and search online in a way they never imagined! Most of Google’s new features are already available for Android and iOS devices in the United States, and a few are available globally. So feel free to try out these new updates yourself today!