Later this year, Google is going roll out an update to its Lens AR Translate capabilities so users can more seamlessly translate text on complex backgrounds. Instead of covering up the original text like it currently does, Google is going to erase the text and re-create the pixels underneath with an AI-generated background, and then overlay the translated text on top of the image.
“Say you’re visiting New York with plans to knock out your holiday shopping and catch up with friends,” Google says in a blog post. “Lift your phone and tap on the camera icon in the search bar to see nearby stores and other places like coffee shops, banks and ATMs. With AR-powered directions and arrows, you can see what direction they’re in and how far away they are — and even spot places that aren’t in your immediate view (like a clothing store around the block) to get a true sense of the neighborhood at a glance.”
If you want to find other places, you can tap on categories to explore what restaurants, bars, dessert shops, parks and transit stations are nearby. In addition to displaying information about where places are, users will be able to see key information about each spot overlaid, such as whether the location is busy, if its open, what the price range is, etc.
Google also announced that it’s expanding its “accessible places” feature globally after initially launching it in the U.S., Australia, Japan and the U.K. in 2020. The feature is designed to help people determine whether a place is wheelchair accessible. You can turn on the “accessible places” setting in the app, after which you will start to see a wheelchair icon on places that are wheelchair accessible. You’ll be able to see if a place has accessible seating, restrooms and parking. Google notes that the feature can also be helpful if you want to avoid stairs because you have a stroller or are using a cart.
Google has announced a new AR shopping feature that is designed to make it easier to find your exact foundation match. The company says its new photo library features 148 models representing a diverse spectrum of skin tones, ages, genders, face shapes, ethnicities and skin types. As a result, it should be easier for shoppers to better visualize what different products will look like on them.
“Here’s how it works: Search for a foundation shade on Google across a range of prices and brands, like ‘Armani Luminous Silk Foundation’,” Google explained in a blog post. “You’ll see what that foundation looks like on models with a similar skin tone, including before and after shots, to help you decide which one works best for you. Once you’ve found one you like, just select a retailer to buy.”
Google rolls out new features across Maps, Search and Shopping by Aisha Malik originally published on TechCrunch