Google’s engineers must’ve been hungry when designing these latest updates, as Google Lens is getting a feast of new features that make it easier to find delicious-looking food.
Google Lens is Google’s picture-based search tool that allows you to use images instead of text; you can snap a picture or screenshot something and search for similar-looking images on the web (it's Ctrl + F for the real world). Beyond that Google, Lens can also scan text in the image, letting you search what a written phrase means, look up help for your homework or translate some text.
Being able to look up what something is using a picture is certainly helpful, but if you’re like us you don’t just want to know what a delicious-looking meal is called, you need to know where you can go to munch on it right away. Well, the Google Lens Multisearch 'near me' feature allows you to search using a snapped picture of a food item, and then add a 'near me' tag to find a list of nearby locations that serve that food.
As seen in the example above, rather than trying to deduce what the treats are by googling 'jam-filled cookies,' the person was able to snap a pic to find out that they’re Linzer Augen. Then by tapping the search bar at the top they added 'near me' to discover close-by bakeries and restaurants that said they serve the treat.
But just because you know what Linzer Augen is and where to find it, you might not be able to eat it – if it's made with almonds and you have a nut allergy, the cookies could cause you serious trouble. So, when you search for food near you (like soup dumplings as seen in the example below) in Lens or regular Google Search, you can find out additional information about the treat you want to eat.
This includes information about ingredients, how spicy the food is, and if it’s vegetarian or vegan – you can even filter the results so you only see nearby options that meet your requirements.
These new tools should be available in Google Lens – both the Android and iOS versions of the app – and Google Search right now (they launched on November 17). However, for now, they’re restricted to users and restaurants in the US, which is often the case for new Google features. So if you live somewhere else, or are traveling abroad, you won’t be able to rely on these new tools right away; we expect they’ll be rolled out to other regions in the not-too-distant future.
Fashion with function
Another new feature, which is now available worldwide, is Lens’s AR translation update.
AR translation does what it sounds like – you can take a snap of some written text, and your smartphone is able to translate it to the language of your choice. It’s probably the Google Lens feature we rely on most as it works pretty well – it’s not always perfect, but the translation is almost always good enough for you to get the gist of what a sign or menu is saying.
This latest improvement won’t make the translations more accurate; instead, it’ll improve how they look on the image you take. Normally Google Lens will block out the original writing and superimpose the translation on top, but this can look fairly ugly – especially if there’s a beautiful background underneath.
The new change to Lens allows the app to erase the original text rather than covering it up, reconstruct the background using AI, and then place the translation on top, making it look more like it’s meant to be there. If you’re a restaurant owner who wants to create a quick translation of your lovingly designed menu for foreign guests, or a foodie who wants to share the translated menu online so other people know what to expect, this tool should help make your translations look a lot nicer.
from TechRadar - All the latest technology news https://ift.tt/t8Fj9cZ
via IFTTT
Comments
Post a Comment