There's a certain allure to smart glasses that bulky mixed-reality headsets lack. Meta's Ray-Ban Smart Glasses (formerly Stories), for instance, are a perfect illustration of how you can build smarts into a wearable without making the wearer look ridiculous. The question is, can you still end up being ridiculous while wearing them?
Ray-Ban Meta Smart Glasses' big upcoming Meta AI update will let you talk to your stylish frames, querying them about the food you're consuming, the buildings you're facing, and the animals you encounter. The update is set to transform the wearable from just another pair of voice-enabled glasses into an always-on-your-face assistant.
The update isn't public and will only apply to Ray-Ban Smart Glasses and not the Ray-Ban Meta Stories predecessors that do not feature Qualcomm's new AR1 Gen 1 chip. This week, however, Meta gave a couple of tech reporters at The New York Times early access to the Meta AI integration and they came away somewhat impressed.
I must admit, I found the walkthrough more intriguing than I expected.
Even though they didn't tear the glasses apart, or get into the nitty gritty tech details I crave, the real-world experience depicts Meta AI as a fascinating and possibly useful work in progress.
Answers and questions
In the story, the authors use the Ray Ban smart glasses to ask Meta AI to identify a variety of animals, objects, and landmarks with varying success. In the confines of their homes, they spoke full voice and asked Meta AI. "What am I looking at?" They also enabled transcription so we could see what they asked and the responses Meta AI provided.
It was, in their experience, quite good at identifying their dogs' breed. However, when they took the smart glasses to the zoo, Meta AI struggled to identify far-away animals. In fact, Meta AI got a lot wrong. To be fair, this is beta and I wouldn't expect the large language model (Llama 2) to get everything right. At least it's not hallucinating ("that's a unicorn!"), just getting it wrong.
The story features a lot of photos taken with the Ray-Ban Meta Smart Glasses, along with the queries and Meta AI's responses. Of course, that's not really what was happening. As the authors note, they were speaking to Meta AI wherever they went and then heard the responses spoken back to them. This is all well and good when you're at home, but just weird when you're alone at a zoo talking to yourself.
The creep factor
This, for me, remains the fundamental flaw in many of these wearables. Whether you wear Ray-Ban Smart Glasses or Amazon Echo Frames, you'll still look as if you're talking to yourself. For a decent experience, you may engage in a lengthy "conversation" with Meta AI to get the information you need. Again, if you're doing this at home, letting Meta AI help you through a detailed recipe, that's fine. Using Meta AI as a tour guide when you're in the middle of, say, your local Whole Foods might label you as a bit of an oddball.
We do talk to our best phones and even our best smartwatches, but I think that when people see you holding your phone or smartwatch near your face, they understand what's going on.
The New York Times' authors noted how they found themselves whispering to their smart glasses, but they still got looks.
I don't know a way around this issue and wonder if this will be the primary reason people swear off what is arguably a very good-looking pair of glasses (or sunglasses) even if they could offer the passive smart technology we need.
So, I'm of two minds. I don't want to be seen as a weirdo talking to my glasses, but I can appreciate having intelligence there and ready to go; no need to pull my phone out, raise my wrist, or even tap a smart lapel pin. I just say, "Hey Meta" and the smart glasses wake up, ready to help.
Perhaps the tipping point here will be when Meta can integrate very subtle AR screens into the frames that add some much-needed visual guidance. Plus, the access to visuals might cut down on the conversation, and I would appreciate that.
You might also like
- Ray Ban Meta glasses made me feel creepy, but people were ready ...
- Ray-Ban Meta Smart Glasses review: the wearable AI future isn't ...
- CES: These microLED smart glasses might be the coolest we ...
- The Ray-Ban Meta camera glasses feel inevitable but I'm worried ...
- The Ray-Ban Meta Smart Glasses are getting a welcome camera ...
- Meta's Ray-Ban smart glasses are becoming AI-powered tour ...
from TechRadar - All the latest technology news https://ift.tt/vl14wXx
via IFTTT
Comments
Post a Comment