Meta’s Ray-Bans New Live AI and Translation, Hands-On: Signs of AR Glasses to Come


I activated Meta Ray-Bans’ new live AI feature and took a morning walk across Manhattan. It was a strange experience. A white LED in the corner of my eyes stayed on as my glasses kept a feed of my life. I awkwardly asked questions: about the pigeons, about the construction workers, about whether it knew what car was nearby or who owned those trucks across the street. I got mixed answers, sometimes no answer at all. And then my connection ended because of bad Bluetooth in the city.

My first steps with an always-aware AI companion have been weird and even more science-fictiony than what I’d experienced over the last year. Much like a recent demo with Google’s always-on Gemini-powered glasses, Meta’s Ray-Bans — which are already very much available — are taking next steps to being something like an always-aware assistant. Or agent, as the AI landscape is calling it now. Live AI and live translation; once on, stay on. It’s assumed that AI can see what you see. And maybe it’ll help you do something you don’t know how to do.

Watch this: Meta Ray-Bans Live Translation and Live AI Demo

But these features also look like previews of what could be a whole new set of Meta glasses coming next year, ones that could have their own display and maybe even a gesture-controlling wristband too, based on hints Mark Zuckerberg gave on Threads last week after a story written by The Wall Street Journal’s Joanna Stern.

At the moment, Live AI feels like an odd glimpse of a more always-on and more intrusive AI future, that’s more of a companion than a helper from my very early attempts. And yet, translation, when it works, feels surprisingly helpful… even if it operates at a bit of a delay.

Meta's Ray-Bans next to a phone showing Meta AI settings page

Live AI mode is part of an Early Access set of features. It’s separately toggled on and off.

Scott Stein/CNET

Live AI: A constantly listening and watching state of mind

Turning on Live AI means starting live video recording. Although the video isn’t saved for you to watch later, it’s processed by Meta’s AI via your phone and relayed to the glasses. The LED light stays on to notify people it’s on, but in my experience people don’t notice the LED light all that much or don’t seem to care. Anything you say can be interpreted by Meta AI, so forget about conversations with others. In the office, I just seemed like a weird guy talking to myself or maybe, seemingly talking to others (only to have people try to talk to me and realize I wasn’t talking to them). But Live AI can be paused by tapping the side of the glasses.

Ending Live AI can be done by saying, “Stop Live AI,” but sometimes Meta AI thought I was asking if it was a live AI — a “Who’s on first?” moment. I had to yell out several times before it stopped.

A self portrait of a CNET's Scott Stein wearing Meta's Ray-Ban smart sunglasses

With Meta Ray-Bans on, it’s hard to for anyone to know you’re wearing smart tech… or having a conversation with AI.

Scott Stein/CNET

The challenge with live AI is figuring out what to do with it. I walked around the office asking about the furniture placement and was told everything seemed fine: “the room appears to be well-designed and functional, with no obvious changes needed.” I asked about a story I was writing on my laptop, and it said: “The text appears to be a cohesive and well-structured piece, with no parts that feel unnecessary.” I kept trying to get constructive feedback, and it was hard to get anything that wasn’t generic, although it did point out some notable lines and summarized my points.

As I walked outside, it told me what street I was on, but it was wrong — I corrected it, and then it simply acknowledged it and moved on. It knew the Chase bank I was looking at and told me the bank hours, and it knew Joe’s Pub when I stood at the entrance to the Public Theater, but it couldn’t tell me what was playing that night. It could recognize common pigeons, misrecognized a car on the curb as a Mercedes (it was a Lincoln) and recommended, for some reason, a bar down the street that was now, according to Meta AI, “defunct.”

Live AI is very much an early access beta right now, but I also need to understand what I’ll do with it, too. The early-beta feel and unclear purpose can combine to make it feel ridiculous. Or unexpectedly profound. Either way, keeping it running takes a hit on battery life: 30 minutes of use, instead of the hours that Ray-Bans normally work.

Live Translation mode on a phone, next to Meta Ray-Ban glasses

Live Translation needs to download individual language packs in order to work.

Scott Stein/CNET

Translation: Useful, for a few languages

Live translation works the same way, starting on request. But language packs need to be downloaded for the specific languages you want to translate: Spanish to English, for instance. Only Spanish, French, Italian and English are supported right now, which is a letdown.

I chatted with CNET colleague Danni Santana out in noisy Astor Place, near our New York office. He spoke in Dominican Spanish, and I spoke in English. The translated responses appeared in my ears a few seconds later, and over our chat, I felt like I was picking up enough to understand. It wasn’t perfect: the translation AI didn’t seem to get some phrases or idioms. The time delay made it hard to know when translation would end or if more was still coming in. I had trouble judging the timing of my replies to Danni as he was patiently waiting for me to talk across the table.

Meta also shows a live transcript feed of the conversation in the Meta View phone app, which you could refer to while using the glasses to show the person you’re talking with or clarify what was said.

The translation feature on Ray-Bans seems a lot more instantly helpful than Live AI, but that’s also because Live AI doesn’t make it clear yet what I should be using it for. Maybe I could turn it on while cooking or constructing IKEA furniture or playing a board game? I don’t know. Help me figure this out, Meta. Also, not having any heads-up display makes Live AI feel like I’m guessing as to what the glasses are looking at.

You could, of course, just use Google Translate on your phone instead. Meta’s using its glasses for translation in a similar way to how you’d use a pair of earbuds. But Meta’s glasses can also see and translate written items too, but that’s not part of the conversational live translation mode.

Wearing Meta Orion AR glasses and a wristband

Meta’s AR glasses moonshot, Orion, has its own neural input wristband and heads-up 3D displays. When will these slowly arrive on Ray-Bans?

Celso Bulgatti/CNET

What next: Display or gestures? Or both?

Meta’s year-old Ray-Bans have now gotten multiple major AI features, each one changing the equation in surprising ways. The newest live AI additions feel like they’re pushing the limits of the hardware, though, shaving down battery life. I wish I had better ways of knowing what the AI could see or could point with my hand to indicate what I wanted to ask about.

Future glasses could move in this direction: both with heads-up displays and with gesture recognition. Meta’s CTO, Andrew Bosworth, in a conversation I had with him at the end of the year, acknowledges these are the next steps — but the timeframe is unclear. Meta’s Orion glasses — a future-ambitious pair of glasses with 3D displays and a wrist-worn gesture tracker I demoed earlier this year that can recognize finger taps and pinches — are still years off from being real. But Meta’s wrist-worn neural band could emerge sooner or maybe a way for camera-equipped glasses to recognize hand gestures. And as for displays in smart glasses, Meta could explore a smaller heads-up display for showing information before it moves into larger, more immersive AR displays. Bosworth points to next-gen AR glasses in a recent blog post, but will some of this be possible in the next generation of Ray-Ban-like glasses next year?

“Gesture based controls require downward facing cameras and probably some illumination,” Bosworth says of future Meta glasses. “You could do it in the current Ray-Ban Metas — in the Live AI, we played with it — but you just have to do it [from] the field of view of the camera.” However, he acknowledges possibilities of adding an EMG band to glasses sooner than later. “Now you’re adding a device that has to be charged, it’s extra cost, it’s extra weight, but it’s so convenient.” But Bosworth sees the EMG band as being useful only when there’s a display on the glasses — something Ray-Bans don’t have…yet. It’s likely that, when Ray-Bans do get some sort of heads-up display, an input band might debut alongside. I’ve seen some attempts at ideas like this in other products.

And then there’s the battery life question: how will these more always-on glasses work for more than a few hours at a time? Or how would this all ramp up the cost of a next-gen pair of glasses? 

In the meantime, Meta’s AI might also carry into areas like fitness, as something that also bridges over to VR, where Meta has another version of Meta AI. “It would be very unusual if, a year from now, the AI you’re using to track your steps in the world and give you advice isn’t also aware that you’re also doing these workouts [in VR],” Bosworth says. 

As Live AI keeps evolving, having better ways to add gestures might be absolutely necessary. Bosworth sees pointing at things as being a key way to train the AI to get better in the future. “As the AIs get better, the need for these much simpler, intuitive gestures actually goes up significantly.”

Meta’s Ray-Bans don’t let me point to things right now, and it makes Live AI seem a bit confusing to use sometimes. But maybe that’ll need newer hardware, and added gestures and displays, to take that next leap.




Leave a Comment