Meta’s Ray-Bans Can Now Do Real-Time Live AI And Translation


Meta’s camera-equipped Ray-Bans have had a lot of AI features onboard already, but they’re getting two more big ones. Always-on continuous AI assistance is arriving onto the glasses starting today for owners who have early access to Meta’s features, as well as onboard translation. Both these features were demoed by Mark Zuckerberg at Meta’s Connect developer conference earlier this year.

The features involve continuous audio and camera recording as opposed to specific individual prompts, allowing the Ray-Bans to be used for an extended period of time with the AI features turned on. Saying “Meta, start Live AI” begins the always-on feature. The glasses’ LED light stays on when always-on live AI is activated, and Meta keeps a recording of the conversation that can be referred to throughout the AI session. For translation, it should work automatically while talking, with the translation coming in via the glasses with a slight delay.

The live AI assistance is similar to what Google just demonstrated on its own prototype glasses this month via Gemini and Android XR, arriving next year.

The always-on AI takes a hit on battery life: expect 30 minutes of use before needing a recharge, according to Meta. But this type of always-on camera-assisted AI is exactly what more tech companies are going to be exploring in the next year. I’ll follow up with impressions when I get a chance to test it for myself.




Leave a Comment