Meta is rolling out two long-awaited features to its popular Ray-Ban Smart Glasses: real-time visual AI and translation. While it’s just being rolled out for testing right now, the plan is that, eventually, anyone that owns Ray-Ban Meta Smart Glasses will get a live assistant that can see, hear, and translate Spanish, French, and Italian.
It’s part of the v11 update that cover the upgrades Meta described at its Connect 2024 event, which also include Shazam integration for music recognition. This all happens via the camera, speakers, and microphones built into the Ray-Ban Meta glasses, so you don’t need to hold up your phone.
In my review of Ray-Ban Meta Smart Glasses, I was pleasantly surprised by the excellent photo quality. The ease of taking photos and recording videos hands-free is great.
Initially, the AI assistant had to be summoned with a wake word, slowing down communication. To ask about something I was seeing with Ray-Ban Meta Smart Glasses, I had to precede questions with “look and,” which always felt awkward.
The new live AI adds an intelligent assistant that’s always available to streamline interactions so the answers arrive quicker. Meta’s live AI feature uses video to get a continuous visual feed.
This isn’t a privacy concern because the live AI and live translation features must be enabled in settings and activated to start a continuous listening and watching session. The live features can also be disabled in settings or with a voice command.
The latest features are coming first to Early Access users for testing. Within weeks or months, depending on feedback, Meta will begin rolling out these exciting v11 updates to more Ray-Ban Meta Smart Glasses owners.
Ray-Ban Meta Smart Glasses offer great value at $299 and keep getting better with each over-the-air update. Check out our guide to the best smart glasses if you’re interested, but want to check out the competition. As of now, the Ray-Ban Smart Glasses remain the first products of its type to really see some mainstream interest.