Ray-Ban Meta Smart Glasses are more of an AI device than ever with new updates

Mark Zuckerberg mostly touted new AI features — as opposed to new styles.
Ray-Ban Meta Smart Glasses at Meta Connect 2024
Here's what's coming soon to Ray-Ban Meta Smart Glasses: A lot of AI features. Credit: Meta

If you do a quick online search for Ray-Ban Meta Smart Glasses right now, you'll find that the wearable is mostly marketed for its quick photo capturing and livestreaming capabilities.

However, at the Meta Connect 2024 event on Wednesday, Meta founder and CEO Mark Zuckerberg didn't have too much to say about photos and videos during the Ray-Ban Meta Smart Glasses section of the presentation.

In fact, Zuckerberg introduced the Ray-Ban Meta Smart Glasses primarily as an AI device.


You May Also Like

"Glasses are a new AI device category," Zuckerberg said, noting that his company has just caught up with the consumer demand for Meta smart glasses after sales took off faster than he said he expected.

Aside from a new limited edition Ray-Ban Meta Smart Glasses device with clear transparent frames, there weren't any new smart glasses hardware announcements from Meta. 

Clear, transparent Ray-Ban Meta Smart Glasses
Credit: Meta

However, Zuckerberg did share several new features that he said were coming to the Meta smart glasses in a set of updates releasing over the next couple of months — all of them AI related.

Meta AI is already integrated into Ray-Ban Meta Smart Glasses in much the same way other companies' voice assistant's are integrated into their devices. But, according to Zuckerberg, new updates will make these interactions "more natural and conversational."

"Hey Meta" instead of "Look and tell me."

For example, currently, users have to prompt their Ray-Ban Meta Smart Glasses with the phrase "look and tell me" when they have a question. Zuckerberg's demo showcased how users will no longer have to do that. Users will just need to activate the feature with the "Hey Meta" prompt and then ask their question. Meta AI will automatically know the question is in regards to whatever the user is looking at through the glasses.

Furthermore, after the initial "Hey Meta," Meta AI will no longer require that users start each prompt with that phrase. Meta AI will be able to continue interacting with users.

Live Translation on Ray-Ban Meta Smart Glasses 

The latter feature is similar to what's been seen in other smart glasses when it comes to translations. A user can access live real-time audio translations of another language through the glasses when conversing with another person. The demo seemed to work nearly perfectly at Meta Connect when translating from Spanish to English and English to Spanish.

Meta multimodal video AI
Credit: Meta

Multimodal AI prompts

Zuckerberg explained the multimodal video AI feature through a demo showing a user trying on outfits to wear. Through this feature, Meta AI was able to offer fashion advice and suggestions based on the user's outfit and their specific question about it.

Ray-Ban Meta Smart Glasses will also soon be able to automatically remember things for users. The example showcased at Meta Connect involved Meta AI recalling the parking space number where the user parked their car. The user did not have to prompt Meta AI to do that. It naturally appeared to remember the number because the user viewed it through the glasses.

Meta AI remembers
Credit: Meta

Adding on to that feature is a similar Meta AI capability where Ray-Ban Meta Smart Glasses users will soon be able to look at a flier or advertisement and ask the smart glasses to call the phone number or scan the relevant QR code. The glasses can also automatically remember those things as well if a user wants to go back to what they previously viewed through the glasses at a later time.

Other updates coming to Ray-Ban Meta Smart Glasses include the ability to voice control Spotify and Amazon Music through the device as well as new integrations with apps like Audible and iHeartRadio.

Partnership with Be My Eyes for blind and low vision users

Meta + Be My Eyes
Credit: Meta

Meta also announced a partnership with Be My Eyes, a mobile app that connects blind and low-vision people with volunteers via live video to talk through what's in front of them. The app will work directly through Ray-Ban Meta Smart Glasses and volunteers will be able to see through the user's glasses in order to provide assistance.

Topics Meta

Mashable Potato

Recommended For You
Pranksters and pickup artists are using Meta Ray-Ban glasses to harass strangers for content
Man with meta ray ban glasses with creepy grin

Your Meta Ray-Ban smart glasses recordings aren't private
A close-up image of a small camera in the corner of the Ray-Ban Meta glasses frame.


I tried the Even Realities G2, the most subtle pair of smart glasses you can buy in 2026
portrait of even realities g2 smart classes held in hand at ces 2026

Apple is secretly working on smart glasses, AI pendant, and AI AirPods, report says
tim cook stands in front of a rainbow arch during WWDC 2025 at Apple Park headquarters

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!