Meta is rolling out three new features for its Ray-Ban smart glasses, enhancing their capabilities with live AI, live translations, and Shazam integration. These updates are designed to make the smart glasses not just stylish, but also increasingly functional, pushing the boundaries of wearable technology.
What’s New?
- Live AI (Early Access)
Meta’s live AI feature enables users to interact seamlessly with Meta’s AI assistant while the glasses observe the environment. Imagine strolling through a grocery store and asking the AI to suggest recipes based on what’s in your basket. The assistant responds naturally, delivering real-time insights.- Battery Life: The AI assistant can run for approximately 30 minutes per session on a full charge.
- Availability: Exclusive to members of Meta’s Early Access Program for now.
- Live Translation (Early Access)
Breaking language barriers, the live translation feature supports real-time speech translation between English and Spanish, French, or Italian. You can either:- Hear translations through the glasses’ speakers.
- View transcripts on your connected phone.
- Pre-Download Required: Users must download language pairs beforehand and set their own and their conversation partner’s languages.
- Shazam Support (Available Now)
Identifying songs on the go just got easier. Simply ask Meta AI to recognize a song you’re hearing, and it will provide the details. This feature is already available to all Ray-Ban smart glasses users in the US and Canada. You can see Meta CEO Mark Zuckerberg demonstrate it in this Instagram reel.
How to Get These Features
- Check Your Software: Ensure your glasses are running the v11 software and your Meta View app is updated to version v196.
- Early Access Program: Live AI and live translation require Early Access. Interested users can apply here.
- For Shazam: No extra steps are needed—this feature is live for all users in supported regions.
Why It Matters
With these updates, Meta continues to integrate AI into everyday life, blending utility and convenience into its Ray-Ban smart glasses. Features like live translation and Shazam provide immediate value, while live AI hints at a future where your eyewear becomes a true smart assistant. For those in Meta’s Early Access Program, this is an exciting glimpse into what wearable AI might look like tomorrow.
Whether you’re exploring recipes, breaking language barriers, or identifying a song, these glasses are stepping closer to becoming an indispensable tech companion.
Great Read 👏