Loading stock data...
Ray Ban Meta Lifestyle11

Meta Updates Smart Glasses with Real-Time AI-Powered Video Capabilities

Enhanced AI Capabilities for Wearers

Meta’s Ray-Ban Meta smart glasses are getting several new AI-powered upgrades, including the ability to have an ongoing conversation with Meta’s AI and translate between languages. The updates aim to make the user experience even more seamless and intuitive.

Live AI: Ongoing Conversations Made Easy

The latest firmware update, v11, introduces ‘live AI’ for Ray-Ban Meta wearers in Meta’s early access program for the U.S. and Canada. This feature allows users to continuously converse with Meta’s AI assistant, Meta AI, without having to say the ‘Hey, Meta’ wake word. Wearers can interrupt Meta AI to ask follow-up questions or change topics at any time.

How Live AI Works

Live AI lets wearers reference things they discussed earlier in the conversation, making it easier to pick up where they left off. This feature is particularly useful for tasks that require multiple steps or complex discussions. For example, a user can ask Meta AI to summarize their previous conversation and then follow up with more questions.

Real-Time Video Capabilities

The live AI update also works seamlessly with real-time video. Wearers can ask questions about what they’re seeing in real time – for instance, what’s around their neighborhood. This feature is a significant improvement over traditional smart glasses capabilities and sets Meta apart from competitors like Google and OpenAI.

A Breakthrough in Real-Time AI Video

Meta’s focus on real-time AI video was evident at its Connect dev conference this fall. The company positioned itself as an answer to OpenAI’s Advanced Voice Mode with Vision and Google’s Project Astra, which also feature real-time AI capabilities. However, Meta is one of the first tech giants to market with real-time AI video on smart glasses.

Live Translation: Bridging Language Barriers

The firmware update v11 introduces live translation, enabling Ray-Ban Meta wearers to translate real-time speech between English and Spanish, French, or Italian. When a wearer is talking to someone speaking one of those languages, they’ll hear what the speaker says in English through the glasses’ open-ear speakers and receive a transcript on their paired phone.

How Live Translation Works

Live translation uses AI-powered technology to translate spoken language in real-time. This feature is particularly useful for travelers or individuals who communicate with people speaking different languages regularly. The update makes it easier for people to connect across cultural boundaries.

Shazam Support: Music Identification Made Easy

The firmware v11 also introduces Shazam support, allowing wearers to say, ‘Hey, Meta, Shazam this song’ and have the glasses try to find the tune that’s playing. This feature is a convenient addition to the Ray-Ban Meta experience, making it easier for users to identify music.

Challenges Ahead

Meta warns that the new features might not always get things right. The company acknowledges that there may be instances where live AI and live translation don’t perform as expected. However, Meta assures users that they’re continuing to learn what works best and improving the experience for everyone.

Future Developments

In the future, Meta claims that live AI will even give ‘useful suggestions’ before a wearer asks. The company hasn’t revealed what sort of suggestions this might entail, but it’s an exciting prospect for users who want to get the most out of their Ray-Ban Meta glasses.

A Competitive Advantage

Meta’s focus on AI-powered upgrades sets the company apart from competitors like Google and OpenAI. While these companies are working on similar features, Meta is already delivering them to its users. This puts Meta in a competitive position in the market for smart glasses.

Conclusion

The latest firmware update v11 brings significant improvements to the Ray-Ban Meta experience. The addition of live AI, real-time video capabilities, and live translation make it easier for wearers to interact with their surroundings and communicate with others. As the technology continues to evolve, users can expect even more exciting features from Meta.

Related Topics

  • AI
  • Generative AI
  • Machine Learning
  • Robotics
  • Smart Glasses
  • Augmented Reality

Sources

  • Meta’s official blog post announcing the firmware update v11
  • Meta’s Connect dev conference presentation on real-time AI video capabilities
  • OpenAI’s Advanced Voice Mode with Vision documentation
  • Google’s Project Astra documentation
Media 6f7a0380 482a 4330 82d5 d3b62184ace9 133807079768756310 Previous post BitGo Unveils New Retail Crypto Custody Platform
tidal arctic Next post Alphabet X’s Latest Spinout Brings Computer Vision and Artificial Intelligence to Salmon Farms