Business

Meta Enhances Ray-Ban Smart Glasses with AI Video and Live Translation

Meta Introduces Advanced Features for Ray-Ban Smart Glasses

Meta Platforms has recently announced an update to its Ray-Ban Meta smart glasses, introducing AI video capability and real-time language translation functionality. These features were first unveiled during Meta's annual Connect conference in September and are now available to members of the 'Early Access Program'.

The v11 software update, which began rolling out on Monday, enhances the glasses' AI chatbot assistant with video processing capabilities. This allows the device to analyze what the user sees and respond to queries in real-time.

Additionally, the smart glasses can now translate speech in real time between English and Spanish, French, or Italian. Meta explained in a blog post that users will hear translations through the glasses' open-ear speakers or view them as transcripts on their phones.

Meta has also integrated Shazam, an app that identifies songs, into the smart glasses. This feature will be accessible in the U.S. and Canada. Previously, Meta announced several new AI features for the Ray-Ban smart glasses, including tools for setting reminders and scanning QR codes and phone numbers using voice commands.