Meta’s Ray-Ban Smart Glasses Get a High-Tech Upgrade with AI and Translation Features
Meta has unveiled exciting new features for its Ray-Ban smart glasses, taking wearable tech to the next level. Among the standout additions are live AI capabilities and real-time translation, enabling users to seamlessly integrate advanced technology into their daily lives.
With the live AI feature, the glasses can process visual information from their built-in camera and engage in real-time interactions. Meta describes the glasses as a hands-free assistant that can aid with activities such as cooking, gardening, or navigating unfamiliar surroundings. Impressively, users can ask questions without needing a wake word, as the AI understands the context and builds on previous queries. Meta hints that future updates will allow the AI to anticipate user needs and offer proactive suggestions.
Another major enhancement is the live translation functionality. The glasses can now translate conversations between English and Spanish, French, or Italian. Whether through audio via the built-in speakers or text on a connected smartphone, communication across languages becomes effortless.
In addition to these upgrades, Meta has integrated Shazam into the glasses, allowing users to identify songs simply by asking, “What is this song?” This feature taps into Shazam’s powerful music recognition technology, owned by Apple and deeply embedded in iOS.
These innovations are part of Meta’s Early Access Program, available to U.S. and Canadian customers with limited slots. As Meta continues to push boundaries, rumors suggest Apple might enter the smart glasses arena with a competing device.
Meta’s latest updates position the Ray-Ban smart glasses as a compelling option for tech enthusiasts seeking convenience, functionality, and a glimpse into the future of wearable technology.