Meta, led by Mark Zuckerberg, has taken a major step in merging artificial intelligence with wearable technology by unveiling the Meta Ray-Ban Display, a new generation of smart glasses designed to transform how users interact with the digital world. Featuring instant subtitles, real-time translation, visual navigation, and gesture control via a neural wristband, these glasses aim to be more than just an accessory, they seek to redefine how we use our phones.

Available in the United States starting September 30 at a price of $799, the Ray-Ban Display integrates a monocular screen on the right lens to show notifications, messages, photos, and video calls without needing to pull out a smartphone. They also allow interaction with Meta AI in both visual and voice command formats. In addition, the design includes visual privacy (only the wearer can see the screen), around six hours of battery life per charge, and a neural wristband capable of detecting subtle wrist movements for silent commands.

How disruptive is this technology for a market currently dominated by smartphones?

It already shows signs of being a real alternative: Meta’s new glasses could reduce dependence on phones, becoming a daily-use device for messaging, multimedia, and navigation. However, they still face challenges such as comfort, pricing, and broader public acceptance of this new way of interacting with technology.