Meta reveals new smart glasses with advanced AI capabilities
- Meta introduced the Meta Ray-Ban Display smart glasses with advanced AI capabilities during the Meta Connect conference.
- The glasses include a see-through display and a gesture-detecting wristband, aimed at enhancing user experience and interaction.
- Mark Zuckerberg envisions these glasses as a significant evolution in personal computing technology, potentially surpassing smartphones.
On September 18, 2025, in Menlo Park, California, Meta unveiled its new AI-powered smart glasses, the Meta Ray-Ban Display, during its Meta Connect developer conference. The glasses feature a see-through display on the right lens, allowing users to read messages, receive video calls, and follow map directions. This innovation is considered a pivotal step in integrating artificial intelligence into personal computing devices. Meta executives, including CEO Mark Zuckerberg, emphasized that these glasses will enhance human-computer interaction and may even redefine how people communicate and access information in their daily lives. The glasses come equipped with a 12-megapixel camera for capturing images and videos and can be controlled via a wristband that detects gestures. This technology represents a shift towards personal superintelligence, which Zuckerberg defined as a tool that can provide real-time assistance by integrating seamlessly into users' lives. Analysts predict that while these glasses are primarily for experimentation right now, they have the potential to become mainstream devices. Acceptance and enthusiasm for the glasses have been bolstered by their popularity among social media creators and early adopters, suggesting that Meta could successfully transition its products beyond niche markets. The company is also anticipated to enhance its AI capabilities across its platforms, continuing its commitment to AI development and the transformation of user experiences.