Building the metaverse has been Mark Zuckerberg’s dream, reflected in his renaming of Facebook to Meta. However, this dream has remained out of reach, with the company recently announcing layoffs in its Reality Labs division. The move confirms Meta’s transition from virtual reality (VR) to artificial intelligence, but the organization may have finally found the sweet spot between these two disciplines: AI glasses.
Meta Glasses, Explained
Meta’s lineup of smart glasses support hands-free interactions with smartphones, tablets and other devices through features that border on augmented reality. More recently, Meta has been integrating AI-powered features into its glasses, including its voice assistant, translated captions and a wristband that lets users interact with their phone using hand gestures.
Meta glasses are AI-powered smart glasses that incorporate aspects of augmented reality (AR), allowing users to interact with their devices via a screen displayed at eye level. New AI features continue to enhance Meta’s lineup, while also raising concerns around personal privacy and data security. Here’s everything you need to know about Meta’s AI glasses, including how they work, what they can do and what the future holds in store.
What Are Meta Glasses?
Meta glasses collectively refer to Meta’s smart glasses that facilitate hands-free interactions with smartphones, smartwatches and other devices through AI features. The following product lines make up this category:
- Ray-Ban Meta: Meta’s original smart glasses, which are styled to fit a variety of occasions and have improved in areas like battery life and video quality.
- Oakley Meta: Meta’s smart glasses intended for athletes and those with active lifestyles who want to use AI to record and analyze their performance.
- Meta Ray-Ban Display: Meta’s first pair of smart glasses that offers a heads-up display and allows users to take actions using a wristband.
Together, these glasses enable users to perform actions on smart devices across various settings without ever physically interacting with them, promoting a user experience increasingly rooted in mixed reality — the practice of immersing users in an experience that blends the real world with elements of AR and VR.
How Do Meta Glasses Work?
Like other wearable technologies, Meta’s glasses use Bluetooth to connect to a device, most commonly smartphones. These phones provide an internet connection via Wi-Fi or cellular data, allowing Meta glasses to function. The glasses also need to be online to receive software updates, including the v21 software update that makes it easier to hear someone in a loud setting and quickly summon personalized Spotify playlists.
In addition, users can interact with a heads-up display visible directly in the lens through wrist and hand movements, particularly in the Meta Ray-Ban Display model. An electromyographic (EMG) wristband uses sensors to detect electrical signals in the wrist, so wearers can write out messages, scroll down a page or zoom in on a photo through simple hand gestures — all without taking out their phone.
What Features Do Meta Glasses Include?
While the heads-up display and EMG wristband are limited to Meta’s latest Ray-Ban model for now, there is a range of other features available on other models, including:
- A 12-megapixel camera that takes pictures and videos.
- Open-ear speakers that support audio for calls, podcasts and music.
- The Meta AI voice assistant, which can be activated with voice commands.
- Integration with the Meta AI app to better manage features and devices.
- Bluetooth connectivity to transfer media and files to another device.
- Compatibility with prescription lenses.
These features enable Meta glasses to fulfill a variety of use cases, regardless of the specific model users choose.
What Can You Do With Meta Smart Glasses?
From capturing celebratory moments to assessing athletic performance, Meta glasses are designed to support users with diverse lifestyles and backgrounds. For example, Meta’s AI-powered smart glasses can:
- Sync with Garmin devices to monitor workouts, deliver alerts as needed and provide post-workout assessments.
- Generate captions in a heads-up display to aid those who have hearing impairments or are in a loud environment.
- Translate generated captions to help users navigate language barriers.
- Take videos or photos from a first-person perspective at a birthday party and store them for later.
- Use voice commands to have Meta’s assistant write a message to a coworker or add events to a calendar, boosting productivity.
With distinct product lines set to receive future software updates, there are no rules for how to use Meta glasses. Users can experiment with different AI features to determine how these glasses can best meet their needs.
What Technology Powers Meta Glasses?
Behind the capabilities of Meta’s glasses are large language models — AI models trained on massive volumes of data that can understand and generate human language. For Meta’s Ray-Ban Display model, researchers have been building on the progress made by a model called AnyMal to develop even more advanced multimodal models, or models that can process various data like written, visual, audio, video and even sensor data.
To accomplish multimodality, these models must combine natural language processing (NLP) and natural language understanding (NLU) with computer vision. NLP and NLU enable models to grasp human language, while computer vision focuses on helping AI analyze visual data and glean meaningful insights. Working in tandem, these techniques can handle the constant stream of real-world data collected by AI glasses in real time.
How Are Meta Glasses Different From Traditional Smart Glasses?
Smart glasses and AI glasses are closely related concepts that Meta has explored, but they remain distinct. Smart glasses are defined by their internet connection, which they access by integrating with other devices. For instance, a user can connect their smart glasses to their phone to take pictures, read messages and listen to music. In this sense, the glasses act as an extension of the phone, according to Meta’s website.
AI glasses take this idea a step further: Instead of simply integrating with a device, they proactively complete actions on behalf of the user. Smart glasses display information, but AI glasses also interpret this information and deliver useful insights. As a result, smart glasses can be viewed as a wider category of devices, while AI glasses are smart glasses equipped with artificial intelligence features to assist users in real time.
What Are the Limitations of Meta Glasses?
Despite being touted for their AI features, Meta glasses have a few downsides to consider. A review of Meta’s Ray-Ban glasses notes that they feel heavier than a traditional pair, and adding prescription lenses further increases their weight. The glasses may also need to be charged more often than their official four-hour battery life.
More problematic, though, are the questions around user privacy. At least some of the visual data users record is compiled by Meta for training its AI models, and users are unaware of when exactly their data is collected or where it goes. And violations of personal privacy extend beyond individual users, posing potentially greater consequences for those being recorded.
What Are the Privacy and Security Concerns With Meta Glasses?
The privacy risks of Meta’s glasses drew national attention when a California judge chastised Zuckerberg and his team for wearing AI glasses in the courtroom, threatening to hold them in contempt if they didn’t remove the glasses and delete any recorded footage. Two Swedish newspapers then fanned the flames beneath these fears when they revealed that Meta contractors in Kenya were pressured to review videos collected from Meta glasses that depicted people in intimate or compromising situations.
It’s reached the point where multiple plaintiffs are suing Meta, specifically targeting ad copy describing the glasses as “designed for privacy, controlled by you.” The plaintiffs accuse Meta of false advertising and violating consumer protection laws, presenting an early test for the company as it makes its AI glasses a core part of its strategy moving forward.
How Are Meta Glasses Evolving?
Meta seeks to stay the course it charted with the release of its Ray-Ban Display model. In a 2026 blog post, the company announced plans to equip its heads-up display with a teleprompter feature, expand its pedestrian navigation offering to more cities and make its EMG wristband compatible with Garmin in-vehicle infotainment systems, among other updates.
A more controversial update could see facial recognition introduced into Meta’s glasses, according to leaked documents. Known as “Name Tag,” the feature would let the glasses identify people in public spaces and retrieve additional online information about them via Meta’s AI assistant. This idea is still in the early stages, but it already casts doubt on the future of Meta’s AI glasses as privacy becomes a top priority in the age of AI.
Frequently Asked Questions
Can Meta glasses take photos and videos?
Yes, all Meta smart glasses models feature a built-in 12-megapixel camera that takes photos and videos. In the case of Meta’s Ray-Ban Display model, users can activate the camera using hand gestures, thanks to the company’s EMG wristband.
Do Meta glasses have speakers or audio?
Yes, Meta glasses are equipped with open-ear speakers and Bluetooth connectivity to provide high-quality audio for taking calls, recording videos, streaming music and more. A software update also amplifies the voice of someone a user is listening to, making it easier to focus on a conversation in a noisy environment.
Are Meta glasses the same as Ray-Ban Meta smart glasses?
The term “Meta glasses” refers more broadly to Meta’s entire lineup of smart glasses, including its Ray-Ban, Oakley and Ray-Ban Display product lines. Ray-Ban Meta Glasses are a specific model of glasses, and are actually the company’s original smart glasses.
Do Meta glasses require a phone to work?
Yes, Meta glasses require a smartphone since users must download the Meta AI mobile app to set up their glasses. An internet connection is also needed to access the full suite of AI features on Meta’s smart glasses.
