Meta Glasses: What They Are and How They Work

Meta’s glasses combine elements of augmented reality and AI to support hands-free interactions with other smart devices. These specs could spearhead the next wave of AI-first devices — as well as a host of privacy risks.

Written by Matthew Urwin
A pair of Meta Ray-Ban smart glasses rest on a wooden surface.
Image: JLStock / Shutterstock
UPDATED BY
Abel Rodriguez | Apr 01, 2026
REVIEWED BY
Sara B.T. Thiel | Apr 01, 2026
Summary: Meta glasses are smart glasses that facilitate hands-free interactions with other smart devices using elements of augmented reality. Meta has increasingly updated its glasses with AI features to provide a more immersive experience, although this shift has sparked concerns around personal privacy.

Building the metaverse has been Mark Zuckerberg’s dream, reflected in his renaming of Facebook to Meta. However, this dream has remained out of reach, with the company recently announcing layoffs in its Reality Labs division. The move confirms Meta’s transition from virtual reality (VR) to artificial intelligence, but the organization may have finally found the sweet spot between these two disciplines: AI glasses. 

Meta Glasses, Explained

Meta’s lineup of smart glasses support hands-free interactions with smartphones, tablets and other devices through features that border on augmented reality. More recently, Meta has been integrating AI-powered features into its glasses, including its voice assistant, translated captions and a wristband that lets users interact with their phone using hand gestures.

Meta glasses are AI-powered smart glasses that incorporate aspects of augmented reality (AR), allowing users to interact with their devices via a screen displayed at eye level. New AI features continue to enhance Meta’s lineup, while also raising concerns around personal privacy and data security. Here’s everything you need to know about Meta’s AI glasses, including how they work, what they can do and what the future holds in store.

Meta’s Pivot to AI With Meta Superintelligence Labs, Mark Zuckerberg Is Buying His Way to AI Supremacy

 

What Are Meta Glasses?

Meta glasses collectively refer to Meta’s smart glasses that facilitate hands-free interactions with smartphones, smartwatches and other devices through AI features. The following product lines make up this category:  

  • Ray-Ban Meta: Meta’s original smart glasses, which are styled to fit a variety of occasions and have improved in areas like battery life and video quality.  
  • Oakley Meta: Meta’s smart glasses intended for athletes and those with active lifestyles who want to use AI to record and analyze their performance.     
  • Meta Ray-Ban Display: Meta’s first pair of smart glasses that offers a heads-up display and allows users to take actions using a wristband. 

Together, these glasses enable users to perform actions on smart devices across various settings without ever physically interacting with them, promoting a user experience increasingly rooted in mixed reality — the practice of immersing users in an experience that blends the real world with elements of AR and VR

 

How Do Meta Glasses Work?

Like other wearable technologies, Meta’s glasses use Bluetooth to connect to a device, most commonly smartphones. These phones provide an internet connection via Wi-Fi or cellular data, allowing Meta glasses to function. The glasses also need to be online to receive software updates, including the v21 software update that makes it easier to hear someone in a loud setting and quickly summon personalized Spotify playlists. 

In addition, users can interact with a heads-up display visible directly in the lens through wrist and hand movements, particularly in the Meta Ray-Ban Display model. An electromyographic (EMG) wristband uses sensors to detect electrical signals in the wrist, so wearers can write out messages, scroll down a page or zoom in on a photo through simple hand gestures — all without taking out their phone. 

 

What Features Do Meta Glasses Include?

While the heads-up display and EMG wristband are limited to Meta’s latest Ray-Ban model for now, there is a range of other features available on other models, including: 

  • A 12-megapixel camera that takes pictures and videos. 
  • Open-ear speakers that support audio for calls, podcasts and music. 
  • The Meta AI voice assistant, which can be activated with voice commands
  • Integration with the Meta AI app to better manage features and devices. 
  • Bluetooth connectivity to transfer media and files to another device. 
  • Compatibility with prescription lenses. 

These features enable Meta glasses to fulfill a variety of use cases, regardless of the specific model users choose.  

 

What Can You Do With Meta Smart Glasses?

From capturing celebratory moments to assessing athletic performance, Meta glasses are designed to support users with diverse lifestyles and backgrounds. For example, Meta’s AI-powered smart glasses can: 

  • Sync with Garmin devices to monitor workouts, deliver alerts as needed and provide post-workout assessments. 
  • Generate captions in a heads-up display to aid those who have hearing impairments or are in a loud environment. 
  • Translate generated captions to help users navigate language barriers. 
  • Take videos or photos from a first-person perspective at a birthday party and store them for later. 
  • Use voice commands to have Meta’s assistant write a message to a coworker or add events to a calendar, boosting productivity. 

With distinct product lines set to receive future software updates, there are no rules for how to use Meta glasses. Users can experiment with different AI features to determine how these glasses can best meet their needs.

Meta’s Growing InfluenceMeta Is Spending Millions to Bend America’s AI Policy to Its Will

 

What Technology Powers Meta Glasses?

Behind the capabilities of Meta’s glasses are large language modelsAI models trained on massive volumes of data that can understand and generate human language. For Meta’s Ray-Ban Display model, researchers have been building on the progress made by a model called AnyMal to develop even more advanced multimodal models, or models that can process various data like written, visual, audio, video and even sensor data. 

To accomplish multimodality, these models must combine natural language processing (NLP) and natural language understanding (NLU) with computer vision. NLP and NLU enable models to grasp human language, while computer vision focuses on helping AI analyze visual data and glean meaningful insights. Working in tandem, these techniques can handle the constant stream of real-world data collected by AI glasses in real time.   

 

How Are Meta Glasses Different From Traditional Smart Glasses?

Smart glasses and AI glasses are closely related concepts that Meta has explored, but they remain distinct. Smart glasses are defined by their internet connection, which they access by integrating with other devices. For instance, a user can connect their smart glasses to their phone to take pictures, read messages and listen to music. In this sense, the glasses act as an extension of the phone, according to Meta’s website

AI glasses take this idea a step further: Instead of simply integrating with a device, they proactively complete actions on behalf of the user. Smart glasses display information, but AI glasses also interpret this information and deliver useful insights. As a result, smart glasses can be viewed as a wider category of devices, while AI glasses are smart glasses equipped with artificial intelligence features to assist users in real time.  

 

What Are the Limitations of Meta Glasses?

Despite being touted for their AI features, Meta glasses have a few downsides to consider. A review of Meta’s Ray-Ban glasses notes that they feel heavier than a traditional pair, and adding prescription lenses further increases their weight. The glasses may also need to be charged more often than their official four-hour battery life. 

More problematic, though, are the questions around user privacy. At least some of the visual data users record is compiled by Meta for training its AI models, and users are unaware of when exactly their data is collected or where it goes. And violations of personal privacy extend beyond individual users, posing potentially greater consequences for those being recorded.  

 

What Are the Privacy and Security Concerns With Meta Glasses?

The privacy risks of Meta’s glasses drew national attention when a California judge chastised Zuckerberg and his team for wearing AI glasses in the courtroom, threatening to hold them in contempt if they didn’t remove the glasses and delete any recorded footage. Two Swedish newspapers then fanned the flames beneath these fears when they revealed that Meta contractors in Kenya were pressured to review videos collected from Meta glasses that depicted people in intimate or compromising situations. 

It’s reached the point where multiple plaintiffs are suing Meta, specifically targeting ad copy describing the glasses as “designed for privacy, controlled by you.” The plaintiffs accuse Meta of false advertising and violating consumer protection laws, presenting an early test for the company as it makes its AI glasses a core part of its strategy moving forward.

For Those Tired of Meta’s AI Push...How to Turn Off Meta AI (As Much As Possible)

 

How Are Meta Glasses Evolving?

Meta seeks to stay the course it charted with the release of its Ray-Ban Display model. In a 2026 blog post, the company announced plans to equip its heads-up display with a teleprompter feature, expand its pedestrian navigation offering to more cities and make its EMG wristband compatible with Garmin in-vehicle infotainment systems, among other updates. 

A more controversial update could see facial recognition introduced into Meta’s glasses, according to leaked documents. Known as “Name Tag,” the feature would let the glasses identify people in public spaces and retrieve additional online information about them via Meta’s AI assistant. This idea is still in the early stages, but it already casts doubt on the future of Meta’s AI glasses as privacy becomes a top priority in the age of AI.

 

Notable Meta Glasses Developments

What began as an internal experiment to map the physical world has evolved into a sophisticated ecosystem of neural interfaces, holographic displays and AI features. From the initial partnership with a heritage eyewear brand to the journey to replace smartphones, these milestones define Meta’s journey to advance wearable computing.

Meta Releases Smart Glasses With Prescriptions (March 2026)

Meta released new models of its Meta glasses optimized for individuals needing eye prescriptions. The new models also feature overextension hinges, interchangeable nose pads and temple tips to adapt to unique face shapes. In addition to the new models, Meta introduces a new nutrition tracking feature to its smart glasses and is rolling out neural handwriting to all users in the coming weeks.

Release of Meta Smart Glasses With Display (September 2025)

Meta released a new version of its glasses featuring an in-lens display. The in-lens display enables individuals to view their text messages, follow turn-by-turn directions and read live subtitles during conversations. Meta also equips these glasses with a neural band for simpler user interaction. 

Meta Announces Project Orion (September 2024)

With the initial success of its smart glasses, Meta doubled down on smart wearables, announcing Project Orion. Orion aims to develop a lightweight AR device that can theoretically replace smartphones. Meta designed new glasses that are thinner than traditional VR headsets but can overlay hologram displays in an individual's field of view. Project Orion also features a neural wrist band that tracks hand and muscle movement to interact with the displayed content. Meta currently limits the project to its employees.

Release of Meta Smart Glasses (September 2023)

Meta released its second-generation smart glasses boasting several new features. The updated Meta smart glasses retain the built-in camera, microphone and speakers but now integrate Meta’s AI models. With these AI capabilities, the smart glasses provide voice-activated assistance like real-time object identification and contextual information based on an individual's surroundings.

Meta Releases Ray-Ban Stories (September 2021)

Within a year of working with EssilorLuxottica, Meta released its first generation of smart glasses — Ray-Ban Stories. Ray-Ban Stories featured a five-megapixel camera, open-ear speakers and a microphone that captures 30-second video clips. Although they had no AR or AI capabilities, the Ray-Ban Stories served as a building block for future wearable devices.

Meta Partners With EssilorLuxottica (September 2020)

Following initial AR wearable work, Meta entered into a multiyear partnership with EssilorLuxottica. EssilorLuxottica is an eyewear company that develops lens technology and manages several eyewear brands like Ray-Ban and Oakley. Through the partnership, both companies collaborated to develop smart glasses.

Meta Launches Project Aria (September 2020)

With its commitment to building the Metaverse, Meta launched Project Aria. Project Aria was an internal research initiative to develop wearable augmented reality hardware. The project successfully built a wearable prototype that contained no displays but instead captured real-world egocentric data for computer vision and spatial mapping to build out its virtual and augmented reality capabilities.

Meta Forms Facebook Reality Labs (August 2020)

Following its rebrand from Facebook to Meta — a move that reflected the company's new focus on novel technologies — the company launched Reality Labs. The project unified AR and VR research and development under one division, aiming to build a new computing platform for individuals to seamlessly interact with. The move also solidified Meta’s commitment to developing the Metaverse and the AR and VR advancements required for that undertaking.

Frequently Asked Questions

Yes, all Meta smart glasses models feature a built-in 12-megapixel camera that takes photos and videos. In the case of Meta’s Ray-Ban Display model, users can activate the camera using hand gestures, thanks to the company’s EMG wristband.

Yes, Meta glasses are equipped with open-ear speakers and Bluetooth connectivity to provide high-quality audio for taking calls, recording videos, streaming music and more. A software update also amplifies the voice of someone a user is listening to, making it easier to focus on a conversation in a noisy environment.

The term “Meta glasses” refers more broadly to Meta’s entire lineup of smart glasses, including its Ray-Ban, Oakley and Ray-Ban Display product lines. Ray-Ban Meta Glasses are a specific model of glasses, and are actually the company’s original smart glasses.

Yes, Meta glasses require a smartphone since users must download the Meta AI mobile app to set up their glasses. An internet connection is also needed to access the full suite of AI features on Meta’s smart glasses.

Explore Job Matches.