Mark Zuckerberg mostly uses the new Meta Ray-Ban Display glasses to send text messages. Lots of them.
He has been wearing the glasses around the office, firing off WhatsApp pings to his execs throughout the day. “I run the company through text messages,” he tells me recently.
“Mark is our number one heaviest user,” Alex Himel, the company’s head of wearables, confirms. Zuckerberg is known for sending lengthy, multi-paragraph missives via text. But when he’s typing on the glasses, Himel can tell because the messages arrive faster and are much shorter.
Zuckerberg claims he’s already at about 30 words per minute. That’s impressive considering how the glasses work. The heads-up display isn’t new; Google Glass tried it more than a decade ago. What’s new is the neural wristband Meta built to control the interface and type via subtle gestures. Instead of tracking your hands visually or forcing you to type out into empty air, the band picks up signals from your arm’s muscular nervous system. “You can have your hand by your side, behind your back, in your jacket pocket; it still works,” Zuckerberg says. After trying it, I can confirm he’s right. It feels like science fiction come to life.
“Glasses, I think, are going to be the next computing platform device.”
“Glasses, I think, are going to be the next computing platform device,” says Zuckerberg during our recent conversation, which airs in full on the Access podcast Thursday, September 18th. He argues they’re also the best hardware for AI: “It’s the only device where you can basically let an AI see what you see, hear what you hear, talk to you throughout the day, and then once you get the display, it can just generate a UI in the display for you.”
While Zuckerberg has been advocating for this idea about the next major platform for a while, numbers — not just hype and flashy demos — are now beginning to support his theory. Sales of Meta’s existing Ray-Bans have reached the single-digit millions and increased by triple digits from last year. The broader market for tech-enabled eyewear is projected to reach tens of millions soon. Google is releasing AI glasses next year, and Snap has a consumer pair of AR glasses shipping then as well. I expect Apple to release its own glasses as soon as 2027 — the same year that Meta is targeting to release its much pricer, full-fledged AR glasses.
For Zuckerberg, the prize is enormous. “There are between 1 to 2 billion people who wear glasses on a daily basis today for vision correction,” he says. “Is there a world where, in five or seven years, the vast majority of those glasses are AI glasses in some capacity? I think that it’s kind of like when the iPhone came out and everyone had flip phones. It’s just a matter of time before they all become smartphones.”
Meta’s CTO Andrew Bosworth recalls how EssilorLuxottica initially thought the display glasses would be too big to sell as Ray-Bans. “Then, last year, we showed it to them. They were like, ‘Oh, you did it. Let’s put Ray-Ban on it.’” They’re still chunky, but less noticeable than the Orion AR prototype Meta showed off last year. With transition lenses, the new display Ray-Bans start at $800 before a prescription. Bosworth says the target customers are “optimizers” and “productivity-focused people.”
Meta isn’t making many of them — reportedly a couple hundred thousand — but Bosworth predicts “we’ll sell all of the ones that we build.” When I ask Zuckerberg about the business potential, he hints that the real margin will come later: “Our profit margin isn’t going to come from a large device profit margin. It’s going to come from people using AI and the other services over time.”
The hardware feels surprisingly refined for a first version. The geometric waveguide display sits to the side of the right lens, clear enough to use in sunlight, with a 20-degree field of view and crisp 42 pixels per degree. The neural band lets you pinch to bring up the display and dismiss it again. You can’t see the display from the front at all, even when it’s turned on. The glasses last up to six hours on a charge, with multiple recharges from the case.
The core software still relies on your phone, but it’s more than a notification mirror. You can send texts, take audio or video calls, show what you’re listening to via the speakers in the frame, get turn-by-turn walking directions, see what your camera is capturing, and run Meta AI to recognize what’s in front of you. Bosworth calls crisp text rendering the key to making AI useful. “If the AI has to read it back to you verbally, you’re not getting the most information,” he says. “Whereas you can just ask a question and it shows you the answer. It’s much better. It’s more private, too.”
The long-term bet is that the glasses eventually let you leave your phone behind
While the AI features played a more backseat role in my demo, I did use it to recognize a painting on a wall and generate a table setting out of thin air. I made sure to ask it things that were off the script I was given, and it still performed as expected. The display also shows AI-suggested prompt follow-ups you can easily select via the neural band.
The most striking demo was live captions. In a noisy room, I could look at someone several feet away, and what they were saying appeared in real time in front of me. It feels like super hearing. Language translation is next, with Spanish and a few others supported at launch. Meta is also working on a teleprompter feature.
Still, Meta thinks this is just the start. “If you look at the top 10 reasons you take your phone out of your pocket, I think we knocked out five or six of them,” Bosworth says. The long-term bet is that the glasses eventually let you leave your phone behind.
The neural band may be the bigger unlock in the near term. Bosworth admits Meta only committed to it after the Orion AR glasses prototype proved its usefulness last summer. Now, it’s advancing faster than expected. A handwriting mode initially thought to be years away is already working. Zuckerberg envisions it going even further. “It’s basically just an AI machine learning problem. The future version of this is that the motions get really subtle, and you’re effectively just firing muscles in opposition to each other and making no visible movement at all.”
In addition to enabling personalized autocomplete via thought, the neural band may also become a way to control other devices or even a smart home, according to Zuckerberg. “We invented the neural band to work with the glasses, but I actually think that the neural band could end up being a platform on its own.”
For now, the first generation of these Ray-Ban Display glasses is clearly a device for early adopters. The AI features are limited, the neural band takes practice, and the software needs polishing. But Zuckerberg seems more convinced than ever that glasses are the future, and after trying his new glasses, it’s hard to disagree. “It’s 2025, we have this incredibly rich digital world, and you access it through this, like, five-inch screen in your pocket,” he says. “I just think it’s a little crazy that we’re here.”
If history repeats, Meta may finally be on the cusp of the new platform Zuckerberg has been dreaming about for years. “The first version of the Ray-Bans, the Ray-Ban Stories, we thought was good,” he says. “Then, when we did the second version, it sold five times more, and it was just refined. I think there’s going to be a similar dynamic here. The first version, you learn from it. The second version is a lot more polished. And that just compounds and gets better and better.”
This is Sources by Alex Heath, a newsletter about AI and the tech industry, syndicated just for The Verge subscribers once a week.
0 Comments