Meta CEO Mark Zuckerberg unveiled a new development he called "the first AI-powered, high-resolution glasses" — Meta Ray-Ban Display. They will be released on September 30 at a price of $799. But the presentation did not go quite as Zuckerberg had hoped, UNN reports, citing Mashable.
Details
It was expected that at his keynote speech at MetaConnect 2025, held at Meta's headquarters in California at the unusually late hour of 5:00 PM PT (8:00 PM ET), Zuckerberg would unveil a groundbreaking pair of smart glasses codenamed Hypernova. Instead, an update to the existing Ray-Ban Meta frames, a new Oakley sports kit, Meta Vanguard, and a new model, surprisingly called Meta Ray-Ban, were presented.
"This is one of those special moments where we get to show you something we've poured our lives into," Zuckerberg said to a packed room and a live stream with four thousand viewers.
According to him, the Meta Ray-Ban glasses had a bright, clear display with an impressive brightness of 5,000 nits.
Zuckerberg then showed not only the Meta Ray-Ban glasses he walked in with (and quickly hid), but also a companion device called the Meta Neural Band – a lightweight fabric wristband that captures small wrist movements. This allows for typing words on the smart glasses' display by pretending to write by hand.
"I type about 30 words per minute," Zuckerberg said.
And then the CEO stood helplessly as a repeated WhatsApp video call from Meta CTO Andrew "Boz" Bosworth appeared on his glasses. Zuckerberg's Neural Band interface apparently failed to answer the call. Boz had to join him live on stage.
Zuckerberg's demo game started strong. The keynote began with a live view through his Meta Ray-Bans, where Zuckerberg launched a hype song (the Neural Band also allows for volume control) and responded to incoming emoji messages.
But then the live demonstration of the new Ray-Ban Metas ran into a problem with the glasses' "LiveAI feature," which was supposed to instruct one of the presenters on how to make a sauce from all the ingredients in front of him.
"Now that you've created your base..." the glasses began several times, ignoring the presenter's repeated requests for instructions on how to create that base: "What do I do first?"
Zuckerberg later blamed the demo glitch on Wi-Fi, but couldn't explain why his Meta Ray-Bans couldn't answer Boz's call. Ultimately, a lifeless video without a demonstration supposedly showed Meta Ray-Bans being used to design a surfboard and order parts.
Zuckerberg explained that this is how the glasses would work with agentic AI, dismissing any concerns about whether agentic AI works at all – in live demos or otherwise.
Addition
Mark Zuckerberg expanded on his optimistic ideas that glasses will become the primary way users interact with artificial intelligence in the coming years.
"I continue to think that glasses are going to be, by far, the ideal form factor for AI because you can let the AI see what you see throughout the day, hear what you hear, and talk to you," Zuckerberg said during an earnings call.
