At Meta Connect 2025, Zuck doubled down on a bet he’s been making for years: that glasses – not phones, not headsets – are the ideal form factor for personal superintelligence. Why? Because they let AI “see what you see, hear what you hear, talk to you throughout the day »
You have to give him credit for one thing: this guy ships. While Apple shaves millimeters off its phones and OpenAI‘s hardware ambitions remain somewhere between a teaser and a blog post, Zuck keeps delivering – on a steady drumbeat, with real devices. You can take issue with plenty, but not with his velocity. He runs a trillion-dollar company like he’s still in a hoodie at Harvard.
The crown jewel of the launch yesterday: the Meta Ray-Ban Display – $799 glasses with a high-res, full-color screen tucked discreetly into the right lens. Invisible when idle, present when needed.
Controlling these glasses is Meta’s other big swing: a wristband powered by EMG (electromyography) that reads your muscle signals to interpret gestures like swipes, clicks, and yes – eventually, typing. “Every new computing platform has a new way to interact with it,” Zuck said, describing the wristband as “replacing the keyboard, mouse, touchscreen, buttons, dials” with muscle signals.
There’s also a $499 Oakley Meta Vanguard, aimed at athletes – rugged, water-resistant, integrated with Garmin and Strava, 3K video, and wind-cutting speakers. And yes, the classic Ray-Bans get double the battery and a party trick: voice amplification in noisy rooms.
Unlike the “one device to rule them all” hype, Meta is playing small ball: navigation, translation, comms, recording. Narrow, useful, frequent use cases that compound into habit. Glasses that run passively in the background, surfacing just enough intelligence to help, not distract. Less immersion, more ambience.
Of course, strapping a camera and mic to your face is never going to be frictionless. The privacy concerns are real. Meta is attempting to address this – visible LEDs for recording, local processing, tighter permissions – but trust is earned over time, not engineered.
Are glasses the final form? Maybe. Maybe not. But they’re a strong contender. Here’s why:
– Socially acceptable form factor
– Eye-level display = natural UX
– Already part of daily life
– Integrates camera, mic, audio, and display in one frame
– And crucially: they don’t demand your full attention
Zuck assures us that “AI glasses are taking off. The sales trajectory we’ve seen is similar to some of the most popular consumer electronics of all time”
Hyperbole or not, Meta is building a new interface for AI: One you wear, not touch. One that’s ambient, not immersive. One that may not change your life overnight – but might change how you live in small, cumulative, compounding ways. Which, come to think of it, is how the smartphone started too.