On an earnings call this summer, Meta CEO Mark Zuckerberg made an ambitious claim about the future of smart glasses, saying he believes that someday people who don’t wear AI-enabled smart spectacles (ideally his) will find themselves at a “pretty significant cognitive disadvantage” compared to their smart-glasses-clad kin.
Meta’s most recent attempt to demonstrate the humanity-enhancing capabilities of its face computing platform didn’t do a very good job of bolstering that argument.
In a live keynote address at the company’s Connect developer conference on Wednesday, Zuckerberg tossed to a product demo of the new smart glasses he had just announced. That demo immediately went awry. When a chef was brought onstage to ask the Meta glasses’ voice assistant to walk him through a recipe, he spoke the “Hey Meta” wake word, and every pair of Meta glasses in the room—hundreds, since the glasses had just been distributed to the crowd of attendees—sprang to life and started chattering.
In an Instagram Reel posted after the event, Meta CTO Andrew Bosworth (whose own bit onstage had run into technical problems) said the hiccup happened because so many instances of Meta’s AI running in the same place meant they had inadvertently DDOS’d themselves. But a video call demo failed too, and the demos that did work were filled with lags and interruptions.
This isn’t meant to just be a dunk at the kludgy Connect keynote. (We love a live demo, truly!) But the weirdness, the timid exchanges, the repeated commands, and the wooden conversations inadvertently reflect just how graceless this technology can be when used in the real world.
“The main problem for me is the raw amount of times where you do engage with an AI assistant and ask it to do something and it doesn’t actually understand,” says Leo Gebbie, a director and analyst at CCS Insights. “The failure risk just is high, and the gap is still pretty big between what’s being shown and what we’re actually going to get.”
Eyes of the World
Live Captions seen on the Meta Ran Ban Display.Courtesy of Meta
Clearly, we are a long way from Zuckerberg’s vision of smart glasses being the computing platform that elevates humanity to some higher-thinking, higher-functioning state. Sure, wearing internet-connected hardware on your face can make it easier and faster to access information, and that may help you become—or at least appear to become—smarter or more capable. But as the clumsiness of the Connect demo very publicly demonstrated, the act of simply wearing a chatbot and a screen on your face might cancel out any cognitive advantage. Smart glasses put the wearer at a significant social disadvantage.
Meta’s spectacles are the best smart glasses you’re going to be able to buy right now. They’re much more fashionable than earlier attempts—like the famously dorky Google Glass—and Meta’s partnership with Ray-Ban and Oakley owner EssilorLuxottica has done well for its product visuals. The new Gen 2 Ray-Ban Meta models look very much like normal glasses. But once you start adding more of those cognition-enhancing capabilities, you start packing on the pounds. Look at the sheer heft of the Meta Ray-Ban Displays: you can watch Instagram Reels on them, but they’re big, chunky, and dorky looking.
Maybe you’ll be able to pull off that look. (Old man from Up but make it fashion.) Even then, the process of using them in the wild is bound to be uncanny.
I tried out the Meta Ray-Ban Display glasses at Meta Connect. The screen was definitely visible to me, though it was slightly blurry and took a moment to focus on. Trying to read something or pick out an icon meant staring down and to the right at the lens, looking all but downright cross-eyed to anyone who might be on the other side of me.
The design also puts diversions directly into your field of vision. “I cannot see how, for me as a wearer, it would not be invasive if I was speaking to someone one-on-one and suddenly I get a notification pop-up saying someone’s messaged me on WhatsApp,” Gebbie says. “That is so distracting.”
Courtesy of Meta
Tanner Higgin, a senior researcher at the education nonprofit West Ed, describes watching people use technology like smart glasses and heads-up displays and says it quickly becomes clear when someone using them is focused on the interface rather than their surroundings.
“There is a striking physical change that happens when I watch someone using them,” Higgin says. “Their attention shifts to the display. There’s a kind of vacancy—a thousand-yard stare—that then gets reinforced by gestures as they move their thumb or turn the volume knob. There’s this second reality that seems to be, for some people, more significant to serve in any given moment than their immediate physical reality.”
That’s bound to be off-putting when you’re talking to someone on the street, akin to trying to have a conversation with someone who’s distracted by their phone screen the whole time. I guess it’s hard to be cognitively enhanced to the max when you can’t pay attention to what the person in front of you is saying.
Social Distortion
Gebbie, who wears glasses regularly, says that theoretically he should be the target audience for Meta’s smart glasses. “I could put these on and wear them all day long, but I absolutely do not,” he says. “And it’s because I just worry about the social contract and the kind of weird behavior that’s always potentially there.”
Not that the potential for cringe will stop anyone from buying them. Meta has sold more than 2 million pairs of its Ray-Ban glasses. The company is also bound to focus on those weird angles of its user experience research. Smoothing out the gestures, moving around the displays, and offering features that could sense a one-on-one conversation and mute notifications or automatically turn off the screen all seem like natural ways to make the devices feel more natural.
Courtesy of Meta
Like all tech, these things will get better and become easier to incorporate into your daily routine. Features on the glasses already greatly justify their existence. Live captioning of spoken conversations, with the text displayed on the tiny screen, is poised to be a helpful tool for everyone from Deaf and hard-of-hearing people to clumsy tourists asking for directions.
“Do these benefits outweigh the concerns that people have?” Gebbie says. “I think pretty quickly they will. We are already down the looking glass at this point.”
Still, the issue with Zuckerberg framing the pitch for smart glasses as a way to boost one’s brain function is that it implies you can use technology to win, to be the “better” person in any situation. That’s a rather cynical approach to the nature of human interaction.
“There’s this feeling that we have to optimize and compete, and anytime we engage with another person we’re seeking some advantage or some way to leverage that relationship,” Higgin says. “It’s just such a strange way to operate in the world.”
Courtesy of Meta