Glass vs Screens: Why Smart Glasses Won’t Kill Our Phones – Yet

Glass vs Screens: Why Smart Glasses Won’t Kill Our Phones – Yet

As Snap, Meta, Apple, and Google race toward a screenless future, early demos hint at real utility and new social norms. The question isn’t if we’ll switch, but how we’ll coexist.


Share this post

Look around any café or on the subway and you’ll see the same hunched-over heads staring into a rectangle. For nearly two decades, the smartphone has been our portal to the digital world – with very few credible alternatives. But Qi Pan, Snapchat’s director of computer vision, thinks that time is nearly over. “Until now, the phone has been pretty limiting,” he says. “You’re still trapped behind this tiny screen. You have to look at everything through a rectangle.”

Snap’s long-term vision is simple and radical: glasses replace smartphones. “This is a shift in the future paradigm. This is not like a certain subset of the population will start to use these. Consumers will just use them instead of looking down on their phone,” Qi says.

Snap isn’t alone. Big tech is placing its bets on glass, not screens, as the future of consumer tech. Meta, Apple and Google have all laid out their visions for a screenless era.

When I caught up with Qi, I was given a sneak peek of developer‑ready Snapchat lenses – soon to be tested and iterated before consumers get their hands on a pair. In my demo, I tried the real‑time translation feature: a Snap engineer spoke to me in Spanish while English captions appeared in my lens. I wandered the room asking for info on the author of a book on a shelf and plant care instructions for a cheese plant in the corner.

My favourite feature – less educational, more play – was painting the room in 3D, sharing a canvas with the engineer. This kind of social activity is what Qi sees as a game‑changer compared with phones and immersive tech like VR headsets. “If you look at virtual content today, it’s very isolating. Until now, you haven’t been able to do anything like that.”

A smartphone killer?

Rosh Singh, chief executive of immersive agency Astral City, agrees “the direction of travel definitely is towards that future,” but he’s less convinced we’ll bin our screens overnight. “There will be a long time where they coexist,” he says.

“There is a huge benefit of having an always‑on layer that augments your world, delivering information and utility, and people also want to disconnect from their devices,” Rosh adds. But there will be a significant adjustment period to ditch smartphones in favour of something that sits on your face all day. He sees the first wave of smart glasses not as smartphone killers, but as the new smartwatch.

The signals from big tech are worth watching though. At Astral City, Rosh and his team are currently spending all their time on how glass will change the game. Momentum kicked off with the unexpected popularity of Meta’s Ray‑Ban glasses – 2 million sold since their 2023 debut. “What the Meta Ray‑Bans have done is really opened people’s eyes that an AI assistant embedded into a wearable piece of technology is useful. People see the use instantly, and there’s been an uptick in adoption.”

Google has announced XR glasses slated for 2026. Apple is reportedly pivoting from its mixed reality headset, Vision Pro, into glasses. OpenAI sparked rumours of AI glass hardware when it teamed up with Jony Ive. And Snapchat, after a decade of Spectacles, plans to bring its new Specs device to the public next year.

Roadblocks

Coexistence isn’t just cultural; it’s a necessity. Qi says hardware limitations are still a significant blocker at the moment. When I was testing, the device felt weighty and started to ache my ears after five minutes. Then again, you’re wearing the equivalent of two mid‑range phones, cameras, sensors, and a battery on your face. “They are still too big to wear all the time, so we need to wait for the hardware side of things to catch up before the consumer product is ready,” he says.

To get there, Snap is calling on developers to test and iterate ahead of launch. Anyone can apply to the developer scheme, but they’ll need to use Snap’s proprietary tools. “Any performance gains, efficiency benefits the developers are using and testing – they can also use these glasses to create whatever comes to mind in terms of building AR experiences.” Snap then mines an active Reddit channel for feedback.

The goal is for the tech to disappear into the background. “At some point, you shouldn’t have to think ‘oh this is AR, this is a tech device’,” Qi says.

Spatial computing sits at the heart of this. Devices understand context and the world around us to serve relevant responses at the right time. You open your front door at the usual hour and your device quietly tells you whether to take the bus or the tube – without opening Google Maps or typing a destination.

“Today, the way we interact with phones is that the phone doesn’t understand anything at all. You need to press an app in order for it to understand what you’re trying to achieve,” Qi says. “If we’re going to put a piece of hardware on your face, I expect this to add value almost every minute you are wearing them.”

An alternative to doom‑scrolling?

Our glass futures raise more questions than answers. When a friend picks up their phone, you see the cue. If they’re wearing glasses and start reading an email, you might have no idea. “It’s going to create a load of interesting questions about how we actually interact with each other, what those social cues are, and how it changes,” Rosh says.

There are two sides to the glass‑versus‑screen debate. One side, Rosh tells me, says that the last thing humans need is to spend more time in their digital worlds. But the other side says that having the right information served to you at the right time is better than sitting doom-scrolling on your phone all day. “In a beautiful world we're untethered from our phones, and we're able to live out there in the world, still have access to all the information we need. It's just slightly secondary to holding a giant rectangle in our face every 10 seconds.” 

At this point, glass looks like it may not kill the phone, but it could quietly demote it.


Share this post
Comments

Be the first to know

Join our community and get notified about upcoming stories

Subscribing...
You've been subscribed!
Something went wrong