I tried Snap’s new Spectacles – and it felt like having my own personal Jarvis AI butler overlay with me everywhere I looked

Smartphones aren't that smart

The Snap spectacles being worn
(Image credit: Future / Snap)

Most days, I wake up and reach for my phone before doing anything else. Check the weather, have a scan of the news, check some emails, and if I’m heading to the office, make sure there’s no issues with my route.

It’s muscle memory at this point; it’s kind of mundane, but chances are we all have a similar routine.

Yet, last week I got a glimpse into how this routine can be changed and made more useful in the process.

Instead of twiddling my thumbs, I slipped on a pair of Snap’s next-generation Spectacles,. They were running the newly launched Snap OS 2.0, the wearable operating system quietly powering Snap’s vision for spatial, AI-native computing. It took a minute to get used to, but once I did, it felt like walking around with my own personal Jarvis, Iron Man's AI butler, sat on my face.

What are Snap’s Spectacles?

These aren’t the photo-focussed, social-sharing specs of the early Snapchat days. The latest Spectacles are AI-powered glasses that understand your environment, respond to your voice and gestures, and augment your world with contextual, digital information right in front of your eyes — the next hardware step in the race to own the 'augmented reality' future. They’re Snap’s answer to the question: What comes after the smartphone?

Running on Snap OS 2.0, the glasses deliver real-time interactions through Lenses (Snap’s version of spatial apps), letting you translate languages, play games, pull up directions, and even ask questions about objects in your surroundings, all with your hands free and your eyes up, with information displayed over the real world around you.

Hands-On: First Impressions

A man using Snapchat's Spectacles

(Image credit: Future)

Wearing the Spectacles feels surprisingly natural. After about 15 minutes of acclimating to the interface and gestures, I found myself growing in confidence, pulling up websites, searching the web and exploring what was on offer with ease. Much like a fresh haircut, I think I’ll need some real time with the device to truly figure out the potential and how best to use it, but from a short first impression, it’s hard not to see how this could make day-to-day life easier. In the past, this tech has often felt quite novel, in wearable hardware that has perhaps been large or impractical, but for the first time, I’m seeing how seamless it can be when in the right form factor.

Things like Spatial Tips, where you can ask questions about what you’re seeing (“How do I change the oil in this car?”), felt like magic. That’s where the Jarvis comparison comes in. AI becomes ambient, present, and useful.

The browser is snappy, the visuals are clean, with the consumer version coming next year set to be in a lighter frame too, so they would be more comfortable than what we tested.

What makes it tick? Enter Qi Pan

To dig deeper, I spoke with Qi Pan, Director of Computer Vision Engineering at Snap, and someone who’s spent nearly two decades building augmented reality systems. His passion is clear, and his vision for where this is all going is incredibly exciting.

“In the past, we could recognise the world,” Qi told me of how previous glasses saw your surroundings, “but we couldn’t fill it with enough content. Now, with AI, you can open the bonnet of any car and ask, ‘How do I add oil?’ and get a useful answer. That kind of scale is new. That’s what makes this finally useful.”

He explained that Snap has quietly mastered the art of making machine learning real-time and lightweight, first on mobile, and now on Spectacles. That means precise hand tracking, face tracking, and scene recognition that runs natively — no cables, no lag.

“Smartphones are powerful, but they’re dumb,” he said. “They have no idea what you’re trying to achieve. Glasses change that. They can see and hear your world, and respond in it. You don’t need to open an app. It just happens.”

There’s something delightfully human about that. Qi described a morning where he’d like to eat breakfast while the BBC headlines float gently in front of him. When he walks out the door, the glasses should already know his usual Tube route, alerting him if there’s an issue. Today, we’re fumbling on our phones, looking down, ignoring the world, whereas soon we could be seeing it just through a different lens.

The vision and the reality

Evan Spiegel at AWE 2025: Bringing AI and AR Into the Real World Through Spectacles - YouTube Evan Spiegel at AWE 2025: Bringing AI and AR Into the Real World Through Spectacles - YouTube
Watch On

Of course, any attempt to replace the smartphone faces an uphill battle. Snap’s strategy here is to embed advanced tech in approachable, delightful experiences, much like they did with face Lenses. As Qi puts it:

“No one wakes up thinking, ‘I’m going to use AR today.’ They just want to send a funny Snap to a friend. That’s what we want for Spectacles, too. You shouldn’t think, ‘I’m using AI.’ You should just be… living your life.”

That philosophy is already visible in educational lenses like the Human Body and Solar System demos, letting you explore anatomy and physics in full 3D, anchored in your space. It’s easy to imagine this expanding into classrooms, museums, labs, and even your kitchen.

That said, Qi was honest about the remaining challenges, especially hardware.

“These things have to be wearable all day. For mass adoption, they need to be thinner, lighter, and comfortable enough to forget you’re wearing them. That’s why next year’s consumer Specs release is so important.”

Will this kill the Smartphone?

Not tomorrow. But if Snap has its way? Maybe sooner than you think.

Snap OS 2.0 doesn’t just deliver an incremental upgrade; it represents a fundamental rethinking of what computing could be: not app-based, but contextual; not screen-first, but human-first. The smartphone forced us to look down. This tech lets us look up, potentially allowing us to become more sociable again.

Snap’s Spectacles are currently available to select developers ahead of their consumer launch in 2026.

Morgan Truder
Staff Writer

Morgan got his start in writing by talking about his passion for gaming. He worked for sites like VideoGamer and GGRecon, knocking out guides, writing news, and conducting interviews before a brief stint as RealSport101's Managing Editor. He then went on to freelance for Radio Times before joining Shortlist as a staff writer. Morgan is still passionate about gaming and keeping up with the latest trends, but he also loves exploring his other interests, including grimy bars, soppy films, and wavey garms. All of which will undoubtedly come up at some point over a pint.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.