Last week I asked a straightforward question during a Google Project Aura demo. What do you call these things? To my surprise, many people started a lively discussion about the taxonomy of glasses-faced computers.
It turns out that “smart glasses” is not such a term anymore. The term “AI glasses” is already in vogue. In fact, it seems like no one is completely on the same page.
Let’s go back and I’ll explain. Earlier this year, a Meta representative asked if I could call Meta’s Ray-Ban glasses “AI glasses.” Meta CEO Mark Zuckerberg and CTO Andrew Bosworth have described the glasses as the perfect tool for artificial intelligence. Framing these devices as artificial intelligence glasses do two things: first, they tell consumers about it these glasses are not like Google Glass; secondly, it cites artificial intelligence, not augmented reality, as the reason to buy them.
I refused. As journalists, we’re not in the business of developing marketing narratives, and it doesn’t appear that anyone else in the industry uses this terminology. Cambridge Dictionary defines the term as “a pair of glasses equipped with computer technology” so they can be used like smartphones or transmit information about what you are looking at. This definitely reminds me of Ray-Ban Meta glasses.
During the Aura Project presentation, I was surprised to learn that Google is also adopting the term “AI Glasses”… but only for certain products. Speaking to Juston Payne, director of XR product management at Google, he defined AI glasses as stylish, lightweight glasses that may or may not have a display, and AI is an integral part of the user experience.
Project Aura, he said, does not fall into that category. In fact, Google sees it as more tailored to headsets – just in glasses form. The term Google officially adopted in its press release was “wired XR Glasses” because the Aura has a wired battery/trackpad.
There is a certain logic to this. Project Aura is the result of cooperation with Xreal, which also places its products in the middle between so-called AI glasses and headsets. After the Project Aura demo, I hopped on a call with Xreal CEO Chi Xu and asked what He classifies Aura as.
Xu simply laughed and said, “We will call all our glasses and previous products AR glasses.”
Then I thought back to the conversations I had while investigating who exactly was buying VR and XR headsets. I talked to three experts from Gartner, Counterpoint Research and IDC. No one has defined “smart glasses” in the same way. Gartner defined them as glasses without a camera or display, equipped with a Bluetooth connection and artificial intelligence. Glorified headphones, if you will. Kontrapunkt noted that “smart glasses without transparent displays” are the “major volume driver in the smart glasses category.” IDC had a much broader definition that included anything that was shaped like glasses. There is also the elderly connotation that shrewd glasses must have AR displays and overlays. Think of what Tony Stark wears as Iron Man or what Eggsy has in him Kingsman cinema.
Regardless of the definition, one thing becomes clear: a fundamental change is taking place here.
We tend to divide this category into virtual reality (VR) and augmented reality (AR). Virtual reality has always meant that you are largely disconnected from the real world, immersed in a digital environment. AR traditionally meant heads-up displays that overlaid digital information on top of the real world. Somewhere along the way, mixed reality (MR) and augmented reality (XR) entered the chat room. The first refers to gadgets that connect the virtual and real worlds, the second is a general term for all this technology.
Form factor used to be a reliable indicator of whether a device was VR or AR. VR was the domain of headsets, while AR made the most sense with glasses. Not anymore. The newest headsets are increasingly MR. The problem is that there is no consistent naming strategy. After all, it’s Samsung Galaxy XR, not Galaxy MR. Xu might call the AR glasses Project Aura, but those also fall firmly into MR territory. You could argue that true AR is now dead. That’s why we find ourselves in this taxonomic quagmire.
I am of the opinion that neither of these terms is the end game. Instead, I’m inclined to think that companies will start categorizing these gadgets based on When you will exploit them.
A recurring theme: AI glasses are stylish devices that you wear all day long, even when their batteries run out. But while you can wear them 24/7, each interaction is relatively brief. Some companies have lofty ambitions, perhaps replacing the phone, but for now they are content with an alternative to a smartwatch for selecting notifications.
Conversely, headsets are considered episodic devices. You are NO he intended to wear them 24/7. They are tied to specific exploit cases – entertainment and multi-screen tasks. You wear them for an hour or two and then take them off. Technically, you could take them with you, but generally speaking, companies don’t imagine you’ll do more than just sit in them. Episodic devices are primarily headsets, but as Project Aura shows, they can also be shrunk down to the shape of glasses.
I still don’t like the term “AI glasses,” although I acknowledge that “smart glasses” on their own may no longer be enough. Perhaps in time we’ll have specific terms for whether shrewd glasses have a display, a camera, AI, Bluetooth, all of the above, or a combination of the four. In the meantime, I’m asking for a friend – what do we call this fresh generation of AI-powered face computers?


