Project Aura, announced during Google I/O, is the result of cooperation between Xreal and Google. This is the second Android XR device (the first is the Samsung Galaxy XR headset) and is scheduled for release in 2026. Putting them on, I can see why the term “smart glasses” doesn’t quite fit.
Is this a headset? Shrewd glasses? Both? These were the questions that were running through my mind as I held Project Aura in my hands during a recent demo. This looked like a pair of hefty sunglasses, except for a wire hanging off the left side leading to the battery, which also doubled as a trackpad. When I asked, Google representatives told me they thought it was a headset disguised as glasses. They also have a term for this: wired XR glasses.
I can wirelessly connect to my laptop and create a giant virtual desktop in my space. I have a field of view of up to 70 degrees. My first task is to launch Lightroom on a virtual desktop while opening YouTube in another window. I play a 3D board game where I can squeeze and pull the board to zoom in and out. I look at the painting on the wall and summon the Search Circle. Gemini tells me the name of the work of art and the artist.
I’ve done all this before on the Vision Pro and Galaxy XR. This time my head is not stuffed into a bulky headset. If I wore this in public, most people wouldn’t notice. However, this is not augmented reality, which superimposes digital information on the real world. It’s much more like using the Galaxy XR, where you see apps in front of you and around you.
A Google representative told me that everything I tried in Project Aura was originally developed for the Galaxy XR. None of the apps, features, or experiences had to be reworked for the Project Aura chassis. This is huge.
XR has a sedate application problem. Take the Meta Ray-Ban and Vision Pro display. Both were launched with several third-party apps, giving consumers little reason to carry them. Developers also need to choose which of these gadgets to invest in to build applications. This leaves little room for smaller companies with massive ideas to compete or experiment.
This is what makes Android XR fascinating. Smaller players like Xreal can access apps developed for Samsung’s headset. Android apps will also run on AI glasses that will launch next year from Warby Parker and Gentle Monster.
“I think this is probably the best thing for all developers. You just don’t see any fragmentation anymore. And I really believe that more and more devices will connect with each other. That’s the whole point of Android XR,” says Xreal CEO Chi Xu.
Putting on Google’s latest prototype AI glasses, I watch an Uber demo in which a fictional version of myself receives a ride from JFK airport. The representative calls Uber over the phone. The Uber widget appears on the glasses display. It shows the estimated pickup time and my driver’s license plate and car model. If I look down, a map of the airport appears with current directions to the pickup area.
Everything is handled by the Uber app for Android. This means Uber didn’t have to code the Android XR app from scratch. Theoretically, users could simply pair the glasses and start using the apps they already have.
When Gemini is asked to play music, a YouTube Music widget appears displaying the title of a funky jazz mix and media controls. It’s also just using the YouTube Music app on your Android phone.
I was asked to tell the Twins to take a photo with the glasses on. A preview will appear on the display AND on your paired Pixel Watch. The idea is that smartwatch integration gives users more options. Let’s say someone wants audio glasses with a camera. They can now take a photo and see what it looks like on their wrist. It will work on any compatible Wear OS watch.
I also try live translations, where glasses detect the language used. I make video calls on Google Meet. I get the Nano Banana Pro to add K-pop elements to another photo I took. I’m testing a second prototype with a display in both lenses, allowing a larger field of view. (These are NO will be released next year.) I’m watching a 3D movie on YouTube.
It’s all impressive. I’ve heard some rumors about Gemini being a really killer app. But my jaw dropped when I found out that next year’s Android XR glasses will support iOS.
“The goal is to make having multimodal Gemini on glasses available to as many people as possible. If you’re an iPhone user and you have the Gemini app on your phone, that’s great news. This will give you the full Gemini experience,” says Juston Payne, Google’s director of product management for XR.
Payne notes that this will generally apply to Google’s iOS apps like Google Maps and YouTube Music. Restrictions on iOS will mainly apply to third-party apps. But even in this case, Payne says the Android XR team is looking into workarounds. At a time when the lockdown level in the wearables ecosystem is at an all-time high, this is a breath of fresh air.
Google’s utilize of the existing Android ecosystem is a shrewd move that could give Android XR an edge over Meta, which currently leads in terms of hardware but only has Just made its API available to developers. It also increases pressure on Apple, which is lagging behind in both artificial intelligence and glasses. Ensuring interoperability between different devices? Honestly, that’s the only way an intermediate device like Project Aura has a chance.
“I know we can make these glasses smaller in the future, but we don’t have that ecosystem,” adds Xu, CEO of Xreal. “Right now there are only two companies in the world that can really have an ecosystem: Apple and Google. Apple, they won’t work with others. Google is the only option for us.”
Google tries to avoid past mistakes. In the production of equipment, it deliberately cooperates with other companies. This avoids the eye-catching design of the original Google Glass. It has pre-launch applications. Prototypes explore different forms – just audio and displays in one or both lenses.
Payne doesn’t shy away when I ask an significant cultural question: How do we discourage the utilize of glass holes?
“If something is being recorded, there’s a very bright, pulsating light. So if the sensor is on and it wants to record something, it will notify everyone in the area,” Payne says. This includes inquiries to Gemini regarding everyone camera task. The on and off switches will have clear red and green markings, so users can prove to others that they are not lying when they claim that the glasses do not record. Payne says existing permissions frameworks, privacy policies, encryption, data storage and security guarantees of Android and Gemini will also apply.
“There’s going to be a whole process of getting access to certain sensors so that we can avoid certain situations that might happen if someone decides to use the camera inappropriately,” Payne says, noting that Google is taking a conservative approach to granting access to cameras to third parties.
On paper, Google is making shrewd moves that solve many of the challenges inherent in this space. Sounds good, but that’s straightforward to say before these glasses hit the market. A lot can change by then.

