News Android XR

Google Reveals the Future of Android XR: AI Glasses to Rival Meta's Ray-Bans and PC Productivity

Plus, you can download the Likeness app to make your own 3D avatar.

5 min read
A photo of a pair of smart glasses
The future of Android XR is in these glasses. Image: Google

I tried my hardest to avoid this cheesy opening line, but it's too fitting: Move over, Meta! Google is coming for your AI glasses. Today's Android Show announcement focused on Google's extended reality division, Android XR, and its plans to push directly into the territory Meta has aggressively claimed. Google is betting that its "secret weapon" as it enters the newly established product category will be Gemini, an AI that isn't a total doozy to use.

Today's announcement lays out the future: New Android XR-equipped AI glasses are on the horizon for next year, featuring both monocular (one-screen) and binocular (two-screen) models. Google also announced a partnership with XREAL to add wired XR glasses to its lineup, alongside an SDK update that officially opens development for these additional form factors. For Samsung Galaxy XR users, there are a few new feature updates, including PC Connect, which lets you control a Windows PC directly from the headset.  

What's coming: AI Glasses and wired XR

A photo of a person wearing AI smart glasses
Your author is wearing a prototype pair of Google's forthcoming AI glasses. Image: Florence Ion/ Android Faithful

Google announced earlier this year that it had partnered with Warby Parker and Gentle Monster to produce the Gemini-infused smart glasses. With this latest announcement, it's solidified that the first set of glasses is being fast-tracked for a commercial release in 2026. There will be a pair offering screen-free assistance, though it still includes built-in cameras, microphones, and speakers for interacting with Gemini. There will also be a pair offering an optional in-lens display for accessing features like visual navigation in Maps and live translations. Both types of glasses have physical tactile controls, such as a trackpad and launcher buttons, that mirror the Android navigation scheme.

For higher-performance spatial computing, XREAL will offer its Project Aura glasses with optical see-through abilities. These glasses have a widened 70-degree field of view and run Android XR powered by Qualcomm Snapdragon chips. Those internals enable "headset-like immersion" with a massive, private screen that you can use as a workspace or to watch 3D-rendered YouTube videos.

Connect Galaxy XR to a PC

Those of you who had a minimum $1,800 to drop on the Samsung Galaxy XR headset can enjoy a few new features rolling out to the headset beginning today. The most significant new ability is PC Connect, which lets you link your Windows PC to your headset to pull in your desktop or a window from your computer to view on the headset. The ability is only available in beta right now. I got to try it for a few minutes inside the PC game, Stray, where you control a cat through the harsh human world. I used a game controller connected to a Lenovo Legion laptop streaming the instance. It was definitely neat to control the game from inside the headset, but, as with my experience with Meta headsets, the movement was too much for my sensitive little stomach.

One thing you can try, even if you don't have a Galaxy XR headset: the new Likeness app starts rolling out today in beta to help you create a realistic digital representation of your face, which is then mirrored inside the headset during a compatible video call. It's basically a 3D avatar creator that mimics much of the experience from other Google-backed ventures like Project Starline (now known as Google Beam).

Developers, developers, developers!

A new product category is nothing without developers to make apps for it. Google announced the launch of Developer Preview 3 of the Android XR SDK. This update opens up development for AI glasses, headsets, and wired glasses. The good news for consumers is that it means more headset-ready Android apps, if and when you decide to buy into the hardware category.

I think I want a pair

I was fortunate to go hands-on with these new glasses that Google is working on with its hardware partners. The hardware I wore was not finalized, but it offered a relative idea of what the experience will be like when it arrives on store shelves. I already had an initial glasses demo at Google I/O earlier this year, and my experiences with querying Gemini for a song or for help with navigation in Google Maps weren't too different from that initial try-on. This time, Google has me using Nano Banana to generate an image within a photo I'd snapped with the prototype glasses. I ended up constructing a fashionable Mad Men-style conversation pit, complete with whiskey stones and fragile masculinity. It's amazing what AI can do these days.

a photo of a mad men style pit generated by AI in the middle of an office space
Behold, a manly pit generated by AI.

I already see myself getting more use out of a pair of Google AI glasses than I have out of the pair of Ray-Ban Metas that sit collecting dust on my shelf. But it has nothing to do with image generation, which I wouldn't ever think to use in this situation because that's not something I use AI for. I don't bother with the Meta glasses because I don't like the AI. When I first read about it, Meta was publishing activity to a public feed where folks could see what you were querying behind the scenes.

Gemini doesn't have that social network connection, so I already feel safer delving in. I'm already using it to some capacity on my Pixel smartphone and accessories like the Pixel Buds 2a. Most of what I use Gemini for is utility: issuing a smart home command, navigating with Google Maps, or getting basic search information in my ear because I cannot touch the screen. And for the most part, it works the way that I hope it would.

I am less enthused about the headsets meant to boost productivity. The XREAL wired glasses are neat in that they exist and do what they're supposed to. But I still have to deal with the fact that they make me feel woozy and slightly disoriented. I could feel my head wanting to hurt in that familiar, almost-migraine way. It's the same experience for me with Galaxy XR. I don't think I'm the target person for the full headset experience. But the glasses feel like they could enhance my experience as an all-in Android user.

I'm still not sure how the public would take it, though, and that's another reason I'm not out and about with AI glasses. Google's AI glasses will have all the same privacy-minded capabilities smart glasses are expected to have today, like a light that's on to show you're recording with the camera (though bad actors have shown us how easy it is to shield). But even with Gemini being the most personable AI, Google still has to figure out how to bridge the gap between it being a helpful tool and a social faux pas.

Share This Post

Check out these related posts

What is Reality? - Android Faithful Episode #127

Samsung Trifold is Tri-ing Too Hard - Android Faithful Episode #126

Android 16's Latest Update Adds Gemini-Powered Notification Summaries and Improved Fraud Detection