A headset disguised as glasses
First previewed at Google I/O, Project Aura is Google and Xreal’s upcoming entry in the Android XR ecosystem. It follows Samsung’s Galaxy XR headset as the second official Android XR device and is slated for launch in 2026. But after trying it on, one thing becomes clear: the term “smart glasses” doesn’t quite capture what Aura is.
The device resembles a pair of oversized sunglasses at first glance, though a cable on the left side connects to a compact battery pack that doubles as a trackpad. Google representatives describe it as a headset masquerading as glasses, and they’ve coined a new category name for it: wired XR glasses.
Virtual desktops, multitasking and gesture controls
Project Aura wirelessly connects to a laptop to create a floating virtual workspace. Users get a 70 degree field of view to arrange apps and windows in their physical environment. In my demo, I launched Lightroom on a virtual desktop while running YouTube in a separate floating window — a setup that felt intuitive and surprisingly immersive.
Aura supports spatial interactions as well. A 3D tabletop game allowed me to pinch and pull the digital board to zoom in and out, while anchoring the content to a real table surface. Compared to previous AR attempts, the gestures felt natural and precise.
Circle to Search and AI built directly into your view
Google is positioning Aura as an AI-first XR device. Looking at a painting on the wall, I summoned Circle to Search directly through the glasses. Gemini instantly identified the artwork and its creator, overlaying the answer in my field of view.
It’s a glimpse of how Project Aura blends spatial computing with Google’s growing AI ecosystem — transforming simple real world objects into searchable, interactive references without ever picking up a phone.