AInvest Newsletter
Daily stocks & crypto headlines, free to your inbox
Google hosted a special event themed “Android XR,” focusing on the latest developments in the extended reality (XR) space, including smart glasses and VR headsets.
During the event, Google announced it is developing two types of AI glasses: one equipped with an electronic display and another centered primarily on audio. The company’s first AI glasses will launch in 2026. Compared with the much-criticized Google Glass from a decade ago, this new effort appears far more mature.
Google and Xreal jointly built an AI glasses prototype called Project Aura, which requires connection to an external battery pack to operate. Testers said the glasses connect wirelessly to a smartphone and rely on the phone to perform computing tasks and process requests—such as asking the Gemini AI assistant to play music or analyze ingredients to suggest recipes. Offloading heavy computation to the phone is what keeps the glasses thin and lightweight.
Technology giants are now racing into AI glasses.
is the current leader, while , , and major Chinese tech firms are catching up quickly.
Testers also revealed two Google smart glasses prototypes featuring built-in displays: a monocular version with a screen in the right lens, and a binocular version with a display in each lens. Both can run apps such as Google Maps and Google Meet, though the binocular design provides a larger virtual display.
Google aims to ensure that as many software experiences as possible run smoothly on both monocular and binocular models. Testers tried real-time translation, which can display subtitles on the screen—or users can turn off the display and rely solely on audio translation through the speakers.
Google Maps is expected to become a key use case for the AI glasses. When using Maps, testers could look down to view a larger top-down map of their current location, along with a compass indicating their facing direction.

Users can also take photos with the AI glasses and ask the Gemini model to enhance them with Nano Banana Pro, viewing the processed image directly on the glasses without taking out a phone.
Google additionally outlined improvements to the Galaxy XR headset it is co-developing with Samsung, including a new travel mode for easier XR use while moving. Previously, fast-moving scenery in cars or airplanes disrupted the viewing experience, such as watching movies on a plane.
Google also introduced a PC Connect app that allows Windows computers (Mac version in development) to connect to the Galaxy XR headset and project a mirrored laptop screen into the virtual environment, including support for gaming—potentially boosting the headset’s appeal.
Finally, Google unveiled a new feature called “Likeness,” enabling Galaxy XR users to scan their faces, apply beautification, and use the virtual avatar in video calls, complete with simulated facial expressions and hand gestures.
Senior Research Analyst at Ainvest, formerly with Tiger Brokers for two years. Over 10 years of U.S. stock trading experience and 8 years in Futures and Forex. Graduate of University of South Wales.

Dec.09 2025

Dec.08 2025

Dec.08 2025

Dec.04 2025

Dec.02 2025
Daily stocks & crypto headlines, free to your inbox
Comments
No comments yet