Android XR: Google demonstrates AR glasses at TED conference
Google seems to be making progress in the development of smart glasses with a display.
Google has been showing prototypes of AR glasses for several years. Their functions were demonstrated in front of an audience at a TED conference.
(Image: Google)
During a TED conference, Google demonstrated smart glasses with an integrated display and Gemini functions. The manufacturer is apparently making progress with the development.
Google appears to be gradually preparing for the launch of the first AR glasses with the new Android XR operating system. At a TED conference in Vancouver, Canada, Google managers demonstrated the prototype of AI glasses with an integrated screen live on stage. Google's smart glasses are almost indistinguishable from conventional glasses.
Smart glasses from Google: Gemini helps you find lost objects
As the tech portals Axios and Goodgoodgood write, Android XR boss Shahram Izadi and Nishta Bathia, Product Manager for Glasses and AI, demonstrated various functions and scenarios on stage.
During the event, Izadi performed a live translation from Farsi to English and demonstrated image recognition by scanning a book. The translation appears in subtitles on the display of the glasses.
Izadi explained that real-time features such as seeing, hearing and reacting are “quite rudimentary”. A function called “memory” represents the next stage: According to the Android XR boss, the AI glasses use a “rolling contextual window where the AI remembers what you see without you having to tell it what to keep track of”. A camera is integrated into the glasses to capture the surroundings.
Videos by heise
To find a lost hotel card, for example, Bathia asked Gemini: “Do you know where I last put the card?” “The hotel card is to the left of the record,” Gemini replied, pointing to the items on the shelf behind her via the glasses.
The “memory” function was first demonstrated in a video at Google I/O 2024 and is part of Google's Project Astra. Just a few days ago, Google gave the AI Gemini Live “eyes” to enable the described function.
Lightweight Gemini glasses are paired with the smartphone
Izadi also explained that the glasses do not work independently, but must be paired with a smartphone. Data is streamed back and forth between the devices, meaning that the smartphone acts as a data center. The advantage of this solution is that the glasses can be kept light and compact.
The concept is very similar to Meta's“Project Orion”, a prototype of which the company showed in September 2024. Meta's smart glasses use an external computing unit in the form of a puck instead of a smartphone.
Demos such as those by Google and Meta indicate that the technology has now been miniaturized to such an extent that augmented reality in glasses is now technically feasible.
Smart glasses from Samsung and Meta, expected by 2025
The first potentially mass-market smart glasses are reportedly set to be launched this year. Despite the current demo, one of the first smart glasses based on Android XR might not necessarily come from Google – although this is not completely out of the question – but from hardware partner Samsung.
“Project Haean” is to be a pair of smart glasses with a display based on Android XR, which is to be presented by the end of 2025. The company already confirmed that Samsung has smart glasses in the works at the beginning of 2025 when it unveiled the Galaxy S25. However, the XR headset “Project Moohan” is expected to be presented first.
Meta is also expected to launch similar glasses this year. These are said to be a further development of the Ray-Ban glasses, which are being developed under the code name Hypernova. These are glasses with an integrated display. According to Bloomberg, they will cost more than 1,000 US dollars and will therefore be considerably pricier than the previous Ray-Ban models, which are available in Germany from 329 euros.
(olb)