Snap announces real AR glasses for consumers
Snap has presented new specs. The lightweight AR glasses with AI functions will be available to consumers from 2026.
Play billiards better thanks to glasses.
(Image: Snap)
AI assistance from glasses – All major tech companies are currently working on this. Snap already has the Spectacles 5 augmented reality glasses for developers on the market and is now announcing completely new specs for 2026. The glasses are to be lightweight, immersive and equipped with more AI. They are significantly less bulky and, above all, are available for end consumers. Specs is the "most advanced personal computer", according to the presentation.
Evan Spiegel, co-founder and CEO of Snap Inc. said at the Augmented World Exhibition 2025: "We believe the time is right for a revolution in computing that naturally connects our digital experiences to the physical world, and we can't wait to publicly launch our new Specs next year." AI needs to move from the screen to the real world.
(Image: Snap)
AI help with traveling, translating, cooking and more
With the Specs, AI assistance will move into a three-dimensional space. You can look through the see-through lenses and see objects in your surroundings. The glasses, which do not require a connection to another device such as a cell phone, are controlled using your hands, for example with a pinch. However, to use certain functions such as the phone, the Specs do require a smartphone.
Meta's Ray-Ban glasses always require a connection to the Meta AI app. There are cameras and microphones in the frame of the Meta Ray-Bans so that you can talk to the AI about what you see. AR functions or a display are not yet available. This will only come with the Orion announced by Meta. However, this is only available as a prototype so far.
Videos by heise
The first companies have already developed applications for the Specs. Gowaaa's Super Travel service is designed to help translate signs, menus and more abroad. With Paradiddle, you can learn to play the drums and Cookmate helps you find a recipe that matches the contents of your fridge.
(Image: Snap)
OpenAI as well as Google and other providers such as Deepseek are moving their AI models into Specs and the SnapOS operating system. Developers can create lenses based on the multimodal capabilities of the models – like the automatic translation in Super Travel. Snap also explains that all applications run via the proprietary Remote Service Gateway, which ensures data protection-compliant camera access.
Specs with real-time translation, 3D lenses and for guided tours
Using the Depth Module API, 2D information from the Large Language Models (LLMs) is converted into AR information. Snap explains that this "opens up a new paradigm for spatial intelligence". The Automated Speech Recognition API also enables real-time translations in more than 40 languages. And with the Snap3D API, developers can create 3D objects for their lenses – using GenAI.
Developer tools for location-based experiences are also new. This means, for example, that AR-guided tours can be offered – through museums, in cities or at events. There is also a guided mode for one or more people at the same time. Developers can use the fleet management app to access several specs remotely at the same time.
Initially only announced as "coming soon" is the support of WebXR in the integrated browser. The standards summarized under this make VR and AR experiences available on the web. This means that web content can also be used on the glasses. Together with Niantic Spatial, they want to create an AI-supported world map.
Spiegel said during his presentation that Snapchat is the world's largest AR platform. The AR lenses are used eight billion times a day. Google receives 14 billion requests a day. In addition, there are numerous cases in which the CameraKit SDK is used to offer AR in other apps. Snap is also responsible, for example, for the AR experiences in numerous stadiums in the USA, such as the Kiss Cam, as well as for AR mirrors that are available in malls or at events.
(emw)