Shopping in AR Glasses: Snap Shows New Payment Options and AI Tools

At Lens Fest 2025, Snap presents new AI tools, payment functions, and tools for its AR platform and the upcoming AR glasses "Specs".

listen Print view
A first-person view image: a person on the beach holds a smartphone in their left hand and a payment card in their right.

With the upcoming AR glasses "Specs", Snap wants to enable payments within AR applications.

(Image: Snap Inc.)

3 min. read

At this year's Lens Fest in Santa Monica, Snap co-founder and CTO Bobby Murphy once again presented numerous innovations for developers. The focus is on AI-powered tools, simple payment functions, and preparations for the first AR glasses for consumers, which are scheduled to be released in 2026.

Since autumn last year, developers have been able to rent the fifth generation of Snap Spectacles to program their applications for the AR glasses—so-called “Lenses.” Now it is clear that they will not have to adapt their lenses for future hardware. According to Snap, compatibility with the previous format will be fully maintained. The recently introduced operating system Snap OS 2.0 serves as a common basis. However, Snap still owes images, technical details, or even a concrete release date for the next AR glasses, “Specs.”

With “EyeConnect,” there is at least a new function for shared AR experiences: a brief glance at another person should be enough to automatically start a shared Lens. An important building block for a functioning AR ecosystem are the monetization options for developers. The new “Commerce Kit” is intended to help here, enabling the sale of digital goods or premium functions directly within Specs Lenses. The technical basis is the new “Snap Cloud,” an infrastructure based on Supabase. It offers APIs, real-time functions, and storage for computationally intensive AR projects, and, according to Snap, was developed with a special focus on data protection.

AI is also an important topic for Snap, even though the Snapchat operator is not developing its own model. Instead, the company is integrating a generative text interface into its developer environment. With “Lens Studio AI,” developers should be able to describe in natural language what they want to implement; the AI will then generate code, suggest assets, or help with troubleshooting.

According to Snap, the system uses large language models such as GPT-5 High. More than 50 modular components, including stickers and animated Bitmojis, are available via the new Blocks framework. In the future, Lens Studio can also be used directly in the browser or on mobile devices. A collaborative interface with chat and visual editing should facilitate cooperation.

Videos by heise

In addition, two new modules are intended to enable the creation of more realistic content: “Realistic StyleGen” improves lighting, textures, and materials, while “Enhanced FaceGen” allows for more precise facial adjustments—for example, for avatars or digital characters. Both tools work in real-time and are intended to accelerate development. Also new is “AI Clips,” a tool for automatically creating short videos from images. These clips can then be edited or combined with your content.

(joe)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.