"Vibe Coding XR": Gemini generates VR apps in under a minute

With Vibe Coding XR, Google presents an AI workflow that allows VR apps for Android XR to be generated and tested directly in the browser from prompts.

listen Print view
Collage of XR demos: virtual objects, interfaces, and interactions in real environments.

A collage of AI-generated VR prototypes.

(Image: Google)

2 min. read

Google's internal research unit has presented an AI workflow intended to help with prototyping immersive applications. Users describe via text prompt or natural language what they want to create, and Gemini generates functional, physics-based WebXR applications for Android XR from it in under a minute. With a compatible headset, input, generation, and subsequent testing and iteration take place entirely in the browser.

The workflow relies on the AI tool Gemini Canvas and the open-source framework XR Blocks, developed by Google Research. While Gemini, according to Google, takes on the role of an XR designer and follows best practices for XR development, XR Blocks provides the building blocks and infrastructure that turn generated code into runnable XR applications. The workflow thus enables developers and laypeople to quickly and easily experiment with new user interfaces, 3D interactions, and spatial visualizations.

Videos by heise

Google Research illustrates the approach with several examples: a math tutor generated via prompt visualizes Euler's polyhedron formula using three-dimensional bodies, a physics lab allows balancing weights on a scale, and a chemistry scenario simulates various combustion processes. The examples combine spatial interaction, physical behavior, and visual representation, showing the types of XR prototypes that can be implemented with the workflow.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externes YouTube-Video (Google Ireland Limited) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (Google Ireland Limited) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

However, there is a major drawback: the workflow is apparently limited to the Android XR operating system and compatible headsets. Currently, there is only one device of this kind: the Samsung Galaxy XR, which is only available in the US and South Korea so far. However, it is expected that more devices based on Android XR will appear in the future, including Project Aura. Until then, users can also create applications on the desktop and test them using an integrated simulator, says Google.

While Vibe Coding, i.e., prompt-based programming, now covers many areas of software development, the specialized field of XR development is still largely unexplored and presents particular challenges for such approaches. For example, in integrating spatial interaction and device sensor technology. Besides Google, Meta is primarily working on AI tools for XR development; at GDC 2026, the company focused on agentic AI workflows for Unity and AI-assisted iteration for Horizon OS, among other things.

(mki)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.