Vibe Coding on the Nose: Developer Controls OpenClaw via Smart Glasses
A developer gives the AI agent OpenClaw programming instructions via his smart glasses and always keeps an eye on its progress thanks to the display.
Meta Ray-Ban Display: Meta's first smart glasses with integrated display.
(Image: Meta)
Vibe Coding is when a large language model generates source code and the human contribution is largely limited to formulating instructions. Accordingly, Vibe Coding does not necessarily require programming knowledge.
Programmer and computer glasses enthusiast Jake Ledner demonstrates in a video what Vibe Coding on smart glasses could look like. In the video, he walks through New York while giving the AI agent OpenClaw voice instructions via his Meta Ray-Ban Display and the WhatsApp installed on it. The chat history appears directly in his field of vision, allowing him to track progress via screenshots sent by OpenClaw. The AI agent runs on a Mac Studio in Ledner's apartment and uses OpenAI's coding tool Codex to write the program code.
In the video, Ledner has OpenClaw add a save function for frequent meals to his AI calorie counter app "TrackGPT". After approval, the agent automatically implements the code and the new "Save" button goes live in the app.
"Basically, anyone can develop apps from anywhere today with OpenClaw, OpenAI Codex, and Meta Ray-Ban Smart Glasses," says Ledner in the video, which he initially published on X, but is now also available on LinkedIn.
Security risks and platform limitations slow down adoption
Of course, it's not as easy as Ledner suggests: OpenClaw currently harbors significant security risks and ideally requires a dedicated computing unit, which is unlikely to be practical for many users. Furthermore, the Meta Ray-Ban Display is currently only available in the USA. It was launched there in September 2025 and costs around 800 US dollars. Whether and when an international market launch will occur is open. Meta may be waiting for the second generation of the wearable, which, according to a recent report, could be released later this year.
Videos by heise
Another limitation of the Meta Ray-Ban Display is that third-party developers currently have no access to the display. For this reason, Ledner uses WhatsApp to communicate with OpenClaw. Meta's messenger service is one of the few apps that currently support the smart glasses.
However, the lack of display authorization could change soon: Oscar Falmer, who is responsible for developer relations in the wearables sector at Meta, reacted to the video and promised display access for third-party developers this year. This could lead to the creation of apps in the future that directly access OpenClaw or a comparable future AI function from OpenAI. Its creator Peter Steinberger was recently hired by the company.
Independently of Meta, more smart glasses with displays will be released in the future that have similar features to the Meta Ray-Ban Display. For example, Google plans to introduce a corresponding product later this year.
(nie)