„Doom“ now also runs on Meta Ray-Ban Smart Glasses
Meta has opened up the Meta Ray-Ban Display for developers. One of the first apps tested is a port of the classic game "Doom".
Meta Ray-Ban Display
(Image: Meta)
Seven months after the release of the Meta Ray-Ban Display, developers can now build their first apps for Meta's first smart glasses with an integrated display. Developer Timur Abdrakhimov (Linkedin) demonstrates the new possibilities with a port of the first-person shooter “Doom”.
The game appears on the smart glasses' waveguide display, which has a diagonal field of view of 20 degrees. It is controlled via finger movements, which the Meta Neural Band translates into computer commands. The Meta Ray-Ban Display is only visible to the right eye, making the glasses only conditionally suitable for long gaming sessions. Nevertheless, the experiment is interesting as a proof of concept, especially since “Doom” has previously been made to run on calculators, lawnmowers, and even electric toothbrushes.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externes Video (TargetVideo GmbH) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (TargetVideo GmbH) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
The Meta Ray-Ban Display is currently only available in the USA due to supply bottlenecks. In the EU, regulations on batteries and AI also complicate market entry. According to a report, Meta plans a second generation of smart glasses for this year. Meta might reserve the worldwide launch for this successor model.
Two ways to apps for the Meta Ray-Ban Display
Developers have two approaches for developing apps for the Meta Ray-Ban Display. One is the Meta Wearables Device Access Toolkit, an SDK for iOS and Android that Meta has been offering for its display-less glasses since the end of last year and has now expanded with display functions. For the first time, developers can extend existing smartphone apps to the glasses' display and show elements such as text, images, or video playback there. Development is done with Swift for iOS and Kotlin for Android.
Videos by heise
A second option is the new “Web Apps”. Developers can develop and test these standalone applications with HTML, CSS, and JavaScript in the browser and then launch them on the glasses via URL. They have access to motion and orientation data, GPS data from the smartphone, input from the Meta Neural Band, and local storage. Meta sees Web Apps primarily for rapid prototyping and lean applications. Abdrakhimov's “Doom” port is also based on this approach. In a subreddit specifically set up by Meta, you can find more early experiments with the developer tools.
Both development paths are initially only available as a Developer Preview, meaning that developers can build and test their applications but cannot yet distribute them to end-users. More information can be found on Meta's developer page for wearables.
Meta positions itself ahead of the new smart glasses competition
Meta also announced that the gesture typing introduced in January for testers is now available to all users. It works in Instagram, WhatsApp, Messenger, as well as in native messaging apps on Android and iOS, among others. The recording function announced in March, which combines the display and camera image in a video, will also be more widely available. Visual pedestrian navigation will also be expanded to the entire USA and is expected to work in international major cities like London, Paris, or Rome. Live captions that transcribe spoken language during conversations or phone calls will also be available for WhatsApp, Facebook Messenger, and Instagram Direct. Meta's new AI model Muse Spark is scheduled to be released for Meta Ray-Ban Display this summer.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externes Video (TargetVideo GmbH) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (TargetVideo GmbH) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
The timing of the developer announcements is likely no coincidence. Google could announce its first smart glasses based on Android XR at the Google I/O developer conference as early as next week. Apple is also pursuing smart glasses plans and could provide initial hints at WWDC in June. Furthermore, Snap is expected to present its first AR glasses for consumers this year, with the announcement likely not far off.
As smart glasses become more widespread, so do the controversies surrounding them. Meta has recently faced criticism several times, for example, due to alleged plans for facial recognition and for intimate videos from glasses that ended up with Clickworkers for data annotation. In addition, there is concern that smart glasses facilitate secret recordings in public spaces.
(nie)