Meta turns the wrist into a computer interface
Meta wants to enhance human-computer interaction with an EMG wristband. In a research paper, Meta provides interesting new insights into the technology.
A picture of Meta's sEMG research bracelet.
(Image: Meta)
Meta published the research in the journal Nature, where it describes what the wristband does: it recognizes pinch, tap, and swipe gestures performed freely in the air, allows a one-dimensional cursor to be controlled by wrist movements, and enables words to be written by users tracing letters on a solid surface with their index finger. This can be a table or their leg.
The inside of the wristband is equipped with 48 dry electrodes that record electrical muscle signals on the wrist. An AI model trained by Meta analyses these signals, recognizes individual patterns, and interprets them into the described computer commands. Unlike conventional electromyography (EMG), in which needles are inserted into the skin, the wristband records the signals non-invasively via the surface of the skin, a process known as sEMG (surface electromyography). Despite this limitation, the system achieves a remarkably high bandwidth in decoding the muscle signals.
(Image:Â Meta)
Meta trained the AI model with data from over 6,500 test subjects and thus developed what it claims is the first neuromotor interface that works reliably for a broad section of the population without individual training or calibration.
Metas bracelet: market launch in sight
Test subjects used the system to write around 21 words per minute (median value) by tracing letters on a surface. According to the research paper, this value could be further increased using personalized models. Nevertheless, the typing speed still lags far behind typing on digital smartphone keyboards (approx. 30–40 WPM) and physical computer keyboards (approx. 40–50 WPM).
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externes Video (TargetVideo GmbH) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (TargetVideo GmbH) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
Meta developed the wristband with the aim of being able to operate AR glasses without additional input devices such as touch displays, keyboards, and controllers. In this context, the integrated writing function would be suitable for writing text messages, for example. By recognizing subtle pinch, tap, and swipe gestures, the wristband also provides an inconspicuous alternative to voice control or gesture recognition via optical hand tracking, which is likely to be rarely used in public. With the wristband, micro gestures are sufficient, and the hand can remain in the trouser pocket.
(Image:Â Meta)
The history of the wristband goes back a long way and is based on technology from the start-up CTRL Labs. Almost six years after the start-up was acquired by Meta, the commercialization of the wristband is within reach: in 2024, the company demonstrated a highly advanced product prototype in combination with the Orion AR glasses, and, if a Bloomberg report is to be believed, the wristband could still appear in 2025 and be used to control new smart glasses with a display.
An input method that learns from humans
Irrespective of this, the combination of sEMG technology and artificial intelligence has great future potential for human-computer interaction. The hope is that input methods will adapt to humans in the future and no longer the other way around. AI models capable of learning could recognize and adopt individual control patterns. For example, the way a microwave is operated could be automatically tailored to an individual's personal operating logic without the appliance having to be specially programmed.
Videos by heise
The technology could also be useful in therapy. In a study supported by Meta, a stroke patient was able to use a sEMG bracelet to move all the fingers of a virtual hand, even though he could no longer stretch his physical fingers. The bracelet was able to recognize the very finest residual activity on the wrist and convert it into computer commands. In another study, people with severe movement restrictions due to muscle atrophy and spinal cord injuries developed individual gestures for typical computer interactions.
The research work is freely accessible on the Internet.
(mho)