Haply Robotics: haptic interface for robotics
Haply Robotics presents a haptic input system with force and position feedback at CES 2026, intended to replace game controllers in laboratories and industry.
With input devices like Haply Robotics' Inverse3, robot arms can be controlled precisely and intuitively. Direct force feedback makes what happens to a robot perceptible.
(Image: heise medien / André Kramer)
The Canadian company Haply Robotics develops haptic interfaces for device control that transfer position, movement, and forces in multiple degrees of freedom from the hand to electronics. In practice, Haply input devices allow for intuitive control of robot arms through natural movements.
The Inverse3 input device, suspended on two articulated arms, resolves up to 0.01 mm at a refresh rate of 4 kHz. Three motors for the X, Y, and Z axes provide direct force feedback to the user's hand. Thus, in addition to visual information, it conveys tactile data.
Feeling What the Robot Feels
In a technology demo, CES visitors control a ball on a flexible surface. When pressing on the virtual rubber mat, the Inverse3 responds with palpable counter-pressure. At some point, the ball pops through the mat due to counterpressure. Haplay also demonstrates force feedback with various 3D surfaces. The device makes textures of wood, stone, or sandpaper perceptible.
The Haply device is useful in training surgeons and dentists, as well as in 3D design. In laboratories, industry, and medical training, it makes less suitable makeshift solutions like game controllers obsolete. In robot teleoperation, a user can feel what the machine feels.
(Image: heise medien / André Kramer)
Training Platform for AI Models
In addition, the data from sensitive human hands can be used to train crude robot grippers. Haply Robotics, in cooperation with Nvidia, has developed a system for physical AI training with haptic feedback. The project aims to help train robot systems with realistic data by incorporating tactile information into the learning process alongside visual and kinematic data.
Videos by heise
At CES, the Canadian company is showing how it has integrated its Inverse3 input device into the Nvidia Isaac Sim simulation environment. The robot arm serves as a training platform. During data acquisition, the operator can feel the forces and resistances of virtual objects.
Haptic feedback is used as an additional channel for imitation learning. The acquired data is intended to enable AI models to generate interactions such as grasping, contact, or force exertion that are closer to real-world conditions than purely visual or position-based training data.
(Image:Â Haply Robotics)
GPU-Accelerated Simulation
The simulations run on Nvidia RTX 6000 Blackwell GPUs, which process the multimodal input data. Additionally, the Nvidia Cosmos platform is used to augment recorded training data with synthetic scenarios featuring different environments.
This allows Nvidia to enhance its capabilities in the field of “Physical AI,” the training of robots. Haptically enriched training is intended to enable robots to react more robustly to real-world environments.
heise online is a media partner of CES 2026.
(akr)