SonicSense: robotic hand recognizes materials and object shapes acoustically

To recognize materials, people tap on an object and virtually hear what it might be. A robotic hand can do this and more.

listen Print view
SonicSense robotic hand on a robotic arm

The artificial hand attached to a robotic arm recognizes object shapes and materials acoustically.

(Image: Duke University)

4 min. read

Scientists at Duke University have developed a robotic hand with four fingers that can recognize the material and shape of an object by the vibrations it makes when it touches it. The system, called SonicSense, could significantly expand the perceptive capabilities of robots.

For example, a human taps on an object when they are not sure what material it is made of. From the acoustic feedback, it then deduces which material it could be. In this way, different materials such as wood can be distinguished from plastic. Humans do not even think about this when interpreting the acoustic vibrations.

A research team at Duke University has artificially imitated this human perceptual ability with the SonicSense system. The scientists have summarized the system itself and their research results in the study "SonicSense: Object Perception from In-Hand Acoustic Vibration", which has been published as a preprint on Arxiv. They plan to present their research to a wider audience at the Conference on Robot Learning (CoRL 2024), which will take place in Munich from November 6 to 9.

The system essentially consists of a robotic hand with four fingers that is attached to a robotic arm. Contact microphones are built into the fingertips. They can acoustically pick up vibrations that occur when an object is touched with the hand. It is sufficient for the hand to touch the object. Shaking the object, for example, can further improve the accuracy of the vibration recording. The microphones are designed as contact microphones in order to better suppress ambient noise.

Videos by heise

The recorded acoustic vibrations are interpreted by the artificial intelligence (AI) of the SonicSense system. The system first extracts specific frequency characteristics from the signals. These are then evaluated by a trained AI and assigned to a material. The scientists state that it is also possible to recognize the shape of the object. At least four interactions with the object are required to be able to assign it to a shape that already exists in an internal database. If it is an unknown object, 20 interactions are required to determine the shape of an object, the researchers write.

"SonicSense gives robots a new way to hear and feel, similar to humans, which may change the way today's robots perceive and interact with objects," says Boyuan Chen, professor of mechanical engineering and materials science at Duke University. "Sight is essential, but sound provides additional information that can escape the eye."

The scientists tested SonicSense in the laboratory and had the robotic hand shake a box containing several cubes. The system was able to recognize how many dice were in the box. The researchers achieved a similar result with a bottle of liquid. By shaking it, SonicSense was able to detect how much liquid was in the bottle. They were also able to recognize objects and their materials.

The scientists kept the costs of SonicSense low by using commercially available components. For example, they used contact microphones in the fingertips, similar to those used by musicians to record guitar sounds. They also used standard parts and components from the 3D printer. Overall, this enabled them to reduce the costs to just over 200 US dollars.

The researchers want to improve the SonicSense system even further. The idea is to integrate other sensors, for example for pressure and temperature. The aim is to create a robotic hand that has dexterous manipulation capabilities to enable robots to perform tasks that require a precise sense of touch.

(olb)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.