Realbotix: AI vision system makes humanoid robots more realistic

There is something creepy about humanoid robots that look like humans. To minimize this, Realbotix builds an AI vision system into its robots.

listen Print view
Human and robot looking alike side by side

A Realbotix robot looks very similar to the human original.

(Image: Realbotix)

3 min. read

Realbotix, developer of hyperrealistic humanoid robots, has built a visual recognition system based on artificial intelligence (AI) for humanoid robots that can recognize faces and objects, track faces and enable real-time scene recognition. This should enable robots to react more accurately to their environment, interact better socially with humans and be perceived as more human-like.

The Robotic AI Vision System from Realbotix can recognize the presence of humans. The system uses facial recognition to perceive the presumed emotional state of its counterpart and adapts its reactions to this in a natural way. In this way, the developers want to reduce the “Uncanny Valley” effect that artificial people can trigger in real people. The effect, which impairs the acceptance of human-looking robots due to a creepy factor, was described by Japanese roboticist Masahiro Mori in 1970.

The more natural and human-like a robot appears to a person, the more likely it is to be perceived and accepted as a social being. This opens the door to possible applications for robots beyond industrial use, for example in healthcare, customer service or as a social companion robot – i.e., wherever the focus of interaction with the robot is on people.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externes YouTube-Video (Google Ireland Limited) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (Google Ireland Limited) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

However, the visual recognition system developed by Realbotix can not only recognize and interpret human facial expressions. It can also detect objects and scenes in real time, i.e., understand entire situations. The system uses cloud-based multimodal AI to update its understanding of processes, can determine objects, people, and behaviors and thus adaptively generate context-related responses. The system should also enable the robot to remember previous interactions with people.

The system achieves a natural language conversation with a humanoid robot via real-time vision processing in combination with large AI language models (Large Language Model – LMM). This should enable the robot to have more intelligent, contextually nuanced conversations.

Videos by heise

Another component of the Robotic AI Vision System is a realistic eyeball control system that moves natural eyes to match the language and scene. The Realbotix Robotic AI Vision System is already available in new Realbotix robots. Existing models can be retrofitted with the system. The first robots with the AI vision system are due to be delivered in 2025.

(olb)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.