Seeing like a human: event camera system simulates the human eye

Micro-movements of the human eye ensure sharp vision and identification of moving objects. Researchers have transferred it to a camera.

Save to Pocket listen Print view
Woman with a green eye

(Image: ra2 studio/Shutterstock.com)

3 min. read
This article was originally published in German and has been automatically translated.

A team of scientists at the University of Maryland has developed a camera, the Artificial Microsaccade-Enhanced Event Camera (AMI-EV), which enables robots to perceive their environment in a similar way to humans. The camera replicates the natural, involuntary micro-eye movements of the human eye, which ensure a sharp and blur-free view and tracking of moving objects over a longer period of time.

"Event cameras are a relatively new technology that tracks moving objects better than conventional cameras, but today's event cameras struggle to capture sharp, blur-free images when there is a lot of motion involved," says Botao He, lead author of the study and a PhD student at the University of Maryland.

The problem is that robots and autonomous vehicles, for example, rely on sharp and blur-free images in real time in order to be able to react promptly to changes. The researchers have therefore developed a camera system that improves the technology of an event camera. They have summarized the results in the study "Microsaccade-inspired event camera for robotics", which has been published in Science Robotics.

The scientists first investigated which mechanisms in the human eye ensure that moving objects can be tracked in a focused manner. The researchers identified microsaccades, small and very fast, involuntary eye movements. They always come into play when a person focuses their gaze. The tiny, continuous eye movements make it possible to track a moving object while at the same time accurately perceiving structures such as color, depth, shadow and shading.

The scientists deduced from this that a camera system must also be able to imitate such movements to minimize motion-related blurring. The researchers used an existing event camera system, which they equipped with a rotating prism. This allows them to redirect the light captured by the lens and simulate the micro-movements of the eye. To compensate for changing light and to achieve stable images, they developed compensation software.

In the tests, the AMI-EV was also able to capture fast movements with tens of thousands of images per second better than conventional cameras, which can capture up to 1000 images per second on average. The modified event camera could also identify fast-moving objects more accurately, the scientists write.

The researchers see a wide range of applications for the AMI-EV. For example, it can help robots and autonomous vehicles to quickly detect and identify moving objects. The camera also performs better in extreme lighting conditions, has a lower latency time and lower power consumption. It is therefore also well suited to virtual reality (VR) applications involving rapid head and body movements. The technology could also be useful in simple smartphone cameras.

(olb)