External radar sensor to increase safety in autonomous driving
An externally mounted radar sensor on streetlights or traffic lights is intended to help self-driving cars get a better view of the traffic situation.
The cutaway EyeDAR sensor provides insight into the complex lens structure.
(Image: Jared Jones/Rice University)
A research team at Rice University has developed EyeDAR, an external radar sensor that can be attached to traffic lights and streetlights to send information about the current traffic situation back to autonomous vehicles. EyeDAR is intended to give self-driving cars a better view of traffic and thus increase their driving safety.
Traditionally, self-driving vehicles use internal sensors such as cameras, lidar, or radar systems to capture the traffic situation in real-time and act autonomously. However, this does not always work well; for example, when road users obscure parts of the environment, visibility is impaired by fog, rain, or darkness, and the detection systems cannot capture a traffic situation quickly and sufficiently enough. The researchers at Rice University have therefore looked for a way to supplement internal sensors with external ones that can better capture traffic situations from an elevated overview position.
To this end, the researchers have developed a millimeter-wave radar sensor that is about the size of an orange, as the scientists write in a statement. The study “EyeDAR: A Low-Power mmWave Tag that Senses and Communicates 3D Point Clouds to Enhance Radar Perception” was presented at the end of February at the Hotmobile Congress, The International Workshop on Mobile Computing Systems and Applications. The sensor uses the radar signals emitted by the vehicle to send them back to the vehicle from a better position.
Additional “eyes” for autonomous vehicles
EyDAR essentially consists of two components: a spherical Luneburg lens 3D-printed from resin and an antenna array on the back. The scientists were inspired by the human eye. The lens focuses incoming signals from any direction onto a focal point on the opposite surface. The antenna array surrounding the lens on the back acts like a human retina, capturing the signal and its direction. This design has a crucial advantage over conventional radar systems, which rely on large antenna arrays and complex algorithms to determine angles. This is because EyeDAR does not require the energy-intensive computing power needed for this.
Instead, EyeDAR relies on more than 8,000 uniquely shaped, tiny units with different refractive indices, arranged to direct incoming radar signals to the exact position of the antenna array. According to the scientists, the system can thus determine the target direction up to 200 times faster than conventional radar systems.
Videos by heise
The radar reflections are sent back to the transmitting radar device in the vehicle. The EyeDAR sensor thus combines sensing and communication in one. This makes the system compact, cost-effective, and energy-efficient. The scientists believe that the system could be particularly useful on roads in areas with high traffic density to make the use of autonomous vehicles safer. In principle, EyeDAR could also be used in other autonomously operating systems, such as robots or drones.
(olb)