Dr. Martin Randler is Director Sensor Technologies and Perception System at ZF.
Being able to assess everything happening around the vehicle accurately and reliably is a prerequisite for autonomous driving. This is why ZF has developed the first radar capable of generating a detailed spatial visualization of its environment.
Automated driving presents a number of technical challenges for our industry – from the immense computing power to the mechatronic actuators required. One of the biggest obstacles to overcome, however, is environment recognition. If a vehicle has to react correctly to every complex traffic situation imaginable, it needs to have a constant 360-degree view of its environment. For this to happen, radar sensors are essential. Unlike cameras, radar systems also work equally well in poor visibility conditions such as oncoming headlight glare, rain, or fog. However, until now they have not been capable of visualizing their environment as a three-dimensional space. Radars normally only measure the speed, distance, and width of an object, and enable rough height estimates to be made.
In order to change this situation, ZF has developed the first high-resolution full-range radar capable of precisely determining the angle of elevation. This essentially makes radar technology capable of visualizing surroundings in 3D, with speed as an additional fourth dimension.
The ZF Full-Range Radar has a size of approximately 10 x 10 cm, is installed in the front of the vehicle, and has a range of over 300 meters.
More channels for more detailed environment recognition
The resolution of a radar depends on two factors: the amount of antenna space available and the number of channels. The latter is calculated by multiplying the number of transmitters and receivers, and limits the number of measuring points that can be used. Current mid-range passenger cars usually have 12 channels (three transmitters and four receivers). Our full-range radar has 192 channels – 16 times more than the current standard. This means that our sensor has several thousand data points per measuring cycle. Today’s standard radars only have a couple of hundred.
Rather than being limited to a two-dimensional projection of the traffic situation ahead, the radar system is thus able to create a three-dimensional image that includes information on the height of objects in the vehicle’s environment. The system’s high-resolution technology can even distinguish between objects that are at the same distance from the vehicle and moving at the same relative velocity and visualize them separately from one another. The vehicle can thus recognize hazards such as the end of a traffic jam located under a bridge early and brake to prevent a collision. The high information density also provides much more detailed information on the shape of objects. For example, the full-range radar receives around 10 data points from a pedestrian, rather than the previous standard of only 1 or 2, enabling the system to more accurately distinguish between people and static objects such as bushes. The system is even able to define the movement of individual limbs via the speed of movement of the measuring points, which means the sensor can see in which direction a pedestrian is moving.
192 channels provide several thousand data points per measurement cycle, creating a high-resolution 3D image of the environment. Even the window and roof structure of the building on the right side is clearly identified.
High resolution – a prerequisite for automated driving features
With a beam angle of +/-60 degrees, our full-range radar is suitable for a number of different situations, from slow city traffic to driving on country roads and freeways. The range of 300 meters is also much higher than the current state of the art. This makes the system a powerful supplement to our sensor set and an interesting option for a whole range of applications with varying levels of automation. For example, it is suitable for level 4 and 5 systems without human drivers – such as people or cargo movers on company premises, at airports, in cities, or for last-mile delivery – and for cutting-edge networked advanced driver assistance systems in the passenger car sector (level 2+).
With our comprehensive systems expertise, we were able to seamlessly integrate the radar sensor into the automated functions. The huge amount of information generated by the system presents a great challenge for signal processing and analysis. We therefore worked closely with our ADAS engineers during development of the full-range radar in order to ensure the interface with the control unit and the software architecture are designed to fulfill the technical demands. Our imaging radar is currently in the validation phase, and volume production is planned for 2021.