Technology

#AutonomousEverything

Combined Sensor Power for Autonomous Vehicles

Min Reading Time
Tags: ZeroAccidents, SeeThinkAct, Safety

Advanced driver assistance systems must also be able to handle complex traffic situations. For the vehicle to function reliably under any lighting and weather conditions, information from the cameras and sensors must be intelligently connected.
Kathrin Wildemann, January 15, 2019
author_image
Kathrin Wildemann has been a part of the permanent Copy Team at ZF since 2016. In her online and offline articles, she likes to cover electromobility and other topics that involve sustainability.
A camera that registers a blank space on each image probably has a faulty design. If you compare the human eye with a camera, then the exact same thing happens when seeing: At the place where the optic nerve exits the retina, there are no receptors that record the light stimuli from which an image forms in the brain. This part of the eye is referred to as the blind spot. However, it does not disrupt anything because the information from the surrounding receptors on the retina and, especially, the visual impressions from the other eye, offset the missing image points. At the same time, two eyes next to one another ensure that we can see spatial depth, a major requirement in estimating distances. Expressed in technological terms, the data from two sensors merge or fuse to become a more complete image with more information.

Sensor fusion: a requirement for autonomous driving

Sensor fusion: a requirement for autonomous driving

Developers of automated driving functions also use precisely this principle. Very different sensors are needed so that a driverless vehicle can also unequivocally comprehend every traffic situation even in unfavorable lighting and weather conditions. Cameras, radar and lidar sensors each have their special advantages. If intelligently bundled, they allow for a sweeping and detailed 360-degree view. “As system architects of autonomous driving, we have developed a sensor set that equips vehicles with all necessary senses in order to be able to perceive their environment digitally,” explains Torsten Gollewski, head of ZF Advanced Engineering and managing director of Zukunft Ventures GmbH.
ZF sensor architecture in use: The merged information from radar, lidar and the most diverse range of cameras is already enabling vehicles to identify and handle even the most complex traffic situations.

Cameras ensure various angles of view for every driving situation

Cameras ensure various angles of view for every driving situation

Cameras are indispensable for object detection. They supply the vehicle with the necessary information by using artificial intelligence to detect objects, such as pedestrians or garbage cans, along the side of the road. Furthermore, the camera’s greatest strength is that it can measure angles precisely. This allows the vehicle to recognize early on whether an approaching vehicle will turn. If city transport requires a wide angle of vision to record pedestrians and traffic, a long range of up to 300 meters and a narrow angle of vision are necessary on highways. ZF’s broad assortment of different camera systems is important for adaptive cruise control, for the automated emergency braking system and the Lane Keeping Assist function.

Interior cameras for maximum occupant protection

Interior cameras for maximum occupant protection

However, cameras not only monitor the exterior surroundings of the vehicle, they also keep an eye on the driver and passengers inside the vehicle. For example, they can recognize not only whether the driver is distracted or tired, but also which seating positions the passengers select. This knowledge represents a major safety plus because, in case an accident occurs, the seat belt and airbag functions adapt accordingly.
The three lenses from the ZF TriCam have different angles of view and deliver important information from different distances in front of and next to the vehicle.

Radar sensors use an echo system in case of poor visibility

Radar sensors use an echo system in case of poor visibility

Unlike cameras that passively record image information, radar systems are an active technology. These sensors emit electromagnetic waves and receive the “echo” that is reflected back from the surrounding objects. Radar sensors can therefore determine, especially, the distance and the relative speed of these objects with a high level of accuracy. That makes them ideal for maintaining distances, issuing collision warnings or for emergency brake assist systems. Another decisive benefit of radar sensors, compared to optical systems, is that they function regardless of weather, light or visibility conditions because they use radio waves. That makes them an important component in the sensor set. Like its camera systems, ZF also offers a broad assortment of sensors with different ranges and opening angles (beam width) The imaging Gen21 Full Range Radar, for example, is a good option for highly automated and autonomous driving due to its high resolution.

Lidar: laser-sharp all-around view

Lidar: laser-sharp all-around view

Lidar sensors also apply the echo principle, however, they use laser pulses instead of radio waves. That’s why they record distances and relative speeds equally as well as radar, but recognize objects and angles with a much higher level of accuracy. This is also a reason why they oversee complex traffic situations in the dark very well. Unlike cameras and radar sensors, the angle of view is not critical because lidar sensors record the 360-degree environment of the vehicle. The high-resolution 3D solid state lidar sensors from ZF can also display pedestrians and smaller objects three dimensionally. This is very important for automation as of level 4. The solid-state technology is considerably more robust than previous solutions due to the lack of moving components.
“It is good to see that solid-state lidar is hitting the road together with our partner Ibeo. And we have shown here on the Consumer Electronics Show the new full-range radar. That is a high-resolution radar technology that overcomes limitations of previous generations”, says Martin Randler. He is Director Sensor Technologies and Perception System.

Sound.AI enables vehicles to detect acoustic signals

Sound.AI enables vehicles to detect acoustic signals

As its advertising slogan “see. think. act.” implies, ZF’s technological solutions allow vehicles to see. In addition, the company equips vehicles with the “Sound.AI” application so they can also hear. The system recognizes, among other things, approaching emergency vehicles, such as police cars, ambulances and fire trucks, by their acoustic signals. When fitted with Sound.AI, the vehicle will also subsequently pull over to the side of the road.
With the sensor solution Sound.AI, the vehicle can acoustically locate approaching emergency vehicles like police cars, ambulances and fire trucks and also pull over to make room for them.

Collectively unbeatable thanks to artificial intelligence

Collectively unbeatable thanks to artificial intelligence

Combined into one sensor set, the above-mentioned technological solutions also prevent blind spots from emerging when perceiving the vehicle’s surroundings, even in complex situations. To merge the sensor information from the lidar, radar and camera systems to create one complete picture, a “brain” is also needed. ZF’s solution to this is its “ProAI RoboThink” computer. It is currently the world’s most powerful mainframe computer in the automotive industry. Once vehicles are equipped with this artificial brain, drivers will soon be able to close their eyes or do something else and leave the driving up to their autonomous vehicles. This is not science fiction, says engineer Gollewski: “Our concentrated sensor power can already help cover future requirements through fully integrated technologies.”