see. think. act.

Technology

#AutonomousEverything

Sensor Power for Automated Driving

Min Reading Time
Tags: Safety, SeeThinkAct, ZeroAccidents
Automated vehicles need sensors to perceive their environment. Automotive cameras are a prerequisite for many Advanced Driver Assistance Systems (ADAS). For autonomous driving different sensors are needed.
January 15, 2019
A camera that registers a blank space on each image probably has a faulty design. If you compare the human eye with a camera, then the exact same thing happens when seeing: At the place where the optic nerve exits the retina, there are no receptors that record the light stimuli from which an image forms in the brain. This part of the eye is referred to as the blind spot. However, it does not disrupt anything because the information from the surrounding receptors on the retina and, especially, the visual impressions from the other eye, offset the missing image points. At the same time, two eyes next to one another ensure that we can see spatial depth, a major requirement in estimating distances. Expressed in technological terms, the data from two sensors merge or fuse to become a more complete image with more information.

Sensor fusion for automated driving

Sensor fusion for automated driving

Developers of automated driving functions also use precisely this principle. For autonomous driving, different sensors are needed so that a driverless vehicle can also unequivocally comprehend every traffic situation even in unfavorable lighting and weather conditions. Cameras, radar and lidar sensors each have their special advantages. If intelligently bundled, they allow for a sweeping and detailed 360-degree view. Learn more about it in the following sections.

Cameras ensure various angles of view for every driving situation

Cameras ensure various angles of view for every driving situation

Cameras are indispensable for object detection. They supply the vehicle with the necessary information by using artificial intelligence to detect objects, such as pedestrians or garbage cans, along the side of the road. Furthermore, the camera’s greatest strength is that it can measure angles precisely. This allows the vehicle to recognize early on whether an approaching vehicle will turn. If city transport requires a wide angle of vision to record pedestrians and traffic, a long range and a narrow angle of vision are necessary on highways. ZF’s camera systems are important for different driving functions like Adaptive Cruise Control (ACC), Automated Emergency Braking (AEB), Lane Change Assist (LCA) and many more.

Radar sensors use an echo system in case of poor visibility

Radar sensors use an echo system in case of poor visibility

Unlike cameras that passively record image information, radar systems are an active technology. These sensors emit electromagnetic waves and receive the “echo” that is reflected back from the surrounding objects. Radar sensors can therefore determine, especially, the distance and the relative speed of these objects with a high level of accuracy. That makes them ideal for maintaining distances, issuing collision warnings or for emergency brake assist systems. Another decisive benefit of radar sensors, compared to optical systems, is that they function regardless of weather, light or visibility conditions because they use radio waves. That makes them an important component in the sensor set.

Lidar: laser-sharp all-around view

Lidar: laser-sharp all-around view

Lidar sensors also apply the echo principle, however, they use laser pulses instead of radio waves. That’s why they record distances and relative speeds equally as well as radar, but recognize objects and angles with a much higher level of accuracy. This is also a reason why they oversee complex traffic situations in the dark very well. Unlike cameras and radar sensors, the angle of view is not critical because lidar sensors record the 360-degree environment of the vehicle.

Unbeatable thanks to artificial intelligence

Unbeatable thanks to artificial intelligence

Combined into one sensor set, the above-mentioned technological solutions also prevent blind spots from emerging when perceiving the vehicle’s surroundings, even in complex situations. To merge the sensor information from the lidar, radar and camera systems to create one complete picture, a “brain” is also needed. ZF’s solution for this is the ProAI supercomputer.

Autonomous driving

What is the state of the art? How is testing carried out? What will the future bring? Everything there is to know about cars and other vehicles that operate on an automated or autonomous basis.

related articles