Eyes on Everything - Always Sensor Technology

Many species have evolved special ways of perceiving their environments, from the raptor’s eagle-eyed vision to the ultrasonic echolocation that enables bats to navigate in the dark. Vehicle sensors use the same underlying principles to maintain precise positioning and orientation.

Musca domestica, or the common housefly, is not the animal that most immediately springs to mind when discussing spectacular performance evolved over millions of years. Yet it often appears to anyone wielding a fly swatter that these winged pests have added a seventh sense to the other six, enabling them to evade certain death by mere fractions of a second.

The housefly owes its superfast reactions to a highly evolved sensory system. A film comprising a series of images flashing by at 20 frames per second can fool the human eye into seeing continuous motion. But flies are capable of perceiving as many as 250 separate images in one second. They can watch the deadly fly swatter as it approaches in what, for them, is literally slow motion – a principle that is equally useful in road traffic. But a lidar sensor puts even a housefly’s awesome high-speed, high-resolution capabilities in the shade. On average, lidar sensors register several thousand signals every second.

Lidar: precise echolocation for cars

But the detector, a passive device for registering stimuli, makes up just half of a lidar sensor. The sensor as a whole is based on an echolocation principle similar to the biological sonar that allows dolphins and bats to find their way – and their prey – in the dark. To do so, they generate sound waves that are reflected back to them by obstacles and potential food sources. The time the sound takes to bounce back tells the animals where a given object is positioned in relation to themselves. Bats even make use of the associated Doppler effect to work out which way a tasty moth is flying and how rapidly its wings are beating.

Lidar systems use bursts of laser light, each lasting just billionths of a second, as the equivalent of sound waves. Together with ZF, Hamburg-based firm Ibeo is developing a new generation of mobile lidar sensors, using lasers that operate in the infrared range at wavelengths of 850 or 885 nanometers. Light at these wavelengths is invisible to the human eye, and not intense enough to do any harm. Compared with other sensors, lidar systems produce extremely accurate results over very long ranges. The laser sensors can detect objects – both motionless and moving – surrounding the vehicle at distances of up to 300 meters.

Radar: piercing fog and darkness

Radar sensors operate using the same basic principle, albeit using electromagnetic radiation at considerably longer wavelengths; ZF’s radar systems generate waves in the multi-millimeter range. Although they have a lower resolution than lidar systems, radar sensors come into their own in poor weather conditions: whereas fog or heavy rain literally and in the truest sense of the word blind an optical system, radar waves pass through water droplets more or less effortlessly.

Cameras: wide-angle and long shots

Alongside echolocation systems, cameras are also firmly established in the pantheon of automotive environment recognition devices. Vehicle camera systems can’t match the visual acuity of birds of prey. After all, that proverbial eagle eye can pick out a mouse at distances of around 350 meters. In road traffic, however, such high resolution would be more hindrance than help. In these conditions, a wide field of vision combined with good resolution are much more important, especially at a perpendicular angle to the direction of travel. Thus ZF’s Tri-Cam system has both a telephoto lens and a fish-eye lens, for improved recognition of close-up objects. The above-mentioned sensor systems have one significant advantage over the animal world’s sensory specialists: they aren’t limited to a single technology, but can rely on the interaction of multiple sensor systems. The specific benefits of radar, lidar and camera systems are mutually complementary, covering every traffic situation imaginable. A vehicle fitted with all these systems has 360-degree all-round vision. Even the animal world’s record holder with the broadest field of vision, the chameleon, is “only” capable of swivelling its eyes through 342 degrees. Despite protuberant eyes capable of moving independently, the animal still has a small 18-degree blind spot just behind its head.

Processing power for driverless cars

If necessary, the multiplicity of sensor technologies could be extended even further. While ultrasonic sensors only have a comparatively limited range, they are a very cost-effective option for parking and lane-change assistance. And infrared devices could be helpful for reliably detecting obstacles obscured by the dazzle of oncoming headlights. Of course not even the most comprehensive selection of sensor technologies is capable of powering a driver-assist system on its own, let alone enabling a car to drive by itself. For high-speed reaction times, you also need the right software, capable of processing and analyzing the incoming streams of data without delay. A bat’s brain, for example, is able to compute the exact position of its prey from the reflected echo of a sound wave. The chameleon’s bony features conceal a mind capable of processing two completely separate images of its surroundings and turning them into a single, coherent image. In the automotive world, the demand for processing power is growing in parallel with the swelling streams of data collected by ever more sophisticated sensors. Future electronic control units like the ZF ProAI developed in collaboration with Nvidia will become the vehicle’s brain. Only then will self-driving cars be able to react to the sudden appearance of a deer in the middle of the road with the same lightning speed as a housefly avoiding a potentially lethal flyswatter.

Sensors for all seasons - Perfect all-round vision - guaranteed

Depending on vehicle speeds, front-facing radar systems (1) in, for example, active cruise control systems have a range of up to 200 meters. They can detect the position and speed of the vehicle in front, as well as oncoming vehicles. For lane-change assist purposes, ZF also supplies the AC2000 system as a side-mounted radar sensor with a field of vision of up to 150 degrees. As cost-effective, robust alternatives, cameras like ZF’s Tri-Cam system (2) are already used in many driver-assist solutions such as lane-keeping assistants. With ranges of up to 250 meters, they are not the most “eagle-eyed” of technologies – but they are capable of detecting motion perpendicular to the direction of travel much more accurately than, say, radar sensors. Lidar systems are very costly, but their long range and high resolution beat other technologies hands down. They’re also capable of effortlessly detecting pedestrians and cyclists. However, rain and fog do have a significant impact on their visual acuity. Together with Ibeo, ZF is currently developing a compact lidar sensor (3) that doesn’t need wear-prone rotating mirrors. Working together, all these different sensor technologies ensure that the vehicle has full, all-round perception of the surrounding environment at all times. The benefits of the different systems are mutually complementary, giving vehicles the system redundancy that is so essential for the success of
autonomous driving.

Further related articles