BACK

Driver assistance systems: What sensors can do

Advanced Driver Assistance Systems (ADAS) provide vehicles with additional comfort and safety on the roads. Nowadays, numerous driver assistance systems are used in cars, often combined in individual safety packages. This is made possible by ever more intelligent recognition of the vehicle surroundings because sensor technology is performing better and better. Ultrasound, radar, lidar, camera and more have clearly improved in performance. Highly complex software is at the heart of the increasingly powerful control units, optimising the algorithms' processes in order to respond quickly and trigger the right (re)action even in critical driving situations. In this way, critical situations can be overcome and accidents avoided.

Take the lane departure warning system, for example: Is a warning sound enough when leaving a lane? Or should there be a haptic warning where the steering wheel vibrates? Or is an active steering or braking intervention necessary? To make the right decision, modern systems can make use of the interaction between sensors and camera systems. The more accurately the live information is processed in real time, the more optimally the driver assistance system responds to the situation on the roads. Depending on the application and ADAS, numerous other pieces of information are also incorporated, such as speed, steering angle, distance from the vehicle in front, blind spot information or even road conditions (etc.). For example, HELLA's new Shake sensor enhances the ambient perception of radar, lidar and camera by providing up-to-date and precise data on the road condition.

 

Nevertheless, there are also assistance systems that do not directly intervene in processes involving driving dynamics, but significantly improve passive safety and comfort. Examples are the high beam assistant or the automatic windscreen wiper. For example, HELLA also supplies combined rain/light sensors that measure the temperature, humidity and solar radiation (ambient light) and can thus control the air conditioning system in addition to the vehicle's lights.

By means of a piezoelectric element, the SHAKE sensor detects vibrations, airborne noise and water droplets swirled up in the air and from this information determines the degree of wetness between the tyres and the road. Figure: HELLA

By means of a piezoelectric element, the SHAKE sensor detects vibrations, airborne noise and water droplets swirled up in the air and from this information determines the degree of wetness between the tyres and the road. Figure: HELLA

Modern, multifunctional rain/light sensor. Figure: HELLA

Modern, multifunctional rain/light sensor. Figure: HELLA

Radar and ultrasonic sensors

Radar systems (mostly with 77 GHz) enable accurate speed and distance measurements – even at high vehicle speeds – but do not have a high angular resolution. They are used to avoid collisions, for example. One of their strengths is that they are not dependant on the weather. Mid-range and long-range radar systems with a range of up to 250 m are used in addition to short-range radar for detecting objects up to 30 m away.

Ultrasonic sensors have been part of the traditional parking aid for a long time. They measure the distance from the nearest object by recording the time it takes reflected sound pulses that they have emitted to travel. As short-range specialists, they are less relevant for automated driving, but they have proven themselves as parking and blind spot assistants. Ultrasonic sensors are compact and robust. They also work at night and without interference, for example in fog. However, they have shortcomings in snowfall and are also not suitable for longer distances.

Ultrasonic sensors are considered a \'classic\' sensor. Up to eight of these are installed at the front and rear, usually in the bumpers, and are used to warn about distance when parking.

Ultrasonic sensors are considered a "classic" sensor. Up to eight of these are installed at the front and rear, usually in the bumpers, and are used to warn about distance when parking.

Lidar sensors

The lidar sensor is an equally important sensor. The abbreviation means light-detection and ranging – an optical measuring system used to detect objects. The position of the object can be determined by the reflection of the emitted light at the object until the light returns to the receiver. In principle, therefore, it is a laser scanner that can also create a three-dimensional image of the surroundings. Lidar systems do not work with microwaves, but with light pulses from non-visible light ranges, i.e. near infrared light. They usually have a 905 nm wavelength, a range of 200 m in good weather conditions, a high angular resolution and 360° coverage. However, dazzling light and poor visibility conditions, such as fog, rain or spray, affect the range. Therefore, lidar is mostly used as an auxiliary system.

Camera systems (optical sensors)

Camera systems are also frequently used to survey the surroundings. One core application, for example, is traffic sign recognition. The detected signs are displayed directly on the instrument display or the screen. In many cases, traffic sign recognition also serves as an information base for other driver assistance systems, such as the priority warning system, the wrong-way driving warning system or the speed warning function.

In addition to traffic signs, modern cameras can also recognise and even distinguish between obstacles in front of the vehicle.

In addition to traffic signs, modern cameras can also recognise and even distinguish between obstacles in front of the vehicle.

Modern cameras can also recognise and even distinguish between obstacles in front of the vehicle. Both mono and stereo cameras are used. The latter are able to detect obstacles in 3D without additional sensor technology. With a stereo camera, however, the installation space limits 3D imaging: the smaller the distance between the two camera lenses, the smaller the effective three-dimensional measuring range. This means that stereo cameras can see in 3D up to 50 m in front of the vehicle. Furthermore, the differences in perspective of the two images taken are too small to derive 3D information from them. Above this limit, the camera behaves like a mono camera.

 

The range of a mono camera is around 250 m, regardless of the installation space. By combining the images from several cameras and sensors, a three-dimensional image can be created. Cameras inside the vehicle can also detect whether the driver is tired or distracted. Cameras outside the vehicle (front and rear) also record the car's immediate surroundings and point out obstacles.

Infrared sensors

On the contrary, infrared cameras are used for night-vision assistants. They respond to heat radiation from objects. Converted to black and white images, the information is shown on the combined display. Cooler surroundings appear dark, people and animals appear conspicuously bright. Modern systems detect people and larger wild animals at a distance of up to 300 m. A warning signal sounds in dangerous situations. Depending on the headlamp system, it is possible, for example, to warn the person with short light pulses.

Merging sensor data

All relevant data from ultrasound, radar, lidar, cameras and other systems can be linked intelligently and in real time using sensor fusion. Looking forward, this is what makes automated driving possible in the first place. Redundancies, i.e. partially overlapping results for recognition of the surroundings, are an express requirement. Redundancies and plausibility checks, i.e. checking within the system whether the data about the surroundings has been recorded correctly, are the only things that largely stop the data being incorrectly interpreted. Therefore, depending on the driver assistance systems, the degree of automation and the vehicle class, we are dealing with an individual mix of information and sensors and ever more data that must be processed in real time. It is already a technological masterpiece!