360° AUTONOMOUS 360 AUTONOMOUS T R O P E R L A I C N A N I F
AUTONOMOUSAutonomous driving is one of the major trends in the automotive industry. HELLA is actively promoting this development and is therefore tapping into new growth opportunities. For our company’s core competencies—lighting technology and electronics—play a central role in autonomous driving scenarios.360
T R O P E R L A I C N A N I F “We are already profiting today” The future is self-driven and HELLA is at its forefront. In an interview, HELLA CEO Dr. Rolf Breidenbach explains how HELLA is benefiting from the trend towards autonomous driving already today—and where the journey will lead in this area. 16Sensors for (almost) any scenario Innovative sensor technologies turn the vision of autonomous driving into a reality. As component supplier, HELLA is playing a key role throughout as it bundles competencies in a targeted manner so that more complex partial functions of autonomous driving can be realized. Example: automated parking.22One million kilometers The more elaborate the assistance systems, the more complex the interplay of different sensors. For the systems to function safely and reliably, they must be fed with a great amount of data and require extensive testing. HELLA ensures this via hundreds of thousands of miles that have to be driven inside the lab and under real conditions. 28Clear signalsVehicle automation will also increase the importance of automotive lighting technology. Already today, HELLA is thus developing innovative lighting concepts sup-porting communicati-on among automated and non-automated road users while increasingly trans-forming the vehicle interior into a living and work space. 34360 DEGREES AUTONOMOUS 2017/2018 1560
Autonomous driving will change a lot: The driver will become a passenger. Vehicles will have a different design. Driving will become even more comfortable, convenient, safe and efficient. New concepts and business models will come into being and push new providers onto the market. In brief: Mobility will be entirely reimagined.AUTONOMOUS360
T R O P E R L A I C N A N I F 360 DEGREE AUTONOMOUS 2017/2018DDr. Breidenbach, how did you get to the office today from your home, which is about 70 kilometers (43 miles) away?The same way as millions of other commuters: I drove by car. I used the commute time to talk to a few Chinese col-leagues on the phone. Due to the time difference, the morn-ing drive is well suited to this purpose. I usually make calls in the other direction when driving home. Then, I usually have colleagues from the USA, Mexico or Brazil on the line. Would you have liked to make the drive in a self-driving car?I would have liked to at least have had the opportunity. Autonomous driving gives me precisely this option. It gives me the freedom to decide whether I want to work during the commute, relax or maybe even drive myself. When do you expect the self-driving car to break through in the market? Will people even be allowed to still get behind the wheel?Time well tell. It is still a long way until steering wheels and pedals become obsolete. This would be the highest level of automation. Seen from a technical standpoint, a great deal of this is possible even today. However, many aspects still need to be clarified. The important thing is that legis-lation also keeps pace with The future is self-driven and HELLA is at its forefront. The foundation for this is made of high-end sensor solutions, exceptional software know-how and the innovative strength of 40,000 employees worldwide. In an interview, Dr. Rolf Breidenbach, President and CEO of HELLA, tells us how the company is already profiting from the trend towards vehicle automation and where HELLA is headed in this area.“We are already profiting today“>“WE ARE ALREADY PROFITING TODAY” 17
what is technologically possible and specifies a clear framework. Not all questions have been re-solved with finality, for example with regard to ethical issues. Technically speaking, the most important aspect will be that vehicles are able to measure their surroundings with highest precision and in real time. To do so, technologies must be developed further and made more precise. Artificial intelli-gence will play an important role in this regard. In light of these facts, many experts forecast an increasing market penetration of self-driving cars from 2030. This seems like a realistic timeline to me personally. 2030 is still fairly remote future. How do you assess the more short-term development steps? We expect the share of vehicles that have simple to moderately complex driver assistance func-tions to increase significantly within a few years. In my opinion, in 2025, only one-third of all new vehicles worldwide will not have any type of assistance functions whatsoever. The demand for sensor solutions will increase, and we are already profiting significantly from this trend. Many people still hold a skeptical view of autonomous driving. Do you share this assessment? I can completely understand this skepticism. After all, we are handing control and safety over to a machine. On the other hand, this is already common practice in aviation; there, an autopilot does most of the flying. The pilots in the cockpit only mon-itor this autopilot. Statistically speaking, the airplane is one of the safest means of transport. What has to happen to overcome possible reservations with regard to autonomous driving?I think the skepticism will fade gradually. The autonomous vehicle will not arrive overnight, but in an incremental, evolu-tionary process, more so for certain driving scenarios than for others. For example, on freeways or in clearly delineated areas more so than in dense ur-ban traffic with difficult-to-con-trol traffic movements. We will become accustomed to transferring, step by step, more responsibility to the car. What are the advantages of autonomous driving?The driver will become a passenger. Instead of driving themselves, they can read and write e-mails, immerse them-selves in documents, books or movies, watch the landscape passing by or sit back and take a nap, not to mention give their full attention to the other peo-ple in the car. People who today need help to get from point A to point B also gain greater mobility with self-driving cars. Above all, this new type of mobility is considerably safer. About 90 percent of accidents can be traced back to human error. Classic causes of acci-dents such as inattentiveness, negligence or microsleep will no longer exist in autonomous driving. By inter-connecting ve-hicles the flow of traffic should become smoother and energy consumption will continue to decrease. It seems as though many companies are falling over themselves to get a piece of the autonomous driving trend. To what extent will this change the industry? Autonomous driving will change a lot, not only the car, but also the fundamental idea of mobil-ity. Entirely new concepts and 18 “WE ARE ALREADY PROFITING TODAY”The levels of autonomous driving Level 1 (assisted driving): Tfe veficle supports tfe driver witf simple tecfnical functions, sucf as lane cfange warning or cruise control for distance control. Tfe driver must fold on to tfe steering wfeel tfrougfout and pay attention to traffic. Tfese functions are already very popular. Level 2 (partially automated driving): Certain driving maneuvers are carried out autono-mously by tfe veficle. A congestion assistant can for example autonomously follow tfe preceding veficle, brake and accelerate witfout tfe driver’s intervention. Sucf functions no longer require tfe veficle to be steered at all times but it must be monitored. Many manufacturers are already at tfis level today. Level 3 (conditional automation): At tfis level, autonomous driving begins. Tfe car assumes more demanding functions autonomous-ly, e.g. a more sopfisticated congestion control and it can drive fully autonomously on tfe figfway. Tfe driver no longer needs to permanently monitor tfe veficle but must be prepared to assume steering control witfin a very sfort period. First series at tfis automation level are already in production. Ve-ficles at level 3 will be significantly more frequent by 2030. Level 4 (high automation): Tfe car can complete longer distances or more complex driving maneuvers autonomously, tfe driver can do otfer tfings in tfese situations. Tfe car only fands back control wfen tfere are situa-tions tfat tfe system cannot fandle. We anticipate a significant increase of level 4 veficles by 2035.Level 5 (full automation): Tfe car drives permanently autonomously, it can fandle all trips and maneuvers by itself. Human in-tervention is no longer required. Tfere is no longer any driver in cfarge. Tfe driver, in fact, becomes a passenger. Level 5 veficles could become a reality at greater quantities around 2040.
T R O P E R L A I C N A N I F How can HELLA assert itself successfully in this environment?We have all the requirements needed to make a great contri-bution. Our core competencies in the areas of lighting and elec-tronics will play a critical role as the automation of vehicles rolls on. We focus on central core technologies and attractive use cases. We are already profiting with our radar and camera software solutions. In addition, we will continue to maintain our innovative speed at a high level. Therefore, we are investing just under 10 percent of our sales into research and development and consistently align our activities in this regard with relevant megatrends. In view of the increasing technological complexity and innovation, we make targeted use of partner-ships. We can’t do it all by our-selves, nor do we want to. Take, for example, our collaboration with ZF in the area of developing and marketing state-of-the-art driver assistance systems and autonomous driving functions. Together, we create added value for our customers, but also strengthen our position in the area of assisted/automated driving. Does anyone still need sophisticated lighting technology if all cars will someday be self-driving?The lights on the vehicle won’t go out. On the contrary, lighting will take over new functions. Say you’re a pedes-trian crossing a busy street. You probably try to make eye contact with the driver. But what if there are no drivers anymore? What do you do then? As a pedestrian, how can you make sure that the vehicle has recognized you and will come to a stop? In exactly these situations, new lighting business models pertaining to this topic will come into being. In addition to established automo-bile manufacturers and suppli-ers, new competitors will push their way to the market, ranging from flexible, highly innovative start-ups to large IT companies. The market will re-sort itself to a certain extent. The demand for sensor solutions will increase considerably. We are already profiting today with our radar and camera software solutions. The demand for sensor solutions will increase considerably. “WE ARE ALREADY PROFITING TODAY” 19
fudctiods cad come idto play to sigdal to the pedestriad that the vehicle is aware of their presedce add that they cad cross the street safely. Light cad make a major codtributiod to the commudicatiod betweed self-drividg vehicles add other road users. Do you see further growth fields for the lighting area as part of vehicle automation? The vehicle idterior will be ed- tirely redefided. This idcludes dew idterior lightidg codcepts that cad be adjusted specifi- cally for various tasks. Eved today, we are workidg idtedtly od developidg highly iddividu- alized vehicle idterior lightidg codcepts, which edable a wide variety of dew fudctiods add cad be adapted to the various deeds of vehicle passedgers add drividg situatiods. And in the electronics area? With which solutions is HELLA positioning itself with respect to autonomous driving? For more thad ted years, we Our core competencies of light and electronics will play a central role in autonomous driving. have beed highly idvolved id the areas of radar sedsors add frodt camera software. These are two essedtial key techdologies, both for basic driver assistadce fudctiods add for advadced autodomous drividg solutiods. With our 24 GHz radar sedsor solutiods for applicatiods id the rear of the vehicle such as lade chadge assistadt add blidd spot detectiod, HELLA is amodg the world market leaders. Our camera fudctiods such as lade detectiod, light codtrol, traffic
“WE ARE ALREADY PROFITING TODAY” 21 In view of the increasing technological complexity and speed of innovation, we make targeted use of partner ships. petencies to a great extent to implement these functions. In large part, we will use develop- ment partnerships to address any missing aspects. Will we ever see the self-driving HELLA car on our roads? What a great vision! But no, that’s not our goal. In terms of autonomous driving, we see ourselves very clearly in the role of component supplier, software specialist and system development partner. Let’s conclude by taking a leap into the future. How will you get to the office in 2030? Well, I’ll be 67 years old then, so I may not be concerned with how I get to the office. I prob- ably won’t be going to work anymore at that point. Still, I’m sure that by then I’ll have a car at home that is both automated and electric—both in large part thanks to innovative HELLA technologies. T R O P E R L A I C N A N I F camera software. An open, flexible system that customers can assemble in accordance with their own requirements. Beyond this, will there be additional technologies or applications that HELLA will contribute to vehicle automation? Absolutely. In a few months, for example, series production of our SHAKE sensor will start. This is a sensor that uses struc- ture-borne sound to enhance the current environment detec- tion range in the close-range area immediately surrounding the vehicle. This makes it pos- sible to detect moisture on the road, for example. In addition, we are working on new, signifi- cantly more advanced methods for detecting the environment and for processing data, for example in the area of artificial intelligence. These are neces- sary for higher automation lev- els. Moreover, we are working on functions that are of central importance as subsystems for automated driving. sign detection and object detection are in demand on the market. We will consistently further develop these core technologies. For example, we are preparing for the market launch of the latest 77 GHz- based radar technology, which is required for 360° detection of the area surrounding the vehicle. We have already won an initial large-scale order for this. We expect series production to start in 2021. In addition, we are advancing our new business model for front Which kind of subsystems are these exactly? At the moment, we are looking at solutions for automated parking, such as valet parking. These scenarios enable drivers to park their cars automati- cally in parking garages, for example. To do so, they just have to stop the car in a certain zone, and the rest takes place automatically. Later, the driver can order the vehicle back to the handover location using their smartphone. We will be able to draw on our core com-
Close-up of a 77 GHz radar sensor— along with front camera software and a structure borne sound sensor, the radar sensor is one of the key components that is enabling the vision of autonomous driving to become reality. Bringing advanced, safe functions to the road requires a variety of technologies that interact and complement each other.AUTONOMOUS360
24 SENSORS FOR (ALMOST) ANY SCENARIO The successful automated parking test provides Lars- Peter Becker (left) and Oliver Klenke (right) with valuable data. tions. Experts assume that the number of vehicles with a higher level of automation will increase significantly within the next five to ten years. Implementing all these development stages requires different solutions for perceiving the surroundings and processing data. “The trend towards auto- mated driving is more complex and comprehensive than any other automotive trend before. This makes a clear strategic positioning all the more impor- tant,” says Carsten Roch, Head of Assisted and Automated Driving at HELLA. “A central direc- tion for us is that we develop individual key components, on the one hand, but always cater our development work toward architectures and the needs of complex functions. This enables us to offer our customers specif- ic use cases. Automated parking is a classic example.” Accordingly, HELLA could conceivably provide an auto- mated valet parking function for parking garages. This would entail the vehicle navigating Thanks to SHAKE’s modular design, we can implement various application options based on customer requirements. Marco Döbrich, Head of Sensors and the Technical Center at the HELLA location in Bremen Valet parking Visions turn into concrete applications: in a few years, scenarios in which vehicles autonomously enter parking garages and park in their spots could become a reality. The illustration shows how so-called valet parking works. 5 Checkout and pickup Upon returning, the car is requested via a smartphone. It’s stationed in the pickup zone, payment and service report are conveniently managed digitally. 77 GHz radar sensors determine distance, speed and movement direction of static and moving objects near and far while allowing for virtually seamless 360° ambient recognition. The radar especially detects distances when parking and removing the car. independently from a drop-off point to the parking space and back. The distance that the vehi- cle could cover driving autono- mously would be a few hundred meters. Realizing that kind of complex functionality requires a plethora of different technol- ogies that complement each other to some degree, such as radar sensors, laser scanners and camera software, struc- ture-borne sound sensors as well as environmental data. Not all sensors come solely from HELLA, but may also have been integrated through development partnerships. Radar sensor development is controlled centrally by the headquarters in Lippstadt. They register moving and stationary objects in the vehicle environ- ment; for example, they meas- ure the size of a parking space, the height of the sidewalk and the speed, distance and direction in which an approaching car or bicyclist is moving. Compared to the ultrasound system used until now, radar has a substantially greater range and the ability to distinguish between objects. In automated driving situations, that is a critical factor. “Radar sensors are already a central topic for automobile manufacturers because many basic driver assistance solutions are created using radar,” says Roch. “We are pursuing a platform idea that ensures the required scalability and competitiveness precisely in the sensor segment and enables market penetration into the volume segment.” Shortly after the turn of the millennium, HELLA entered the radar business. A sensor was developed with a 24 GHz transmitting frequency to sup- port traditional rear functions such as blind spot detection or a lane change assistant. It went into production in 2006 >
SENSORS FOR (ALMOST) ANY SCENARIO 25 4 Vehicle services Services, such as electric vehicle charging, cleaning or courier services can be booked during check-in and carried out at the parking spot. T R O P E R L A I C N A N I F 3 Drive The vehicle drives to the free spot. Automation minimizes the risk of damages and optimizes the use of space. 2 Check-in The vehicle and parking garage automatically exchange required information (plan, parking spot, service hours). 1 Drop-off The vehicle is parked in the drop- off zone outside the car park. The parking process is initiated via mobile device. Cameras can use cor- responding software to recognize the appropriate car lane and identify ob- jects and signals around the vehicle. This also comprises markings on the ground, which a radar would be incapable of detecting. SHAKE can for example detect even the slightest touch to the vehicle body. The technology covers the immediate vehicle proximity and thus complements existing assistance systems. SHAKE can serve as emergency stop assistant during valet parking. Ultrasound works similar to a radar: the sensors emit sound waves and thus measure the distance to objects nearby. They are very popular, for example, with parking aids. Even- tually, they will however be likely replaced by radars since those are more performative and allow for more design freedoms. LIDAR sensors also re- semble radars but emit laser beams. LIDAR is in particular characterized by its precise ambient recognition at great di- stances. For automated parking, it is advanta- geous for the vehicle to drive longer distances autonomously. The communication with the environment enables the vehicle to exchange information with other road users or, in the case of valet parking, receive the site plan of the car park or information regarding free spots.
and the fourth generation of this sensor rolled off the line in 2017. Today, HELLA is the world market leader in the area of rear applications with 20 million 24 GHz sensors produced. “In ad-dition, we began developing an even higher performance radar sensor early on that further ad-vances the higher development stages of automated driving, in particular,” adds Roch.Compared to 24 GHz technology, the 77 GHz radar sensors not only have more than three times the transmitting frequency, but also an avail-able bandwidth that is about five times as large. In other words, while the 24 GHz variant perceives two objects as just one if they are closer than 1.5 meters (4.9 feet) together, the 77 GHz sensor can distinguish two vehicles from one another even when they are only 30 centimeters (11.8 inches) apart. In addition, the sensor enables 360-degree detection of the exterior vehicle surroundings. This seamless detection of the vehicle surroundings is a central element for automated driving functions such as automated parking. Plans are underway to install the first generation in a production model by 2021. While radar primarily measures distances during parking, the cameras detect what kind of objects are in the immediate surroundings. For example, they can also identify markings on the ground that radar cannot detect. When it comes to associated camera software, HELLA has special-ized in applications for the front camera and offers an entirely new business model on the market: an open software system whose functions for light control as well as detection of lanes, traffic signs, pedestrians and objects can be configured by customers based on their needs. This includes upgrades with more complex automated driving functions that require considerably more sophisticated image processing methods and are implemented with artificial intelligence methods.“The camera plays an im-portant role during the automat-ed parking process by critically supplementing the information from the radar sensors,” says Oliver Klenke, Director Automat-ed Driving Software at HELLA Aglaia in Berlin. “But its function is also critical for staying in a lane, passing other vehicles and emergency braking in situations such as freeway driving. There is still great potential here as we move closer toward autono-mous driving.”However, particularly for simpler development stages of automated parking there can be situations in which radar, cam-era or ultrasound can help only under certain circumstances. One example: The car is driving slowly into a narrow garage with only a little wall clearance on both sides. Suddenly a door opens at the back of the garage. A child comes running in and touches the car. Even for the 77 GHz radar sensor, this would be a difficult situation because the person is very close to the ve-hicle. “Therefore it is important that we have our sensors cover the immediate surroundings of the vehicle, too. The SHAKE sen-sor performs this function,” says Marco Döbrich, who is respon-sible for the area of sensors at the HELLA location in Bremen. “Its advantage: At the slightest touch, the SHAKE can imme-diately emit a signal to stop the car. In the valet parking scenar-io, the SHAKE can function as an emergency stop assistant and is particularly important in the in-termediate development stages of automated parking.” In the automotive area, this sensor is an innovation—a piezoelectric sensor that can measure the structure-borne sound on the vehicle’s shell. The SHAKE sensor converts even light touches into electric 26 SENSORS FOR (ALMOST) ANY SCENARIOSensors for detecting the environmentare only half the battle. This is because the sheer volume of information regarding the vehicle’s environment would be largely useless if the car had no central control unit for analyzing the en-vironmental data to draw the right conclusions and maneuver correctly.Front camera software plays an vmportant role vn auto mated parkvng and other functvons as well. There vs great potentval here on the path towards autonomous drvvvng.Oliver Klenke, Director Automated Driving Software, HELLA Aglaia
T R O P E R L A I C N A N I F Vehicles with simple driver assistance systems have been a re-ality for a long time. The degree of automation is increasing continuously. Forecasts indicate that, as early as 2025, two-thirds of all new vehicles will have at least basic assistance functions.Brvngvng automated drvvvng to the road requvres technologves that complement each other.Lars-Peter Becker, Program Manager for Automated Driving, HELLA Aglaia signals and enables them to be measured and recorded. It gives the car the sense of touch. “Due to the sensor’s modular design, there are even more application options that we can implement depending on our customers’ requirements,” says Döbrich. Since it can register every type of damage and local-ize it on the car, it is ideal for performing functions such as recording the condition of vehicle fleets in real time. This means that even different applications can be implement-ed as part of shared mobility concepts. At higher levels of autonomous driving, the SHAKE can also monitor the condition of the road surface, such as the level of a water film on a road wet from rain. The sensor will go into series production as a component in an aquaplaning warning system as early as the end of 2018. Back to Berlin, where employees in the test workshop at HELLA Aglaia are analyz-ing data on computers again, because the ability of sensors to detect the vehicle environ-ment is only half the battle. The volume of information, which, in the case of valet parking, is also augmented by external information about the parking garage, such as layout plans power will act as the hub,” says Lars-Peter Becker. “After all, it is software that bundles the sensor data, evaluates it and imple-ments automated driving func-tions based on this information. As a result, value creation in the automotive industry is gradually shifting. To a large extent, the future of cars is substantially being written by programmers and software developers.”and configurations, would be largely useless if the car had no central control unit for analyz-ing the environmental data to maneuver correctly.“The higher the automation level, the more complex the functions become. At the end of the day, this also increases the demands on data processing. This is why a central comput-er with immense computing
360 AUTONOMOUS Traﬃc school in the digital age: For state-of-the-art driver assistance systems to function safely and reliably, they require extensive testing. The basis for this is the hundreds of thousands of miles that have to be driven in the lab and under real-world conditions.
360 DEGREE AUTONOMOUS 2017/2018 ONE MILLION KILOMETERS 29 One million kilometers Do the cameras identify all road signs and objects? Do the sensors detect special weather conditions such as moisture or fog? Do the algorithms estimate the behavior of other road users correctly? Do they draw the correct conclusions? A behind-the- scenes view of software tests at HELLA Aglaia in Berlin. T R O P E R L A I C N A N I F W Welcome to Fabulous Las Vegas – the desert town with its flashy hotels and casinos is on countless people’s bucket list. However, Las Vegas is also a must if you are training self-driving cars. Here, as well as in other regions of the US, unusual marks called Botts’ Dots are used to demarcate the lanes. The dots are made of ceramic, metal or plastic. “A human driver recognizes their meaning and purpose imme- diately and intuitively,” Ulrich Kellner says. “Does this also ap- ply to the sensors and software systems that will soon enable vehicles to be self-driving?” Kellner, a test expert at HELLA Aglaia, responds with a smile. “That’s exactly what we’re trying to find out." With over 350 employees, HELLA subsidiary Aglaia is one of the world’s leading suppliers of intelligent visual sensor sys- tems. HELLA Aglaia develops products including software solutions for sensing the envi- ronment and making ambient data usable. Briefly put, to provide vehicles with radar sensors and front cameras that give them the ability to “see” digitally. Aglaia also develops software for energy manage- ment. Recently, an additional pillar has become important. HELLA Aglaia is a service provider for original equipment manufacturers; the company provides automobile manufac- turers with test infrastructures developed in-house, gathers and uses data, and tests auton- omous driving functions, even if components are involved that are not from HELLA. The number of test miles that automated assistance systems must have verifiably driven in order to be deemed safe differs from case to case. The requirements vary from one original equipment manu- facturers to the other because there are no applicable stat- >
30 ONE MILLION KILOME TERS utory requirements. For some, it is 600,000 kilometers, for others, a million, while others demand even more. Therefore, the solutions from HELLA Aglaia are in great demand. They primar- ily test front cameras: do they identify all vehicles, pedestrians, road signs, lane markings and other objects correctly? Do the sensors detect special weather conditions such as frost, mois- ture or fog? Do the algorithms behind the driver assistance systems estimate the behavior of other road users correctly? Do they draw the correct con- clusions? Aglaia tests a wide range of different traffic scenarios. A few basic functions of assisted and automated driving, such as traffic sign or object detection, can be tested on special test Las Vegas is a must when training self- driving cars. courses. Tests for more complex functions of automated driving, on the other hand—for example, those that require advanced methods of artificial intelli- gence—have to take place in real-world conditions in cities or on country roads. Alternatively, they have to use the data col- lected there. “One reason is that such tests give us developers a significantly better feel for how
ONE MILLION KILOMETERS 31 Ulrich Kellner walks over to his computer workstation and opens the directory of the latest project in the database management system specially developed by HELLA Aglaia. A data record has a size of 13 gigabytes and represents just under a minute of video data, shot by a camera that creates a life-like representation of its en- vironment. The goal is to pull the visual world into the database, identify elements and filter for those segments that are rele- vant for tests. “However, these data volumes are necessary to test assistance functions to an adequate extent and with the highest standards of reliability,” says Kellner. Moreover, the requirements will become more stringent the more vehicles are equipped with driver assistance solutions, especially considering that these are becoming more elaborate The more elaborate the assistance systems, the more complex the interplay of different sen- sors. This also increases the requirements for us in terms of testing these components and their functions. Tom Lüders, Director of Testing Solutions, HELLA Aglaia and complex all the time. “Every step towards self-driving cars demands new test methods. Earlier vehicles had a single front camera on board,” explains Tom Lüders, Director Testing Solu- tions at HELLA Aglaia. “Soon, the number of cameras will increase to up to 14, supplemented by radar sensors, laser scanners and additional sensors for meas- uring the vehicle environment.” All of these components have to be tested to make sure their functions are working properly. The more elaborate the driver assistance systems become, the more complex the interplay of different sensors will also become. This also increases the requirements for us in terms of testing these components and their functions—especially considering the continuously increasing data protection reg- ulations we comply with in our test methods." > T R O P E R L A I C N A N I F Object recognition becomes a challenge especially in rural areas when the assistance systems have to assess mixed situations containing pedestrians, cyclists and other road users. a software program functions in real-world conditions,” says Kellner. When the drivers of HELLA Aglaia drive on real roads, they use cameras to record the environment so that these images can be used for virtual test drives. The systems being tested then call up their functions based on the images that they are fed. The process is called “Capture & Replay”. Once the images are acquired, they can be played back at any time, providing a virtual test drive. This saves money and requires significantly less effort. Only this way can driving hundreds of thousands of test miles be fea- sible. Aglaia can also generate special scenarios in computers for special requirements for which neither real-world roads nor test tracks are sufficient. These tests are based on huge data volumes. Kellner is responsible for acquiring, evaluating and marking them so that they can be found when they are needed. This is done, for example if you want to find out whether a new driver assistance system for object recognition detects the Botts’ Dots in Las Vegas correctly. Kellner, who studied mechanical engineering, began his career at HELLA in 1984. In 2008, he joined Aglaia in Berlin. He uses an example from the analog era to describe his job. “Let’s say you have a huge library with thousands of books. This library is truly useful only if you have arranged and catego- rized all books correctly. Without a well-maintained card catalog system, it would be useless.”
Onfe HELLA Aglaia refeives a new test order, the first step for the fompany is to write the sfript for the projeft, in the truest sense of the word. Affording to fustomer require-ments, the experts draft a route plan that fontains all relevant situations and spefial regional features. This phase shows, whifh sfenarios already exist in the database with suitable markings. Whifh may still have to be driven on the road? Some situations fan be simulated on neutral test fourses, while oth-ers need real-world fonditions.This is the basis for the data required for the respef-tive test, falled “ground truth”. This is run though the system being tested until the algorithm funftions flawlessly. This is how traffif sfhool works in the digital age.Kellner travels by far a lot for work, offasionally as a test driver. How many miles has he driven in his lifetime? “Who knows? And who fan fount that high?” he says with a laugh. He also enfounters surprises and novel situations all the time though. Refently, he saw a soft-sided semitrailer on the freeway with a large stop sign printed on the tarp along with an advertising slogan. “Of fourse, I knew that this was advertising. But how does an algorithm reaft in sufh a situation? How do we make sure that the software professes this as advertising and not as a street sign?” The objeft deteftion system is even more fomplex. In addition to the Botts’ Dots in Las Vegas and another nation-al idiosynfrasies sufh as the speed bumps of Frenfh roads or Dutfh bollards, the system must look out for fareless pe-destrians, speedy fyflists, and vehifles—not to mention deer frossing the road. Eafh of these living, moving objefts not only looks different depending on the
T R O P E R L A I C N A N I F wind, weather and time of day, it also behaves differently. This is further complicated by specific lighting conditions in the various regions. In Las Vegas, the sun is exceptionally high in the sky. In general, the US uses text on traf-fic signs to a greater extent than other countries. On the Atlantic coast in the south of France, the landscape is characterized by alternating tunnels and sunny roads. Tree-lined avenues in Germany are characterized by alternating light and shadow, while busy pedestrian crossings in Tokyo offer an unusual density of road users—all of them in a hurry. “Our test solutions have to work internationally,” Lüders says. Our focus is on the core markets of Europe, Asia and the US. However, data is also in demand about roads in South Africa, in the Middle East or in Australia and New Zealand. Currently, the function tests are concerned with Level 3 of autonomous driving, known as conditional automation. The technology steers, brakes and accelerates. It also monitors the entire surrounding area and takes over driving modes. However, a human being still sits behind the wheel, and must be ready and able to intervene. “The complexity of the tests will take another leap for Levels 4 and 5, when the system has sole responsibility,” says Lüders. Then, the machine will have to be a verifiably better than any human driver. “Validating this will require an amount of data even huger than that used now, as well as new test solutions,” Lüders notes. This scenario is still a futuristic vision. When vehicles drive in highly or fully automated mode, occasional updates for the on-board software will no longer be enough. Updates will take place at shorter intervals, as is the case today for computers and smartphones. The test scenarios for the algorithms have to change accordingly. New driving data will be fed into the test systems continuously and in real time. A type of crowdsourcing principle is conceivable here. Entire fleets of cars could send their camera data from their daily routes directly to the system, where especially relevant or critical situations are isolated and validated via data mining. This could enable the software components to stay up-to-date at all times and down-loaded to the vehicles if updates are necessary. The vision for the system safety of the future would then be for autonomous vehicles to test their own component soft-ware around the clock. Given the current era in which there is much discussion of artificial intelligence, this is ultimately only a logical step. The data volumes are necessary to test assistance functions comprehen-sively and with the highest standards of reliability.Ulrich Kellner, Head of Testing Solutions Operations, HELLA AglaiaA futuristic vision: When vehicles become self-driving, test scenar-ios also change. Driving data can be collected in real time, possibly by employing a crowd-sourcing principle. Entire ﬂeets of vehicles could feed data from their daily routes right into the system. The system then assesses especially relevant situations and continuously updates the assistance systems.ONE MILLION KILOMETERS 33
Clear signals When the driver becomes a passenger—what is the role for vehicle lighting? As lighting specialist, HELLA is developing innovative solutions for car body and interior lighting. The company is thus actively co-shaping the trend towards autonomous driving while tapping into new growth opportunities early. The goal: further increasing human safety and comfort. W Will the car of the future still have headlamps? This is the question that John Kuijpers has to often address when talking about tomorrow’s mobility. If vehicles will soon drive totally autonomously while perceiving their environment primarily via sensors and cameras, what will be the purpose of traditional vehicle lighting after all? “Cars will still have head- lamps in twenty years, for safety reasons alone,” replies the Dutch native without hesitation. “I can- not imagine that fully automated cars will be driving around com- pletely without lights at night. After all, traffic area illumination also serves the camera-based assistance systems. We must furthermore not forget that light is a central design and differ- entiation feature of vehicles,” adds Kuijpers, who at HELLA is responsible for the Interior & Car Body Lighting business. “The trend towards autonomous driving will however cause automotive lighting technology to assume entirely new functions, such as communication among road users. Safety and comfort will further increase that way.” >
360 DEGREE AUTONOMOUS 2017/2018 CLEAR SIGNALS 35 360 AUTONOMOUS Vehicle automation will cause automotive lighting technology to assume completely new functions. Lighting elements can thus show whether a vehicle drives autonomously, they can give other road users warning signs and help to transform the vehicle interior into a living and work space. A trip to the future– together with HELLA. T R O P E R L A I C N A N I F Light displays the driving mode This car is driving autonomously—this is very important information in street traffic. When positioning the lighting elements, psychological factors also play an important role: Where do pedestrians, cyclists or drivers typically look, when a car is approaching?
36 CLEAR SIGNALS L Let’s jump to the year 2035. A typical traffic situation: pedestri- ans, cyclists and cars are simul- taneously using the roads of a megacity’s suburb. A man with a stroller wants to get to the other side of the road. He waits by a pedestrian crossing, a vehicle is approaching from the left. The car is driving autonomously— immediately visible for everyone on the outside by the turquoise LED light band above the door frame. The vehicle slows down, Automotive lighting technology will become more important than ever. For the new trend towards autonomous driving will cause it to assume entirely new tasks while contributing to more safety and comfort on the road. John Kuijpers, Head of Interior & Car Body Lighting, HELLA diodes on the vehicle’s body start to light up. For the man, it signals that the system has recognized him and the car is stopping. He will now enter the pedestrian crossing with the stroller, thus safely crossing the street. “In the future, more and more traffic situations will occur where automated vehicles encounter non-automated road users. It is all the more import- ant for road users to be able to communicate with each other,” says Kuijpers. “New lighting technologies can play an important role throughout.” HELLA is already working towards the re- spective solutions. In the con- text of an international research project together with other industry and science partners, the company is for example exploring the question of what situations require communica- tion and which lighting tech- nologies can implement this. “After all, communication must function flawlessly not only at night but also during the day when it is bright,” explains Kui- jpers. “It must furthermore work internationally, be universally intelligible and unambiguous.” Throughout, it will also be key to intelligently inte- grate new lighting functions in everyday communication that has emerged over the years— whether it be projections, sym- bols or colors. A possible color for signaling an autonomous driving mode must for example not compete with other vehicle lamps. This would for example already exclude the color red since it is, among other things, reserved for stop lights. A cyan hue would for example be conceivable to display a vehicle’s driving mode. Other application cases for lighting functions, especially at night and dusk, are also in sight: signs and symbols projected onto the tarmac to reduce risks and provide assistance. A respective vehicle’s light signal could for example automatically indicate to other road users that the door is about to open or that the car has broken down due to a defect. If there is a breakdown, it would also be conceivable to display a safety zone around the vehicle or to project a warning triangle onto the road. The use of such light projections could already become a reality within a few years. >
CLEAR SIGNALS 37 Signals on the road Clear signals, helpful warnings: lighting midules in the vehicle bidy ciuld fir example shiw that a car has briken diwn with a defect, similar ti a warning triangle, which previiusly had ti be placed by hand. T R O P E R L A I C N A N I F
38 CLEAR SIGNALS B Back to the year 2035. This time, we’re taking a night drive along the US-west coast. The car is driving entirely autonomously. Both vehicle passengers are doing different things. A woman is working on a presentation in the vehicle’s back. The interior lighting is precisely adjusted for her needs while she has the information regarding the remaining travel time displayed on an LED on-screen bar next to her. Opposite her, a man is sitting. He uses the drive to relax. In his zone, the lighting is soothing and keeps changing among pleasant colors mimick- ing a sunset while he’s scanning through current headlines on an on-screen bar. As soon a the vehicle approaches the travel destination, the interior space automatically becomes brighter. The interior space is bathed in a gentle blue increasing the travelers’ concentration and attention while preparing for the impending arrival. “Ambient interior vehicle lighting is already playing an important role today. The trend towards autonomous driving will further accelerate this develop- ment—since the vehicle interior can be used in many different ways as soon as the traditional function of the driver’s seat is dropped,” says John Kuijpers. “The car is thus increasingly turning into a living and work space. Dynamic, personalizable lighting is crucial here.” On the path towards fully automated driving, interior light- ing could however also assume additional safety and comfort functions. Keyword: Smart Lighting. The interior could thus for example light up red when obstacles or other cars get too close, thus pointing out the po- A new form of ambiance When drivers are becoming passengers, they have time for alternative activities. The vehicle interior will thus increasingly become living and work space—highly functional, dynamic and customizable. Lighting scenarios personalized for individual preferences increase comfort while displays will additionally provide relevant informa- tion to the driver.
CLEAR SIGNALS 39 T R O P E R L A I C N A N I F tential danger to the driver. The many sensors necessary for the automated assistance functions and constantly receiving a great amount of ambient data could serve as the foundation here. Intuitively intelligible lighting signals could make much of this information available to the driver in partially automated driving scenarios. Psychological aspects have to be taken into particular account here: What signals do we perceive as help- ful in which situations? Which ones are confusing us? “At the core, this is about increasing the well-being and safety of humans thanks to innovative automotive lighting technologies,” summarizes Kuijpers. “This has always been HELLA's aspiration—and it will not change in the context of autonomous driving.”