How Self-Driving Cars Will Sneak Onto Our Roads

The car slammed to a halt on a sunny afternoon. A cloth and metal figure of a child wiggled back and forth following its sudden emergence into the car’s path. Inside, two sets of human eyes stared at the puppet kid’s eyes. The car was watching too.

A moment later, the car’s brake pedal lifted from the floor. The Bosch engineer riding in the car, who hadn’t even moved his foot as the vehicle brought itself to the abrupt stop, now stepped on the brake. Now that the car’s autonomous braking system—a chip that used images from a windshield camera—had concluded that the danger of an accident had passed, it was handing control back to the human driver.

This handoff of control is an imminent and necessary step before the commercialization of self-driving cars that regulators and the industry might call “fully autonomous.” (The U.S. National Highway Traffic Safety Administration’s definitions of the five levels of car autonomy are online.) The industry has already accepted the inevitability of autonomy: Next year, the European New Car Assessment Programme (Euro NCAP) will begin including autonomous emergency braking in its safety ratings.

For now, such systems alert, remind, and cajole drivers to do a better job of paying attention and responding to hazards. They are beginning to act on their own in small ways, such as braking, and then returning control to humans. One model will pull over to the side of the road if a driver does not respond. But to achieve full autonomy, cars will need to stop asking for our help. Such a car is not in production yet, but as universities, automakers, and even Google have shown, experimental cars can now handle many driving situations on their own. Autonomous cars will, no doubt, be safer than human-driven cars one day, and the technologies that will make that possible are already emerging from experimental cars and engineering laboratories. Some may already be in your car.

A car that went into production this year can park itself, but as a safety precaution, it requires a human involvement. Other cars are able to nudge the steering wheel to keep you in your lane on the highway. Many cars now track nearby threats in the driver’s blind spots using an array of sensors. Cruise control has grown more sophisticated as cars have begun to track the cars in front of them instead of driving blind at a fixed speed.

Each year, manufacturers improve these features, pushing the technologies into one another’s domains. Some cars now use the parking sensors for low-speed obstacle detection during traffic jams, for example. That sort of convergence is how today’s driver assistance systems will turn human drivers into assistants.

Bosch’s demonstration at last year’s Frankfurt Motor Show is part of the car industry’s response to ever-higher safety expectations. In 2011, Americans crashed 5.3 million times, causing 2.2 million injuries and killing more than 32 000 people, reports think tank Rand (PDF). Worldwide that year, 1.3 million people died from road injuries, making it the ninth-leading cause of death, according to a 2011 World Health Organization report.

The Insurance Institute for Highway Safety estimated in 2010 that wider adoption of existing driver assistance systems could cut traffic fatalities by almost a third (PDF). Today’s—and tomorrow’s—driver assistance systems promise to do even better.

Driver assistance systems are an intermediate solution, though. The ultimate safety goal is to prevent distractible, drunk, sleep-deprived humans from getting behind the wheel at all. Our reaction times are just too slow to square with our ambitions to prevent more accidents and squeeze more cars on the road. Fully autonomous cars, on the other hand, would offer mobility to people who can’t drive themselves, such as the young or disabled. Autonomous cars could compete with public transit in places with low population densities. More-autonomous transportation would help more people participate in the economy, boosting growth, according to the Rand report. Autonomous cars will also drive with more efficiency than humans and save gas. If they crash less, they could be built with lighter parts, cutting construction costs and further improving fuel economy. Autonomy will also make it possible to redesign parking. If a car can drop off its passengers and go park elsewhere, cities and shops can move their parking areas further from prime real estate.

Those are the stakes. And they are not news. Researchers have worked on the idea of autonomous cars for decades, but the sharpest inflection point may have been a series of Defense Advanced Research Projects Agency (DARPA) challenges between 2003 and 2007. The competition for increasing amounts of prize money spurred military contractors, graduate students, and major carmakers to try creative ways of teaching cars to drive themselves. Today, research and development on autonomous cars has bifurcated: Carmakers and dealerships are handing drivers more so-called assistance systems (often called advanced driver assistance systems, or ADAS), while their research arms and university and corporate partners develop the raw technology of tomorrow’s highly autonomous and fully autonomous vehicles. The headline grabbers may be experimental, but cars that are beginning to make a difference are already in showrooms and driving down streets—or roaring onto the German autobahn. It’s no surprise that the industry introduces its high-end autonomy technology through its luxury offerings. Luxury buyers can afford the extra cost—about US $3000 for a full package of half a dozen systems for the Mercedes S-Class, for example, perhaps the most autonomous production car today, with a total price of around $100 000. BMW’s 5 Series is a stiff competitor.

Manufacturers are familiarizing customers with the idea of autonomous technology through these types of cars. For example, Mercedes has a traffic-jam assistant, an add-on available in the European S-Class that accelerates and brakes in stop-and-go traffic. Future models of that Mercedes and higher-end competitors will be the first to have other futuristic autonomous features, such as fully automatic parking, obstacle avoidance (which steers a car around an obstacle, rather than just braking), and communicating with nearby vehicles.

Manufacturers will use those cars to gather real-world data on the trade-offs between different kinds of sensors and algorithms for driving the car. Any car you drive may “know” you are about to drift out of your lane, for example, but manufacturers will perhaps differentiate themselves by the way they warn you or how soon they take over. Bosch has already tested its braking system, for instance, with stationary and moving targets during the day. But it and other manufacturers will need to test their systems at night and in all kinds of weather. They will need to decide how to weigh data from radar versus data from optical cameras. Then they will need to decide the right moment to take over from the driver and how brusque to make the braking.

Today’s first-generation driver assistance systems tend to have embedded processors dedicated to a single sensor or set of sensors for achieving a single task, such as obstacle detection and braking. Since many such systems are sold as optional add-ons, they must be independent. Optical cameras used for lane detection might have nothing to do with cameras used for obstacle detection. But as cars such as the S-Class, which combines several driver assistance systems, become more common, carmakers say the next challenge will be to integrate all the sensory data and handle it at a central processor to make better decisions. “You can provide [an advanced cruise control] with just radar and lane-keeping aid with a camera system, and there is a lot of work to do to merge these two tasks,” says engineer Aria Etemad, who works on driver assistance and safety for Volkswagen Group Research, in Wolfsburg, Germany.

Each of those systems looks for very few things: a reflective surface from a car within a narrow range of angle and distance, and a couple of parallel lines. But what about other obstacles or unmarked roads? “The tricky thing is, when it comes to all the situations you haven’t been thinking about…what happens if you are combining rain and nightfall and one pedestrian on the right and one bicycle [rider] on the left and one car in front of you that has one light broken?” says Michael Fausten, the director of Bosch’s autonomous driving team. It will require processors to handle multiple sources of sensor data and make better-than-human judgments.

The concept Fausten is talking about, known as sensory fusion, is a medium-term target for top carmakers. But in a building at the Autonomous University of Barcelona, computer scientist Antonio M. López and colleagues are working on its opposite. You could call it “sensory simplification.” They’re trying to guide a car using only optical cameras. To do so, they are trying to see things from their car’s point of view.

The small black car in question is parked in a basement workshop near a computer. Several cameras sit on the car’s roof rack. As López stands near the computer, explaining their work, his colleague David Vázquez and graduate student Sebastian Ramos, a few meters away, squat in front of the car and stand up again, twirl, and cross in front of each other. A pair of red boxes on the computer screen follow their digitized figures. The boxes, which indicate that the car’s computer has identified a pedestrian, sometimes merge into one but never disappear. The young computer scientists manage to confuse the car at times, but they do not escape its computer vision, which they helped to build.

Optical cameras are cheap, López says, and will soon be ubiquitous even in entry-level new cars. In fact, in the United States, rearview cameras will be mandatory in new cars starting in 2018. Researchers fully expect that autonomous cars of the future will have more-expensive complementary sensors, but until those are ready, López asks, “What’s the most we can get out of optical cameras?”

He and other researchers are tackling the difficult problem of converting two-dimensional video feeds into a three-dimensional local map in real time so that a car can situate itself relative to its environment. In that sense, their work converges with that of the luxury carmakers, which are already struggling with using local sensor information to inform the car of its location on a regional or global map. “The biggest challenge is to bring the sensor signals into an environmental model,” said Werner Huber, head of driver assistance and perception at BMW, in an interview with IEEE Spectrum last year. The environmental model, which researchers also call localization, is the field of autonomous driving’s big remaining technical challenge. There are many ways of tackling it, including enhancing Global Positioning System (GPS) signals with ground-based supplements to the networked creation and maintenance of up-to-the-second three-dimensional maps for the car to carry on board and share with other cars. In any case, a car must distinguish among temporary local obstacles, such as a parked car, objects moving with the flow of traffic, such as a cyclist or another car, and the local, longer-lasting environment, such as the road, a curb, or a building. It must always know how close it is to each of these things so it can decide how to drive among them. Google’s approach involves making a detailed three-dimensional map using laser-ranging (lidar) from its experimental car and hand-correcting it afterward, much as humans correct its two-dimensional Web map.

By putting those human corrections into a machine-learning program, Google may train its software to attack the problem of extracting objects and deciding whether they are dynamic, such as pedestrians or parked cars, or static, such as streets and curbs. López is trying to do the same using a virtual city environment such as you might find in a video game.

What sensor mix to apply and the best way to train software to use them to handle real-world traffic situations is still an open question. “Since it’s not clear what is the best, [the industry must] have them all in parallel,” López says.

So what is now in showrooms is a mixture of more or less advanced driver assistance systems, many of which are sold as optional add-ons. Some cars, such as the Mercedes S-Class, package several technically separate systems together that, perhaps on highways, could give drivers a sense of having a digital chauffeur.

In a more proletarian-priced Ford C-Max in Madrid, I got a glimpse of how self-driving cars will arrive: They will sneak up on us a little at a time. In light traffic just outside Ford’s Spanish headquarters, the company’s press fleet manager, Eusebio Ruiz, took his hands off the wheel without telling me. A camera in front of the rearview mirror was following the curve in the road, and a processor concluded that Ruiz was being irresponsible. It commanded the steering wheel to make a correction so smooth I did not notice it, but I did hear the vibration of the steering wheel and saw the dashboard light up as the car drifted back into the lane. When Ruiz refused to return his hands to the wheel, the car made another attempt to alert him, but it did not attempt to make a second consecutive correction.

That sort of decision, to require constant attention on the part of the driver, will remain a feature of ADAS for some time, say people in the industry. There was no technical reason for the car not to return us to our lane over and over as we headed out of town, but Ford and others are very cautious about taking away so much control that a driver’s attention might wander too far to return to the car when it is needed.

For one thing, the international Vienna Convention on Road Traffic requires the driver to be in control of the vehicle. Whether a driver can farm out “control” to a semiautonomous system such as lane-keeping is up to lawyers, legislators, and regulators. Carmakers are already restricting some advanced driver assistance systems to Europe due to liability concerns in the United States. Insurance companies are studying the risks, and several U.S. states have already passed preliminary legislation for handling experimental autonomous cars. In March, California’s Department of Motor Vehicles held a hearing to discuss rules for privately operated self-driving cars. The rules will be drafted around June and must go into force at the end of this year, but even those closest to the problem of how to allocate autonomous cars’ risks are unsure how to proceed.

“There is a large agreement in the industry that this line needs to be clarified,” says Harald Barth, a driver assistance marketing manager at Valeo in Bietigheim-Bissingen, Germany. An IHS study published earlier this year noted that regulators may end up taking the same approach as the engineers training car software to drive: They may simply test self-driven cars on the road. “It may also make sense for self-driving-only cars to get a driver’s license, where each car model has to run through and pass a large battery of driving tests, emergency situations, and unexpected events,” the authors wrote.

That may give away another clue as to how people will respond to self-driving cars: They will anthropomorphize them. As Ruiz talked me through the other driver assistance systems, from advanced cruise control to parking assistance, he referred to them as “the guy.” As in “the guy” does this and then “the guy” does that. The pieces of “the guy” are starting to fall into place. Now it is up to drivers to decide how to welcome him.

First published by IEEE Spectrum as the main feature of a 5-part  online special report:

Feature: How Self-Driving Cars Will Sneak Onto Our Roads [pdf]
Related story 1: Adaptive Cruise Control and Traffic-Jam Assistants [pdf]
Related story 2: Self-Parking [pdf]
Related story 3: Lane Keeping [pdf]
Related story 4: Autonomous Emergency Braking [pdf]