Innovation and safety sought in self-driving car guidelines proposed by administration

By Joan Lowy, Tom Krisher and Dee-Ann Durbin
Associated Press

WASHINGTON (AP) - Saying they were doing something no other government has done, Obama administration officials rolled out a plan Tuesday they say will enable automakers to get self-driving cars onto the road without compromising safety.

In drawing up 112 pages of guidelines, the government tried to be vague enough to allow innovation while at the same time making sure that car makers, tech companies and ride-hailing firms put safety first as the cars are developed.

Only time will tell whether the mission was accomplished, but the document generally was praised by businesses and analysts as good guidance in a field that's evolving faster than anyone imagined just a few years ago.

"How do you regulate a complex software system?" asked Timothy Carone, a Notre Dame University professor who has written about the future of automation. "They want to allow innovation, but they want to be very proscriptive in managing the risk side of this. In my mind, they're trying to manage the unknown."

The guidelines from the Department of Transportation's National Highway Traffic Safety Administration don't tell companies specifically how to get to an autonomous car that can safely carry people down the road, leaving a lot to interpretation.

But they tell companies to explain how they'll comply with a 15-point safety assessment before they roll out the cars. And the guidelines also make clear that NHTSA will force recalls if software doesn't perform as it should. The agency, for the first time in its history, may even seek authority from Congress to approve technology before it goes on the road.

"We want to be as nimble and flexible as we can be, recognizing that we will never, ever compromise on what we think is safe," Transportation Secretary Anthony Foxx said at a Washington news conference.

Among other things, the safety assessment asks automakers to document how the car detects and avoids objects and pedestrians, how the car is protected against cyberattacks and what sort of backup system is in place in case the computers fail.

Companies that already have even semi-autonomous vehicles on the road will have to submit assessments four months after the government's 60-day comment period ends. Companies that are developing autonomous and semi-autonomous vehicles will be asked to submit assessments before those cars go on the road.

For now, the assessments are voluntary, but the government intends to make them mandatory after a lengthy rule-making process.

The guidelines come as the government has struggled with how to capitalize on the technology's promised safety benefits - the cars can react faster than people, but don't drink or get distracted - while making sure they are ready for widespread use. Officials hope the guidelines will bring order to what has been a chaotic rollout so far.

The Transportation Department also said it, rather than the states, would be responsible for regulating cars controlled by software. States have historically set the rules for licensing drivers, but Foxx said states should stick to registering the cars and dealing with questions of liability when they crash when the driver is a computer.

The guidelines allow automakers to seek exemptions from NHTSA from federal safety standards that might be outdated, such as a rule requiring a steering wheel in brake pedals in a vehicle without a human driver. California currently requires a steering wheel and brake pedals, but NHTSA has the authority to approve vehicles without them if the agency decides they're safe.

The government also wants cars, whether partially or fully self-driving, to collect and share data from crashes and near-misses so companies and the government can learn from the experience. Data isn't currently collected industrywide.

NHTSA made clear that it can use its current recall authority to regulate the new cars. It warned automakers that self-driving cars that still rely on a human driver to intervene in some circumstances must have a means for keeping the driver's attention. If they don't, that "may be defined as an unreasonable risk to safety and subject to recall," the department said.

NHTSA says the warning isn't aimed at electric car maker Tesla Motors. But it would address events like a fatal crash in Florida that occurred while a Tesla Model S was operating on the company's semi-autonomous Autopilot system. The system can brake when it spots obstacles and keep cars in their lanes. But it failed to spot a crossing tractor-trailer and neither the system nor the driver braked. Autopilot allows drivers to take their hands off the steering wheel for short periods.

Tesla has since announced modifications so Autopilot relies more on radar and less on cameras, which it said were blinded by sunlight in the Florida crash. The company has maintained that Autopilot is a driver assist system and said it warns drivers they must be ready to take over at any time.

Some consumer advocates have objected to voluntary guidelines instead of safety rules that are legally enforceable.

"Consumers need more than just guidelines. This new policy comes with a lot of bark, but not enough bite," Marta Tellado, President and CEO of Consumer Reports, said in a statement.

Industry reaction, however, was largely favorable. Former NHTSA Administrator David Strickland, who now represents a coalition involving Ford, Google, Lyft, Uber, and Volvo Cars, said the guidelines are a foundation of how to test and deploy autonomous cars. Yet if a manufacturer doesn't follow the guidelines "it will be open and apparent," he said.

--------

Associated Press writer Justin Pritchard contributed from Los Angeles. Krisher and Durbin reported from Detroit.

Published: Thu, Sep 22, 2016