The federal government released 116 pages of guidelines for self-driving cars on Tuesday, outlining broad goals and questions companies must answer for regulators on the safety of their technology and how it handles ethical dilemmas.
The guidelines, which are more of a set of recommendations than a rulebook with specific benchmarks, list a 15-point safety assessment and several other expectations: Companies should record and share data on crashes and near-misses, and be prepared to reconstruct them. They should be programmed to deal with somewhat common road scenarios, such as direction of traffic by a police officer, or disabled vehicles in a lane. And they should include fallback plans for when the technology is malfunctioning, such as directing the vehicle to a safe place and stopping.
“We believe we have struck the right balance between safety and innovation,” US Department of Transportation Secretary Anthony Foxx said on a call with reporters.
In “several months,” the agency will move toward turning the recommendations into rules, Foxx said at a press conference. The framework comes at a time when companies are racing to put self-driving vehicles on the road. Uber launched a pilot program in Pittsburgh last week, becoming the first company in the US to let people hail rides in self-driving cars, and Google has been testing its autonomous vehicles in several states for years.
The industry has been waiting for the Department of Transportation to release standards for autonomous vehicles, particularly since the federal government opened two investigations into a fatal Tesla crash earlier this year to determine whether its semi-autonomous technology played a role. Tesla has called Autopilot, its advanced driver assist system, an incremental step toward self-driving cars. At the same time, Autopilot doesn’t fulfill the promise implied by that term – by definition, a technology that can drive itself in place of a person.
The new federal guidelines also address “highly automated vehicles,” including technology like Tesla’s that expects humans to remain on guard to take the wheel at any time, and note that manufacturers should account for both misuse and the fact that people could become complacent if technology has taken over some of their duties. (Earlier this month, Tesla said it plans to update Autopilot to put limits on how long people can go hands-free. If people don’t heed warnings to keep their hands on the wheel, the car will disable Autosteer until it is parked and reengaged.)
Companies already testing vehicles will be given a period of time to send the DOT their responses to the new guidelines and the safety assessment.
Regulators pointed out that they won’t hesitate to crack down on vehicles if they find a company is putting unsafe technology on the road. “Our enforcement authority stands strong and it will be used to its full effect as needed,” Mark Rosekind, head of the National Highway Traffic Safety Administration, told reporters on a conference call. “We have defect recall authority, and we’ll use that to its full effect.”
The policy also asks states to write laws that allow for the safe testing of self-driving vehicles, but to otherwise leave these vehicles’ regulation to the federal government. For fully autonomous vehicles, states won’t need to regulate licensing because the software would be the driver.
When asked for comment, Uber directed BuzzFeed News to a statement released by the Self-Driving Coalition For Safer Streets, an industry group that the company is part of. Google did not return a request for comment.
“We support guidance that provides for the standardization of self-driving policies across all 50 states, incentivizes innovation, supports rapid testing and deployment in the real world,” the coalition, which also includes Google, Lyft, and Ford, among others, said in a statement. Joe Okpaku, vice president of government relations at Lyft, which is developing self-driving cars with General Motors, called the guidelines “a step in the right direction” in a statement.