BuzzFeed News

Reporting To You

tech

Here’s What Cops Have To Say When Teslas On Autopilot Crash

In the last week, two Tesla drivers who’ve crashed or lost control of their vehicles blamed Autopilot technology.

Posted on January 25, 2018, at 6:12 p.m. ET

CBS

Three road incidents in the last week have made it clear that local law enforcement agencies will have to deal with Teslas and their drivers on an increasingly regular basis. Cops and firefighters took to social media after each incident to make fun of the errant Tesla drivers, but they made it clear that regardless of driver assist software, they plan to treat Tesla drivers like anyone else on the road.

After his Tesla ended up in a creek bed, one inebriated driver 70 miles south of San Francisco in Morgan Hill blamed a deer for the accident. And the Morgan Hill Police Department mocked him on Facebook, saying the driver “thought they'd finally found the car capable of driving underwater.....turns out they hadn't.”

But in the other two incidents, only one of which involved a collision, the drivers blamed another phantom — Tesla’s Autopilot.

After police questioned one driver who had stopped on the Bay Bridge between San Francisco and Oakland and was literally asleep at the wheel, he claimed his car’s Autopilot software was on at the time. The other driver, in Culver City, California, slammed his Tesla into the back of a fire truck while going 65 miles per hour. That driver also blamed the incident on Autopilot.

Local law enforcement was quick to take to social media to mock these drivers, too.

The Culver City Fire Department shared this on Twitter:

While working a freeway accident this morning, Engine 42 was struck by a #Tesla traveling at 65 mph. The driver re… https://t.co/7mTcihSGUK

And the California Highway Patrol joked that, after they found a drunk driver passed out in his car, the Tesla “didn’t drive itself to the tow yard.”

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP… https://t.co/T0njUNBgM5

But — beyond roasting them on social media — what are local law enforcement officials supposed to do when law-breaking drivers claim the car was in control?

The California Highway Patrol is still investigating what caused the crash in Culver City, according to an email from Officer Mike Martis Jr. But the agency told BuzzFeed News it has no special advice for drivers of semiautonomous cars, except a reminder that people driving these cars are ultimately responsible for what happens on the road.

“As changes and advances continue, it is important to keep in mind, whether the driving operations are performed by a person or with the assistance of technology, the driver/operator is still responsible for the safe operation of the vehicle at all times and is required to abide by all existing rules of the road,” Martis told BuzzFeed News.

The National Transportation Safety Board and the National Highway Traffic Safety Administration have each dispatched investigators to Culver City to explore the cause of the crash.

Meanwhile, Officer Rueca with the San Francisco Police Department said there are no laws on the books excusing drivers of semiautonomous or driver-assisted vehicles from bad behavior.

“The person is still supposed to be in control of the vehicle. They’re still considered driving,” he said. “Even though this technology is out there, there aren’t any changes in the law.” In the case of an accident, Rueca said the circumstances in the case of a Tesla would be investigated the same way as in the case of a normal car.

The three recent incidents are not the first of their kind. There have been a handful of accidents where drivers blamed Autopilot over the years — in Dallas, in California, and in Florida.

In 2016, a Tesla driver was killed while operating a vehicle equipped with Autopilot. The National Highway Traffic Safety Administration investigated, and not only exonerated Tesla, but found that the company had tried to prevent customers from becoming distracted and over-relying on Autopilot, according to the Verge. “It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse,” the report said.

Tesla told the Washington Post, “Autopilot is intended for use only with a fully attentive driver.”

But in August, Adrian Lund, the president of the Insurance Institute for Highway Safety, told Bloomberg that there’s real concern over how semiautonomous vehicles make it easier for drivers to zone out. “Everything we do that makes the driving task a little easier means that people are going to pay a little bit less attention when they’re driving,” he said at the time.

Even though car makers have made it clear drivers still need to be vigilant behind the wheel of semiautonomous cars, and driver assist software tends to reduce accidents overall, it’s possible that the existence of the software is making drivers less alert.

Exactly how fully autonomous vehicles on the road in California will be regulated is still being hashed out. But already, Google’s self-driving operation Waymo is working with local law enforcement on how to “recognize and then access” a self-driving car following a collision, and communicate with Waymo about the incident.

For the semiautonomous cars that are already on the road, their drivers — including all owners of brand-new Model 3s — should continue to follow the existing rules of the road.

ADVERTISEMENT