Tesla Says The SUV Involved In A Deadly Crash Was Driving On Autopilot

"We have never seen this level of damage to a Model X in any other crash," the company said.


A Tesla SUV involved in a fatal crash in Northern California last week was driving with its semi-autonomous Autopilot system engaged moments before it struck a freeway divider, the electric vehicle company said late Friday.

Walter Huang was killed when his Model X crashed on March 23 into a concrete lane divider on Highway 101 in Mountain View, California. Huang's family told KGO-TV that he had repeatedly complained that the car's semi-autonomous system kept veering toward the same barrier.

In a statement, Tesla said the SUV's logs show the driver's hands weren't detected on the wheel for six seconds before the collision despite several warnings.

"The driver had about five seconds and 150 meters of unobstructed view of the concrete divider," the statement said. "The vehicle logs show that no action was taken."

On Tuesday, the company said a safety barrier that protects vehicles from the concrete divider was missing at the time of the deadly crash.


Tesla posted a photo showing the missing barrier, reportedly taken the day before the collision, next to a Google street view image of the same location with the barrier.

"The reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced," the company said. "We have never seen this level of damage to a Model X in any other crash."

After Huang's Model X crashed into the divider, the SUV was hit by two other vehicles and caught fire.

A Tesla spokesperson declined to comment further and referred BuzzFeed News to the company statements.

The National Transportation Safety Board (NTSB) is investigating the crash.

This isn't the first time questions have been raised about Tesla's Autopilot system.

Last year, the NTSB determined that the likely cause of a fatal 2016 crash in Florida was the driver's overreliance on Autopilot.

The agency said Tesla's Autopilot design "allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings."

As a result of that investigation, the NTSB also made seven safety recommendations aimed at ensuring Autopilot features can only be used in limited circumstances.



A BuzzFeed News investigation, in partnership with the International Consortium of Investigative Journalists, based on thousands of documents the government didn't want you to see.