Debate Over Tesla Crash Questions Companies Autopilot Feature

Story by Trinity Torgerson (News Writer)

The National Transport Safety Board met on February 25th in Washington, D.C. to discuss the ruling on the crash in 2018 caused by a Tesla Model X on autopilot, which veered over into the center divider causing the death of Walter Huang. After two years the NTSB still continues to contemplate whether they should allow autopilot or not because of the many different possible contributing factors of the crash. 

During the crash the driver, Walter Huang was driving the Tesla Model X with autopilot turned on for use. He was on US-101 when the car lost sight of the markings on the road and the car started to turn into the “gore area” that separates the highway and exit lane. The car crashed into the concrete barrier and resulted in Walter Huang later dying at a nearby hospital. Investigators found that this was not the first time that the machine had done this.

It was suspected that the Tesla driver was distracted by a game on his cellphone, not looking at the road and on his phone when the crash happened. “In this crash, we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,” said NTSB chairman Robert Sumwalt. Suggesting that the crash happened because of the driver and it was not all the autopilot’s fault, because he was on a company phone playing video games. Another thing found in the investigation was that the attenuator on the barrier was damaged before the crash, and had not been repaired by California’s transportation department. Investigators say that if the barrier would have been repaired Huang could have survived the impact of the crash. The tesla also has strict rules for the driver to have their hands on the wheel and be alert at all times, and when the driver does not have their hands on the wheel, it gives visual and auditory alerts to tell them to put their hands on the steering wheel of the car. 

On the other side, many people think the death of Walter Huang was due only to the fact that he was relying on the autopilot to drive, and overly trusting the autopilot’s capability to function. Many flaws have been looked into with the autopilot, but none confirmed by the company or investigators. For example, the sensors could not be trusted, they need to add more safe guards-and the cameras don’t do justice. The evidence shows contradicting signs of what the official cause of the crash was, so the ruling will be tough to make.

Tesla is continuing to work on the improvement of their autopilot technology,

and is set on making sure that the correct action is being taken into account to keep the lives of their drivers safe. If tesla decides to not add new safeguard technology the investigators suspect that the rate for the crashes will stay the same and not improve. That would not be a positive action being taken for the ruling being gone through. The next steps Tesla decides to make could possibly end to the allowance for autopilot in the future for users. 

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s