Home Tech Tesla found partially liable for a deadly 2019 crash

Tesla found partially liable for a deadly 2019 crash

A jury in Florida has found Tesla partially liable for a 2019 crash involving the company’s Autopilot self-driving feature, The Washington Post reports. As a result, the company will have to pay $43 million in compensatory damages and even more in punitive damages.

Autopilot comes pre-installed on Tesla’s cars and handles things like collision detection and emergency braking. Tesla has mostly avoided taking responsibility for crashes involving cars with the Autopilot enabled, but the Florida case played out differently. The jury ultimately decided that the self-driving tech enabled driver George McGee to take his eyes off the road and hit a couple, Naibel Benavides Leon and Dillon Angulo, ultimately killing one and severely injuring the other.

During the case, Tesla’s lawyers argued that McGee’s decision to take his eyes off the road to reach for his phone was the cause of the crash, and that Autopilot shouldn’t be considered. The plaintiffs, Angulo and Benevides Leon’s family, argued that the way Tesla and Elon Musk talked about the feature ultimately created the illusion that Autopilot was safer than it really was. “My concept was that it would assist me should I have a failure … or should I make a mistake,” McGee said on the stand. “And in that case I feel like it failed me.” The jury ultimately assigned two-thirds of the responsibility to McGee and a third to Tesla, according to NBC News.

When reached for comment, Tesla said it would appeal the decision and gave the following statement:

Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator – which overrode Autopilot – as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver – from day one – admitted and accepted responsibility.

In a National Highway Traffic Safety Administration investigation of Autopilot from 2024, crashes were blamed on driver misuse of Tesla’s system and not the system itself. The NHTSA also found that Autopilot was overly permissive and “did not adequately ensure that drivers maintained their attention on the driving task,” which lines up with the 2019 Florida crash.

While Autopilot is only one component of Tesla’s larger collection of self-driving driving features, selling the idea that the company’s cars could safely driving on their own is a key part of its future. Elon Musk has claimed that Full Self-Driving (FSD), the paid upgrade to Autopilot, is “safer than human driving.” Tesla’s Robotaxi service relies on FSD being able to function with no or minimal supervision, something that produced mixed results in the first few days the service was available.

Update, August 1, 6:05PM ET: This story was updated after publication to include Tesla’s statement.

Great Job Ian Carlos Campbell & the Team @ Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics Source link for sharing this story.

#FROUSA #HillCountryNews #NewBraunfels #ComalCounty #LocalVoices #IndependentMedia

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Leave the field below empty!

Secret Link
Exit mobile version