Saturday, March 15, 2025
HomeFuture NewsTesla Sued by Family of Man They Say Was Killed by Autopilot

Tesla Sued by Family of Man They Say Was Killed by Autopilot

Published on

spot_img


“This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology.”

Crash Course

The family of a man who died after his Tesla crashed while driving on Autopilot is suing the EV maker, as the Independent reports, accusing CEO Elon Musk of making misleading claims about the driver assistance software.

In February 2023, 31-year-old Genesis Giovanni Mendoza-Martinez died after his Model S smashed into a firetruck on the side of an interstate in San Francisco.

According to the complaint, Mendoza-Martinez made “no accelerator pedal or brake pedal inputs” during the 12 minutes prior to the crash, while the vehicle was on Autopilot.

Mendoza-Martinez’s family is arguing that the driver bought the vehicle under the mistaken belief that it could drive itself, echoing sentiments that Tesla has overstated its vehicles’ ability to drive themselves.

The complaint singles out pages worth of online posts penned by Musk, alleging that he was knowingly misleading the public, despite knowing that the software wasn’t — and still isn’t — able to allow Teslas to safely drive themselves.

The carmaker has since shot back, arguing that it was the driver’s “own negligent acts and/or omissions” that led to his death, as quoted by the Independent.

However, attorney Brett Schreiber, who is representing the family, told the newspaper that Tesla was wrongfully using its customers to beta test flawed driver assistance software on public roads, with fatal results.

“This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology,” he told the newspaper.

Deep Impact

Tesla is already the subject of several government investigations into the safety of its so-called “self-driving” software.

Mendoza-Martinez’s crash is already part of an active National Highway Traffic Safety Administration investigation that dates back to 2021.

The regulator also found earlier this year that those making use of FSD were lulled into a false sense of security and “were not sufficiently engaged in the driving task.”

According to NBC, there are at least 15 other similar and active cases involving either Autopilot or the EV maker’s misleadingly called “Full Self-Driving” (FSD) software — an optional add-on — leading up to a crash that resulted in deaths or injuries.

The California Department of Motor Vehicles has also filed a lawsuit against the carmaker, accusing it of false advertising regarding FSD.

More on Autopilot: Workers Training Tesla’s Autopilot Say They Were Told to Ignore Road Signs to Avoid Making Cars Drive Like a “Robot”



Source link

Latest articles

Yale Law Scholar Suspended After AI Calls Her a Terrorist

Since the first publicly available large language model (LLM) ChatGPT hit the scene...

OpenAI’s strategic gambit: The Agents SDK and why it changes everything for enterprise AI

Join our daily and weekly newsletters for the latest updates and exclusive content...

Lovelace Studio uses AI to help players build survival craft sandbox worlds

Lovelace Studio is using generative AI to build Nyric, a tool that can...

Virgin Media O2 Invest £700m 4G, 5G Network

UK Internet Service Provider (ISP)...

More like this

Yale Law Scholar Suspended After AI Calls Her a Terrorist

Since the first publicly available large language model (LLM) ChatGPT hit the scene...

OpenAI’s strategic gambit: The Agents SDK and why it changes everything for enterprise AI

Join our daily and weekly newsletters for the latest updates and exclusive content...

Lovelace Studio uses AI to help players build survival craft sandbox worlds

Lovelace Studio is using generative AI to build Nyric, a tool that can...