Friday, July 4, 2025
HomeFuture NewsTesla Sued by Family of Man They Say Was Killed by Autopilot

Tesla Sued by Family of Man They Say Was Killed by Autopilot

Published on

spot_img


“This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology.”

Crash Course

The family of a man who died after his Tesla crashed while driving on Autopilot is suing the EV maker, as the Independent reports, accusing CEO Elon Musk of making misleading claims about the driver assistance software.

In February 2023, 31-year-old Genesis Giovanni Mendoza-Martinez died after his Model S smashed into a firetruck on the side of an interstate in San Francisco.

According to the complaint, Mendoza-Martinez made “no accelerator pedal or brake pedal inputs” during the 12 minutes prior to the crash, while the vehicle was on Autopilot.

Mendoza-Martinez’s family is arguing that the driver bought the vehicle under the mistaken belief that it could drive itself, echoing sentiments that Tesla has overstated its vehicles’ ability to drive themselves.

The complaint singles out pages worth of online posts penned by Musk, alleging that he was knowingly misleading the public, despite knowing that the software wasn’t — and still isn’t — able to allow Teslas to safely drive themselves.

The carmaker has since shot back, arguing that it was the driver’s “own negligent acts and/or omissions” that led to his death, as quoted by the Independent.

However, attorney Brett Schreiber, who is representing the family, told the newspaper that Tesla was wrongfully using its customers to beta test flawed driver assistance software on public roads, with fatal results.

“This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology,” he told the newspaper.

Deep Impact

Tesla is already the subject of several government investigations into the safety of its so-called “self-driving” software.

Mendoza-Martinez’s crash is already part of an active National Highway Traffic Safety Administration investigation that dates back to 2021.

The regulator also found earlier this year that those making use of FSD were lulled into a false sense of security and “were not sufficiently engaged in the driving task.”

According to NBC, there are at least 15 other similar and active cases involving either Autopilot or the EV maker’s misleadingly called “Full Self-Driving” (FSD) software — an optional add-on — leading up to a crash that resulted in deaths or injuries.

The California Department of Motor Vehicles has also filed a lawsuit against the carmaker, accusing it of false advertising regarding FSD.

More on Autopilot: Workers Training Tesla’s Autopilot Say They Were Told to Ignore Road Signs to Avoid Making Cars Drive Like a “Robot”



Source link

Latest articles

Robots Are About to Outnumber Humans At Amazon Warehouses

Amazon will soon have as many robots as it does humans plugging away...

Soham Parekh joins Darwin Studios after Silicon Valley backlash for moonlighting

Soham Parekh announced he is joining AI startup Darwin Studios exclusively, after facing...

Dust hits $6M ARR helping enterprises build AI agents that actually do stuff instead of just talking

Want smarter insights in your inbox? Sign up for our weekly newsletters to...

ChatGPT, Claude and Gemini not helping? Here’s how to fix your prompts for better output

AI chatbots like OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude are increasingly woven...

More like this

Robots Are About to Outnumber Humans At Amazon Warehouses

Amazon will soon have as many robots as it does humans plugging away...

Soham Parekh joins Darwin Studios after Silicon Valley backlash for moonlighting

Soham Parekh announced he is joining AI startup Darwin Studios exclusively, after facing...

Dust hits $6M ARR helping enterprises build AI agents that actually do stuff instead of just talking

Want smarter insights in your inbox? Sign up for our weekly newsletters to...