Bored Panda works better on our iPhone app
Continue in app Continue in browser

Add post form top
Add Post
Tooltip close

The Bored Panda iOS app is live! Fight boredom with iPhones and iPads here.

Tesla Blames Young Man For Crash That Tragically Ended His Life While In ‘Autopilot’ Mode
167

Tesla Blames Young Man For Crash That Tragically Ended His Life While In ‘Autopilot’ Mode

Interview With Expert
ADVERTISEMENT

The parents of a Tesla driver, who was crushed to death in a horrifying accident, filed a lawsuit against the electric car manufacturer. They also blamed its CEO, Elon Musk, for trumpeting misleading claims about the car’s self-driving features.

Genesis Giovanni Mendoza-Martinez, 31, tragically lost his life on February 18, 2023, after his Model S rammed into a fire truck near San Francisco, according to a lawsuit filed by his parents, Eduardo and Maria.

Highlights
  • The bereaved parents of a Tesla driver have blamed the electric carmaker for the fatal crash that claimed their son's life.
  • Genesis Giovanni Mendoza-Martinez, 31, tragically passed away after his Model S rammed into a fire truck near San Francisco.
  • “The time is coming due for Tesla to be held to account,” attorney Brett Schreiber, who is representing the Mendoza family, told Bored Panda.

The deceased victim was behind the wheel and suffered fatal injuries while his brother, Caleb, survived the incident with non-life-threatening injuries. Four firefighters also sustained minor injuries as a result of the collision.

“The time is coming due for Tesla to be held to account,” attorney Brett Schreiber, who is representing the Mendoza family, told Bored Panda.

RELATED:

    Tesla and its CEO, Elon Musk, are being blamed for the death of a 31-year-old man in a lawsuit

    Image credits: Chesnot/Getty Images

    Image credits: Maxim

    The family members of Genesis are currently suing Tesla and pointing fingers at Elon for misleading claims about the car’s self-driving technology. They believe the vehicles are not ready for the road, contrary to the company’s bold declarations.

    ADVERTISEMENT

    Genesis’ parents said their son was under the impression that the car could drive itself and was using ‘Autopilot’ mode when the crash took place.

    “Not only was he aware that the technology itself was called ‘Autopilot,’ he saw, heard, and/or read many of Tesla or Musk’s deceptive claims on Twitter, Tesla’s official blog, or in the news media,” read the complaint.

    “Giovanni believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously.”

    Genesis Giovanni Mendoza Martinez was killed in a crash involving his Tesla Model S and a fire truck near San Francisco on February 18, 2023

    Image credits: ContraCostaFire

    Image credits: ContraCostaFire

    “Based on representations Giovanni heard made by Musk, Giovanni believed the vehicle was a safer driver than a human and relied on it to perceive and react to traffic in front of him,” the complaint added.

    ADVERTISEMENT

    The emergency truck involved in the crash had arrived on the freeway in response to a previous accident. The vehicle had its lights on and was parked diagonally when Genesis rammed into it.

    According to the lawsuit, Tesla’s Autopilot misinterpreted the emergency firetrucks and police cruisers at the scene; they appeared as “single frames in the vision system that were either very dark or very bright,” thus rendering the technology incapable of reacting appropriately.

    The family claimed that this failure reflects flaws in the technology—flaws Tesla allegedly knew about but failed to address.

    The National Highway Traffic Safety Administration (NHTSA) has been investigating 16 crashes involving Teslas in Autopilot mode colliding with emergency vehicles over the last six years. These 16 crashes have resulted in at least 15 injuries and one death, according to CBS News.

    The deceased crash victim’s parents filed a lawsuit against Tesla and its CEO, accusing them of touting misleading claims about the car’s self-driving capabilities

    Image credits: Dmitry Novikov

    The accident that claimed Genesis’ life raised critical questions about the reliability of autonomous systems in real-world scenarios.

    ADVERTISEMENT

    “In principle automatic controls will always do a better job than a human operator because they don’t get tired or distracted, have very fast reactions and have as many eyes and other sensors as you wish,” Dr. William told Bored Panda via email.

    “With all technologies, standards have to be developed and products will be tested for compliance. This is mature in the aircraft industry, but at an early stage in the automotive industry. It is unfortunate, but inevitable, that accidents will happen, but it is important to learn from them,” added the University Of British Columbia’s Department of Electrical and Computer Engineering professor.

    Attorney Brett Schreiber said Genesis’ death and the injuries sustained by the victims could have been prevented.

    “Tesla knew that this generation of auto pilot technology could not decipher emergency vehicles’ flashing lights. Rather than taking the responsible step of actually recalling these vehicles Tesla simply pushed an over the air download that left thousands of vehicles vulnerable to the same defect,” he told Bored Panda. “That is how Tesla set the stage for Genesis Mendoza’s preventable death and the injuries to several innocent first responders.”

    Tesla claimed that the crash might have been caused “in whole or in part” by the driver’s “own negligent acts and/or omissions”

    ADVERTISEMENT

    Image credits: Jonas Leupe

    “Like so many, Mr. Mendoza believed the misrepresentations, half truths and lies of Tesla about what its auto pilot technology could do,” he added. “Sadly, he suffered the ultimate price, his brother Caleb was seriously injured and their parents suffered a loss no one ever should. The time is coming due for Tesla to be held to account.”

    On the other hand, Tesla has maintained that their vehicles have “a reasonably safe design as measured by the appropriate test under the applicable state law.”

    “[N]o additional warnings would have, or could have prevented the alleged incident, the injuries, losses and damages alleged,” the company said in response to the family’s lawsuit.

    They argued that the “damages” and “injuries” suffered by the two brothers, “if any, were caused by misuse or improper maintenance of the subject product in a manner not reasonably foreseeable to Tesla.”

    The family claimed Elon’s statements about the Autopilot and self-driving features led Genesis to trust the car could safely drive itself

    ADVERTISEMENT

    Image credits: Roberto Nickson

    ADVERTISEMENT

    In the Mendoza family’s lawsuit, they also accuse Tesla of knowing that their cars couldn’t live up to the hype created by Elon, who said in 2014: “I’m confident that — in less than a year — you’ll be able to go from highway onramp to highway exit without touching any controls.”

    In 2016, he also claimed that the Autopilot feature was “probably better” than a human driver.

    The family further accused Tesla of undertaking “a widespread campaign to conceal thousands of consumer reports about problems with [its] ‘Autopilot’ feature, including crashes, unintended braking, and unintended acceleration.”

    The lawsuit stated that Tesla forced customers to sign nondisclosure agreements to receive repairs under warranty.

    The company received “thousands of customer reports regarding problems with Tesla’s ‘Autopilot’ system between 2015 and 2022, including over 1,000 crashes; over 1,500 complaints about sudden, unintentional braking; and 2,400 complaints about sudden acceleration,” the complaint stated.

    NHTSA has been investigating 16 similar crashes involving Teslas in autopilot and emergency vehicles, resulting in 15 injuries and one death over six years

    Image credits: Justin Sullivan/Getty Images

    ADVERTISEMENT

    As the demand for electric vehicles increases across the globe, Dr. William asserted the importance of having drivers trained in emerging technologies.

    “One problem with driver assistance technology is that the user is sometimes not trained in how to use it. I believe that even with old fashioned cruise control some people have assumed that steering as well as speed is controlled,” he told Bored Panda.

    “Certainly systems that need some human input should try to check that the human is not doing something inappropriate like being asleep. Some cars do this by analysing the steering,” he added. “In trains the driver is typically required to respond to a stimulus every few minute.”

    “Some years ago I saw a presentation on automatic driving. A worst case example was given where a child ran after a ball from one side at the same time as someone lost control of a wheelchair on the other,” he went on to say. “It was inevitable that someone would die and the question was what the appropriate reaction should be. Certainly the automatic system would react faster than a human operator.”

    One social media user found it “amazing” that “no one wants to hold Elon responsible when most of his vehicles have major flaws”

    ADVERTISEMENT
    ADVERTISEMENT
    ADVERTISEMENT
    Ic_polls

    Poll Question

    Thanks! Check out the results:

    Share on Facebook
    Binitha Jacob

    Binitha Jacob

    Writer, BoredPanda staff

    Read more »

    Working as a writer for Bored Panda offers an added layer of excitement. By afternoon, I'm fully immersed in the whirlwind of celebrity drama, and by evening, I'm navigating through the bustling universe of likes, shares, and clicks. This role not only allows me to delve into the fascinating world of pop culture but also lets me do what I love: weave words together and tell other people's captivating stories to the world

    Read less »
    Binitha Jacob

    Binitha Jacob

    Writer, BoredPanda staff

    Working as a writer for Bored Panda offers an added layer of excitement. By afternoon, I'm fully immersed in the whirlwind of celebrity drama, and by evening, I'm navigating through the bustling universe of likes, shares, and clicks. This role not only allows me to delve into the fascinating world of pop culture but also lets me do what I love: weave words together and tell other people's captivating stories to the world

    Donata Leskauskaite

    Donata Leskauskaite

    Author, BoredPanda staff

    Read more »

    Hey there! I'm a Visual Editor in News team. My responsibility is to ensure that you can read the story not just through text, but also through photos. I get to work with a variety of topics ranging from celebrity drama to mind-blowing Nasa cosmic news. And let me tell you, that's what makes this job an absolute blast! Outside of work, you can find me sweating it out in dance classes or unleashing my creativity by drawing and creating digital paintings of different characters that lives in my head. I also love spending time outdoors and play board games with my friends.

    Read less »

    Donata Leskauskaite

    Donata Leskauskaite

    Author, BoredPanda staff

    Hey there! I'm a Visual Editor in News team. My responsibility is to ensure that you can read the story not just through text, but also through photos. I get to work with a variety of topics ranging from celebrity drama to mind-blowing Nasa cosmic news. And let me tell you, that's what makes this job an absolute blast! Outside of work, you can find me sweating it out in dance classes or unleashing my creativity by drawing and creating digital paintings of different characters that lives in my head. I also love spending time outdoors and play board games with my friends.

    What do you think ?
    Add photo comments
    POST
    Featherytoad
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    You know this feature has always been a disaster waiting to happen. There are always going to be those idiots who think you can put it on autopilot and then crawl in the back seat and take a nap. They is so much c**p on cars nowadays that don't need to be, especially touch screens for everything.

    Load More Replies...
    Tams21
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    So they have to put a warning on coffee because it's hot but they expect people to understand that "full self driving" doesn't actually mean a car can be relied on to drive by itself without having an accident. That doesn't make sense.

    WindySwede
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    Are you referring to the lady that burned her vulva och boiling coffee?

    Load More Replies...
    Donna Peluda
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    As we say in UK " six of one and half a dozen of the other" the problem is that Tesla is using the roads and it clients as it's test lab. They should put trained drivers in the seat, not hyped up Musk fanboys.

    iseefractals
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    The problem is that dumb people are doing stupid things. Musk, the website for the feature, the manual that comes with the car, the infotainment system all make very clear that Autopilot is an ASSISTIVE FEATURE....requiring users to remain engaged, hands on wheel, eyes on road, butt in seat and ready to take over at ALL TIMES, and once that human driver takes over any assistive features....stop. Right away. It's not that difficult to grasp. The only way those features improve is with an abundance of real world data, of how not only the driver in the tesla acts and reacts, but the actions of every other driver on the road that isn't outfitted with LIDAR and a supercomputer. If you are sitting behind the wheel and see the car is speeding towards anything that could result in the death of you or others and you just stare slack jawed while thinking "dur....but autopilot" You are at fault.

    Load More Replies...
    Load More Comments
    Featherytoad
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    You know this feature has always been a disaster waiting to happen. There are always going to be those idiots who think you can put it on autopilot and then crawl in the back seat and take a nap. They is so much c**p on cars nowadays that don't need to be, especially touch screens for everything.

    Load More Replies...
    Tams21
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    So they have to put a warning on coffee because it's hot but they expect people to understand that "full self driving" doesn't actually mean a car can be relied on to drive by itself without having an accident. That doesn't make sense.

    WindySwede
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    Are you referring to the lady that burned her vulva och boiling coffee?

    Load More Replies...
    Donna Peluda
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    As we say in UK " six of one and half a dozen of the other" the problem is that Tesla is using the roads and it clients as it's test lab. They should put trained drivers in the seat, not hyped up Musk fanboys.

    iseefractals
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    The problem is that dumb people are doing stupid things. Musk, the website for the feature, the manual that comes with the car, the infotainment system all make very clear that Autopilot is an ASSISTIVE FEATURE....requiring users to remain engaged, hands on wheel, eyes on road, butt in seat and ready to take over at ALL TIMES, and once that human driver takes over any assistive features....stop. Right away. It's not that difficult to grasp. The only way those features improve is with an abundance of real world data, of how not only the driver in the tesla acts and reacts, but the actions of every other driver on the road that isn't outfitted with LIDAR and a supercomputer. If you are sitting behind the wheel and see the car is speeding towards anything that could result in the death of you or others and you just stare slack jawed while thinking "dur....but autopilot" You are at fault.

    Load More Replies...
    Load More Comments
    You May Like
    Related on Bored Panda
    Related on Bored Panda
    Trending on Bored Panda
    Also on Bored Panda