Home » Jury Says Tesla Is Liable In Accident Involving Autopilot

Jury Says Tesla Is Liable In Accident Involving Autopilot

Autopilot Tesla Interior
ADVERTISEMENT

After successful defenses during two other trials involving Autopilot-related crashes, Tesla Motors was found partially liable in a 2019 incident that fatally killed one person and seriously injured another. In Miami federal court yesterday, a jury deliberated for mere hours before awarding $243 million to the victims’ families.

Tesla is no stranger to litigation. After all, there’s a dedicated Wikipedia page covering the EV automaker’s extensive list of past and current lawsuits. Following the Miami ruling, however, that archive is growing.

Vidframe Min Top
Vidframe Min Bottom

The jury decision is compelling on multiple levels. Not only is the award a significant amount, but it also suggests that Tesla is not completely absolved from whatever its vehicle technology does or doesn’t do, regardless of driver involvement. Bloomberg reports:

A jury in Miami federal court found Friday that Tesla was 33% to blame for the collision. A Tesla Model S ran a stop sign at a T intersection in the Florida Keys and rammed into the couple’s parked Chevrolet Tahoe while they were standing next to it.

Jurors issued their verdict after less than a day of deliberations following a three-week trial. The jury determined that the Tesla S driver was primarily responsible for the crash and that Tesla should pay $42.5 million to compensate the victims for their losses. The panel also ordered Tesla to pay $200 million in punitive damages, but the company said it expects that figure to be reduced by the court.

The driver of the Model S was also at fault, and acknowledged as much. According to Bloomberg, the man told the court that he had engaged Autopilot while commuting. During that time, he dropped his cellphone while on a call and attempted to retrieve it. When he realized the vehicle had left the roadway, he “jammed on the brakes.” By then, it was too late.

In a separate lawsuit, Bloomberg reports that he reached a “confidential settlement” with the family of the woman who was killed. During this trial, the man testified that he gave Autopilot too much credit with regard to driver assistance. As recounted by the Associated Press, the driver said:

ADVERTISEMENT

“I trusted the technology too much…I believed that if the car saw something in front of it, it would provide a warning and apply the brakes.”

Because of his admitted negligence, Tesla pins the blame entirely on the driver. The victims’ families said otherwise, accusing the automaker of withholding evidence. Per the AP:

The case also included startling charges by lawyers for the [families]…They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there…

Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up.

Additionally, the plaintiffs’ attorney said the Autopilot name itself is deceiving and leads drivers to believe the technology is more capable than it really is. Which is exactly why the driver thought searching for his lost phone wasn’t a big deal. 

To its credit, Tesla’s ADAS and other safety technology have improved since the 2019 crash. Yet Tesla remains in the hot seat with NHTSA. The federal agency has opened several investigations against the company, including its self-driving claims and its remote car control feature. These regulatory probes cover millions of vehicles. Tesla plans to appeal the Miami judgment, of course. From Bloomberg:

“Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology,” Tesla said in a statement. “We plan to appeal given the substantial errors of law and irregularities at trial.”

Should Tesla lose its appeal, the automaker doesn’t believe it will actually pay the amount awarded due to previous agreements. The AP reports that the families feel otherwise: 

Even if that fails, the company says it will end up paying far less than what the jury decided because of a pre-trial agreement that limits punitive damages to three times Tesla’s compensatory damages. Translation: $172 million, not $243 million. But the plaintiff says their deal was based on a multiple of all compensatory damages, not just Tesla’s, and the figure the jury awarded is the one the company will have to pay.

Money isn’t something Tesla or its CEO is particularly short of these days. If anything were to put a definitive dent in Tesla’s cash pile, however, it could be the failure to regain public trust. Its valuation is based, in large part, on its ability to usher in a new era of driverless cars (e.g., Robotaxi). This is just another concern to mount on top of the federal investigations, lack of transparency, and months-long sales slump.

ADVERTISEMENT

Top image: Tesla

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
94 Comments
Inline Feedbacks
View all comments
Lotsofchops
Member
Lotsofchops
2 months ago

I remember being chastised years ago for complaining that the Autopilot name was misleading. The top voted reply was that autopilot in airplanes doesn’t do everything for the pilot, and I asked how much of the general population are pilots and would they know that distinction; apparently that was “stupid” of me.
That was probably near peak Tesla/Musk ass-kissing, so I’m not surprised.

Last edited 2 months ago by Lotsofchops
Ppnw
Member
Ppnw
2 months ago
Reply to  Lotsofchops

Yeah I agree on the public perception. “Autopilot” is actually the perfect name for the system as it exists, but to your point, nobody knows that’s how actual autopilot works.

Lotsofchops
Member
Lotsofchops
2 months ago
Reply to  Ppnw

And that’s OBE now that they’ve removed all ambiguity and just straight-up call it Full Self-Driving; but no, it can’t fully drive itself silly! Why would you think that?

RustyJunkyardClassicFanatic
Member
RustyJunkyardClassicFanatic
2 months ago

Fuck Tesla and NAZI Melon Husk…they are mass murderers to me and he should be in prison

Ham On Five
Member
Ham On Five
2 months ago

Am I the only one repeatedly ‘misreading’ the headline?
Judy Says Tesla Is Liable

79 Burb-man
Member
79 Burb-man
2 months ago

It’s horrible when someone’s death is fatal.

Professor Chorls
Professor Chorls
2 months ago
Reply to  79 Burb-man

RIP in peperoncini 🙁

I_drive_a_truck
Member
I_drive_a_truck
2 months ago

This is a step in the right direction. Tesla has always played it fast and loose with their marketing and use case for the product. I wouldn’t be surprised to hear an appeals court significantly modifies this judgment, but it’s nice to see it starting to come around on Tesla.

Personally, I’m still waiting for the first lawsuit drops for a body shop that hasn’t recalibrated a vehicle properly after an accident. Hoo boy. Being involved with that component of repair is enlightening and terrifying at the same time. If your car has ever had a front collision or windscreen with ADAS replaced, unless it was by a dealer, the likelihood (90%+) is that it’s not working properly at all anymore.

Shooting Brake
Member
Shooting Brake
2 months ago

Glad they are finally being held accountable for something

Timbales
Timbales
2 months ago

I agree with this verdict.

I do have a question, though. Why was the driver physically handling his phone during a call? They are driving an $80,000+ vehicle that is (allegedly) sophisticated, they don’t trust the onboard systems to handle a phone call hands free? Like any modern car can?

Jason Snooks
Jason Snooks
2 months ago
Reply to  Timbales

I doubt he was on a call, he was probably scrolling Facebook or hotties on TikTok.

William Domer
Member
William Domer
2 months ago

And let’s not forget that Tesla/Musk does the same shit that the orange twit does: not paying vendors and sending them into bankruptcy. I really really abhor Musk and by proxy any thing that shit owns.

MazdaLove
MazdaLove
2 months ago

Simple solution: Insurance company could void coverage while “autopilot” is engaged. Or refuse to insure the car, full stop.

While I am no fan of Tesla, and the verdict feels great, I just can’t shake the feeling that the ultimate responsibility should be with the person who willingly chooses to buy and use a known deceptive product to endanger everyone around them.

Last edited 2 months ago by MazdaLove
Gubbin
Member
Gubbin
2 months ago
Reply to  MazdaLove

I think Tesla is the main surer for most Tesla cars – other companies won’t touch them.

Jeff Elliott
Jeff Elliott
2 months ago
Reply to  MazdaLove

A customer bought a new car and paid extra for “full self driving”. The real problem is that Tesla is selling a product that does not do what they claim it does and it’s killing people.

I know it seems like everyone knows that it can’t actually drive itself, but there are a lot of people that don’t know that and it’s because Tesla lied to them and needs to be held responsible.

CarEsq
Member
CarEsq
2 months ago
Reply to  Jeff Elliott

You have the world’s richest man hawking these cars, claiming they’re FSD and has been doing so from the cyber rooftops for years, so it’s no wonder people are fooled. Also, the prior cases have been (IIRC) that the estates of deceased drivers of Teslas were the ones bringing the case. Here, the guy in the Tesla is saying “I’m responsible, but Tesla lied to me about FSD” so it’s a bit different case hearing from the actual consumer.

Christocyclist
Christocyclist
2 months ago

I have a friend who bought a Model S Plaid. He knows nothing about cars. He was excited that “it drives itself- it has Autopilot”! I explained to him that, “no, it doesn’t” and that is because Elon is a fucktard.

Hugh Crawford
Member
Hugh Crawford
2 months ago

The verdict does not absolve the driver of responsibility, despite what some of the commenters seem to believe.

Tesla took the responsibility for the public’s safety and failed.

It’s not even the trolly problem, it’s running a red light and hitting a perked car.

What hubris compelled them to do that is another topic, but that’s what they did and they are suffering the consequences of their actions

Hugh Crawford
Member
Hugh Crawford
2 months ago

Tesla claimed that the car could drive itself. Or that it was fully self driving.
Somebody believed it, because it seemed to work until it didn’t .

It’s been generally accepted that the cars driver is responsible for their driving barring outside factors they have no control over like bridge collapse or brake failure.

Tesla said that they were responsible when Tesla said that the car could fully drive itself. So under the theory that Tesla claimed to be taking responsibility for driving the car, or that there was a mechanical or software failure of the car that Tesla built, Tesla was at fault one way or the other.

I find it hard to believe that Tesla didn’t have a lawyer tell them that if they claim the the car is self driving, and fully no less, that they are on the hook for any accident as much as any driver.

Tesla took responsibility for driving the car and suffered the consequences of their actions.

3laine
3laine
2 months ago
Reply to  Hugh Crawford

Or that it was fully self driving.

There was no “Full Self Driving” mode in 2019. False premise of your entire comment.

Hugh Crawford
Member
Hugh Crawford
2 months ago
Reply to  3laine

My mistake, I thought this was for a later car, and a later accident.

Dodsworth
Member
Dodsworth
3 months ago

The driver was negligent because Tesla told him he could be. A pathetic argument from a pathetic man buying a pathetic promise.

Mondestine
Mondestine
2 months ago
Reply to  Dodsworth

Exactly. Tesla might technically be telling people to keep their eyes glued to the road and always be prepared to take control at any moment, but when their entire attitude has a tone of *wink wink we know you’re going to be taking a nap or using your phone or getting a BJ, because you’re a Tesla driver and You RULE!*, it’s beyond obvious which message will genuinely resonate with their drivers.

Ppnw
Member
Ppnw
3 months ago

There’s some shadiness from Tesla here but it’s still pretty crazy verdict and precedent to set.

When it comes to full autonomy, I do think that the manufacturer of the system should bear some liability. Without that, liability will always fall on the “driver”, so the system won’t ever be fully autonomous.

I do want to see a world where I can read/sleep/watch TV/drink in my car and be 0% liable. For that to happen, someone has to be liable – it’s reasonable the automaker should be.

But in the case (like here) of a driver aid knowingly engaged – the driver should retain full liability. The only thing you could point to is the misleading nature of the “Autopilot” name, which is another issue altogether. And given how autopilot actually works on commercial aircraft, the term is actually perfectly accurate.

FSD, on the other hand, is a way more misleading name.

Anoos
Member
Anoos
2 months ago
Reply to  Ppnw

The driver aid system failed. Even without the misleading name, the car failed to brake and broadsided a parked car after running through a stop sign at a T junction.

Both the driver and the vehicle systems had control of the car. Both failed. Primary liability is on the distracted driver (reflected in the percentages of the judgement), but the second ‘driver’ also neglected its duties.

Last edited 2 months ago by Anoos
CRM114
Member
CRM114
2 months ago
Reply to  Anoos

The car didn’t brake because the driver had his foot on the accelerator, which the author neglected to mention.

3laine
3laine
2 months ago
Reply to  CRM114

Awful article leaving out important info.

The guy dropped his phone and instead of slowing down or stopping to get it, he stopped looking at the road and started fishing around for the phone WHILE PRESSING THE ACCELLERATOR (which overrides the Automatic Emergency Braking system).

Ppnw
Member
Ppnw
2 months ago
Reply to  Anoos

We aren’t talking about a critical component failing leading to the accident here. The brakes or suspension didn’t fail here. The driver had his foot on the accelerator, thus overriding whatever driver assist system was engaged.

If my lane departure warning fails to read the center line and I drift into oncoming traffic, the system should be liable for the eventual accident? I don’t think so.

Ultimately, until we have full autonomy where the driver isn’t required to pay attention, all liability needs to be on the driver.

Jeff Elliott
Jeff Elliott
2 months ago
Reply to  Ppnw

I do want to see a world where I can read/sleep/watch TV/drink in my car and be 0% liable. For that to happen, someone has to be liable – it’s reasonable the automaker should be.”

Same, I love driving and the self driving car I want a room that takes me places.

Sitting behind the steering wheel of a car that is driving itself and being expected to take over at a moments notice and be fully aware of the emergency situation and your surroundings is so ridiculous to me that I can’t believe large numbers of people thought it was a good idea.

Michael Hess
Michael Hess
3 months ago

You drive, you’re culpable for anything that happens. End of story.

It’s not the manufacturers fault you don’t follow their warnings.

This is no different than the lies every other corporation tells about their products. Cheerios doesn’t help your heart. Can I sue them for everyone that’s died of heart attack because they are them assuming they would survive?

Seriously people, grow some depth of thought before commenting. The only way to change this kind of things is regulation. Something Americans have made pretty clear they don’t comprehend.

Tarragon
Member
Tarragon
3 months ago
Reply to  Michael Hess

When monitoring a machine that mostly gets it right it is literally impossible to maintain attention. Even when you life literally depends on maintaining attention. Look up vigalance task or vigilance problem

We’ve known this for decades. The fact that Tesla doesn’t account for this is critically negligent.

Here’s an Army paper from 1963 that summarizes the state of knowledge at the time and lays out plans for further testing. https://apps.dtic.mil/sti/tr/pdf/ADA079163.pdf

Tarragon
Member
Tarragon
3 months ago
Reply to  Tarragon

Here’s a deposition transcript or the head of Tesla Autopilot software denying he even knows what operational domain design is, let alone Tesla not having one. https://gwern.net/doc/law/2022-06-30-19cv346663-elluswamy-deposition.pdf Starts bottom of page 40

this is all stuff they should know and don’t. This is crazy scary stuff

Slow Joe Crow
Slow Joe Crow
3 months ago
Reply to  Tarragon

Torch has been pounding this as a foundational flaw in all Level 2 systems. You need either full involvement of the driver, with safety aids (level 1) or full autonomy, like Level 3 and higher. Tesla’s “Autopilot is Level 1, and “Full Self Driving” is Level 2. The only Level 3 currently on the market is some Mercedes Benz cars. It should be noted everyone else markets their Level 2 systems as an enhanced cruise control.

Tarragon
Member
Tarragon
2 months ago
Reply to  Slow Joe Crow

Oh yeah. I violently agree (autocorrect errors are fun, I’ll leave it)

Tesla also does some stuff down at lvl 1 that build trust in the system that it doesn’t deserve.

(So far as I understand) It does lane centering instead of lane keeping means you don’t have to do any driving at all. It also disengages if you steer even a bit instead of letting you adjust steering and then resuming so now you can’t steer. The driver doesn’t have to do anything, even worse it’s actively discouraging any driving action.

I really think safety systems should leave the driver in control but be there to help when needed. It’s the only way to ensure the driver is really engaged.

I’ve read the research and have enough self awareness to realize that to some degree I lose appropriate focus any time I let the car do things for me.

At this point I don’t use anything that puts the car in control of any aspect of driving, not even cruise control. I set a the speed limiter to the max speed I want (usually 10 over); I let the car help me, but I’m doing 100% of the driving.

Michael Hess
Michael Hess
2 months ago
Reply to  Tarragon

I’ve never had a problem, any competent driver knows to check intersections, lights, signs, kid zones, whenever they come up to them.

I recently drove a 24 hour trip, over two days, then returned, also over two days. 99.9% was with FSD. Every moment I’d be vigilant as if I were driving, I still was. It’s not hard. You are doing a critical task, you pay attention. If you can’t manage your emotional and cognitive state to do that, don’t drive. The world will be better off without you on the roads.

All that said, the truck also caught a few things and reacted quicker than I could ever have. The combination is better than either single monitoring system.

Hugh Crawford
Member
Hugh Crawford
2 months ago
Reply to  Michael Hess

Exactly. And Tesla took responsibility for driving the car.

Actually, if you are driving and metal fatigue causes a wheel to fall off, the estate of someone it kills can sue the manufacturer of the part that failed. I don’t think that is particularly novel.

But this is beyond that. Tesla set themselves up for this.

The feature was not “help you drive up to the limits of the cameras and the software most of the time” or even “drive assist “. It wasn’t even “self driving” where there might be some some wiggle room where Tesla could say “well we didn’t say that it was fully self driving just that it was self driving sometimes. Tesla said it was “full self driving”, so they voluntarily took responsibility and said they were driving the car.

And as you say “ You drive, you’re culpable for anything that happens. End of story.”

3laine
3laine
2 months ago
Reply to  Hugh Crawford

Tesla said it was “full self driving”

Wrong. “Full Self Driving” mode didn’t exist in 2019 when this accident happened.

Hugh Crawford
Member
Hugh Crawford
2 months ago
Reply to  3laine

Oh, I missed that, I thought this was more recent and am wrong.

Still “full self driving” as a term seems like a liability magnet.

Sort of like my grandfather calling blasting caps “safety caps” , when they didn’t seem particularly safe at all.

Michael Hess
Michael Hess
2 months ago
Reply to  Hugh Crawford

They had disclaimers that very obviously say that’s not true, typical corporate false advertising with all the legal bs. You are simply wrong. You are in charge regardless.

Tarragon
Member
Tarragon
2 months ago
Reply to  Michael Hess

https://thumbnails.cbsig.net/CBS_Production_News_VMS/2018/10/02/1334613571519/MuskMain_1730422_640x360.jpg

Tesla requires that you keep your hands on the wheel at all times.

Then Musk goes on 60 minutes and does this. Why would anyone take Tesla’s warnings seriously

JTilla
JTilla
2 months ago
Reply to  Tarragon

This 100 percent. You don’t get to have your cake and eat it too. Telsa is totally responsible in this shit.

Michael Hess
Michael Hess
2 months ago
Reply to  JTilla

Just like gun manufacturers that say to not point guns at people, then kids die. Right.

JTilla
JTilla
2 months ago
Reply to  Michael Hess

Stupid argument. The two are not remotely comparable. Try again.

Last edited 2 months ago by JTilla
Michael Hess
Michael Hess
2 months ago
Reply to  JTilla

Your ability at deep thought could use some exercise.

Michael Hess
Michael Hess
2 months ago
Reply to  Tarragon

FSD doesn’t require that, they monitor via IR camera that watches your eyes, just like GM and some others. Autopilot requires constant input to verify you are “paying attention.”

Utherjorge, who is quite angry about the baby FJ
Member
Utherjorge, who is quite angry about the baby FJ
2 months ago
Reply to  Michael Hess

found the Tesla driver

Michael Hess
Michael Hess
2 months ago

Responsible Tesla driver.

Fixed that for ya.

Just like everything in life, there are idiots that make things worse for the rest of us.

JTilla
JTilla
2 months ago
Reply to  Michael Hess

Dude, they straight up lie about the capability of the system. I am not absolving the driver but Telsa is 100 percent responsible for MAKING people think they can do this. I am SO fucking sick of people licking the boots of these corporations and giving them a free pass on all their fuckery. Corpos are the reason the country is such a mess and we need to treat them that way.

Michael Hess
Michael Hess
2 months ago
Reply to  JTilla

I agree, false advertising needs to stop. Too bad Republicans and their ideal of deregulation exists. They are effectively allowed to do it. Can’t get mad at them when it’s allowed by the government. Vote better America.

JTilla
JTilla
2 months ago
Reply to  Michael Hess

Laws for citizens but no laws for companies. That is their agenda.

Dirk from metro Atlanta
Dirk from metro Atlanta
3 months ago

More, much more, like this please. Our country can never be anything resembling “free” when billionaire oligarchs are allowed to literally run roughshod over its citizens.

Urban Runabout
Member
Urban Runabout
3 months ago

Good.

Xt6wagon
Xt6wagon
3 months ago

So tesla can’t handle a cell phone drop much less the sleep your way to work advertised at the start?

Hoonicus
Hoonicus
3 months ago

Con Descending

Weston
Weston
3 months ago

Tesla claims in fine print that you need to keep your eyes on the road at all times and take over at a moments notice if the system fails to recognize a hazard – like a fire truck or a stop sign or a red light or a tunnel painted on a canyon wall by a coyote…
And then, in the same breath, claim that the system is miraculous and can drive you everywhere without help and it’s the greatest thing since sliced bread. And then with a wink and a nod tell you that the other verbiage is just some legal technicalities and they have to say that to make their lawyers happy – but really the system can totally drive the car all by itself and everything is great so don’t worry.
And their high priced lawyers think that’s good enough.
Tesla deserves to lose all of these lawsuits and the damages need to be excessive so that it hurts. And the NHTSA needs to enforce meaningful rules for public safety and take action.
And the driver of the car that caused the crash absolutely needs to spend a couple of decades behind bars because THAT would be a deterrent too.

Dirk from metro Atlanta
Dirk from metro Atlanta
3 months ago
Reply to  Weston

And the NHTSA needs to enforce meaningful rules for public safety and take action.

NHTSA? Does that still exist, or did DJT declare it a nonperson?

James
James
3 months ago

“a 2019 incident that fatally killed one person”

Not just killed, but fatally killed.

Urban Runabout
Member
Urban Runabout
3 months ago
Reply to  James

Brought to you by the department of redundancy.

Nlpnt
Member
Nlpnt
3 months ago
Reply to  Urban Runabout

Excuse me? It’s the Department of Redundancy Department.

Dirk from metro Atlanta
Dirk from metro Atlanta
3 months ago
Reply to  James

Tesla, like Raid, kills people dead.

Vanillasludge
Vanillasludge
2 months ago
Reply to  James

I was killed once, but luckily it wasn’t fatal.

Bags
Bags
2 months ago
Reply to  Vanillasludge

I got better

William Domer
Member
William Domer
2 months ago
Reply to  Vanillasludge

Dead or mostly dead?

RustyJunkyardClassicFanatic
Member
RustyJunkyardClassicFanatic
2 months ago
Reply to  James

“You can’t overdie, you can’t overdry”
-Jerry Seinfeld

Cheap Bastard
Member
Cheap Bastard
3 months ago

“$243 million to the victims’ families.”

That’s a HELL of a lot of money, enough for a whole Walmart of victims.

Fuzz
Fuzz
3 months ago
Reply to  Cheap Bastard

The victims didn’t agree to be part of Musk’s public beta testing, where life and death monitoring is done by an untrained car buyer who has been deceived as to the capabilities of said system by company messaging and the words of it’s only spokesperson. This particular user had failed to do his beta tester job as instructed, and had numerous strikes, but was still permitted to be use the software, despite these safety critical incidents where he failed to be attentive. Tesla, however, continued to allow it’s use.

This brings us back to who is responsible for innocent bystanders being impacted, who didn’t agree to a Tesla beta tester disclaimer. They had no choice. The untrained user did what he had so confidently been told the car was capable of, driving itself. And Tesla permitted him to continue to operate the vehicle in this manner, despite him not following the instructions of being attentive at all times. By re-activating FSD each time he struck out, they gave permission to him to continue as he had done. Tesla absolutely holds responsibility here. Not all of it, but a big chunk.

Cheap Bastard
Member
Cheap Bastard
3 months ago
Reply to  Fuzz

My comment was not decrying the decision only the amount. That’s a LOT of money for a single death.

Ignatius J. Reilly
Member
Ignatius J. Reilly
3 months ago
Reply to  Cheap Bastard

Since the majority is punitive and designed to discourage the company from bad actions, think of it as .02% of Tesla’s market cap. Proportionally, it would be a fine of $4800 for somebody with the median net worth. $4800 is awfully light for being even partially responsible for killing somebody.

Cheap Bastard
Member
Cheap Bastard
3 months ago

Its also hundreds of times a typical lifetime earning potential which leaves a whole lot in someone’s pocket even after court costs. Which sounds like a big incentive for the unscrupulous to make sure unloved or inconvenient relatives are killed by a passing Tesla. People do it for insurance and inheritance, why not for a settlement?

Anoos
Member
Anoos
2 months ago
Reply to  Cheap Bastard

I would feed so many people to Musk’s patrolling death machines for that money.

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

The percentage of people who murder for money is inconsequential. The percentage of corporations that are willing to do it is 100%.

Cheap Bastard
Member
Cheap Bastard
2 months ago

“Willing to murder” isn’t the same as actually going through with it though. It takes more than will to get away with murder.

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

Corporations do it all the time. But they call it “risk management.” Basically, knowing that their actions are very likely to cause death or great harm, but choosing to do it anyway since it is still likely to increase “shareholder value.”

Cheap Bastard
Member
Cheap Bastard
2 months ago

Corporations have “risk management”. Regular people have “hit and run”. Kill someone with your car and you have better than an even chance of never being caught and even if you are you have a good chance of not going to jail.

https://abcnews.go.com/US/hit-run-drivers-kill-people-jail-time-rarely/story?id=61845988

https://www.rightlawgroup.com/committed-a-hit-and-run/

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

The percentage of people who commit vehicular-based crimes is tiny. In contrast, the percentage of corporations that engage in “risk management” with the goal of understanding how many people they can hurt and get away with it is 100%. Corporations so rarely face any consequences whatsoever that when they do, it makes headlines.

Is your point that corporations proportionally see higher levels of punishment than do individuals? Because at this point it isn’t clear.

Cheap Bastard
Member
Cheap Bastard
2 months ago

People have lots of options to murder other than vehicles. Poison, asphyxiation, drug overdoses, fire, infections, explosions, drowning, falls, bad food, mixing the wrong chemicals, “unintentional” firearm discharges, hanging, electrocution, buried alive, rockfalls, fights, animal attack/stings/bites, falling on a knife, or just encouraging the victim to do something really, really stupid. The list of ways to be killed in an “accident” is endless and impossible to prove if done right.

In the real world police may not bother to investigate unless murder is blatantly obvious (like a dead body with multiple stab wounds) or they are forced to by pressure from the family and even then only if the family has means. Or perhaps the victim just vanishes and nobody cares enough to report it. People disappear every day.

The reality is neither you nor I have no idea how many people get away with murder because the first rule of getting away with murder is not to be caught. That’s especially true outside of first world countries where data is unreliable and police are even less likely to be bothered.

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

Other than a very random theory on murder with zero basis in reality, was there something you were trying to add to the OP?

Cheap Bastard
Member
Cheap Bastard
2 months ago

It was clear in the beginning that my point was the settlement amount is huge for a single death. That’s a powerful incentive for “murder by Tesla”.

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

“$243 million to the victims’ families.”

That’s a HELL of a lot of money, enough for a whole Walmart of victims.

My comment was not decrying the decision only the amount. That’s a LOT of money for a single death.

Your original comments made no such claim explicitly or implicitly.

My response pointed out the proportional size of the penalty since you said you made no claim other than the size of the penalty.

Did you have a point that related to my comment?

Cheap Bastard
Member
Cheap Bastard
2 months ago

How is:

“$243 million to the victims’ families.”

That’s a HELL of a lot of money, enough for a whole Walmart of victims.”

Not being clear that my point was the settlement amount is huge for a single death?

What’s an average settlement for a single death? Well lets see:

Low end: $250,000 – $500,000
These amounts are often seen in cases where the victim had limited income, or the evidence was weaker.

Moderate: $500,000 – $1.5 million
Common for cases with a clear loss of income, medical bills, and funeral expenses supported by strong evidence.

High end: $2 million – $5+ million
These settlements are more likely in cases involving medical malpractice, commercial liability, or the death of a high earner with dependents. Strong legal representation and clear proof of fault can also lead to higher payouts.

https://www.lawfirmdavidoff.com/blog/whats-the-average-settlement-for-wrongful-death/

$243M is 50-2000x other wrongful death settlements. So yes it is a HELL of a lot of money, enough for a whole Walmart of victims.”

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

And my original comment was…

Since the majority is punitive and designed to discourage the company from bad actions, think of it as .02% of Tesla’s market cap. Proportionally, it would be a fine of $4800 for somebody with the median net worth. $4800 is awfully light for being even partially responsible for killing somebody.

Now again, did you have a point?

Cheap Bastard
Member
Cheap Bastard
2 months ago

Did you?

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

Yes. I will repeat it in hopes a third time will allow you to digest it.

Since the majority is punitive and designed to discourage the company from bad actions, think of it as .02% of Tesla’s market cap. Proportionally, it would be a fine of $4800 for somebody with the median net worth. $4800 is awfully light for being even partially responsible for killing somebody.

This explains in the simplest terms possible why the size of the judgment isn’t necessarily outrageous. Which you claimed it was.

So, did you have a point that was related or just more Gish gallop?

Cheap Bastard
Member
Cheap Bastard
2 months ago

Yes, that $243M may not be 0.02% of Tesla market cap but its many, many times the average settlement for wrongful death.

From that I don’t find it hard to imagine unscrupulous persons could find a way to get their own massive payoff. It appears you do find that hard to imagine so you interpret that concern as “Gish gallop”. If so there is little I can do about it.

Ignatius J. Reilly
Member
Ignatius J. Reilly
2 months ago
Reply to  Cheap Bastard

Do you not understand the fact that the judgements (not settlements) have two elements? That $200m is a penalty which, by its nature, is typically proportional to the ability of the guilty party to pay? Tesla is a massively valuable company (despite most of that value being an illusion), so the penalty is justifiably going to be proportionally large. Being more than the average is easily explained by this and in no way shocking.

The fact that your first instinct is that people are going to start unaliving their loved ones for a payout speaks more to how little you value others than anything else. After all, large payouts have happened well before Tesla, and there is zero evidence that people are murdering people in hopes of a large lawsuit payout in any numbers large enough to influence anything. Meanwhile, there are mountains of evidence that corporations regularly, knowingly take actions that harm and kill people in the name of making money.

Corporate ViolenceMaximizing profit and endangering healthCorporate ExternalitiesThe Game of Influence in Regulatory Capture
So why would you bring up the issue of a payout encouraging “unscrupulous persons” as an issue when you have no evidence that it has any measurable real-world impact outside of a few random anecdotes and your own inner monologue?

Dirk from metro Atlanta
Dirk from metro Atlanta
3 months ago

yeah, funny how that works in the land of the “free” and the home of the brave.

Last edited 3 months ago by Dirk from metro Atlanta
Crank Shaft
Member
Crank Shaft
3 months ago
Reply to  Cheap Bastard

I’d prefer $243 Billion. That might end the second biggest con in history.

Cheap Bastard
Member
Cheap Bastard
3 months ago
Reply to  Crank Shaft

Me too as long as I can get a taste of it.

Sure some of you may die, but it’s a sacrifice I am willing to make.

Crank Shaft
Member
Crank Shaft
3 months ago
Reply to  Cheap Bastard

I appreciate that. I feel exactly that same. 🙂

Urban Runabout
Member
Urban Runabout
3 months ago
Reply to  Cheap Bastard

It’s almost enough to buy a presidency and a whole lot of data.

Avalanche Tremor
Member
Avalanche Tremor
3 months ago

In our litigation happy society it was really only a matter of time until one of these stuck, and that could open the floodgates against Tesla and others. Plus we’re just going to see more and more accidents with Level 2 systems as they become more commonplace. You don’t have to spend much time on any given public road to see that people can’t put their phone downs even when driving totally manual cars, much less semi autonomous ones. Honestly I think there is a reality where autonomous driving can actually be sued out of existence if manufacturers have to shoulder liability. The alternate reality, which I can’t decide is more or less likely given the current state of the courts, is that eventually the Supreme Court rules it just isn’t fair to poor little multi-billion dollar companies, sorry, people, to get picked on by being sued for things their products and the people that use them do and broad immunity is then the name of the game. The Peacemaker mentality, make cars “safer” no matter how many men, women and children you have to kill to do so.

James
James
3 months ago

If we were going to do anything about autonmous vehicles it would have been done years ago when Tesla started calling it Autopilot and Full Self Driving. In the end the laws and courts will side with the big companies.

Dirk from metro Atlanta
Dirk from metro Atlanta
3 months ago

Good take, and I sure hope that your “alternate reality” won’t happen. I’ve no issue with any of it, save for the phrase “litigation happy society.” Which, I realize, you didn’t deploy in order to be judgmental. That said?

Litigation is literally the ONLY–and I do mean only–tool that citizens have in a Republic that has shrugged its shoulders on the issue of a pedestrian’s right to life. (and any number of other fundamental rights, but I digress.)

BTW, I had a life-changing experience some decades back serving as foreman on a civil case regarding a horrific pedestrian injury. I might submit an Autopian piece about it one of these days.

Our Framers didn’t get a lot right when they tried to piece together a nation back in the day, but the 6th and 7th’s enshrinement of one’s right to jury trials? [Chef’s kiss].

AceRimmer
AceRimmer
3 months ago

Good! Let’s hope we see more litigation wins like this. The public did not agree to be beta-testers for Tesla and these other corps for “self-driving.”

Recent Posts

ADVERTISEMENT
ADVERTISEMENT
94
0
Would love your thoughts, please comment.x
()
x