> The company must pay $329 million in damages to victims and survivor, including compensatory and punitive damages.
> A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.
On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly. Had the system been being used correctly and Tesla was assigned more of the blame, would this be a 1 billion dollar case? This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road
> This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road
I hope we haven't internalized the idea that corporations should be treated the same as people.
There's essentially no difference $3M and $300M fine against most individuals, but $3M means very little to Tesla. If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.
That's another difference - fining an indivisible is not going to change risks much, the individual's behavior changing is not that meaningful compared to Tesla's behavior changing. And it's not like a huge fine is gonna make a difference in other drivers deciding to be better, whereas other automakers will notice a huge fine.
>I hope we haven't internalized the idea that corporations should be treated the same as people.
Only when it comes to rights. When it comes to responsibilities the corporations stop being people and go back to being amorphous, abstract things that are impossible to punish.
Perhaps better to achieve symmetry by ceasing to execute humans.
You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human, so you can't make the scores even but you can stop putting more on the total for human misery.
Historically it was impractical to permanently warehouse large number of humans, death was more practical = but the US has been doing it for all sorts of crap for decades so that's not a problem here.
The US would still have much harsher punishments than Western Europe even without the death penalty, because it routinely uses life-means-life sentences where no matter what you're never seeing the outside again.
>You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human
What if we garnished 100% of the future wages of all the employees in perpetuity as well as dissolving the corporate entity? You know, to to make sure the company stays all the way dead.
> Would be nice to see executions of corporations as punishment
Fines. Massive fines.
"Corporate death penalty" is a genius invention of corporate lawyers to distract from the punitive effect of massive fines.
Fines and license revocable are precedented. They take real money from the corporation and its owners. Corporate death penalties are a legal morass that doesn’t actually punish shareholders, it just cancels a legal entity. If I own an LLC and have a choice between a fine and the LLC being dissolved, I’d almost always opt for the latter.
But fines are boring. Corporate death penalty sounds exciting. The anti-corporate folks tend to run with it like catnip, thus dissolving the coalition for holding large companies accountable. (Which, again, a corporate "execution" doesn't do. Nullifying my LLC doesn't actually hurt me, it just creates a little bit of work for my lawyer, and frankly, getting out of a fuckup by poofing the LLC without touching the udnerlying assets is sort of the selling point of incorporation.)
Corporate fines are a genius invention of corporate execs' personal lawyers to distract from the fact that all corporate malfeasance is conducted by actual people who could be held accountable.
> Corporate fines are a genius invention of corporate execs' personal lawyers
Ahistoric and orthogonal. Corporate fines and personal sanctions have coëxisted since corporations were a thing. Charter revocations, on the other hand, have almost always followed individual liability, because again, just poofing a corporation doesn't actually do anything to its assets, the part with actual value. (In the English world, corporations frequently came pinned with trade charters. The actual punishment was losing a trade monopoly. Not a legal fiction being dissolved.)
Nothing about corporate death penalties or corporate fines prevents personal liability. And neither particularly promotes it, either, particularly if guilt is never acknowledged as part of the proceedings.
I'm guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders.
> guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders
This is just expropriation. Which is legally complicated. And raises questions around what the government should do with a bunch of seized assets and liabilities that may or may not be in a going condition, and whether some stakeholders should be reimbursed for their losses, for example employees owed pay, also do pensions count, and if so executive pensions as well, and look at that the discussion got technical and boring and nobody is listening anymore.
On the other hand, a massive fine punts that problem to the company. If it can pay it, great. It pays. If it can’t, we have bankruptcy processes already in place. And the government winds up with cash, not a Tesla plant in China.
Corporate death penalties are stupid. They’re good marketing. But they’re stupid. If you want to hold large companies unaccountable, bring it up any time someone serious threatens a fine.
>And raises questions around what the government should do with a bunch of seized assets and liabilities that may or may not be in a going condition, and whether some stakeholders should be reimbursed for their losses, for example employees owed pay, also do pensions count, and if so executive pensions as well,and look at that the discussion got technical and boring and nobody is listening anymore.
Bullshit they aren't listening. Executives get no payout. Whatever happened happened on their watch. Summary dismissal. Keeps the incentives aligned. Otherwise you're just incentivizing unlawful behavior behind the corporate veil. Pensions do count. Since we've got such a fucked up social safety net in the U.S. that doubles as a mandatory investment slush fund for our corporate overlords, stewardship of said retirement funds should obviously be escrowed until such time as a new more compliant set of execs can either be installed, or the assets can be unwound.
There is no point in leaving something in the hands of the provably reckless/malicious. Which is exactly what these type of people are (executives making decisions that are traceable to reasonably foreseeable loss of life) are.
I'm done with victim blaming when we've made a habit of building orphan mulching machines that we just happen to call corporations.
Which is why you see execs get golden parachutes for failing. The assets get transferred to private parties who walk away to repeat the exact behavior they were just rewarded for at another firm.
Everyone spewing opinions in this thread so they can get their upvotes from like minded readers is missing the 800lb gorilla.
It's not in the state's interest to kill profitable things most of the time except occasionally as a deterrent example. It's the same reason richer people (who pay a lot of taxes, engage in activity spawning yet more taxes, etc) tend to get probation instead of jail. Likewise, the state is happy to kill small(er) businesses. It does this all the time and it doesn't make the news. Whereas with the big ones it just takes its pound of flesh and keeps things running.
As long as that incentive mostly remains, the outcomes will mostly remain.
In most jurisdictions, a corporation is a juridical person[1]. When not explicitly mentioning natural persons, whether a corporation is a "person" is thus ambiguous.
I agree that Tesla should receive punitive damages. And the size of the punitive damages must be enough to discourage bad behavior.
I'm not necessarily sure the victim(s) should get all of the punitive damages. $329 million is a gargantuan sum of money; it "feels" wrong to give a corporation-sized punishment to a small group of individuals. I could certainly see some proportion going toward funding regulatory agencies, but I fear the government getting the bulk of punitive damages would set up some perverse incentives.
I think in the absence of another alternative, giving it to the victim(s) is probably the best option. But is there an even better possible place to disburse the funds from these types of fines?
>> it "feels" wrong to give a corporation-sized punishment to a small group of individuals
This feeling has a name; loss aversion.
It's a really interesting human traits. About 66% of people feel bad when someone else does well. The impact of this feeling on behavior (even behavior that is self-harming) is instructive.
The concept of "Fairness" comes into play as well. Many people have an expectation that the "world is fair" despite every evidence that it isn't. That results in "everything I don't get is unfair" whereas "everything I get I earned on my own." Someone rlse getting a windfall is thus "unfair".
It is definitely not loss aversion. It also has nothing to do with whether or not someone else is getting the money. Handing me nearly half a billion because a parent died would certainty be welcome, but I think it would feel equally as disproportionate and out-of-place.
$300M means very little to Tesla. The stock didn't even drop a bit (other than the usual market fluctuations today). Perhaps $4.20B or $6.90B would've been meaningful. Elon notices these numbers.
Not doing what it asks - “keep your hands on the wheel” and “eyes on the road” - and crashing the car is somehow Elon Musks’ fault LOL hn logic. Can’t wait to sue lane assist when I drive drunk and crash!
It's supposed to stop if objects appear in its path. For sure you're an idiot if you trust Tesla's autopilot, but I think it's reasonable to partially fault Tesla for setting the consumer's expectation that the car stops when obstacles get in the way even if the vehicle isn't being operated exactly as suggested by the manufacturer.
Maybe system should sound alarm and slow down immediately when there is no hands on wheel and eyes not on the road. Would have avoided this accident, so it seems Tesla was at fault for not doing that.
> If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.
What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
This is obviously targeted and the court system should not be playing favorites or going after political opponents
> Tesla cars come standard with advanced hardware capable of providing Autopilot features, and *full self-driving capabilities* — through software updates designed to improve functionality over time.
> Tesla's Autopilot AI team drives the future of autonomy of current and new generations of vehicles. Learn about the team and apply to help accelerate the world with *full self-driving*.
Now you can say that can be interpreted multiple ways - which means the copywriter is either incompetent, or intentionally misleading. Interestingly, the text from 2019 (https://web.archive.org/web/20191225054133/tesla.com/autopil...) is written a bit differently:
> ...full self-driving capabilities *in the future*...
- FSD came out in October 2020; I suppose rounding up to 10 puts it nearly 10 years since. It also, literally, doubles the number from its actual value.
- There have been a lot more than one incident. This is one court case about one incident.
- There are an insane number of accidents reported; does it only matter to you if someone dies? A lot more than one person has died in an accident that involved a vehicle that was equipped with FSD.
- Your comment is obviously targeted and disingenuous.
So to answer your question of what one might want to come out of it, perhaps another recall where they fix the system or stop making false claims about what it can do.
I went into a Tesla dealership nearly 10 years ago to take a look at the cars, and the salespeople were telling me - in no uncertain terms - that the cars were fully self-driving.
I knew that was complete nonsense, because I knew people who worked on Tesla's self-driving software, but that's how Tesla salespeople were selling the cars.
> It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
And how many times did humans had to take over and save themselves and others from Tesla killing or injuring them? Tesla won't tell us this numbers, guess why ? The tech might be safe as a backup driver , but so far you need a human to pay attention to save himself from the car bugs/errors/glitches etc.
I really hate this bullshit safety claims pulled from someones ass, it is like me trying to convince you to get operated but an AI doctor by claiming "it is better then the a old and drunk doctor , he only killed a few people when the people supervising it did not payed attention but in the rest was very safe, we can't tell you how many times real doctors had to perform the hard work and our AI doctor only did stitching , those numbers need to be secret, but trust us the human doctors that had to intervene are just there because of the evil laws it could do the full job itself, we would not call it Full competent doctor if it can\t perform fully all expected tasks.
Tesla was found partially liable for this crash. The reason they were liable was they sold something claiming (practically speaking) that it could do something. The customer believed that claim. It failed to do that thing and killed people.
So the question then is - how much did Tesla benefit from claiming they could do this thing? That seems like a reasonable starting point for damages.
And the fine needs to be high enough to prevent them from just saying - oh, well, we can make money if we keep doing it.
If you could only fine a person for committing murder, you wouldn't fine a billionaire $5m, and then hope he wouldn't go on killing everyone he thinks he'd rather have dead than $5m.
> On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, (...)
Let me stop you right there. That's not how damages work.
Damages have two goals: compensate victims, and dissuade offenders from repeating the same mistakes. The latter involves punishments that discourage repeat offenses.
That's where these high values come from. They are intended to force the likes of Tesla to not ignore the lives they are ending due to their failures.
If damages were low, the likes of Tesla would do absolutely nothing and absorb them as operational expenses, and continue to cause deaths claiming they are unavoidable.
Once the likes of Tesla are forced to pay significant volumes of cash in damages, they suddenly find motives to take their design problems seriously.
I tend to agree, however the government is not an unincentivized incentivizer. By being able to impose such fines, the government is potentially itself incentivized to not prevent these accidents for they potentially cause this kind of revenue.
There are ways to mitigate this, such as forcing the government to use these revenues in a way that is relevant to the issue at hand, i.e. creating safety jobs, strengthening control authorities, or something else.
You could also say that the amount is insignificant, but that could of course change with every lawsuit, and it of course accumulates. Or one could speculate that the interests are not really monetarily aligned at all (e.g. prisons), or that the judicial system is independent enough to stop propagation of these incentives. I think it is still needed to consider and try to controlledly align these motives between the relevant actors.
I don't think these terms are meaningfully different in the heads of most people.
I know autopilot in airplanes is a set of assistive systems which don't remotely pretend to replace or obsolete humans. But that's not typically how it's used colloquially, and Tesla's marketing benefits heavily from the colloquial use of "autopilot" as something that can pilot a vehicle autonomously.
> I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.
It's only fair. If the name was fine when it was attracting the buyers who were mislead about the real capabilities, it must be fine when it causing the same to jurors.
There's another similar argument to be made about the massive amount awarded as damages, which maybe will be lowered on appeal. If people (Tesla included) can make the argument that when a car learns something or gets an "IQ" improvement they all do, then it stands to reason that when one car is dangerous they all are (or were, even for a time). There are millions of Teslas on the road today so proportionally it's a low amount per unsafe car.
"Autopilot" isn't even the most egregious Tesla marketing term since that honour goes to "Full Self-Driving", which according to the fine text "[does] not make the vehicle autonomous".
Tesla's self-driving advertising is all fucking garbage and then some George McGee browses Facebook while believing that his car is driving itself.
As gets pointed out ad nauseum, the very first "cruise control" product in cars was in fact called "Auto-Pilot". Also real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free.
This is a fake argument (post hoc rationalization): It invents a meaning to a phrase that seems reasonable but that has never been rigorously applied ever, and demands that one speaker, and only that one speaker, adhere to the ad hoc standard.
> real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free
Pilot here. If my G1000’s autopilot were flying and I dropped my phone, I’d pick it up. If my Subaru’s lane-keeping were engaged and I dropped me phone, I might try to feel around for it, but I would not go spelunking for several seconds.
The market Tesla is advertising to is not airplane pilots. It is the general car buying public.
If they are using any terms in their ads in ways other than the way the people the ads are aimed at (the general car buying public) can reasonably be expected to understand them, then I'd expect that could be considered to be negligent.
Much of the general public is going to get their entire idea of what an autopilot can do from what autopilots do in fiction.
Notably, an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way. It's just that the sky is very big and other aircraft are very small, so random collisions are extremely unlikely.
> an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way
TAWS (terrain) and ACAS (traffic) are built into modern autopilots.
And Tesla lied about its autopilot’s capabilities in proximity to this crash:
“In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own.
‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.”
Couldn't agree more. This thing where words have a common definition and then a secret custom definition that only applies in courts is garbage. Everyone knows what "full self driving" means, either deliver that, come up with a new phrase or get your pants sued off for deceptive marketing.
Autopilot is used when referring to a plane (until Tesla started using it as a name for their cruise control that can steer and keep distance).
In the context of a plane, autopilot always meant automatic piloting at altitude, and everyone knew it was the human pilots that were taking off and landing the plane.
Pilot is a relatively high status career. They are always shown taking off and landing in tv shows and movies. I would be surprised if people thought they just sit back and relax the entire time.
It's also worth mentioning he would have been required to keep his hands on the wheel while using autopilot, or else it starts beeping at you and eventually disables the feature entirely. The system makes it very clear you're still in control, and it will permanently disable itself if it thinks you're not paying attention too many times (you get 5 strikes for your ownership duration).
As is also pointed out ad nauseam, the claims made about autopilot (Tesla) go far beyond the name, partly because they sold a lot of cars on lies about imminent "FSD" and partly because as always Elon Musk can't keep his mouth shut. The issue isn't just the name, it's that the name was part of a full-court-press to mislead customers and probably regulators.
Is there any contextual difference between the first instance of cruise control (which has since been relabeled cruise control, perhaps with reason), automatic flight control, and a company whose CEO and fanboys incessantly talk about vehicle autonomy?
> On one hand I don't think you can apply a price to a human life
Yes, although courts do this all the time. Even if you believe this as solely manufacturer error, there are precedents. Consider General Motors ignition switch recalls. This affected 800k vehicles and resulted in 124 deaths.
> As part of the Deferred Prosecution Agreement, GM agreed to forfeit $900 million to the United States.[4][51] GM gave $600 million in compensation to surviving victims of accidents caused by faulty ignition switches
So about $5m per death, and 300m to the government. This seems excessive for one death, even if you believe Tesla was completely at fault. And the fact that this is the only such case (?) since 2019, seems like the fault isn't really on the manufacturer side.
If you you make a manufacturing error without intentionally deceiving your customers through deceptive naming of features, you have to pay millions per death.
If you intentionally give the feature a deceptive name like "autopilot", and then customers rely on that deceptive name to take their eyes off the road, then you have to pay hundreds of millions per death.
Wouldn't that logic mean any automaker advertising a "collision avoidance system" should be held liable whenever a car crashes into something?
In practice, they are not, because the fine print always clarifies that the feature works only under specific conditions and that the driver remains responsible. Tesla's Autopilot and FSD come with the same kind of disclaimers. The underlying principle is the same.
Right, this is the frustrating thing about court room activism and general anger towards Tesla. By any stretch of the imagination, this technology is reasonably safe. It has over 3.6 billion miles, currently about 8m miles per day. By all reasonable measures, this technology is safe and useful. I could see why plaintiffs go after Tesla. They have a big target on their back for whatever reason, and activist judges go along. But I don't get how someone on the outside can look at this and think that this technology or marketing over the last 10 years is somehow deceptive or dangerous.
The product simply should not be called Autopilot. Anyone with any common sense could predict that many people will (quite reasonably) assume that a feature called Autopilot functions as a true autopilot, and that misunderstanding will lead to fatalities.
> feature called Autopilot functions as a true autopilot
What's a "true autopilot"? In airplanes, autopilot systems traditionally keep heading, altitude, and speed, but pilots are still required to monitor and take over when necessary. It's not hands-off or fully autonomous.
I would argue you are creating a definition of "autopilot" that most people do not agree with.
It can be called anything in an airplane because there the pilot has some level of training with the system and understands its limits. You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.
A car is another thing entirely because the general population's definition of "autopilot" does come into play and sometimes without proper training or education. I can just go rent a tesla right now.
>You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.
...Um... You did get pilots hopping into a 737 MAX, getting training that barely mentioned an automated anti-stall and flight envelope management system called MCAS that eventually flew several planeloads of people into the ground. That was managements idea too, btw.
The word literally means automatic pilot. Legal departments create an alternate definition of the word to not get sued and most people will interpret the word literally.
> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.
Applies here. As far as I can tell the system did do exactly that. But the details of the actual event are unclear (I've heard parked car but also just hitting the back of another car?)
It’s an emergent technology. The onus is on Tesla to be crystal clear about capabilities, and consistently so. People might quite reasonably infer that something which is labeled as “auto-“ or “automatic” doesn’t require supervision.
Anyone who used it knows its limitations. IDK maybe in 2019 it was different tho, now it's full of warnings that make it barely useable when distracted. Ironically you are better off disabling it and staring into your phone, which seems what regulators actually want.
And by the way what is true autopilot? Is the average joe a 787 pilot who's also autopilot master?
Funny that pretty much every car ships with autosteer now. Ones I've used didn't seem to have much warnings, explanations, disclaimers or agreements that pundits here assume it should.
Boats and aircraft are both different from automobiles. They have full freedom of movement. You can't set a course in the same way with an automobile, because the automobile will need to follow a roadway and possibly make turns at intersections. Boats and aircraft can have waypoints, but those waypoints can be mostly arbitrary, whereas a car needs to conform its path to the roadway, traffic, signage, etc.
There's two conflicting goals here, Tesla's marketing department would really like to make you think the car is fully autonomous for financial reasons (hence autopilot and full self driving) and then there's Tesla's legal department which would prefer to blame somebody else for their poor software.
The fault with an individual can be reasonably constrained to the one prosecuted death they caused. The fault with "autopilot by Tesla", a system that was marketed and deployed at scale, cannot.
And if you want to draw parallels with individuals, an individual driver's license would be automatically suspended and revoked when found at fault for manslaughter. Would you propose a minimum 1~3 year ban on autopilot-by-Tesla within the US, instead?
*Punitive damages*. From another article: "They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident." If Tesla is destroying evidence then yeah they ought to feel the pain, and those personally responsible should be charged as well. If you make it cheaper to evade the law than comply, what good is the court at all?
329 million is not just compensatory damages (the value of the human life) but also punitive damages. That number floats up to whatever it takes to disincentivize Tesla in the future.
> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.
> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.
Nothing enrages a judge faster then an attempt to conceal evidence that a court has ordered be turned over during discovery. If this is then I suspect the punitive damages have to do as much about disregard to the legal process as it is the case itself.
I think the conceptually messed up part is, when such an award includes separate components for compensatory damages and punitive damages, the plaintiff receives the punitive damages even if they're part of a much broader class that was impacted by the conduct in question. E.g. how many people's choice to purchase a Tesla was influenced by the deceptive marketing? How many other people had accidents or some damages? I think there ought to be a mechanism where the punitive portion rolls into the beginning of a fund for a whole class, and could be used to defray some costs of bringing a larger class action, or dispersed directly to other parties.
So if I make a lawsuit and prove there is a small possibility my toaster can cause my arm to be cut off, because that’s what it did to me, and win $400,000,000 I should only get $400 if it turns out they sold 1 million units?
It’s not a class action lawsuit. If they want their cash they should sue too. That’s how our system works.
No, you misread that pretty significantly. They're only talking about splitting up the punitive damages.
Using the tesla numbers you'd get somewhere between 43 and 129 million dollars personally, plus a share of the 200 million. And your share of the 200 million would probably be a lot higher than someone merely defrauded. And the 200 million would probably get a lot bigger in a proper class action.
If you want numbers for your theoretical you'll have to specify how much was compensatory and how much was punitive.
On the flip side: Penalties should be scaled relative to one's means so that the wealthy (whether people or corporations) actually feel the pain & learn from their mistakes. Otherwise penalties for the wealthy are like a cup of coffee for the average Joe -- just a "cost of business."
I'm also a big proponent of exponential backoff for repeat offenders.
If anyone's confused about what to expect of autopilot and/or FSD, it's Tesla's doing and they should be getting fined into oblivion for the confusion and risks they're creating.
Normally, Autopilot does not require your foot on the accelerator. Pressing the accelerator overrides it to make to car move ahead where the autopilot decided to stop. At the time, autopilot would always stop at an intersection, and the driver was supposed to move ahead when it is clear.
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.
Hard for me to see this as anything but the driver’s fault. If you drop your phone, pull over and pick it up or just leave it on the floor. Everyone knows, and the car tells you, to pay attention and remain ready to take over.
The argument is that if the driver was in a different vehicle he would have done just that, pulled over and picked it up, but because he believed the Tesla was capable of driving safely on it's own he didn't do so.
Normally I turn the steering wheel when I want to turn my car. If you sold me a car and told me it had technology to make turns automatically without my input then I might let go of the wheel instead of turning it, something I would never have done otherwise. If I then don't turn and slam straight into a wall, am I at fault for trusting what I was sold to be true?
If the driver has heard that their Tesla is capable of autonomous driving, and therefore trusts it to drive itself, there may be a fair argument that Tesla shares in that blame. If it's a completely unreasonable belief (like me believing my 1998 Toyota is capable of self driving) then that argument falls apart. But if Tesla has promoted their self driving feature as being fully functional, used confusing descriptions like "Full Self-Driving", etc, it might become a pretty reasonable argument.
Every time I engage Autopilot in my Model S it admonishes me with a notice in the instrument cluster that I am to keep my hands on the wheel. If I don't make it clear the car that I am there and holding on by applying a little rotational force to the wheel at least every fifteen seconds the car will remind me.
So how does one conclude the that the car is capable of driving itself? Or is the version of Autopilot in the car in question different in this respect?
Autopilot is not autonomous driving and isn't marketed as such; Full Self Driving (FSD) is an extra cost option.
Does it remind you thay it cant stop at an intersection, or brake to avoid hitting objects? If it did then the person might be more responsible. But elon wouldnt have let the engineers put that in because it goes against his grift
It also ignores how Tesla promoted "autopilot". Until very recently tesla.com/autopilot just showed that a video saying the driver was only there for legal reasons. Yes maybe technically they meant FSD (for which it's also a lie, and which has a lying name as well) but they were definitely mixing the terms up themselves (and I think the video predated FSD)
The model s has terrible phone docks. Don't get me started on cupholders, I'll bet people have drink mishaps all the time that affect driving.
I'm actually kind of serious about this - keeping people's stuff secure and organized is important in a moving car.
I'm surprised the touchscreen controls and retiring of stalks aren't coming under more safetly scrutiny.
With the new cars without a PRND stalk, how can you quickly reverse the car if you nose out too far and someone is coming from the side? will the car reverse or go forward into danger?
And why was his mobile phone in his hand to drop, if he was driving? Most states have laws against mobile device usage while driving, and it was never a responsible thing to do even before the laws were enacted.
Maybe he would have pulled over if the car’s capabilities hadn’t been oversold. Two entities did stupid things: 1) The person by not waiting to pull over because Elon Musk’s false claims, and 2) Tesla via Elon Musk making those false claims.
The case number appears to be 1:21-cv-21940-BB (S.D. Fla.).
I practice in that court regularly. Beth Bloom has a reputation as a good trial judge, so I'm somewhat skeptical of Tesla's claims that the trial was rife with errors. That said, both the Southern District and the Eleventh Circuit are known to frequently and readily lop off sizable chunks of punitive damages awards.
The real battle is just beginning: post-trial motions (including for remittitur) will be due in about a month. Then an appeal will likely follow.
Wellbeing doesn't have a price tag. There's no amount of money someone could pay you to make up for your daughter dying early or your son becoming disabled.
However, $329M sounds like an imaginary amount of money in a liability claim. If this guy crashed into a parked car without Tesla being involved, the family would be unfathomably lucky to even get 1% of that amount.
I agree. If Tesla was not allowed to give their product a misleading name, it probably would get fewer sales. That's the rationale behind false advertising laws.
The supreme court has ruled on punitive damage awards going back to the 1990s and limited the excesses. You can't for example get $4 million in punitive damages on $4000 in actual damages. The general rule from the looks of the article is 5x actual damages.
I dont understand this at all. Mercedes recently announced that theyd be liable for accidentals caused by their level 4 system meaning that by default, all other self driving features are driver liable?
That’s apples and oranges. And Mercedes saying something certainly doesn’t change the law.
Mercedes built a sophisticated system designed to be able to handle what it might run into within the very limited domain it was designed for. Then they got it certified and approved by the government. Among other things, it only works at low speed. Not 60 MPH. And I think only on highways, not where there are intersections.
Tesla‘s system is not child’s play, sure. But they unleashed it everywhere. Without any certification. And relatively few limits, which I think they later had to tighten.
"Vehicle will not break when accelerator pedal is applied" is displayed on the screen in AutoPilot mode. The system warns you this every time you have foot on the pedal in AutoPilot mode. I wonder if it did that back in 2019 as well or if accidents like this spurred that as a UI change.
Also, how the heck is Mr. McGee supposed to come up with the other 67% of this judgment?
Yes I'm pretty sure that my 2015 Model S has always warned me that the car will not brake if I command it not to with my foot on the accelerator. It certainly did in 2019.
> Also, how the heck is Mr. McGee supposed to come up with the other 67% of this judgment?
He probably can't. Assuming the amount stands on appeal, and assuming that he doesn't have an insurance policy with policy limits high enough to cover it, he'll pay as much as he can out of personal assets and probably have his wages garnished.
He might also be able to declare bankruptcy to get out of some or all of it.
An interesting question is what the plaintiffs can do if they just cannot get anywhere near the amount owed from him.
A handful of states have "joint and several liability" for most torts. If this had been in one of those states the way damages work when there are multiple defendants is:
• Each defendant is fully responsible for all damages
• The plaintiff can not collect more than the total damage award
In such a state the plaintiffs could simply ask Tesla for the entire $329 million and Tesla would have to pay. Tesla would then have a claim for $220 million against McGee.
Of course McGee probably can't come up with that, so McGee still ends up in bankruptcy, but instead of plaintiffs being shortchanged by $220 million it would be Tesla getting shortchanged.
The idea behind joint and several liability is that in situations like there where someone has to be screwed it shouldn't be the innocent plaintiff.
Many other states have "modified joint and several liability". In those states rather than each defendant being fully responsible, only defendants whose share of the fault exceeds some threshold (often 50%) are fully responsible.
For example suppose there are three plaintiffs whose fault shares are 60%, 30%, and 10%. In a pure joint and several liability state plaintiff could collect from whichever are most able to come up with the money. If that's the 10% one they could demand 100% from them and leave it to that defendant to try to get reimbursed from the others.
In a modified joint and several liability state with a 50% threshold the most the plaintiff can ask from the 30% and 10% defendants is 30% and 10% respectively. They could ask the 60% defendant for 100% instead. If the 60% defendant is the deep pockets defendant that works out fine for the plaintiff, but if it is the 10% one and the other two are poor then the plaintiff is the one that gets screwed.
Finally there are some states that have gotten rid of joint and several liability, including Florida in 2006. In those states plaintiff can only collect from each defendant bases on their share of fault. Tesla pays their 33%, McGee pays whatever tiny amount he can, and the plaintiff is screwed.
If I were running a car company, I may decide to stop development on assistive features and remove any that exist from future models. The way people are using them, and the courts are punishing the companies for adding these safety features, is it worth it to them? Not having them would also bring down the base price on vehicles, which seems to be the primary complaint from people right now.
It's not assistive features in general that are the problem. It's that level 3 self-drive (which is what Tesla purports to have) should not be permitted to exist.
Things which simply help the driver, fine. Lane assist, adaptive cruise control and other such things that remove some of the workload but still leave the driver with the full responsibility of control. That's level 2.
Level 3 can generally run the car but is not capable of dealing with everything it might encounter. That's where you get danger--this guy thought he could pick up his dropped phone, his car didn't even manage to stay in it's lane. And the jury quite correctly said that's totally unacceptable.
The next step above level 2 should be skipping to level 4--a car that is capable of making a safe choice in all circumstances. That's Waymo, not Tesla. A Waymo will refuse to leave the area with the sufficiently detailed mapping it needs. A Waymo might encounter something on the road that it can't figure out--if that happens it will stop and wait for human help. Waymos can ask their control center for guidance (the remote operators do not actually drive, they tell the car what to do about what it doesn't understand) and can turn control over to a local first responder.
The only level 3 autonomous vehicles available for purchase in the U.S. are certain Mercedes-Benz models, and it only works on select roads in California and Nevada.
> Mercedes-Benz Drive Pilot: California and Nevada are currently the only states with roads approved for the Mercedes-Benz Drive Pilot. For 2025, only the EQS and S-Class offer Level 3 and then it’s by subscription.
Tesla hid evidence and was probably punished by the jury and I hardly feel sorry for Tesla. Maybe next time they will not do that, but I doubt it. Transparency was never a thing baked into the Tesla Musk DNA.
> CORRECTION: Tesla must pay portion of $329 million in damages after fatal Autopilot crash, jury says
This was added,
> The jury determined Tesla should be held 33% responsible for the fatal crash. That means the automaker would be responsible for about $42.5 million in compensatory damages.
Whatever one believes about the state of the FSD / Autopilot today (it still doesn't work well enough to be safe and probably won't for the next 5 years), I just don't see how one can argue that almost 10 years ago, when this thing would go into a fatal accident every 10 miles while Tesla was arguing it's safer than a human
and will go from LA to NY autonomously within the year, was not a total bullshit a deceptive marketing.
"The person in the driver's seat is only there for legal reasons" - that was 2016... Funnily enough, in 2025 they are rolling exactly the same idea, as a Robotaxi, in California. Amazing.
Airplanes have autopilots too but pilots are still sitting there on the controls and have to be paying attention. A brand or technical name doesn't remove responsibility.
Car travel is such a weird thing. The other day I was walking on the driver side of a wide street where cars were passing me by very fast. Most of us are usually an arms length away from passing cars when we do that, roughly an arms length away from being hit at 20+ mph, yet humans do this careful beam walking without flinching. It’s the American version of Indians riding the roof of a train. If you goto a third world country, they are even more acclimated to insanely dangerous traffic conditions (no traffic laws being followed, cars flying out from every angle). The whole thing is insanity yet everyone is just going along with it.
A life built around car commutes is the outcome of a poorly thought out society. It had such stupid repercussions like racial segregation and the absurdity that everyone gets their own house along with the vertical airspace on top of it (exceedingly selfish). Along with that, it’s dangerous as fuck.
Someone needs to suggest dropping the speed limit across the board in America. Unfortunately the MAGA people will straight boycott roads if we do that (unfortunately?). Most people I talk to in Texas will concede the highways are full of maniacs.
Interestingly, removing this illusion of a safe "beam" or division is part of safer road design for mixed traffic areas like city streets.
When you remove physical divides between "road" and a "footpath" drivers become more aware that they are dangerous and slow down on their own. Making roads narrower has a similar effect, and removing car parking helps remove that divide and removes the blindspots they create.
There are many roads that should either stop pretending to be safe for pedestrians and turn into highways, or lower the speed limit and redesign the layout to suit the mixed traffic nature of the areas. Doesn't have to be extreme just be honest about the real use of the roads. If it's actually mostly pedestrians, suck it up, make it safe, go drive on the other faster roads.
One of the things that is particularly strange about traveling to the United States is that you pretty much have no choice but to rent a car or at least have access to a car in some way, unless you’re going to be spending all your time in a place like New York City. But in nearly every other major western destination, certainly throughout much of Europe, it’s never even a thought that you would need to rent a car. I know plenty of grown adults here in Europe they don’t even have a drivers license! It really makes life a lot more convenient when the world does not revolve around an automobile.
Merely dropping speed limits doesn’t work. Speed limits not enforced by road design such as road/lane width (real or by parked cars) or other features (bulb outs, neck downs) are speed suggestions.
Train travel is such a weird thing. The service, staff, stock, pricing, timetable etc is set by civil servants and politicians 250 miles away.
The ticketing system is Byzantine and if you make a mistake you get a fine or a criminal record, unless you look like a thug, in which case the staff rarely bother.
Tickets are miraculously priced similar to driving in many instances despite a train looking nothing like a car. In various cases, more expensive than driving
You can be late because a politician 200 miles away annoyed the staff so they decide not to turn up.
It also turns out services on Sunday ran on good will (overtime), so the annoyed staff who get ok pay rises now realised they don't want the overtime so good luck travelling on a Sunday in quite a few areas.
Because of low enforcement of infringements by thugs, they are drawn to the service. You'd think your very high ticket price would include some kind of security? No. Fuck you.
When the trains break down you are expected to be OK with being trapped on a train for hours with no open windows or working toilets because of some "greater good" and insufficient emergency response
Such is life in dense cities where workplaces are concentrated. And pace of life requires humans to take shorter path which is safe enough. At least it’s much safer than in the era of the horses. And sanitary conditions are much better.
Density is fundamentally incompatible with car-friendly cities and a large amount of road. In general, the denser a city becomes, the more people will go by foot, cycle, or use public transport, especially in Europe.
Roads take up an enormous amount of space, as does parking. We could capture significant real estate opportunities back from roads in cities that have multi-lane roads going through the CBD (mine has like a dozen criss crossing it). It wouldn't be an overnight thing, but close off a block of road, rezone it and sell it off, you'll get a bunch of new high density buildings that are walkable (since any roads going through will have to be narrow)
And then you get business parks on the outskirts of the city. Because otherwise both employees and visitors complain about parking. Which makes cheaper suburbs more compelling. And some people living downtown driving to workplaces in outskirts of the city.
At least that’s what is happening in my whereabouts. I remember when I was kid downtown was prime location for regular shopping and doing daily business. Now all of that moved to big box stores along major traffic arteries. Nowadays downtown is just a tourist trap with super expensive boutiques. I still go there to take a walk once a year or so, but besides that there’s no reason to there anymore for vast majority of people. But suburbs life is flourishing… With shops and services that are needed for every day life. Cheap and good eateries. Even casual entertainment is moving away from the city core.
All in all, not sure if it’s good or bad. I couldn’t afford living in downtown even in old era. And now I can live comfy life in suburbs. Not helping much to lower total amount of traffic though.
Oh it is. You get traffic jams and that’s not exactly car-friendly. But still people drive. On top of that, walking on a sidewalk when a bus pass close by is much more interesting than a car passing. And then there’s all the commercial traffic that is necessary to serve a dense city.
And many cities, especially here in Europe, are dense enough to need some mode of transit. Especially with workplaces concentrated either in office parks or industrial districts by the city. But not multi million megapolies that warrant a sweet metro system. And that’s where cars shine.
Although after living in one of the largest cities on earth with a very nice train system… there’s still plenty of traffic. And people still drive. Especially those who can’t afford living in city core and end up in cheaper more remote locations.
When I lived in London, people waiting for the tube were leas than an arm’s length away when it rushed in. Blew my mind. Eventually I got used to it and did the same.
The difference is that the tube driver (assuming there even is one) can't just decide to take a slight turn from the planned route, while the car driver really might.
It does happen, it happened to me — a driver was distracted (because they were texting while driving) and they veered from the traffic flow and I was hit at 40 mph. The EMS worker who responded to the scene told me two things: 1. They almost never pull people alive from this kind of accident, 2. This kind of head-on, no brakes, distracted driver accident was happening more and more. This was 10 years ago.
Yeah but look at how normalized it is. All you need to do is slightly misstep, out of the thousands of steps you take, to bump your head on a passing train. You think it’s not possible (part of the trance) but all the parameters are set up so that the slot machine eventually hits the jackpot. You’re 12 inches away from a projectile, and you are to repeat the loop daily for a lifetime. The odds are the odds.
"Tesla designed Autopilot only for controlled-access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans"https://www.bbc.com/news/articles/c93dqpkwx4xo
This is a clear case of misleading marketing. Spreading deceptive information and then hiding contradictory details in the fine print simply won't fly.
Intent is central to legal interpretation. There's been a clear intent to mislead about Tesla Autopilot's capabilities for years, and it's finally starting to backfire. It doesn't matter if Tesla is, on average, better if individual cases are lost due to constant misleading marketing.
Tesla tried to hide evidence … the jury probably did not like to be lied to by Tesla.
F** around and find out
> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.
> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.
"Schreiber acknowledged that the driver, George McGee, was negligent when he blew through flashing lights, a stop sign and a T-intersection at 62 miles an hour before slamming into a Chevrolet Tahoe that the couple had parked to get a look at the stars."
> “Tesla designed Autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” Brett Schreiber, counsel for the plaintiffs, said in an e-mailed statement on Friday.
Tesla had every opportunity to do the safe thing and limit Autopilot to only highways, but they want the hype of a product that "works anywhere" and don't want to be seen as offering a "limited" product like their competitors. Then they are surprised their users misuse it and blame it all on the driver. Tesla wants to have their cake and eat it too.
You know what else prevents misuse? Implementing safeguards for your safety critical product. It took multiple NHTSA probes for Tesla to implement obvious driver monitoring improvements to Autopilot/FSD. There's a reason Tesla is always in hot water: they simply lack the safety culture to do the right thing proactively.
A lot of discussion in this thread about the technical meaning of “autopilot” and its capabilities vs FSD.
This is really missing the point. Tesla could have called it “unicorn mode” and the result would still be the same.
The true issue at hand is that Elon Musk has been banding about telling people that their cars are going to drive themselves completely for over a decade now and overstating teslas capabilities in this area. Based on the sum totality of the messaging, many lay consumers believe teslas have been able to safely drive themselves unsupervised for a long time.
From a culpability standpoint, you can’t put all this hype out and then claim it doesn’t matter because technically the fine print says otherwise.
> it doesn’t matter because technically the fine print says otherwise.
Every time you engage the system it tells you to pay attention. It also has sensors to detect when you don’t and forces you. If you have more than N violations in a trip, the system is unavailable for the remainder of your trip.
I don’t know how much clearer it could be.
I would argue that the system is actually so good (but imperfect) that people overestimate how good it is, and let their guard down.
If a system were more error prone, people would not trust it so much.
Maybe not give it a misleading name that implies full self-driving capabilities. Also not have the CEO publicly make grandiose claims of the performance over 8 years.
> If a system were more error prone, people would not trust it so much.
Unfortunately not. Youtube is full of videos of FSD trying to crash into oncoming traffic, parked cars etc, but then at the end of the video the driver goes “well that was pretty impressive” and just ignores all the suicide attempts.
Look at what Elon and even Tesla official account are publishing daily on Twitter. The tweet suggestions like your car can drive you even when you have a health emergency and need to drive to ER or that you can make an espresso while your car is driving itself.
Can you point to any other automaker that even come close to this level of false advertising ?
Which is kind of the entire problem. It’s Tesla bringing this grey-zone product to market that the Jury/Judge found to be problematic (and thus ruled against Tesla).
to me? an infinite amount! So trying to measure it in dollar terms is already a pointless exercise.
If you want to see the hole in this argument, try looking at it from the other side- How can you arrive at the conclusion that a life is worth only $329M? Why so low? Why not a billion, or a trillion?
If you have to come up with an objective and dispassionate standard, then I doubt you'd find one where the number it generates is $329 million.
> So trying to measure it in dollar terms is already a pointless exercise.
That strikes me as an argument for a judgement of zero dollars, which is the exact opposite of the reality that life is worth infinitely more than money.
The driver literally stopped paying attention to find his phone that he dropped. Is that really a fanboy thing or just total irresponsibility on the driver’s part?
HN commenter: "On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly."
What this sentence is describing are compensatory damages, e.g., compensation for loss of life
That number was 129 not 329, and Tesla was only found liable for 33% of it
CNBC: "Tesla's payout is based on $129 million in compensatory damages, and $200 million in punitive damages against the company."
CNBC: "The jury determined Tesla should be held 33% responsible for the fatal crash. That means the automaker would be responsible for about $42.5 million in compensatory damages."
42.5 is not even close to the 329 number that the HN commenter's claims "feels too high"
HN commenter: "This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road"
This sentence is describing something more akin to punitive damages, e.g., a fine
That number was 200 not 329
It seems the HN commenter makes no distinction between compensatory and punitive damages
> The company must pay $329 million in damages to victims and survivor, including compensatory and punitive damages.
> A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.
On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly. Had the system been being used correctly and Tesla was assigned more of the blame, would this be a 1 billion dollar case? This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road
> This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road
I hope we haven't internalized the idea that corporations should be treated the same as people.
There's essentially no difference $3M and $300M fine against most individuals, but $3M means very little to Tesla. If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.
That's another difference - fining an indivisible is not going to change risks much, the individual's behavior changing is not that meaningful compared to Tesla's behavior changing. And it's not like a huge fine is gonna make a difference in other drivers deciding to be better, whereas other automakers will notice a huge fine.
>I hope we haven't internalized the idea that corporations should be treated the same as people.
Only when it comes to rights. When it comes to responsibilities the corporations stop being people and go back to being amorphous, abstract things that are impossible to punish.
Would be nice to see executions of corporations as punishment.
Perhaps better to achieve symmetry by ceasing to execute humans.
You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human, so you can't make the scores even but you can stop putting more on the total for human misery.
Historically it was impractical to permanently warehouse large number of humans, death was more practical = but the US has been doing it for all sorts of crap for decades so that's not a problem here.
The US would still have much harsher punishments than Western Europe even without the death penalty, because it routinely uses life-means-life sentences where no matter what you're never seeing the outside again.
>You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human
What if we garnished 100% of the future wages of all the employees in perpetuity as well as dissolving the corporate entity? You know, to to make sure the company stays all the way dead.
I guess my bad for not specifying that it'd need be wicked for the corporation not the humans.
> Would be nice to see executions of corporations as punishment
Fines. Massive fines.
"Corporate death penalty" is a genius invention of corporate lawyers to distract from the punitive effect of massive fines.
Fines and license revocable are precedented. They take real money from the corporation and its owners. Corporate death penalties are a legal morass that doesn’t actually punish shareholders, it just cancels a legal entity. If I own an LLC and have a choice between a fine and the LLC being dissolved, I’d almost always opt for the latter.
But fines are boring. Corporate death penalty sounds exciting. The anti-corporate folks tend to run with it like catnip, thus dissolving the coalition for holding large companies accountable. (Which, again, a corporate "execution" doesn't do. Nullifying my LLC doesn't actually hurt me, it just creates a little bit of work for my lawyer, and frankly, getting out of a fuckup by poofing the LLC without touching the udnerlying assets is sort of the selling point of incorporation.)
Corporate fines are a genius invention of corporate execs' personal lawyers to distract from the fact that all corporate malfeasance is conducted by actual people who could be held accountable.
> Corporate fines are a genius invention of corporate execs' personal lawyers
Ahistoric and orthogonal. Corporate fines and personal sanctions have coëxisted since corporations were a thing. Charter revocations, on the other hand, have almost always followed individual liability, because again, just poofing a corporation doesn't actually do anything to its assets, the part with actual value. (In the English world, corporations frequently came pinned with trade charters. The actual punishment was losing a trade monopoly. Not a legal fiction being dissolved.)
Nothing about corporate death penalties or corporate fines prevents personal liability. And neither particularly promotes it, either, particularly if guilt is never acknowledged as part of the proceedings.
I'm guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders.
> guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders
This is just expropriation. Which is legally complicated. And raises questions around what the government should do with a bunch of seized assets and liabilities that may or may not be in a going condition, and whether some stakeholders should be reimbursed for their losses, for example employees owed pay, also do pensions count, and if so executive pensions as well, and look at that the discussion got technical and boring and nobody is listening anymore.
On the other hand, a massive fine punts that problem to the company. If it can pay it, great. It pays. If it can’t, we have bankruptcy processes already in place. And the government winds up with cash, not a Tesla plant in China.
Corporate death penalties are stupid. They’re good marketing. But they’re stupid. If you want to hold large companies unaccountable, bring it up any time someone serious threatens a fine.
>And raises questions around what the government should do with a bunch of seized assets and liabilities that may or may not be in a going condition, and whether some stakeholders should be reimbursed for their losses, for example employees owed pay, also do pensions count, and if so executive pensions as well,and look at that the discussion got technical and boring and nobody is listening anymore.
Bullshit they aren't listening. Executives get no payout. Whatever happened happened on their watch. Summary dismissal. Keeps the incentives aligned. Otherwise you're just incentivizing unlawful behavior behind the corporate veil. Pensions do count. Since we've got such a fucked up social safety net in the U.S. that doubles as a mandatory investment slush fund for our corporate overlords, stewardship of said retirement funds should obviously be escrowed until such time as a new more compliant set of execs can either be installed, or the assets can be unwound.
There is no point in leaving something in the hands of the provably reckless/malicious. Which is exactly what these type of people are (executives making decisions that are traceable to reasonably foreseeable loss of life) are.
I'm done with victim blaming when we've made a habit of building orphan mulching machines that we just happen to call corporations.
The “LL” in LLC stands for Limited Liability. The whole point is to financially insulate the owner(s).
To be fair, I think they’re talking about seizing the LLC’s assets. Not the members’.
Which is why you see execs get golden parachutes for failing. The assets get transferred to private parties who walk away to repeat the exact behavior they were just rewarded for at another firm.
Everyone spewing opinions in this thread so they can get their upvotes from like minded readers is missing the 800lb gorilla.
It's not in the state's interest to kill profitable things most of the time except occasionally as a deterrent example. It's the same reason richer people (who pay a lot of taxes, engage in activity spawning yet more taxes, etc) tend to get probation instead of jail. Likewise, the state is happy to kill small(er) businesses. It does this all the time and it doesn't make the news. Whereas with the big ones it just takes its pound of flesh and keeps things running.
As long as that incentive mostly remains, the outcomes will mostly remain.
> Only when it comes to rights.
"Corporations are people" means a corporation is people, not a corporation is a person.
People have rights, whether they are acting through a corporation or not. That's what Citizens United determined.
I hope you think about who misled you to thinking that "corporations are people" meant a corporation is a person and trust them a little less.
In most jurisdictions, a corporation is a juridical person[1]. When not explicitly mentioning natural persons, whether a corporation is a "person" is thus ambiguous.
[1]: https://en.wikipedia.org/wiki/Juridical_person
I agree that Tesla should receive punitive damages. And the size of the punitive damages must be enough to discourage bad behavior.
I'm not necessarily sure the victim(s) should get all of the punitive damages. $329 million is a gargantuan sum of money; it "feels" wrong to give a corporation-sized punishment to a small group of individuals. I could certainly see some proportion going toward funding regulatory agencies, but I fear the government getting the bulk of punitive damages would set up some perverse incentives.
I think in the absence of another alternative, giving it to the victim(s) is probably the best option. But is there an even better possible place to disburse the funds from these types of fines?
>> it "feels" wrong to give a corporation-sized punishment to a small group of individuals
This feeling has a name; loss aversion.
It's a really interesting human traits. About 66% of people feel bad when someone else does well. The impact of this feeling on behavior (even behavior that is self-harming) is instructive.
The concept of "Fairness" comes into play as well. Many people have an expectation that the "world is fair" despite every evidence that it isn't. That results in "everything I don't get is unfair" whereas "everything I get I earned on my own." Someone rlse getting a windfall is thus "unfair".
It is definitely not loss aversion. It also has nothing to do with whether or not someone else is getting the money. Handing me nearly half a billion because a parent died would certainty be welcome, but I think it would feel equally as disproportionate and out-of-place.
Killed, not just died.
That really doesn't sound like loss aversion.
> I'm not necessarily sure the victim(s) should get all of the punitive damages.
I have some great news for you, then: the attorney probably took a third (more if they win an appeal).
> But is there an even better possible place to disburse the funds from these types of fines?
Oh, my mistake: I thought you meant way worse.
$300M means very little to Tesla. The stock didn't even drop a bit (other than the usual market fluctuations today). Perhaps $4.20B or $6.90B would've been meaningful. Elon notices these numbers.
Not doing what it asks - “keep your hands on the wheel” and “eyes on the road” - and crashing the car is somehow Elon Musks’ fault LOL hn logic. Can’t wait to sue lane assist when I drive drunk and crash!
It's supposed to stop if objects appear in its path. For sure you're an idiot if you trust Tesla's autopilot, but I think it's reasonable to partially fault Tesla for setting the consumer's expectation that the car stops when obstacles get in the way even if the vehicle isn't being operated exactly as suggested by the manufacturer.
"Somehow?" We're literally discussing a court case where culpability was proven and accepted by a judge.
Maybe system should sound alarm and slow down immediately when there is no hands on wheel and eyes not on the road. Would have avoided this accident, so it seems Tesla was at fault for not doing that.
> If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.
What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
This is obviously targeted and the court system should not be playing favorites or going after political opponents
> What behavior do you want them to change?
Don't advertise their driver assist system as "full self driving".
The system involved in this crash was never advertised as "full self driving".
From https://web.archive.org/web/20211002045634/https://www.tesla...:
> Tesla cars come standard with advanced hardware capable of providing Autopilot features, and *full self-driving capabilities* — through software updates designed to improve functionality over time.
> Tesla's Autopilot AI team drives the future of autonomy of current and new generations of vehicles. Learn about the team and apply to help accelerate the world with *full self-driving*.
Now you can say that can be interpreted multiple ways - which means the copywriter is either incompetent, or intentionally misleading. Interestingly, the text from 2019 (https://web.archive.org/web/20191225054133/tesla.com/autopil...) is written a bit differently:
> ...full self-driving capabilities *in the future*...
> This is obviously targeted and the court system should not be playing favorites or going after political opponents
This was a jury trial of a civil case - the family of the deceased took Tesla to court, not an anti-Tesla/Musk court system conspiracy.
- FSD came out in October 2020; I suppose rounding up to 10 puts it nearly 10 years since. It also, literally, doubles the number from its actual value.
- There have been a lot more than one incident. This is one court case about one incident.
- There are an insane number of accidents reported; does it only matter to you if someone dies? A lot more than one person has died in an accident that involved a vehicle that was equipped with FSD.
- Your comment is obviously targeted and disingenuous.
There was even a recall over it: https://www.eastbaytimes.com/2023/02/16/tesla-full-self-driv...
So to answer your question of what one might want to come out of it, perhaps another recall where they fix the system or stop making false claims about what it can do.
I went into a Tesla dealership nearly 10 years ago to take a look at the cars, and the salespeople were telling me - in no uncertain terms - that the cars were fully self-driving.
I knew that was complete nonsense, because I knew people who worked on Tesla's self-driving software, but that's how Tesla salespeople were selling the cars.
> It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
And how many times did humans had to take over and save themselves and others from Tesla killing or injuring them? Tesla won't tell us this numbers, guess why ? The tech might be safe as a backup driver , but so far you need a human to pay attention to save himself from the car bugs/errors/glitches etc.
I really hate this bullshit safety claims pulled from someones ass, it is like me trying to convince you to get operated but an AI doctor by claiming "it is better then the a old and drunk doctor , he only killed a few people when the people supervising it did not payed attention but in the rest was very safe, we can't tell you how many times real doctors had to perform the hard work and our AI doctor only did stitching , those numbers need to be secret, but trust us the human doctors that had to intervene are just there because of the evil laws it could do the full job itself, we would not call it Full competent doctor if it can\t perform fully all expected tasks.
Tesla was found partially liable for this crash. The reason they were liable was they sold something claiming (practically speaking) that it could do something. The customer believed that claim. It failed to do that thing and killed people.
So the question then is - how much did Tesla benefit from claiming they could do this thing? That seems like a reasonable starting point for damages.
And the fine needs to be high enough to prevent them from just saying - oh, well, we can make money if we keep doing it.
If you could only fine a person for committing murder, you wouldn't fine a billionaire $5m, and then hope he wouldn't go on killing everyone he thinks he'd rather have dead than $5m.
> On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, (...)
Let me stop you right there. That's not how damages work.
Damages have two goals: compensate victims, and dissuade offenders from repeating the same mistakes. The latter involves punishments that discourage repeat offenses.
That's where these high values come from. They are intended to force the likes of Tesla to not ignore the lives they are ending due to their failures.
If damages were low, the likes of Tesla would do absolutely nothing and absorb them as operational expenses, and continue to cause deaths claiming they are unavoidable.
Once the likes of Tesla are forced to pay significant volumes of cash in damages, they suddenly find motives to take their design problems seriously.
I tend to agree, however the government is not an unincentivized incentivizer. By being able to impose such fines, the government is potentially itself incentivized to not prevent these accidents for they potentially cause this kind of revenue.
There are ways to mitigate this, such as forcing the government to use these revenues in a way that is relevant to the issue at hand, i.e. creating safety jobs, strengthening control authorities, or something else.
You could also say that the amount is insignificant, but that could of course change with every lawsuit, and it of course accumulates. Or one could speculate that the interests are not really monetarily aligned at all (e.g. prisons), or that the judicial system is independent enough to stop propagation of these incentives. I think it is still needed to consider and try to controlledly align these motives between the relevant actors.
> Damages have two goals: compensate victims, and dissuade offenders
Let me stop you right there. Just the compensatory damages were 129 million. And most of that was charged to the driver, no corporate boost there.
The US justice system uses punitive damages very heavily. And Tesla should absolutely get some punishment here.
On most other places you'd see it paying hundreds of millions in fines and a few millions in damages.
I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.
"[Plaintiffs] claimed Tesla’s Autopilot technology was flawed and deceptively marketed."
do you think they heard "autopilot" or "full self driving"?
I don't think these terms are meaningfully different in the heads of most people.
I know autopilot in airplanes is a set of assistive systems which don't remotely pretend to replace or obsolete humans. But that's not typically how it's used colloquially, and Tesla's marketing benefits heavily from the colloquial use of "autopilot" as something that can pilot a vehicle autonomously.
You really think the defense wouldn’t have objected if the wrong term was used, or that the judge would allow its continued use?
> I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.
It's only fair. If the name was fine when it was attracting the buyers who were mislead about the real capabilities, it must be fine when it causing the same to jurors.
There's another similar argument to be made about the massive amount awarded as damages, which maybe will be lowered on appeal. If people (Tesla included) can make the argument that when a car learns something or gets an "IQ" improvement they all do, then it stands to reason that when one car is dangerous they all are (or were, even for a time). There are millions of Teslas on the road today so proportionally it's a low amount per unsafe car.
"Autopilot" isn't even the most egregious Tesla marketing term since that honour goes to "Full Self-Driving", which according to the fine text "[does] not make the vehicle autonomous".
Tesla's self-driving advertising is all fucking garbage and then some George McGee browses Facebook while believing that his car is driving itself.
As gets pointed out ad nauseum, the very first "cruise control" product in cars was in fact called "Auto-Pilot". Also real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free.
This is a fake argument (post hoc rationalization): It invents a meaning to a phrase that seems reasonable but that has never been rigorously applied ever, and demands that one speaker, and only that one speaker, adhere to the ad hoc standard.
> real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free
Pilot here. If my G1000’s autopilot were flying and I dropped my phone, I’d pick it up. If my Subaru’s lane-keeping were engaged and I dropped me phone, I might try to feel around for it, but I would not go spelunking for several seconds.
The market Tesla is advertising to is not airplane pilots. It is the general car buying public.
If they are using any terms in their ads in ways other than the way the people the ads are aimed at (the general car buying public) can reasonably be expected to understand them, then I'd expect that could be considered to be negligent.
Much of the general public is going to get their entire idea of what an autopilot can do from what autopilots do in fiction.
The dictionary definition for Americans is:
> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.
https://ahdictionary.com/word/search.html?q=automatic+pilot
Note that “autopilot” and “automatic pilot” are synonyms.
https://ahdictionary.com/word/search.html?q=Autopilot
An autopilot is supposed to be an automatic system, which doesn’t imply supervision.
https://ahdictionary.com/word/search.html?q=automatic
> Self-regulating: an automatic washing machine.
Notably, an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way. It's just that the sky is very big and other aircraft are very small, so random collisions are extremely unlikely.
> an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way
TAWS (terrain) and ACAS (traffic) are built into modern autopilots.
And Tesla lied about its autopilot’s capabilities in proximity to this crash: “In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own. ‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.”
https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
Airplanes and automobiles differ in a number of ways.
That’s why we have a jury.
Autopilot quite literally means automatic pilot. Not “okay well maybe sometimes it’s automatic”.
This is why a jury is made up of the average person. The technical details of the language simply does not matter.
Couldn't agree more. This thing where words have a common definition and then a secret custom definition that only applies in courts is garbage. Everyone knows what "full self driving" means, either deliver that, come up with a new phrase or get your pants sued off for deceptive marketing.
Autopilot is used when referring to a plane (until Tesla started using it as a name for their cruise control that can steer and keep distance).
In the context of a plane, autopilot always meant automatic piloting at altitude, and everyone knew it was the human pilots that were taking off and landing the plane.
Did they?
I think you may be overestimating how much average people know about autopilot systems.
Pilot is a relatively high status career. They are always shown taking off and landing in tv shows and movies. I would be surprised if people thought they just sit back and relax the entire time.
Be prepared to be surprised
It's also worth mentioning he would have been required to keep his hands on the wheel while using autopilot, or else it starts beeping at you and eventually disables the feature entirely. The system makes it very clear you're still in control, and it will permanently disable itself if it thinks you're not paying attention too many times (you get 5 strikes for your ownership duration).
The strikes reset after a week, they do not persist for the duration of your ownership of the vehicle.
https://www.tesla.com/support/autopilot - section “How long does Autopilot suspension last?”
>you get 5 strikes for your ownership duration).
It isn't ownership of the device disables things against you. That's licensing at best.
As is also pointed out ad nauseam, the claims made about autopilot (Tesla) go far beyond the name, partly because they sold a lot of cars on lies about imminent "FSD" and partly because as always Elon Musk can't keep his mouth shut. The issue isn't just the name, it's that the name was part of a full-court-press to mislead customers and probably regulators.
Is there any contextual difference between the first instance of cruise control (which has since been relabeled cruise control, perhaps with reason), automatic flight control, and a company whose CEO and fanboys incessantly talk about vehicle autonomy?
> On one hand I don't think you can apply a price to a human life
Yes, although courts do this all the time. Even if you believe this as solely manufacturer error, there are precedents. Consider General Motors ignition switch recalls. This affected 800k vehicles and resulted in 124 deaths.
> As part of the Deferred Prosecution Agreement, GM agreed to forfeit $900 million to the United States.[4][51] GM gave $600 million in compensation to surviving victims of accidents caused by faulty ignition switches
So about $5m per death, and 300m to the government. This seems excessive for one death, even if you believe Tesla was completely at fault. And the fact that this is the only such case (?) since 2019, seems like the fault isn't really on the manufacturer side.
https://en.wikipedia.org/wiki/General_Motors_ignition_switch...
If you you make a manufacturing error without intentionally deceiving your customers through deceptive naming of features, you have to pay millions per death.
If you intentionally give the feature a deceptive name like "autopilot", and then customers rely on that deceptive name to take their eyes off the road, then you have to pay hundreds of millions per death.
Makes sense to me.
Wouldn't that logic mean any automaker advertising a "collision avoidance system" should be held liable whenever a car crashes into something?
In practice, they are not, because the fine print always clarifies that the feature works only under specific conditions and that the driver remains responsible. Tesla's Autopilot and FSD come with the same kind of disclaimers. The underlying principle is the same.
There are plenty of accurate names Tesla could have selected.
They could have named it "adaptive cruise control with assisted lane-keeping".
Instead their customers are intentionally led to believe it's as safe and autonomous as an airliner's autopilot.
Disclaimers don't compensate for a deceptive name, endless false promises and nonstop marketing hype.
If it was called "comprehensive collision avoidance system" then yes.
Right, this is the frustrating thing about court room activism and general anger towards Tesla. By any stretch of the imagination, this technology is reasonably safe. It has over 3.6 billion miles, currently about 8m miles per day. By all reasonable measures, this technology is safe and useful. I could see why plaintiffs go after Tesla. They have a big target on their back for whatever reason, and activist judges go along. But I don't get how someone on the outside can look at this and think that this technology or marketing over the last 10 years is somehow deceptive or dangerous.
https://teslanorth.com/2025/03/28/teslas-full-self-driving-s...
> activist judges
Wait what? What activism is the judge doing here? The jury is the one that comes up with the verdict and damage award, no?
The product simply should not be called Autopilot. Anyone with any common sense could predict that many people will (quite reasonably) assume that a feature called Autopilot functions as a true autopilot, and that misunderstanding will lead to fatalities.
> feature called Autopilot functions as a true autopilot
What's a "true autopilot"? In airplanes, autopilot systems traditionally keep heading, altitude, and speed, but pilots are still required to monitor and take over when necessary. It's not hands-off or fully autonomous.
I would argue you are creating a definition of "autopilot" that most people do not agree with.
It can be called anything in an airplane because there the pilot has some level of training with the system and understands its limits. You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.
A car is another thing entirely because the general population's definition of "autopilot" does come into play and sometimes without proper training or education. I can just go rent a tesla right now.
>You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.
...Um... You did get pilots hopping into a 737 MAX, getting training that barely mentioned an automated anti-stall and flight envelope management system called MCAS that eventually flew several planeloads of people into the ground. That was managements idea too, btw.
IANAP but I think they can take their hands off the controls and pick up a dropped phone.
So you can on tesla when used correctly. Can you enable plane autopilot and still crash into a mountain?
Modern autopilots? No, they will not crash into a mountain or another plane.
So you’re a pilot?
You asked the question to someone you knew wasn't a pilot.
So why are you ignoring the answer you got, instead waiting to hear if the person that answered is a pilot before you acknowledge it?
The word literally means automatic pilot. Legal departments create an alternate definition of the word to not get sued and most people will interpret the word literally.
What matters is the definition which most people use.
https://news.ycombinator.com/item?id=44761341
I guess I don't understand how.
> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.
Applies here. As far as I can tell the system did do exactly that. But the details of the actual event are unclear (I've heard parked car but also just hitting the back of another car?)
It’s an emergent technology. The onus is on Tesla to be crystal clear about capabilities, and consistently so. People might quite reasonably infer that something which is labeled as “auto-“ or “automatic” doesn’t require supervision.
Such as automobile?
Automobiles are self-propelled, not self-navigating. They don't rely on horses to pull them.
automobile: self-propelled (self moving)
autopilot: self-piloting (self navigating)
https://www.ahdictionary.com/word/search.html?q=automobile
Indeed, since you do not need to externally supervise the drivetrain in a car.
It didnt maintain the course. It kept going at a t intersection and ploughed into another car
Anyone who used it knows its limitations. IDK maybe in 2019 it was different tho, now it's full of warnings that make it barely useable when distracted. Ironically you are better off disabling it and staring into your phone, which seems what regulators actually want.
And by the way what is true autopilot? Is the average joe a 787 pilot who's also autopilot master?
Funny that pretty much every car ships with autosteer now. Ones I've used didn't seem to have much warnings, explanations, disclaimers or agreements that pundits here assume it should.
A true autopilot is a system on a boat or aircraft that keeps it on a set heading. ISTM in this case that's what the autopilot did.
Boats and aircraft are both different from automobiles. They have full freedom of movement. You can't set a course in the same way with an automobile, because the automobile will need to follow a roadway and possibly make turns at intersections. Boats and aircraft can have waypoints, but those waypoints can be mostly arbitrary, whereas a car needs to conform its path to the roadway, traffic, signage, etc.
It's an entirely different domain.
Yes, an autopilot is not what you need on a car.
There's two conflicting goals here, Tesla's marketing department would really like to make you think the car is fully autonomous for financial reasons (hence autopilot and full self driving) and then there's Tesla's legal department which would prefer to blame somebody else for their poor software.
The fault with an individual can be reasonably constrained to the one prosecuted death they caused. The fault with "autopilot by Tesla", a system that was marketed and deployed at scale, cannot.
And if you want to draw parallels with individuals, an individual driver's license would be automatically suspended and revoked when found at fault for manslaughter. Would you propose a minimum 1~3 year ban on autopilot-by-Tesla within the US, instead?
*Punitive damages*. From another article: "They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident." If Tesla is destroying evidence then yeah they ought to feel the pain, and those personally responsible should be charged as well. If you make it cheaper to evade the law than comply, what good is the court at all?
329 million is not just compensatory damages (the value of the human life) but also punitive damages. That number floats up to whatever it takes to disincentivize Tesla in the future.
329 million too high? if you had the money and handing it over would save your life, would you rather keep the money as a corpse?
So wrongful death liability should be infinite, or maybe just equal to the money supply (pick one, I guess)?
I just did some googling around:
> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.
> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.
-https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...
Nothing enrages a judge faster then an attempt to conceal evidence that a court has ordered be turned over during discovery. If this is then I suspect the punitive damages have to do as much about disregard to the legal process as it is the case itself.
I think the conceptually messed up part is, when such an award includes separate components for compensatory damages and punitive damages, the plaintiff receives the punitive damages even if they're part of a much broader class that was impacted by the conduct in question. E.g. how many people's choice to purchase a Tesla was influenced by the deceptive marketing? How many other people had accidents or some damages? I think there ought to be a mechanism where the punitive portion rolls into the beginning of a fund for a whole class, and could be used to defray some costs of bringing a larger class action, or dispersed directly to other parties.
So if I make a lawsuit and prove there is a small possibility my toaster can cause my arm to be cut off, because that’s what it did to me, and win $400,000,000 I should only get $400 if it turns out they sold 1 million units?
It’s not a class action lawsuit. If they want their cash they should sue too. That’s how our system works.
No, you misread that pretty significantly. They're only talking about splitting up the punitive damages.
Using the tesla numbers you'd get somewhere between 43 and 129 million dollars personally, plus a share of the 200 million. And your share of the 200 million would probably be a lot higher than someone merely defrauded. And the 200 million would probably get a lot bigger in a proper class action.
If you want numbers for your theoretical you'll have to specify how much was compensatory and how much was punitive.
>I don't think you can apply a price to a human life
Not only we can, but it also done routinely. For example, see this Practical Engineering video: https://www.youtube.com/watch?v=xQbaVdge7kU
It's the sum of damages and punitive damages.
On the flip side: Penalties should be scaled relative to one's means so that the wealthy (whether people or corporations) actually feel the pain & learn from their mistakes. Otherwise penalties for the wealthy are like a cup of coffee for the average Joe -- just a "cost of business."
I'm also a big proponent of exponential backoff for repeat offenders.
Let's fine Apple too, since they allow smartphones to be using while driving.
If anyone's confused about what to expect of autopilot and/or FSD, it's Tesla's doing and they should be getting fined into oblivion for the confusion and risks they're creating.
[dead]
It's sort of like with the Prius acceleration debacle we had, people always want to blame the car and not their own actions.
> Tesla Model S in Autopilot mode allegedly came to a T-intersection and, failing to see that the roadway was ending, kept his foot on the accelerator
Autopilot requires you to have your foot on the accelerator? That seems weird to me.
Normally, Autopilot does not require your foot on the accelerator. Pressing the accelerator overrides it to make to car move ahead where the autopilot decided to stop. At the time, autopilot would always stop at an intersection, and the driver was supposed to move ahead when it is clear.
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.
Hard for me to see this as anything but the driver’s fault. If you drop your phone, pull over and pick it up or just leave it on the floor. Everyone knows, and the car tells you, to pay attention and remain ready to take over.
The argument is that if the driver was in a different vehicle he would have done just that, pulled over and picked it up, but because he believed the Tesla was capable of driving safely on it's own he didn't do so.
Normally I turn the steering wheel when I want to turn my car. If you sold me a car and told me it had technology to make turns automatically without my input then I might let go of the wheel instead of turning it, something I would never have done otherwise. If I then don't turn and slam straight into a wall, am I at fault for trusting what I was sold to be true?
If the driver has heard that their Tesla is capable of autonomous driving, and therefore trusts it to drive itself, there may be a fair argument that Tesla shares in that blame. If it's a completely unreasonable belief (like me believing my 1998 Toyota is capable of self driving) then that argument falls apart. But if Tesla has promoted their self driving feature as being fully functional, used confusing descriptions like "Full Self-Driving", etc, it might become a pretty reasonable argument.
Every time I engage Autopilot in my Model S it admonishes me with a notice in the instrument cluster that I am to keep my hands on the wheel. If I don't make it clear the car that I am there and holding on by applying a little rotational force to the wheel at least every fifteen seconds the car will remind me.
So how does one conclude the that the car is capable of driving itself? Or is the version of Autopilot in the car in question different in this respect?
Autopilot is not autonomous driving and isn't marketed as such; Full Self Driving (FSD) is an extra cost option.
Does it remind you thay it cant stop at an intersection, or brake to avoid hitting objects? If it did then the person might be more responsible. But elon wouldnt have let the engineers put that in because it goes against his grift
Except driver already accepted liability.
Also this doesn't stand water today as all new cars have some basic autosteer.
No one else falsely advertises it as "Full Self Driving".
FSD wasn’t even released at that time.
Substitute Autopilot then.
No other auto maker uses similar language. Ford and GM use BlueCruise and SuperCruise, clearly implying an improved kind of cruise control.
From what bits I've seen of the trial, at least a chunk of it was devoted to the meaning of the word "autopilot". Tesla even brought in a linguist to argue their case: https://bsky.app/profile/niedermeyer.online/post/3lunfw2s2is...
Makes sense. If you’re Tesla’s lawyer you need to do whatever you can to try to prove the name wasn’t a promise of functionality it didn’t meet.
Autopilot in an aircraft is basically cruise control, not full self flying.
This is such a pointless talking point, it amazes me how pro-Tesla people have latched onto it.
It doesn’t matter what it means. It matters what people think it means.
This is discussed tons of other places in these comments, and every previous story about Autopilot and FSD here on HN.
It also ignores how Tesla promoted "autopilot". Until very recently tesla.com/autopilot just showed that a video saying the driver was only there for legal reasons. Yes maybe technically they meant FSD (for which it's also a lie, and which has a lying name as well) but they were definitely mixing the terms up themselves (and I think the video predated FSD)
The model s has terrible phone docks. Don't get me started on cupholders, I'll bet people have drink mishaps all the time that affect driving.
I'm actually kind of serious about this - keeping people's stuff secure and organized is important in a moving car.
I'm surprised the touchscreen controls and retiring of stalks aren't coming under more safetly scrutiny.
With the new cars without a PRND stalk, how can you quickly reverse the car if you nose out too far and someone is coming from the side? will the car reverse or go forward into danger?
What Tesla touchscreen controls are crucial for driving? FWIW climate is quick little swipe, arguably easier than turning dial.
i’d like to hear that argument.
The driver at that time was Tesla Autopilot. So yeah, drivers fault as the jury said.
1/3 Tesla's fault, 2/3 the operator's
That sounds indistinguishable from “The driver at the time was the motor”
Motors dont make decisions. Teslas says autopilot does
And why was his mobile phone in his hand to drop, if he was driving? Most states have laws against mobile device usage while driving, and it was never a responsible thing to do even before the laws were enacted.
Perhaps he thought it was safe. After all, he had autopilot.
Sure--dangerous and wrong. Despite that, Autopilot was driving at the time.
Why would you pull over when you paid top dollars for Autopilot?
Maybe he would have pulled over if the car’s capabilities hadn’t been oversold. Two entities did stupid things: 1) The person by not waiting to pull over because Elon Musk’s false claims, and 2) Tesla via Elon Musk making those false claims.
It passes a classic “but for…” test in causality.
The case number appears to be 1:21-cv-21940-BB (S.D. Fla.).
I practice in that court regularly. Beth Bloom has a reputation as a good trial judge, so I'm somewhat skeptical of Tesla's claims that the trial was rife with errors. That said, both the Southern District and the Eleventh Circuit are known to frequently and readily lop off sizable chunks of punitive damages awards.
The real battle is just beginning: post-trial motions (including for remittitur) will be due in about a month. Then an appeal will likely follow.
The articles has been updated (correction) to say Tesla is liable for a portion of the damages. Not all of it.
33% liable
https://www.cnbc.com/2025/08/01/tesla-must-pay-329-million-i...
33% for the compensatory damages and 100% of the punitive damages according to the article. So total of about $242 mm.
Wellbeing doesn't have a price tag. There's no amount of money someone could pay you to make up for your daughter dying early or your son becoming disabled.
However, $329M sounds like an imaginary amount of money in a liability claim. If this guy crashed into a parked car without Tesla being involved, the family would be unfathomably lucky to even get 1% of that amount.
Perhaps the driver would have acted differently if Tesla hadnt sold him a product called "autopilot."
But if it was called 'lane assist' or 'adaptive cruise' he might not have bought the car in the first place.
I agree. If Tesla was not allowed to give their product a misleading name, it probably would get fewer sales. That's the rationale behind false advertising laws.
Indeed. So they called it autopilot and then this accident happened and that's the reality we're actually living in now.
Typically you need better evidence than "perhaps" to walk out of court with millions.
Yes that's why the court heard testimonies from linguistics experts etc.
I'm not an attorney, but I think typically juries operate on the "reasonable person" standard.
The supreme court has ruled on punitive damage awards going back to the 1990s and limited the excesses. You can't for example get $4 million in punitive damages on $4000 in actual damages. The general rule from the looks of the article is 5x actual damages.
https://corporate.findlaw.com/litigation-disputes/punitive-d...
Most of that $329 million is punitive, not compensatory.
I dont understand this at all. Mercedes recently announced that theyd be liable for accidentals caused by their level 4 system meaning that by default, all other self driving features are driver liable?
Mercedes Benz system in California and Nevada is level 3 autonomous.
https://www.kbb.com/car-advice/level-3-autonomy-what-car-buy...
https://cars.usnews.com/cars-trucks/features/mercedes-level-...
> Mercedes Chief Technology Officer Markus Schäfer recently told Automotive News that Level 4 autonomy is "doable" by 2030.
Level 3 and level 2 are VERY different.
That’s apples and oranges. And Mercedes saying something certainly doesn’t change the law.
Mercedes built a sophisticated system designed to be able to handle what it might run into within the very limited domain it was designed for. Then they got it certified and approved by the government. Among other things, it only works at low speed. Not 60 MPH. And I think only on highways, not where there are intersections.
Tesla‘s system is not child’s play, sure. But they unleashed it everywhere. Without any certification. And relatively few limits, which I think they later had to tighten.
Mercedes was minimizing risk. Tesla was not.
I haven't looked at it but that may be regulatory or insurance liability vs civil suits which in America like to invent fantastical numbers.
If Tesla is partially responsible for this fatal accident, who or what else was ruled as contributing to the crash?
Well, I dont think anyone else contributed a system called "autopilot."
And yet Tesla was only found 33% accountable, so the presence of "autopilot" was only 1/3 of the cause.
The article is clear. The driver bore the rest of the responsibility.
I'd guess the driver since he was using his phone, then dropped it and was scrambling to pick it up when the accident happened.
"Vehicle will not break when accelerator pedal is applied" is displayed on the screen in AutoPilot mode. The system warns you this every time you have foot on the pedal in AutoPilot mode. I wonder if it did that back in 2019 as well or if accidents like this spurred that as a UI change.
Also, how the heck is Mr. McGee supposed to come up with the other 67% of this judgment?
Yes I'm pretty sure that my 2015 Model S has always warned me that the car will not brake if I command it not to with my foot on the accelerator. It certainly did in 2019.
> Also, how the heck is Mr. McGee supposed to come up with the other 67% of this judgment?
He probably can't. Assuming the amount stands on appeal, and assuming that he doesn't have an insurance policy with policy limits high enough to cover it, he'll pay as much as he can out of personal assets and probably have his wages garnished.
He might also be able to declare bankruptcy to get out of some or all of it.
An interesting question is what the plaintiffs can do if they just cannot get anywhere near the amount owed from him.
A handful of states have "joint and several liability" for most torts. If this had been in one of those states the way damages work when there are multiple defendants is:
• Each defendant is fully responsible for all damages
• The plaintiff can not collect more than the total damage award
In such a state the plaintiffs could simply ask Tesla for the entire $329 million and Tesla would have to pay. Tesla would then have a claim for $220 million against McGee.
Of course McGee probably can't come up with that, so McGee still ends up in bankruptcy, but instead of plaintiffs being shortchanged by $220 million it would be Tesla getting shortchanged.
The idea behind joint and several liability is that in situations like there where someone has to be screwed it shouldn't be the innocent plaintiff.
Many other states have "modified joint and several liability". In those states rather than each defendant being fully responsible, only defendants whose share of the fault exceeds some threshold (often 50%) are fully responsible.
For example suppose there are three plaintiffs whose fault shares are 60%, 30%, and 10%. In a pure joint and several liability state plaintiff could collect from whichever are most able to come up with the money. If that's the 10% one they could demand 100% from them and leave it to that defendant to try to get reimbursed from the others.
In a modified joint and several liability state with a 50% threshold the most the plaintiff can ask from the 30% and 10% defendants is 30% and 10% respectively. They could ask the 60% defendant for 100% instead. If the 60% defendant is the deep pockets defendant that works out fine for the plaintiff, but if it is the 10% one and the other two are poor then the plaintiff is the one that gets screwed.
Finally there are some states that have gotten rid of joint and several liability, including Florida in 2006. In those states plaintiff can only collect from each defendant bases on their share of fault. Tesla pays their 33%, McGee pays whatever tiny amount he can, and the plaintiff is screwed.
If I were running a car company, I may decide to stop development on assistive features and remove any that exist from future models. The way people are using them, and the courts are punishing the companies for adding these safety features, is it worth it to them? Not having them would also bring down the base price on vehicles, which seems to be the primary complaint from people right now.
It's not assistive features in general that are the problem. It's that level 3 self-drive (which is what Tesla purports to have) should not be permitted to exist.
Things which simply help the driver, fine. Lane assist, adaptive cruise control and other such things that remove some of the workload but still leave the driver with the full responsibility of control. That's level 2.
Level 3 can generally run the car but is not capable of dealing with everything it might encounter. That's where you get danger--this guy thought he could pick up his dropped phone, his car didn't even manage to stay in it's lane. And the jury quite correctly said that's totally unacceptable.
The next step above level 2 should be skipping to level 4--a car that is capable of making a safe choice in all circumstances. That's Waymo, not Tesla. A Waymo will refuse to leave the area with the sufficiently detailed mapping it needs. A Waymo might encounter something on the road that it can't figure out--if that happens it will stop and wait for human help. Waymos can ask their control center for guidance (the remote operators do not actually drive, they tell the car what to do about what it doesn't understand) and can turn control over to a local first responder.
Tesla is level 2 autonomous.
https://www.kbb.com/car-advice/level-3-autonomy-what-car-buy...
The only level 3 autonomous vehicles available for purchase in the U.S. are certain Mercedes-Benz models, and it only works on select roads in California and Nevada.
> Mercedes-Benz Drive Pilot: California and Nevada are currently the only states with roads approved for the Mercedes-Benz Drive Pilot. For 2025, only the EQS and S-Class offer Level 3 and then it’s by subscription.
And that’s the problem.
Autopilot is/was fancy cruise-control. They don’t treat it like that or talk about it like that.
You probably should have read the article because this lawsuit is specifically about the product that you just said is "fine."
Well, it becomes a lot less fine if you call it "full self driving", because it very much is not that.
Tesla hid evidence and was probably punished by the jury and I hardly feel sorry for Tesla. Maybe next time they will not do that, but I doubt it. Transparency was never a thing baked into the Tesla Musk DNA.
https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...
Or just not do false advertising.
Important: headline was just updated,
> CORRECTION: Tesla must pay portion of $329 million in damages after fatal Autopilot crash, jury says
This was added,
> The jury determined Tesla should be held 33% responsible for the fatal crash. That means the automaker would be responsible for about $42.5 million in compensatory damages.
Whatever one believes about the state of the FSD / Autopilot today (it still doesn't work well enough to be safe and probably won't for the next 5 years), I just don't see how one can argue that almost 10 years ago, when this thing would go into a fatal accident every 10 miles while Tesla was arguing it's safer than a human and will go from LA to NY autonomously within the year, was not a total bullshit a deceptive marketing.
"The person in the driver's seat is only there for legal reasons" - that was 2016... Funnily enough, in 2025 they are rolling exactly the same idea, as a Robotaxi, in California. Amazing.
I bet most of the prosecution time was spent "explaining" the meaning of autopilot to the jury. Easy money.
Airplanes have autopilots too but pilots are still sitting there on the controls and have to be paying attention. A brand or technical name doesn't remove responsibility.
This wasn't a plane though, and the driver wasn't a pilot
Car travel is such a weird thing. The other day I was walking on the driver side of a wide street where cars were passing me by very fast. Most of us are usually an arms length away from passing cars when we do that, roughly an arms length away from being hit at 20+ mph, yet humans do this careful beam walking without flinching. It’s the American version of Indians riding the roof of a train. If you goto a third world country, they are even more acclimated to insanely dangerous traffic conditions (no traffic laws being followed, cars flying out from every angle). The whole thing is insanity yet everyone is just going along with it.
A life built around car commutes is the outcome of a poorly thought out society. It had such stupid repercussions like racial segregation and the absurdity that everyone gets their own house along with the vertical airspace on top of it (exceedingly selfish). Along with that, it’s dangerous as fuck.
Someone needs to suggest dropping the speed limit across the board in America. Unfortunately the MAGA people will straight boycott roads if we do that (unfortunately?). Most people I talk to in Texas will concede the highways are full of maniacs.
Interestingly, removing this illusion of a safe "beam" or division is part of safer road design for mixed traffic areas like city streets.
When you remove physical divides between "road" and a "footpath" drivers become more aware that they are dangerous and slow down on their own. Making roads narrower has a similar effect, and removing car parking helps remove that divide and removes the blindspots they create.
There are many roads that should either stop pretending to be safe for pedestrians and turn into highways, or lower the speed limit and redesign the layout to suit the mixed traffic nature of the areas. Doesn't have to be extreme just be honest about the real use of the roads. If it's actually mostly pedestrians, suck it up, make it safe, go drive on the other faster roads.
One of the things that is particularly strange about traveling to the United States is that you pretty much have no choice but to rent a car or at least have access to a car in some way, unless you’re going to be spending all your time in a place like New York City. But in nearly every other major western destination, certainly throughout much of Europe, it’s never even a thought that you would need to rent a car. I know plenty of grown adults here in Europe they don’t even have a drivers license! It really makes life a lot more convenient when the world does not revolve around an automobile.
Merely dropping speed limits doesn’t work. Speed limits not enforced by road design such as road/lane width (real or by parked cars) or other features (bulb outs, neck downs) are speed suggestions.
(tangent)
Train travel is such a weird thing. The service, staff, stock, pricing, timetable etc is set by civil servants and politicians 250 miles away.
The ticketing system is Byzantine and if you make a mistake you get a fine or a criminal record, unless you look like a thug, in which case the staff rarely bother.
Tickets are miraculously priced similar to driving in many instances despite a train looking nothing like a car. In various cases, more expensive than driving
You can be late because a politician 200 miles away annoyed the staff so they decide not to turn up.
It also turns out services on Sunday ran on good will (overtime), so the annoyed staff who get ok pay rises now realised they don't want the overtime so good luck travelling on a Sunday in quite a few areas.
Because of low enforcement of infringements by thugs, they are drawn to the service. You'd think your very high ticket price would include some kind of security? No. Fuck you.
When the trains break down you are expected to be OK with being trapped on a train for hours with no open windows or working toilets because of some "greater good" and insufficient emergency response
I used to be extremely pro public transport
Such is life in dense cities where workplaces are concentrated. And pace of life requires humans to take shorter path which is safe enough. At least it’s much safer than in the era of the horses. And sanitary conditions are much better.
Density is fundamentally incompatible with car-friendly cities and a large amount of road. In general, the denser a city becomes, the more people will go by foot, cycle, or use public transport, especially in Europe.
Roads take up an enormous amount of space, as does parking. We could capture significant real estate opportunities back from roads in cities that have multi-lane roads going through the CBD (mine has like a dozen criss crossing it). It wouldn't be an overnight thing, but close off a block of road, rezone it and sell it off, you'll get a bunch of new high density buildings that are walkable (since any roads going through will have to be narrow)
And then you get business parks on the outskirts of the city. Because otherwise both employees and visitors complain about parking. Which makes cheaper suburbs more compelling. And some people living downtown driving to workplaces in outskirts of the city.
At least that’s what is happening in my whereabouts. I remember when I was kid downtown was prime location for regular shopping and doing daily business. Now all of that moved to big box stores along major traffic arteries. Nowadays downtown is just a tourist trap with super expensive boutiques. I still go there to take a walk once a year or so, but besides that there’s no reason to there anymore for vast majority of people. But suburbs life is flourishing… With shops and services that are needed for every day life. Cheap and good eateries. Even casual entertainment is moving away from the city core.
All in all, not sure if it’s good or bad. I couldn’t afford living in downtown even in old era. And now I can live comfy life in suburbs. Not helping much to lower total amount of traffic though.
Oh it is. You get traffic jams and that’s not exactly car-friendly. But still people drive. On top of that, walking on a sidewalk when a bus pass close by is much more interesting than a car passing. And then there’s all the commercial traffic that is necessary to serve a dense city.
And many cities, especially here in Europe, are dense enough to need some mode of transit. Especially with workplaces concentrated either in office parks or industrial districts by the city. But not multi million megapolies that warrant a sweet metro system. And that’s where cars shine.
Although after living in one of the largest cities on earth with a very nice train system… there’s still plenty of traffic. And people still drive. Especially those who can’t afford living in city core and end up in cheaper more remote locations.
If traffic in a city was mostly busses and trucks you'd only need two lanes for them and the walking experience could get a lot nicer.
And yet it works surprisingly well.
When I lived in London, people waiting for the tube were leas than an arm’s length away when it rushed in. Blew my mind. Eventually I got used to it and did the same.
The difference is that the tube driver (assuming there even is one) can't just decide to take a slight turn from the planned route, while the car driver really might.
Yet of all the traffic accidents, the one you describe never happens. So it’s not much difference
It happens all of the time. Cars veering off the road and maiming pedestrians is a fairly common occurrence.
Sorry to cite German sources, but google translate should help you out
August 2024: 8 Year old “caught by a car on the pavement” https://www.swr.de/swraktuell/baden-wuerttemberg/ulm/kind-be... October 2024: A Mother and two children killed by an SUV that “veered off the road” https://www.merkur.de/deutschland/baden-wuerttemberg/erfasst... May 2025: 2 Children injured, on their way home from school https://www.focus.de/panorama/welt/20-meter-von-schule-entfe... July 2025: 2 pedestrians injured https://www.sueddeutsche.de/muenchen/starnberg/starnberg-aut...
It does happen, it happened to me — a driver was distracted (because they were texting while driving) and they veered from the traffic flow and I was hit at 40 mph. The EMS worker who responded to the scene told me two things: 1. They almost never pull people alive from this kind of accident, 2. This kind of head-on, no brakes, distracted driver accident was happening more and more. This was 10 years ago.
Sorry if I misunderstand but you don't think cars come off the road and impact pedestrians? Happens for a myriad of reasons, but it does happen.
The tube follows fixed tracks, which is quite different from cars on a road.
This is why you can run trams in pedestrian only areas and have very few impacts. Trained professional driving it, predictable speed and direction.
Yeah but look at how normalized it is. All you need to do is slightly misstep, out of the thousands of steps you take, to bump your head on a passing train. You think it’s not possible (part of the trance) but all the parameters are set up so that the slot machine eventually hits the jackpot. You’re 12 inches away from a projectile, and you are to repeat the loop daily for a lifetime. The odds are the odds.
"Tesla designed Autopilot only for controlled-access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans" https://www.bbc.com/news/articles/c93dqpkwx4xo
This is a clear case of misleading marketing. Spreading deceptive information and then hiding contradictory details in the fine print simply won't fly.
Intent is central to legal interpretation. There's been a clear intent to mislead about Tesla Autopilot's capabilities for years, and it's finally starting to backfire. It doesn't matter if Tesla is, on average, better if individual cases are lost due to constant misleading marketing.
Tesla tried to hide evidence … the jury probably did not like to be lied to by Tesla.
F** around and find out
> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.
> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.
https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...
"Schreiber acknowledged that the driver, George McGee, was negligent when he blew through flashing lights, a stop sign and a T-intersection at 62 miles an hour before slamming into a Chevrolet Tahoe that the couple had parked to get a look at the stars."
Probably the reason why the driver was assigned 66% of the blame.. Did you have a point?
Does that change the shitty behavior of Tesla?
Interesting reading https://www.tesladeaths.com/, which is linked in the article at the end.
Corps operate in jurisdictions where this is limited to hundreds of thousands regardless of what the jury says. tesla will never pay this.
> “Tesla designed Autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” Brett Schreiber, counsel for the plaintiffs, said in an e-mailed statement on Friday.
Tesla had every opportunity to do the safe thing and limit Autopilot to only highways, but they want the hype of a product that "works anywhere" and don't want to be seen as offering a "limited" product like their competitors. Then they are surprised their users misuse it and blame it all on the driver. Tesla wants to have their cake and eat it too.
You know what else prevents misuse? Implementing safeguards for your safety critical product. It took multiple NHTSA probes for Tesla to implement obvious driver monitoring improvements to Autopilot/FSD. There's a reason Tesla is always in hot water: they simply lack the safety culture to do the right thing proactively.
A lot of discussion in this thread about the technical meaning of “autopilot” and its capabilities vs FSD.
This is really missing the point. Tesla could have called it “unicorn mode” and the result would still be the same.
The true issue at hand is that Elon Musk has been banding about telling people that their cars are going to drive themselves completely for over a decade now and overstating teslas capabilities in this area. Based on the sum totality of the messaging, many lay consumers believe teslas have been able to safely drive themselves unsupervised for a long time.
From a culpability standpoint, you can’t put all this hype out and then claim it doesn’t matter because technically the fine print says otherwise.
> it doesn’t matter because technically the fine print says otherwise.
Every time you engage the system it tells you to pay attention. It also has sensors to detect when you don’t and forces you. If you have more than N violations in a trip, the system is unavailable for the remainder of your trip.
I don’t know how much clearer it could be.
I would argue that the system is actually so good (but imperfect) that people overestimate how good it is, and let their guard down.
If a system were more error prone, people would not trust it so much.
> I don’t know how much clearer it could be.
Maybe not give it a misleading name that implies full self-driving capabilities. Also not have the CEO publicly make grandiose claims of the performance over 8 years.
> If a system were more error prone, people would not trust it so much.
Unfortunately not. Youtube is full of videos of FSD trying to crash into oncoming traffic, parked cars etc, but then at the end of the video the driver goes “well that was pretty impressive” and just ignores all the suicide attempts.
The car in question didn't have FSD as far as I can tell.
Look at what Elon and even Tesla official account are publishing daily on Twitter. The tweet suggestions like your car can drive you even when you have a health emergency and need to drive to ER or that you can make an espresso while your car is driving itself.
Can you point to any other automaker that even come close to this level of false advertising ?
Which is kind of the entire problem. It’s Tesla bringing this grey-zone product to market that the Jury/Judge found to be problematic (and thus ruled against Tesla).
Tesla has always been blasé about safety.
Good lord, $129 million in compensatory damages is absurd.
One thing you notice when visiting Florida is that every other billboard says "In a wreck? Get a check! Call Lawyer Ambulance Chaser Jones!!"
How much is your life worth?
to me? an infinite amount! So trying to measure it in dollar terms is already a pointless exercise.
If you want to see the hole in this argument, try looking at it from the other side- How can you arrive at the conclusion that a life is worth only $329M? Why so low? Why not a billion, or a trillion?
If you have to come up with an objective and dispassionate standard, then I doubt you'd find one where the number it generates is $329 million.
> So trying to measure it in dollar terms is already a pointless exercise.
That strikes me as an argument for a judgement of zero dollars, which is the exact opposite of the reality that life is worth infinitely more than money.
What is a reasonable forecast of your personal balance sheet at the time of your natural death?
Of course human life itself doesn't have a dollar value, but that also means that money cannot replace a lost life.
> money cannot replace a lost life
So, money is basically worthless compared to life. That's why there are huge judgements when someone is found at fault for the death of another.
I think Tesla's apparent carelessness is a factor in this judgement as well.
Probably a few million, tops.
[flagged]
The driver literally stopped paying attention to find his phone that he dropped. Is that really a fanboy thing or just total irresponsibility on the driver’s part?
It's true but it's not enough to excuse tesla.
[flagged]
Only 0.03% of market cap... Seems pretty cheap actually.
HN commenter: "On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly."
What this sentence is describing are compensatory damages, e.g., compensation for loss of life
That number was 129 not 329, and Tesla was only found liable for 33% of it
CNBC: "Tesla's payout is based on $129 million in compensatory damages, and $200 million in punitive damages against the company."
CNBC: "The jury determined Tesla should be held 33% responsible for the fatal crash. That means the automaker would be responsible for about $42.5 million in compensatory damages."
42.5 is not even close to the 329 number that the HN commenter's claims "feels too high"
HN commenter: "This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road"
This sentence is describing something more akin to punitive damages, e.g., a fine
That number was 200 not 329
It seems the HN commenter makes no distinction between compensatory and punitive damages
1. Why are you "replying" to someone but not using the reply feature?
2. The number that matters here is total compensatory damages, not Tesla's share.