Home » Tesla Using Full Self-Driving Beta Runs A Stop Sign In The Middle Of A Debate About FSD Beta

Tesla Using Full Self-Driving Beta Runs A Stop Sign In The Middle Of A Debate About FSD Beta

Fsd Stoprun Top
ADVERTISEMENT

We generally don’t cover every single video that shows Tesla’s Full Self-Driving (FSD) Beta software making some sort of an error, because it would get repetitive and boring pretty fast, and it’s not really representative of a system that is, largely, technically impressive. But, sometimes it’s still worth showing a case where the system didn’t perform as it should, because it’s worth reiterating that with Level 2 semi-automated systems like FSD Beta, the driver must always be ready to take over. It’s hard. It’s really hard. And when there’s a pretty dramatic failure that happens in the context of a live-streamed pro- and con test/debate about how well Tesla FSD Beta actually works, well, then that seems worth covering.

The test was organized by Dan O’Dowd, a prominent critic of Tesla’s semi-automated driving systems and founder of The Dawn Project, a public safety advocacy group with a website that does look a bit like a what cult might design, and Tesla investor and haircut enthusiast Ross Gerber. O’Dowd was on the critical side of Tesla FSD Beta, and Gerber was the defender.

Vidframe Min Top
Vidframe Min Bottom

Here’s the video of the livestream between O’Dowd and Gerber, cued up to the incident that’s getting all the attention, so you won’t have to slog through this whole thing.

That livestream is a bit choppy, so here’s a recorded feed from one of the cameras that shows the event more clearly:

ADVERTISEMENT

As you can see, the Tesla, with the pro-FSD Beta Gerber at the yoke, blew right through a stop sign and then continued, at about 35 mph, into the intersection, rapidly approaching perpendicular traffic, only to finally stop when Gerber hit the brakes, pretty hard, disengaging FSD Beta and bringing the car to an abrupt halt before they T-boned that white SUV, which gives them a justifiably annoyed honk.

If we look at this scene carefully, it reveals a very significant and potentially very dangerous failure on the part of FSD Beta. I’ve grabbed a few screenshots of the moments where the Tesla blows through the stop sign, so we can see what’s happening. First, you can see that within a very short distance of the stop sign, the car is going 36 mph, and you can see the blue path, which indicates that FSD is active. You can also see that it appears some part of the system has noted the stop sign, and it seems to appear on the in-dash visualization. And finally you can see the path colored gray, showing when FSD was disengaged by Gerber hitting the brakes:

Fsd Screengrabs

A few moments after FSD disengages because the brakes were applied, we see the speed finally get down to 0 mph, just missing the white Audi SUV:

Fsd After1

ADVERTISEMENT

If Gerber had not intervened – which, of course he should have, as this is a Level 2 system that requires constant driver attention, despite everything that the “Full Self Driving” name suggests – then the Tesla would have crashed into that other car, still doing about 35 mph. It would have been bad.

After the crash, Gerber somewhat defensively notes that of course he took over, he’s not going to just let the car crash into things, which, of course, is great and the right way to deal with a Level 2 system, but if we’re really honest here, that is counter to the way that FSD tends to be discussed by its proponents, who crow about the drives they’ve taken with zero disengagements, because that’s the narrative that fits the often-repeated idea that Tesla will be mass-producing robotaxis very, very soon. This claim has been made since 2014.

Of course, Tesla True Believers are not to be so easily cowed by such bullshit as “what actually happened” and have been defending the Tesla AI’s choices, suggesting that perhaps Gerber was too hasty to disengage. For example, one Tesla fan noted that the actual line where you need to stop is a bit past the stop sign itself:

Oh yeah, look at that. That line has to be fives and fives of feet away from the stop sign! I’m sure that’s plenty of distance for a car traveling at 35 mph to come to a complete stop. If it’s not, then maybe the concept of momentum is shorting Tesla stock? Did you ever think about that?

ADVERTISEMENT

I mean, they’re right: that picture does speak a thousand words. Those words are “FSD blew it” repeated  333.33 times.

Some people are flat out accusing Gerber, the pro-Tesla FSD Beta participant here, of just accelerating through the stop sign and into the intersection before stopping, for motives that I’m not really clear on:

Oh, that’s just sad. In fact, Ross Gerber himself seems to be engaging in some interesting post-event interpretation of what happened:

ADVERTISEMENT

So, here he says he was in control 10 ft before the stop sign, and “FSD was disengaged before encountering any of the vehicles.” Okay. I mean, that’s because ten feet away from a stop sign at 35 mph means you’re blowing through that stop sign. Ten feet is nothing – that’s 0.2 of one second at 35 mph. That’s not getting FSD off the hook here, it just means that the driver correctly determined the FSD was screwing up significantly, so they needed to take control.

There are also people suggesting that the completely un-obscured stop sign is somehow “hidden”:

If this is an “edge case,” then almost everything is an edge case. The whole “edge case” thing is ridiculous, anyway. The world isn’t standardized. Every particular physical location is unique; are they all “edge cases? And, it hardly matters when it only takes one edge case, whatever we decide that actually means, to be the point where your day is absolutely ruined, because your dumb car ran a stop sign right into a busy intersection.

Also, there was a “stop ahead” sign before the actual stop sign – are those just ignored?

ADVERTISEMENT

Sure, it didn’t make mistakes like this nonstop; there were plenty of times it handled things quite well. There were also times when Gerber needed to take over, and later in the livestream, with a different driver, FSD Beta ignored the pop-out stop sign on the side of a school bus, which is a serious offense in most states, and also blew by a road closure sign.

FSD Beta, like all Level 2 systems, is kind of doomed. Even if it becomes extremely capable at driving with nearly no interventions at all, it’s still a Level 2 system that requires the constant vigilance of the driver, and this is precisely the sort of task humans are absolute garbage at. Studies have shown that drivers put far too much faith in Level 2 semi-automated systems, which is what leads to wrecks when things to inevitably go wrong. You just can’t have a system that does most of the work of driving and expect people to always be ready to take over. Look at this particular situation: if the driver wasn’t attentive and able to react immediately, that drive would have ended in a crash.

FSD Beta has no elegant means of failover; if the system realizes it can’t continue and the driver doesn’t respond, the best it can do is stop in an active traffic lane, which is a terrible idea. If the system doesn’t even realize it’s doing something wrong, as in the case shown here, too bad. This isn’t even just a Tesla issue; Level 2 on all platforms has the same fundamental problems.

Will this incident, happening as it did right smack dab in the middle of a publicized live-streamed test, change anyone’s mind about the capabilities of FSD Beta?

I wouldn’t counsel any breath-holding.

ADVERTISEMENT

Relatedbar

That Anti-Tesla Full Self-Driving Super Bowl Ad Wasn’t Fair To Tesla, But Not For The Reasons You Think

The Reported Criminal Probe Against Tesla’s Self-Driving Claims Is Really A Critique Of Level 2 Semi-Autonomy Entirely

Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
69 Comments
Inline Feedbacks
View all comments
Sherifftruman
Sherifftruman
9 months ago

So, lets say the sign was “hidden” (and I will agree the hedge on the right probably does make it less visible than a lot of stop signs). But there are many places with similar situations out in the real world, and it is not like the car had to slam on brakes due to seeing it late, it just never did.

Second, even if there wasn’t even a stop sign there, two cars were in the intersection. Was it just going to say, oh well, I don’t have a stop sign so let’s plow into those cars? Is this the same issue with running into fire trucks where the car just cannot see things dead ahead?

Ricardo Mercio
Ricardo Mercio
9 months ago

There are no edge cases for safety equipment. If you were told by your friends at Durex that their products don’t work if the user takes more than half an hour because that’s an “edge case”, you’d be rightly pissed. But when a car blasts through an intersection and nearly kills people, suddenly it’s ok, because it USUALLY doesn’t do that.

Brunsworks
Brunsworks
9 months ago
Reply to  Ricardo Mercio

“Daddy, Mommy, how was I born?”

”Well, you were what we call an edge case.”

Alan Christensen
Alan Christensen
9 months ago

So, in the future, when fully autonomous driving is a thing, who gets the ticket when the robotic driver blows a stop sign/traffic light? Who is liable for damages, injuries and deaths?

Brunsworks
Brunsworks
9 months ago

Depends on who’s being marginalized most at the time: Gen-Z-ers and their avocado toast, drag performers, or people of color.

MikeInTheWoods
MikeInTheWoods
9 months ago

I don’t like that they are potentially endangering the general public while doing these tests. Beta testing should not involve unwilling participants who just happen to be in the wrong spot when some dumb robot car goes blasting through. If a kid stepped out at that intersection they would have also been dead well before that Tesla stopped.

Alex Estill
Alex Estill
9 months ago

For the life of me I can’t understand why the autonomous vehicle research is so focused on navigating surface streets! My vote is to focus exclusively on limited access highways. Work towards reliable self-driving in that less complex use case, make it reliable, standardize it, normalize it, learn from the experience and then maybe, MAYBE move on to surface street conditions.

My ideal scenario in say ten years is reliable and perhaps mandatory autonomous driving on highways, and restricted only to highways.

Uncle D
Uncle D
9 months ago
Reply to  Alex Estill

Everyone is focusing on urban driving because the speeds are much lower. While highway driving is simpler in many regards, operating at higher speeds requires sensors with greater range and resolution. Reaction time increases with sensor range and this is why so many manufacturers are integrating LiDAR into their future sensor suites. Only Musk was foolish enough to say he didn’t need LiDAR and could do it with cameras. All part of the FSD lie…

Jerry Johnson
Jerry Johnson
9 months ago

I commute to DC every day in a Scud Missle-esque Dodge Neon and I’ve had way too many close calls with Tesla’s where it’s horribly apparent the drivers are using the Level 2 cruise incorrectly. My personal favorite being the guy reading the newspaper while his Tesla started to make a right on red onto a 45MPH zone while I was in the intersection. I honked and he yanked his car onto the shoulder in time to avoid the accident. Could have been avoided if he had been driving…or not reading a newspaper.

Gordon Mitchell
Gordon Mitchell
9 months ago

How and why are they allowed to beta test software on public roads with people who have no interest or desire to have their lives risked as a part of the testing? Also why are people paying ($10k-$15k?) to risk their own lives to be a part of the beta testing? Companies usually pay their users to be a part of testing. Everything about how FSD is being rolled out feels so ass backwards to me and I dont understand how Tesla is not drowning in lawsuits about it.

Chris with bad opinions
Chris with bad opinions
9 months ago

how Tesla is not drowning in lawsuits about it.”

Never underestimate the ability of rich white men to get away with virtually anything. Elon will never face a real consequence for anything he does in this lifetime.

Brent Bevis
Brent Bevis
9 months ago

They aren’t facing lawsuits because of a little something called terms and conditions. You have to agree to take control in these situations and any crash is your responsibility. The case where that software engineer died because the car swerved into a barrier was determined to be his fault because he was not ready to take control.

Methane generator
Methane generator
9 months ago
Reply to  Brent Bevis

Can you point me to the Ts&Cs document bearing my signature please? I haven’t received anything from Tesla or any other manufacturer about testing my Miata with their beta shit

Gordon Mitchell
Gordon Mitchell
9 months ago

Yea the T&Cs are only agreed to by the morons who paid to be testers. What happens when they hit a random person on the street? Or in the case of the video above…what if the Tesla nailed that white Audi while in FSD? I imagine the T&Cs put all liability on the driver but the bigger legal issue to me is beta testing of this should not be allowed on public roads with people who do not want to be a part of it

Do You Have a Moment To Talk About Renaults?
Do You Have a Moment To Talk About Renaults?
9 months ago

How people don’t seem to understand this is beyond me. If you beta test on public streets, you make EVERYONE a beta tester.

Do You Have a Moment To Talk About Renaults?
Do You Have a Moment To Talk About Renaults?
9 months ago

I’ve been saying this since the early days of Autopilot. It’s not just them legally selling a beta-stage product that many users play with as if the world was their beta-testing sandbox – and everyone else mere NPCs. I would argue that both Autopilot and FSD break false advertising laws just with their names, but the fact that the CEO repeatedly makes exaggerated/outright false claims about the product should also be taken into account, especially considering so many of his followers take his word as law.

Duke of Kent
Duke of Kent
9 months ago

My general attitude toward technological advances is usually curiosity rather than fearfulness. I may not be the first kid on the block to get each new gizmo, but I do enjoy reading about such things and imagining how they might come into practice once widely adopted.

Individual driver assistance aids all seem promising. A system that will automatically apply the brakes if it detects that you’re about to crash into the car in front of you? A system that’ll give you a little alert if you start to drift out of your lane? An advanced cruise control system that will match the speed of the car ahead of you? A system that’ll slide your car into a parallel parking spot without the driver having to touch the steering wheel? All of those sound interesting, though I’ll admit that I have not had an opportunity to personally try any of them.

But the leap to a car that completely drives itself just seems like a bridge too far. If the driver must constantly be alert and ready to take over the system, then what’s the point? And how do you know exactly when to take over before it’s too late? If such a system came with my car, I probably wouldn’t use it. I certainly wouldn’t pay extra for it.

I have some theories about the passionate support that these systems (and other aspects of our culture) enjoy, but in addition to their cult-like following, marketing seems to be part of the problem. Something marketed as “Full Self Driving” does not imply “You still have to pay close attention to what’s happening and be prepared to take control in a split second.”

I don’t know how to fix it. You could push for more testing at closed facilities rather than public streets, but it’s comparatively easy to develop a self-driving system under controlled conditions. The reality is that the real world is made up of millions of “edge cases” that are the true test of such systems.

But backing it up a step further, what is the success criteria? Human beings are pretty terrible drivers. A quick Google search tells me that there are over 2 million traffic accidents in the US annually resulting in 42,000 fatalities. If, adjusted for the proportion cars with “self-driving” capabilities compared to those without, the self-driving cars get into fewer crashes and kill fewer people through its mistakes, that still sounds like a win when looking at the numbers. On the other hand, it’s very distasteful to be satisfied that self-driving cars killed “only” X number of people in a year.

In order to have a meaningful conversation on the state of self-driving technology, we need to agree on what success looks like.

rctothefuture
rctothefuture
9 months ago
Reply to  Duke of Kent

Well said about the understanding of success from a system like this. I’d argue that success can be seen as “X number of deaths per year” because when we see that there are 42,000 fatalities every year, we get numb to that fact. We chock it up to human error and move on with our lives, because humans are accepting of the reality of human error.

Machine error is not something the average person can accept, they demand that technology work 100% otherwise we are unsatisfied. Look at the airline industry, the safest way to travel as they say, and all the technology involved with a plane. I’ve heard for years that Airbus and Boeing could make a plane that taxis, takes off, and lands without a pilot on board. Would you trust it? Would you trust it if maintained the average number of civil airline deaths? What if it brought those down by half? Or would you demand 0 deaths with an automated system?

While most people would argue that 0 should be the standard, I’d feel just as safe in a plane with or without a pilot. I’ve been quite comfortable putting my Tesla rentals into Autopilot, and I’ve trusted them on some difficult setups and roads where I thought it would throw it for a loop. Does FSD still need work? Absolutely. But if you told me driving in FSD reduced my chance of an accident by half, with verifiable data, I’d drive in it more often than not.

Brent Bevis
Brent Bevis
9 months ago
Reply to  rctothefuture

Of course there was a specific plan that had a safety control system that caused the plan to crash. The problem with these systems is they do not adapt in an emergency. I think it would be very difficult to get proper statistics with the small numbers of cars on the road that have stage 2 systems, and the fact that they are usually well maintained $40k or more cars would also skew and statistics

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago
Reply to  rctothefuture

If we can’t have input systems that correct “chock it up” to “chalk it up” then self-driving is many many decades out.

Methane generator
Methane generator
9 months ago
Reply to  Duke of Kent

Salient points, but per your question ‘An advanced cruise control system that will match the speed of the car ahead of you?’ I have ridden in a vehicle with active cruise control and it’s a rear-ender waiting to happen. The (Chrysler) system doesn’t just let off the throttle when someone ahead merges into lane, it dabs the brake quite aggressively. If someone is following just a little too close and happens to be accelerating (so, probably a Tesla) that’s a several thousand dollar software error. But hey I guess I’m in two (tiny little) minds about this: if it takes that Tesla off the road for a few months while they figure out how to fix the bumper that makes roads safer so … yeah don’t listen to me

John Patson
John Patson
9 months ago

It’s summer, hedges in leaf and trees and things block radio waves. And the stop traffic is at an angle, not full side on like in the lab. Good catch (just) by the driver. Heavy Tesla and heavy Audi would not have been a good match.

Phantom Pedal Syndrome
Phantom Pedal Syndrome
9 months ago

Autonomous vehicle manufacturers are so busy playing checkers with our lives for profit that they will never be able to take a step back and recognize that traffic and the infrastructure of any given city is a game of chess.

Rafael
Rafael
9 months ago

Everyone is lauding ChatGPT as a game changer, and it mostly is. However, imagine if every time it hallucinated a single phrase in the middle of a ten paragraph response, it killed someone – that’s Level 2 in a nutshell.

Methane generator
Methane generator
9 months ago
Reply to  Rafael

What game does it change, please? I haven’t seen any single valid use case.

Most recently I read about that thing just completely making up a court case and when instructed to provide validation of the case, confirmed (incorrectly) that the case was indeed real. The lawyers who used it were censured.

Last edited 9 months ago by Methane generator
Space
Space
9 months ago

It makes plagiarism really convenient for lazy students.

Mercedes Streeter
Mercedes Streeter
9 months ago

Amazingly, the sanctions were only $5,000 for having chatGPT write bad arguments then support said arguments with fake chatGPT-generated cases.

Far better attorneys have gotten worse sanctions for lesser screw ups.

Phantom Pedal Syndrome
Phantom Pedal Syndrome
9 months ago

Watch out for that “edge case” right around the “corner case”.
That’s not an “edge case” it’s a dog, or a child etc.
Oopsie.
Now its a “court case”.

Last edited 9 months ago by Phantom Pedal Syndrome
Hoonicus
Hoonicus
9 months ago

Note to all AV, FSD, assisted driving proponents; The public roadways are not your playground! You’re not ready! Back to DARPA with ya! If you graduate from there, you then need your own figure 8 track for usable lifetime battery expectancy.

That’s last weeks post to the confused cruse in a police barricade zone. How these systems are allowed on our roadways is beyond comprehension. The only beta system test allowable should have 0 vehicle control, and just record when it would have intervened. This data would take years to accumulate/review, and standards developed. I still want to see them all fight it out on a figure eight track!

Erik Hancock
Erik Hancock
9 months ago
Reply to  Hoonicus

Completely agree. However, if it truly was a beta test as you described, how could they charge $15k to install it? Oh yeah, that’s right, they would have to PAY drivers to beta test their software…

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago
Reply to  Erik Hancock

They did pay drivers, and one of them mowed down a cyclist in AZ.

Methane generator
Methane generator
9 months ago
Reply to  Hoonicus

I agree on all points. However, Tesla collects data from every Tesla vehicle, and did so for many years before they even started to crow about it. So you’re absolutely right. The piece they missed was ‘what if it’s not good enough and we can’t release it?’ It wasn’t, they knew it wasn’t, but they did anyway. That’s how bad this is.

EmotionalSupportBMW
EmotionalSupportBMW
9 months ago

Dear website: How do we stop Tesla from calling it Full Self Driving? It’s seems like they got away with this bullshit title for too long. Also anything that can kill you and everyone you love should not be allowed to have the cop out that is Beta on the end. I know it’s not fiscally wise to accept responsibility for your actions, but allowing Tesla to make the greater Bay Area full of computerized Christines, just because they but “beta” on the end seems fairly unwise.

Last edited 9 months ago by EmotionalSupportBMW
Methane generator
Methane generator
9 months ago

They called it ‘Autopilot’ before. They were told to stop. Ironically, it’s more like an aeroplane autopilot system than what the new name says.

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago

Isn’t autopilot a different product from FSD?

Silent But Deadly
Silent But Deadly
9 months ago

These sort of systems are supposed to make drivers better and driving safer. If they don’t do that then that’s a perverse outcome for the system design and they are not fit for purpose. Simple as that.

Banning FSD systems is a waste of time so the only other option is they get insured out of use. Which inevitably means innocent people will be maimed and killed in the process. Yay… progress.

Mr Sarcastic
Mr Sarcastic
9 months ago

Lets be honest if Tesla made submersibles and Elon took a trip on one there would be 10 dead people and 2 dead CEOS. TESLA is the best made EV Because the rest are garbage. The Chinese junk sucks but dictatorship no real coverage. But Biden insist we finance and buy death mobiles and everyone doubts Toyota being able to look at see problems and put out a better EV.

Chris with bad opinions
Chris with bad opinions
9 months ago
Reply to  Mr Sarcastic

Hey bad take dave, can you edit this so it’s in English? I want to get the full effect of your ignorance.

Methane generator
Methane generator
9 months ago

I think he’s talking about the torPEDO

My Goat Ate My Homework
My Goat Ate My Homework
9 months ago
Reply to  Mr Sarcastic

I read this right after reading another comment about ChatGP and AI writing stuff and am wondering if these two comments are related.

OnceInAMillenia
OnceInAMillenia
9 months ago

The idea that this was a hidden stop sign when it’s more visible than 50% of the overgrown signage I see in my normal drives just tells me that defense is bullshit. It’s a bright red octagon on a green background; it is visible AF

LuzifersLicht
LuzifersLicht
9 months ago

Presumably it’s harder to see a red stop sign if you’re wearing industrial-strength rose tinted glasses.

Bob Boxbody
Bob Boxbody
9 months ago

To be fair, it was hidden by the roof while they blew past it.

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago
Reply to  Bob Boxbody

The car has maps of intersections and signage. It knows there’s a stop sign. People stop at stop signs covered in stickers. There is zero excuse for this failure.

SlowCarFast
SlowCarFast
9 months ago

Not to mention the blatantly clear “stop sign up ahead” sign at the beginning of the clip.

Chris with bad opinions
Chris with bad opinions
9 months ago

It had a white border so it’s optional anyway.

Methane generator
Methane generator
9 months ago

Not to mention the vehicles waiting to cross the street. Did it not anticipate their potential future positions? Like, what if this was a European residential zone where incoming streets have priority over through traffic? But it wasn’t. There’s abundant signage and road markings, the alignment of the intersection is pretty clearly a four-way stop, and there were plenty of cues from the paths other vehicles were taking prior to the near collision.

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago

They have detailed maps of the roads and all the signs and signals on them! The car knows it needs to stop even if a rhino sprayed all over the sign.

AlterId
AlterId
9 months ago

The ability to clearly see that FSD was engaged when this happened has totally changed my mind about the yoke. It should be a mandatory part of the FSD option, along with a camera aimed at the dash and streaming to cloud storage and a black box made out of whatever airplane black boxes are made out of, assuming it can withstand a battery fire. And since the beta testing is taking place on public roads, all information recorded should be made public whenever there’s an accident possibly attributable to FSD or Elon commits an act of douchery that rises above social misdemeanor status. Aren’t all the other L2 and aspiring L3 systems limited to mapped highways, anyway?

Citrus
Citrus
9 months ago

These folks keep talking about edge cases as though driving isn’t entirely edge cases.

Erik Hancock
Erik Hancock
9 months ago
Reply to  Citrus

I would argue that the lab is the edge case.

JaredTheGeek
JaredTheGeek
9 months ago

People saying, he is hitting the accelerator, it tells you that the Tesla can not stop if you do that. There is a warning. This is clearly a big error FSD makes here.

Philip B
Philip B
9 months ago

How this junk is still allowed on our roads is beyond me. How many more people need to die?

Phantom Pedal Syndrome
Phantom Pedal Syndrome
9 months ago
Reply to  Philip B

I’m starting to think DUI laws need to be re-written to include technology as an “influence”. Tesla has been given too much leeway even though they are clearly producing vehicles well beyond their “Ballmer Peak”.

Last edited 9 months ago by Phantom Pedal Syndrome
SlowCarFast
SlowCarFast
9 months ago
Reply to  Philip B

Talk to Boeing. Evidently hundreds of people need to die, and those in charge will be slapped on the wrist while cashing their huge exit checks.

Ryan B
Ryan B
9 months ago

Gary Black an investor that routinely pumps TSLA and always has FSD as a “stock catalyst” sees no issues with FSD running a stop sign, even blames it on another driver.

Shocking.

Last edited 9 months ago by Ryan B
Otter
Otter
9 months ago

I continue to believe that software should not be able to operate a car on public roads UNTIL IT CAN PASS A FREAKING TEST LIKE WE ALL DID. All this talk about levels is BS.
Any real driver would have lifted at the Stop Ahead sign and had no trouble stopping correctly. I hate being reminded that I am unwillingly participating in Tesla’s dangerous experiment.

Rust Buckets
Rust Buckets
9 months ago
Reply to  Otter

Well Tesla self driving could definitely pass the driving test I did, most driving tests are a joke.

Methane generator
Methane generator
9 months ago
Reply to  Rust Buckets

Okay the UK driving test.

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago

A 100k mile driving test in mixed environments.

Goof
Goof
9 months ago

For example, one Tesla fan noted that the actual line where you need to stop is a bit past the stop sign itself.

Oh yeah, look at that. That line has to be fives and fives of feet away from the stop sign! I’m sure that’s plenty of distance for a car traveling at 35 mph to come to a complete stop.

<sarcasm level=”extreme”>
I mean, of course! Why couldn’t a car come smoothly come to a stop in just a few feet? Moreover, I would’ve expected the driver to react faster. Like he should’ve been on the brake pedal in a full panic stop within five hundredths of a second.
</sarcasm>

In all seriousness, I wish I was kidding about hearing stuff like what I just put above. I’ve had NUMEROUS people over the years say in complete confidence that they can immediately respond to some external stimulus and complete an action in well under a tenth of a second. To hell with reaction time, how fast you can activate a muscle, etc. Nope. We are walking around with absolute superhumans with reaction times and speed that would make a superhero look pedestrian. Obviously!

Hell, I had one person miscalculate acceleration due to gravity so badly, he genuinely believed he could’ve jumped out of the 30th floor of the building we were on, and since his number was so incorrect, he then doubled down, trying to convince us it would be little different than if he ran into a wall at full sprint.

Canopysaurus
Canopysaurus
9 months ago
Reply to  Goof

To be fair, the difference between falling 30 stories or running full tilt boogie into a wall (let’s assume concrete and head first) in terms of likely outcomes would be relatively moot to him.

Goof
Goof
9 months ago
Reply to  Canopysaurus

If you’re talking in terms of, “he won’t get any dumber” I’d agree.

In terms of velocity and impact energy though? Yikes!

Assuming he could sprint at 18mph (~29kph) for a very short distance, considering he was about 195lbs (88.5kg), the impact force would be ~2860 joules.

If he went out the 30th story window, where the height was ~120 meters (~395 feet),e he’d hit the ground at 174.6kph (~108.5mph) and the impact force would be 104,000 joules. That’s not factoring in air resistance, but we’re talking 36 times the impact force!.

The F--kshambolic Cretinoid Harvey Park
The F--kshambolic Cretinoid Harvey Park
9 months ago
Reply to  Canopysaurus

Makes you wonder if he didn’t already do those things as a child

Hoonicus
Hoonicus
9 months ago
Reply to  Goof

Darwin award finalist.

Brian Ash
Brian Ash
9 months ago

With what a PITA the NHSTA can be, still beyond me how they have not tested and banned these systems.

Jack Beckman
Jack Beckman
9 months ago
Reply to  Brian Ash

Don’t worry, I’m sure they have their “top people” on it, and will issue a report with some suggestions in it by 2057. Because this is a top priority, that’s why it will be so timely. NHSTA? They are almost worthless.

Methane generator
Methane generator
9 months ago
Reply to  Jack Beckman

They aren’t worthless, they’re hamstrung by shitty deregulation fanatics. Follow the money. Cut the line of funding (by regulation) and hey presto suddenly the GOP/Tories etc dgaf

SlowCarFast
SlowCarFast
9 months ago
Reply to  Brian Ash

There are way more lobbyists than regulators, and the GOP continues to try to shrink the government, successfully reducing funding for many agencies.

69
0
Would love your thoughts, please comment.x
()
x