Home » I Just Can’t With This Tesla FSD User Panicking About Actually, You Know, Driving

I Just Can’t With This Tesla FSD User Panicking About Actually, You Know, Driving

Fsd Thenfsd Top
ADVERTISEMENT

Artificial Intelligence (AI) technology is a powerful tool, but like many powerful tools, it has the potential to allow humans to let our natural abilities atrophy. It’s the same way that the invention of the jackhammer pretty much caused humans to lose the ability to pound through feet of concrete and asphalt with our bare fists. We’re already seeing effects of this with the widespread use of ChatGPT seemingly causing cognitive decline and atrophying writing skills, and now I’m starting to think advanced driver’s aids, especially more comprehensive ones like Level 2 supervised semi-automated driving systems are doing the same thing: making people worse drivers.

I haven’t done studies to prove this in any comprehensive way, so at this point I’m still just speculating, like a speculum. I’m not entirely certain a full study is even needed at this point, though, because there are already some people just flat-out admitting to it online, for everyone to see, free of shame and, perhaps, any degree of self-reflection.

Vidframe Min Top
Vidframe Min Bottom

Specifically, I’m referring to this tweet that has garnered over two million views so far:

Oh my. If, for some reason, you’re not able to read the tweet, here’s the full text of it:

ADVERTISEMENT

“The other night I was driving in pouring rain, fully dark, and the car randomly lost GPS. No location. No navigation. Which also meant no FSD. I tried two software resets while driving just to get GPS back. Nothing worked. So there I was, manually driving in terrible conditions, unsure of positioning, no assistance, no guidance. And it genuinely felt unsafe. For me and for the people in the car. Then it hit me. This feeling – the stress, the uncertainty, the margin for error – this is how most drivers feel every single day. No FSD. No constant awareness. No backup. We’ve normalised danger so much that we only notice it when the safety net disappears.”

Wow. Drunk Batman himself couldn’t have beaten an admission like this out of me. There’s so much here, I’m not even really sure where to start. First, it’s night, and it’s “fully dark?” That’s kind of how night works, champ. And, sure, pouring rain is hardly ideal, but it’s very much part of life here on Earth. It’s perfectly normal to feel some stress when driving in the dark, in bad weather, but it’s not “how most drivers feel every single day.” Most drivers are used to driving, and they deal with poor conditions with awareness and caution, but, ideally, not the sort of panic suggested in this tweet.

Also, my quote didn’t replicate the weird spacing and short, staccato paragraphs that made this whole thing read like one of those weird LinkedIn posts where some fake thing someone’s kid said because a revelation of B2B best practices, or some shit.

It seems that the reason this guy felt the way he did when the driver aids were removed is that he’s, frankly, not used to actually driving. In fact, if you look at his profile on eX-Twitter, he notes that he’s a Tesla supervisor, which is pretty significantly different than calling yourself a Tesla driver:

Oli Profile

This is an objectively terrible and deeply misguided way to view your relationship with your car for many reasons, not the least of which is the fact that even if you do consider yourself a “supervisor” – a deeply flawed premise to begin with – the very definition of Level 2 semi-autonomy is that the person “supervising” has to be ready to take over with zero warning, which means you need to be able to drive your damn car, no matter the situation it happens to be in.

ADVERTISEMENT

If anything, you would think the takeaway here would have been, shit, I need to be a more competent driver and less of a candy-ass as opposed to coming away thinking, as stated in the tweet,

“We’ve normalised danger so much that we only notice it when the safety net disappears.”

This is so deeply and eye-rollingly misguided I almost don’t know where to start, except I absolutely do know where to start: the idea that the “safety net” is Tesla’s FSD software. Because that is exactly the opposite of how Level 2 systems are designed to work! You, the human, are the safety net! If you’ve already made the arguably lazy and questionable decision to farm out the majority of the driving task to a system that lacks redundant sensor backups and is still barely out of Beta status, then you better damn well be ready to take over when the system fails, because that’s how it’s designed to work.

To be fair, our Tesla Supervisor here did take over when his FSD went down due to loss of a GPS signal, but, based on what he said, he felt “unsafe” for himself and the passengers in the car. The lack of FSD isn’t the problem here; the problem is that the human driver didn’t feel safe operating their own motor vehicle.

Not only was he uncomfortable driving in the inclement weather and lack of light (again, that’s just nighttime, a recurring phenomenon), but the reason he had to debase himself so was because of a technical failure of FSD, which, it should be noted, can happen at any time, without warning. Hence the need to be able to drive a damn car, comfortably.

What does he mean when he says, referring to human driving, “no constant awareness?” Almost every driver I know is constantly aware that they are driving. That’s part of driving. Do people get distracted, look at phones, get lost in reveries, or whatever? Sure they do. That’s not ideal, but it doesn’t mean people aren’t aware.

ADVERTISEMENT

Unsurprisingly, the poster of this admission has been getting a good bit of blowback in comments from people a little less likely to soil themselves when they have to drive in the rain. So, he provided a follow-up tweet:

I’m not really sure what this follow-up actually clarified, but he did describe the experience in a bit more detail:

“I knew the rough direction but not exactly. I never use my phone while driving, so 1 rely solely on the car nav. Unfortunately, it wasn’t working, and I had to pull over to double-check where I was going.”

That’s just…driving. This is how all driving was up until about 15 years ago or so. I have an abysmal sense of direction, so I feel like I spent most of my pre-GPS driving life lost at least a quarter of the time I was driving anywhere. But you figure it out. You take some wrong turns, you end up in places you didn’t originally plan to be in, you looked at maps or signs or asked someone and you eventually got there. It wasn’t perfect, but it was what you had, and when we could finally, say, print out MapQuest directions and clip them to the dash, oh man, that was a game changer.

I took plenty of long road trips in marginal cars with no phone and just signs and vague notions to guide me where I was going. If I had to do it today, sure, there would be some significant adapting to exhume my pre-GPS navigational skills – well, skills is too generous a word, so maybe we can just say ability – but I think it could be done. And every driver really should be able to do the same thing.

ADVERTISEMENT

FSD (Supervised) is a tool, a crutch, and if you find yourself in a position where its absence is causing you fear instead of just a bit of annoyance, you’re no longer really qualified to drive a car. Teslas (and other mass-market cars with similar L2 driver assist systems) don’t have redundant sensors, most don’t have the means to clean camera lenses (or radar/lidar windows and domes), and none of them are rated for actually unsupervised driving. Which means that you, the person in the driver’s seat, need to actually live up to the name of that seat: you have to know how to drive a damn car.

This tweet should be taken as a warning, because while it’s fun to feel all smug because you can drive in the rain and ridicule this hapless fellow, I guarantee you he’s not alone. There are other people whose driving skills are atrophying because of reliance on systems like Tesla’s FSD, and this is a very bad path to go down. Our Tesla Supervisor here may actually have been unsafe when he had to take full control of the car and didn’t feel comfortable. And that’s not a technical problem, it’s a perception problem, and it’s not even the original poster’s fault entirely – there is a lot of encouragement from Tesla and the surrounding community to consider FSD to be far more capable than it actually is.

Roboadas Study Top

Driving is dangerous, and it’s good to feel that, sometimes! You should always be aware that when you’re driving, you’re in a metal-and-plastic, ton-and-a-half box hurtling down haphazardly maintained roads at a mile per minute. If that’s not a little scary to you, then you’re either a liar, a corpse, or one of those kids who started karting at four years old.

We all need to accept the reality of what driving is, and the inherent, wonderful madness behind it. I personally know myself well enough to realize how easily I can be lured into false senses of security by modern cars and start driving like a moron; to combat this, my preferred daily drivers are ridiculous, primitive machines incapable of hiding the fact that they’re just metal boxes with lots of sharp, poke-y bits that are whizzing along far too quickly. Which, in the case of my cars, can mean speeds of, oh, 45 mph.

ADVERTISEMENT

The point is, everyone on the road should be able to capably drive, in pretty much any conditions, without the aid of some AI. Even when we have more advanced automated driving systems, this should still be the case, at least for vehicles capable of being driven by a human. But for right now, systems like FSD are not the safety net: the safety net is always us. We’re always responsible when we’re in the driver’s seat, and if we forget that, we could end up in far worse situations than just embarrassing ourselves online.

But that can happen too, of course.

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
242 Comments
Inline Feedbacks
View all comments
Dest
Member
Dest
1 month ago

These people need to get out of their cars and into public transit.

Scott
Member
Scott
1 month ago

“Speculating like a speculum” …how has that not occurred to me before!? Probably for the same reason that I can’t remember that there’s only one ‘r’ in ‘occur’ despite being conscious for most of the past 60 years.

Given how innumeracy has flourished since the widespread use of the pocket calculator, and how 90% of the population has lost the ability to spell since the 1980s when the use of spellcheckers became common, I bet Jason’s right about this.

If road conditions (vs. the ability and self-awareness level of the driver) warrant, you PULL OVER and wait for them to improve. If you ‘feel unsafe’ for any reason, why would you persist? A car full of computers doesn’t absolve the driver of their responsibilities towards others, nor is it a substitute for common sense.

I hereby and unilaterally, without recourse or the possibility of appeal, permanently revoke Oli’s status as a ‘car enthusiast.’ 

What utter bollocks. 🙁

Last edited 1 month ago by Scott
Andrew Bugenis
Member
Andrew Bugenis
1 month ago

It’s perfectly fine to realize when driving conditions are outside your safety zone. I *judge* a bit sometimes when someone who’s 30 says they prefer not to drive at night, but if you aren’t comfortable driving at night in the rain in unfamiliar areas, I won’t judge you that much for staying home.

Taking your car out on the road in conditions you’re uncomfortable driving in, now that’s stupidity. If you can’t be behind the wheel for those conditions, have someone else drive or call an Uber.

Scott
Member
Scott
1 month ago
Reply to  Andrew Bugenis

30!? 😮

Steve Walton
Steve Walton
1 month ago

I can’t even begin to express how different these people are. I cannot believe they are capable of maintaining even a semblance of civilization. You know, on a basic level there is a point where you have to comprehend what food is in order to put it into your mouth. And then there is the whole “chewing” thing. What even darker horrors lie in waiting for them?

H.G. Wells’ Eloi are barbarian warriors compared these generations that are allowing themselves to be made irrelevant. God help them if they ever encounter any real problems to deal with.

Mike McDonald
Mike McDonald
1 month ago
Reply to  Steve Walton

What is this “chewing” thing to which you alluded? Asking for a friend…
/s

Ben
Member
Ben
1 month ago

I personally know myself well enough to realize how easily I can be lured into false senses of security by modern cars and start driving like a moron

I wish everyone had this level of self-awareness, but alas.

This is the thing that bothers me about L2 systems more than anything. I, a reasonably self-aware and conscientious driver, get lulled into a false sense of security by them. Even something as simple as lane-keeping. “Oh, it’s alright if I dig around in the passenger seat looking for something. Lane-keeping will make sure I don’t drift off the road.” Which of course is nonsense, but is also something that absolutely went through my mind when I first got a vehicle with lane-keeping.

Mike McDonald
Mike McDonald
1 month ago
Reply to  Ben

I like to think I am self aware enough to use my L2 systems to elevate situational awareness instead of lose it. When I have lane keep assist and radar cruise control on, I maintain a watchful eye on what the car is doing and am always supplementing it with input, but I am also able to expand my road awareness since I can glance more often at what is happening up ahead instead of pasting my eyes to the bumper in front of me in order to prevent a collision. Etc. So, that’s like one person that perhaps uses these systems to improve safe driving. Anyone else?

Von Baldy
Member
Von Baldy
1 month ago
Reply to  Mike McDonald

Thats kind of what id do with the wifes car that has psuedo l2 systems on it.

Basically let it dawdle along in the right lane at the speed limit making sure it’s behaving while keeping eyes further ahead of the road for crap i normally wouldn’t be able to look at for longer than an eye blink.

Dudeoutwest
Dudeoutwest
1 month ago

Robo driver man needs to give Zen and The Art of Motorcycle Maintenance a read and figure out whether he’s in charge of his tech or it’s in charge of him.

Feels like the latter and he’s mad when he has to, you know, assume responsibility for the safety of himself and his passengers.

Stef Schrader
Member
Stef Schrader
1 month ago

Bless his heart. My sincere advice for him is to cut up his license, throw it into a volcano just to be sure, and to embrace public transit. You’ll never have to take over a train! Or endanger anyone else in your car or on your road with your irresponsble overreliance on driving aids, either!

RecoveringGTV6MaratonaOwner
RecoveringGTV6MaratonaOwner
1 month ago

Torch, thank you for dispatching another humorously written blistering critique. At this rate, AI is going to turn us all into nothing but a bunch of unemployed,, incapable, timid, and useless dullards- especially younger and future generations.

Stay weirdly gold and Happy Hanakkah to you and your family!

Last edited 1 month ago by RecoveringGTV6MaratonaOwner
J Money
Member
J Money
1 month ago

The more these dorks take over, the fewer cool cars will exist. These dweebs see them as laptops or dishwashers — just an appliance to do a task.

GirchyGirchy
Member
GirchyGirchy
1 month ago
Reply to  J Money

Eh. For a daily commuter an appliance is fine.

Steve Walton
Steve Walton
1 month ago
Reply to  GirchyGirchy

The reason your commute is dull is because you don’t choose to make it lively. Likely commuting is by far the largest time you spend in a car. Are you sure you want to spend your life in a toaster?

Last edited 1 month ago by Steve Walton
Mike McDonald
Mike McDonald
1 month ago
Reply to  Steve Walton

Baltimore / Washington / Northern Virginia roads have entered the chat…

GirchyGirchy
Member
GirchyGirchy
1 month ago
Reply to  Steve Walton

I’m not going to double my commute just to make it fun (it would more likely end up being a PITA because I’d be stuck behind slowpokes). I value my time and would rather be at home with my wife and cats that extra hour.

So yeah, I’d much rather spend my time in my little Mazda3 while getting 40 mpg than whatever you true enthusiasts deem to be worthy of my commute. When I go to replace it, it’ll be with something more comfortable, quiet, and with heated seats and steering wheel because I place more value in those things in a daily driver.

Comfort >> fun on the interstate.

Elvis Dogman
Elvis Dogman
1 month ago

To be fair, I’m pretty sure I would feel uncomfortable with him driving me around, too.

G. R.
Member
G. R.
1 month ago

This baby adult should be given a strider. No fit for the road, or cars, or adulthood.

Mike McDonald
Mike McDonald
1 month ago
Reply to  G. R.

Not to be morbid, but the gene pool has a way of cleaning itself…

G. R.
Member
G. R.
1 month ago
Reply to  Mike McDonald

My only concern is when the cleanse comes with other innocent people life’s at stake. If it was garanteed it was only that one stupid driver, I wouldn’t care.

Jerry Johnson
Jerry Johnson
1 month ago

I hate that we have people in Teslas who haven’t got any practice in months/years riding around and one small event makes them the biggest babies on the road.

Unironically, the worst cars to interact with on the road went from BMWs to Teslas over the years. All of my close calls on my commute have been with Teslas doing stupid things.

G. R.
Member
G. R.
1 month ago
Reply to  Jerry Johnson

You’re missing the Plague of Priuses crawling down the road. Bad prius drivers became bad tesla drivers.

Jerry Johnson
Jerry Johnson
1 month ago
Reply to  G. R.

Prius drivers were never an issue for me. It was always super aggressive BMW drivers with no turn signals, occasionally an aggressive shitty ram driver, priuses were just slow drivers in the right lane. Teslas when on autopilot are just kinda like driving behind a dipshit teenager.

Monolithic Juggernaut
Monolithic Juggernaut
1 month ago
Reply to  G. R.

Statisticians have missed it too.

RecoveringGTV6MaratonaOwner
RecoveringGTV6MaratonaOwner
1 month ago
Reply to  Jerry Johnson

Nissan Altima has just entered the conversation.

Stef Schrader
Member
Stef Schrader
1 month ago
Reply to  Jerry Johnson

The curb rash on some of these goofs’ cars alone can be spotted across the parking lot.

Monolithic Juggernaut
Monolithic Juggernaut
1 month ago
Reply to  Jerry Johnson

https://www.roadandtrack.com/news/a62919131/tesla-has-highest-fatal-accident-rate-of-all-auto-brands-study/

On my route I have seen a few bad Teslas but the standouts have been American brand pickups and BMWs.

Max Headbolts
Member
Max Headbolts
1 month ago

While drinking my morning coffee I was doing my morning Reddit perusal, and someone was on r/civisi complaining that they can’t engage “autopilot”* at 30 MPH and that’s stupid and he should be able to do what he wants because driving is tiring.

Complaining that you can’t let your manual car drive itself at low speeds is just, I don’t know get off my lawn!

*Their words not mine. They meant the smart cruise, which is not what it’s even intended for.

242
0
Would love your thoughts, please comment.x
()
x