Home » I Just Can’t With This Tesla FSD User Panicking About Actually, You Know, Driving

I Just Can’t With This Tesla FSD User Panicking About Actually, You Know, Driving

Fsd Thenfsd Top
ADVERTISEMENT

Artificial Intelligence (AI) technology is a powerful tool, but like many powerful tools, it has the potential to allow humans to let our natural abilities atrophy. It’s the same way that the invention of the jackhammer pretty much caused humans to lose the ability to pound through feet of concrete and asphalt with our bare fists. We’re already seeing effects of this with the widespread use of ChatGPT seemingly causing cognitive decline and atrophying writing skills, and now I’m starting to think advanced driver’s aids, especially more comprehensive ones like Level 2 supervised semi-automated driving systems are doing the same thing: making people worse drivers.

I haven’t done studies to prove this in any comprehensive way, so at this point I’m still just speculating, like a speculum. I’m not entirely certain a full study is even needed at this point, though, because there are already some people just flat-out admitting to it online, for everyone to see, free of shame and, perhaps, any degree of self-reflection.

Vidframe Min Top
Vidframe Min Bottom

Specifically, I’m referring to this tweet that has garnered over two million views so far:

Oh my. If, for some reason, you’re not able to read the tweet, here’s the full text of it:

ADVERTISEMENT

“The other night I was driving in pouring rain, fully dark, and the car randomly lost GPS. No location. No navigation. Which also meant no FSD. I tried two software resets while driving just to get GPS back. Nothing worked. So there I was, manually driving in terrible conditions, unsure of positioning, no assistance, no guidance. And it genuinely felt unsafe. For me and for the people in the car. Then it hit me. This feeling – the stress, the uncertainty, the margin for error – this is how most drivers feel every single day. No FSD. No constant awareness. No backup. We’ve normalised danger so much that we only notice it when the safety net disappears.”

Wow. Drunk Batman himself couldn’t have beaten an admission like this out of me. There’s so much here, I’m not even really sure where to start. First, it’s night, and it’s “fully dark?” That’s kind of how night works, champ. And, sure, pouring rain is hardly ideal, but it’s very much part of life here on Earth. It’s perfectly normal to feel some stress when driving in the dark, in bad weather, but it’s not “how most drivers feel every single day.” Most drivers are used to driving, and they deal with poor conditions with awareness and caution, but, ideally, not the sort of panic suggested in this tweet.

Also, my quote didn’t replicate the weird spacing and short, staccato paragraphs that made this whole thing read like one of those weird LinkedIn posts where some fake thing someone’s kid said because a revelation of B2B best practices, or some shit.

It seems that the reason this guy felt the way he did when the driver aids were removed is that he’s, frankly, not used to actually driving. In fact, if you look at his profile on eX-Twitter, he notes that he’s a Tesla supervisor, which is pretty significantly different than calling yourself a Tesla driver:

Oli Profile

This is an objectively terrible and deeply misguided way to view your relationship with your car for many reasons, not the least of which is the fact that even if you do consider yourself a “supervisor” – a deeply flawed premise to begin with – the very definition of Level 2 semi-autonomy is that the person “supervising” has to be ready to take over with zero warning, which means you need to be able to drive your damn car, no matter the situation it happens to be in.

ADVERTISEMENT

If anything, you would think the takeaway here would have been, shit, I need to be a more competent driver and less of a candy-ass as opposed to coming away thinking, as stated in the tweet,

“We’ve normalised danger so much that we only notice it when the safety net disappears.”

This is so deeply and eye-rollingly misguided I almost don’t know where to start, except I absolutely do know where to start: the idea that the “safety net” is Tesla’s FSD software. Because that is exactly the opposite of how Level 2 systems are designed to work! You, the human, are the safety net! If you’ve already made the arguably lazy and questionable decision to farm out the majority of the driving task to a system that lacks redundant sensor backups and is still barely out of Beta status, then you better damn well be ready to take over when the system fails, because that’s how it’s designed to work.

To be fair, our Tesla Supervisor here did take over when his FSD went down due to loss of a GPS signal, but, based on what he said, he felt “unsafe” for himself and the passengers in the car. The lack of FSD isn’t the problem here; the problem is that the human driver didn’t feel safe operating their own motor vehicle.

Not only was he uncomfortable driving in the inclement weather and lack of light (again, that’s just nighttime, a recurring phenomenon), but the reason he had to debase himself so was because of a technical failure of FSD, which, it should be noted, can happen at any time, without warning. Hence the need to be able to drive a damn car, comfortably.

What does he mean when he says, referring to human driving, “no constant awareness?” Almost every driver I know is constantly aware that they are driving. That’s part of driving. Do people get distracted, look at phones, get lost in reveries, or whatever? Sure they do. That’s not ideal, but it doesn’t mean people aren’t aware.

ADVERTISEMENT

Unsurprisingly, the poster of this admission has been getting a good bit of blowback in comments from people a little less likely to soil themselves when they have to drive in the rain. So, he provided a follow-up tweet:

I’m not really sure what this follow-up actually clarified, but he did describe the experience in a bit more detail:

“I knew the rough direction but not exactly. I never use my phone while driving, so 1 rely solely on the car nav. Unfortunately, it wasn’t working, and I had to pull over to double-check where I was going.”

That’s just…driving. This is how all driving was up until about 15 years ago or so. I have an abysmal sense of direction, so I feel like I spent most of my pre-GPS driving life lost at least a quarter of the time I was driving anywhere. But you figure it out. You take some wrong turns, you end up in places you didn’t originally plan to be in, you looked at maps or signs or asked someone and you eventually got there. It wasn’t perfect, but it was what you had, and when we could finally, say, print out MapQuest directions and clip them to the dash, oh man, that was a game changer.

I took plenty of long road trips in marginal cars with no phone and just signs and vague notions to guide me where I was going. If I had to do it today, sure, there would be some significant adapting to exhume my pre-GPS navigational skills – well, skills is too generous a word, so maybe we can just say ability – but I think it could be done. And every driver really should be able to do the same thing.

ADVERTISEMENT

FSD (Supervised) is a tool, a crutch, and if you find yourself in a position where its absence is causing you fear instead of just a bit of annoyance, you’re no longer really qualified to drive a car. Teslas (and other mass-market cars with similar L2 driver assist systems) don’t have redundant sensors, most don’t have the means to clean camera lenses (or radar/lidar windows and domes), and none of them are rated for actually unsupervised driving. Which means that you, the person in the driver’s seat, need to actually live up to the name of that seat: you have to know how to drive a damn car.

This tweet should be taken as a warning, because while it’s fun to feel all smug because you can drive in the rain and ridicule this hapless fellow, I guarantee you he’s not alone. There are other people whose driving skills are atrophying because of reliance on systems like Tesla’s FSD, and this is a very bad path to go down. Our Tesla Supervisor here may actually have been unsafe when he had to take full control of the car and didn’t feel comfortable. And that’s not a technical problem, it’s a perception problem, and it’s not even the original poster’s fault entirely – there is a lot of encouragement from Tesla and the surrounding community to consider FSD to be far more capable than it actually is.

Roboadas Study Top

Driving is dangerous, and it’s good to feel that, sometimes! You should always be aware that when you’re driving, you’re in a metal-and-plastic, ton-and-a-half box hurtling down haphazardly maintained roads at a mile per minute. If that’s not a little scary to you, then you’re either a liar, a corpse, or one of those kids who started karting at four years old.

We all need to accept the reality of what driving is, and the inherent, wonderful madness behind it. I personally know myself well enough to realize how easily I can be lured into false senses of security by modern cars and start driving like a moron; to combat this, my preferred daily drivers are ridiculous, primitive machines incapable of hiding the fact that they’re just metal boxes with lots of sharp, poke-y bits that are whizzing along far too quickly. Which, in the case of my cars, can mean speeds of, oh, 45 mph.

ADVERTISEMENT

The point is, everyone on the road should be able to capably drive, in pretty much any conditions, without the aid of some AI. Even when we have more advanced automated driving systems, this should still be the case, at least for vehicles capable of being driven by a human. But for right now, systems like FSD are not the safety net: the safety net is always us. We’re always responsible when we’re in the driver’s seat, and if we forget that, we could end up in far worse situations than just embarrassing ourselves online.

But that can happen too, of course.

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
132 Comments
Inline Feedbacks
View all comments
Scott Ashley
Scott Ashley
12 minutes ago

That is the problem with the last couple of generations and the way they were raised, too many safeguards too many training wheels now nobody can actually ride a bike.

Slow Joe Crow
Slow Joe Crow
18 minutes ago

I guess there needs to mandatory manual driving for X hours per month if you use ADAS. I don’t like the idea of more regulations but I don’t want to be wiped out by an idiot wh has lost the ability to drive.

Bkp
Member
Bkp
22 minutes ago

Reminds me of an amusing bumper sticker I saw yesterday on a Polestar:

“ONE LESS TESLA”.

Shooting Brake
Member
Shooting Brake
23 minutes ago

Thanks for bringing the rant we need Torch

Fuzzyweis
Member
Fuzzyweis
31 minutes ago

Couple of thoughts, this could a troll post, trolling for retweets and such, and posts on car blogs so if so they won.

Also this article smacks of old man yells at cloud. I mean FSD is a bit of a joke with the cameras that don’t work so good in the dark and rain, but there’s other things like Waymo that use all ‘ars’, lidar, sonar, radar, probably masers even, and the day is coming when a differently abled person that may not normally be able to go somewhere by themself in a car may be able to own and ‘drive’ a fully self driving car. And on that day, the guy who put the braille pads at all the drive throughs will be vindicated.

Jonathan Hendry
Jonathan Hendry
36 minutes ago

Is it normal to wash your car just before a huge rainstorm?

Bob Boxbody
Member
Bob Boxbody
37 minutes ago

I don’t mind driving in the rain, or at night, but the combination does suck. But it’s not worth panicking about. The idea of turning on FSD in those conditions seems a bit crazy though! In any conditions, really, but at night in the pouring rain?

In fact, it’s been pouring around here lately, and I’ve been working 7-5 in December, so it’s entirely dark for me while driving. But truly, the idea that some of the Teslas around me may be on FSD is way more frightening than the actual road conditions!

I think there should be some kind of external indicator when a car is on FSD. Some dome on the top of the car that lights up or something, to let the rest of us know to get the F away from that guy.

Angel "the Cobra" Martin
Member
Angel "the Cobra" Martin
1 hour ago

This kid wouldn’t have lasted 10 minutes in the 80’s.

Cheap Bastard
Member
Cheap Bastard
1 hour ago

He’s calling himself a Tesla supervisor because he’s “supervising” a Tesla car? Seriously?!

Holy crap, that some next level self-delusions of grandeur.

Last edited 1 hour ago by Cheap Bastard
Lost on the Nürburgring
Lost on the Nürburgring
2 hours ago

I’m thinking this guy is somewhere on the spectrum between “Actual Tesla Supervisor True Believer Nutbag” to “Disingenuous Twitter Troll Posting Ragebait for Clicks”.

Interesting fact, no matter where he actually exists along that spectrum between the two poles, he is still a giant dildo.

121gwats
121gwats
2 hours ago

Dont take the rage bait, and thats all this is.. I really hope thats all it is.

Cars? I've owned a few
Member
Cars? I've owned a few
2 hours ago

Ha! When I first read Tesla Supervisor, I thought he worked for the company. Which might be even worse.

I was driving back to Tacoma from Eugene, OR this morning in moderate rain, surrounded by the road spray being thrown up by the many 18-wheelers on I-5. A miles south of Tacoma, the really picked up in intensity. People around here lament that “nobody knows how to drive in the rain,” but visibility was down to maybe 300 feet; everyone slowed down and there were no accidents that I saw. Of course, that’s not always the case and Portland can be miserable to get through when there are accidents. Or an icing event.

Aminorking
Member
Aminorking
2 hours ago

MAKE CARS DUMB AGAIN!

Greg
Member
Greg
2 hours ago

Great article.

But the tweet has me in fear for the future. As a parent, I vow to never let my kid be this helpless and skill-less.

Last edited 2 hours ago by Greg
FormerTXJeepGuy
Member
FormerTXJeepGuy
3 hours ago

I feel like I constantly see Telsas being driven poorly. I’ve always assumed it was FSD causing it, but the recent noise about ChatGPT causing people to lose the ability to think or write, plus this post, has me thinking maybe people who are bad drivers gravitate towards Tesla because of these systems.

I’ll continue to increase my alertness when a Telsa is nearby. Have we determined if Tesla has any liability when one hits me while FSD is operational?

Jb996
Member
Jb996
3 hours ago

My personal opinion.
ChatGPT, on average, performs about like a C-/D student at anything difficult.
So, if one is naturally below that, it’s a huge benefit, and using it may be better than actually thinking. Above that, and it’s useful as a tool; like having a minion.

FSD seems to be similar. A bad clueless driver is still a bad clueless driver.

132
0
Would love your thoughts, please comment.x
()
x