Home » Tesla’s FSD ‘Mad Max Mode’ Will Ignore Speed Limits, Sense

Tesla’s FSD ‘Mad Max Mode’ Will Ignore Speed Limits, Sense

Fsd Lawbreak Top
ADVERTISEMENT

I think one of the things about getting cars to drive themselves that seems to be consistently overlooked by many of the major players in the space is just how much of the driving task has nothing to do with the technical side of things. Sure, the technical side is absolutely crucial, and the problem of how to get a car to interpret its surroundings via cameras, lidar, radar, whatever, and then use those inputs to construct a working model with reasonable actionable moments at over a mile a minute or more is absolutely impressive. But that’s only part of the equation when it comes to driving, especially driving among humans. There’s all sorts of cultural and environmental and other complex and often fuzzy rules about driving among human activity to consider, and then there are the complicated concepts of intent and all the laws and the strange unwritten rules – driving is far more complex than the base mechanics of it.

All of this is to say that Tesla’s re-released Full Self Driving (FSD) and its new mode known as Mad Max is something interesting to consider, mostly because it is software that is designed to willfully break the law. I’m not sure what other products are out there in the world that are like this; sure, you can buy a knife at Dollar Tree and commit any number of crimes with it, but the knife doesn’t come from the factory with the ability to commit crimes on its own, except perhaps for loitering.

Vidframe Min Top
Vidframe Min Bottom

Oh, and as an aside, Tesla’s icon for the Mad Max mode feels a little misguided. Let me show you:

Madmax Icon
Screenshots: Tesla, Warner Bros. PolyGram

That little guy with the mustache and cowboy hat doesn’t really look like what we think Mad Max (lower left) looks like; it looks much more like an icon of Sam Elliott (lower right). Unless they mean some other Mad Max? Maybe one that won’t get them in trouble with Warner Bros?

Anyway, back to the Mad Max speed profile itself. Among Tesla fans, the new mode seems to make them quite excited:

ADVERTISEMENT

 

Tesla’s Mad Max Speed Profile for their latest release of FSD V14.1.2 is actually programmed with the ability to break laws, specifically speeding laws. Here’s how Tesla’s release notes describe the new profile (emphasis mine):

FSD (Supervised) will now determine the appropriate speed based on a mix of driver profile, speed limit, and surrounding traffic.

– Introduced new Speed Profile SLOTH, which comes with lower speeds & more conservative lane selection than CHILL.

Introduced new speed profile MAD MAX, which comes with higher speeds and more frequent lane changes than HURRY.

– Driver profile now has a stronger impact on behavior. The more assertive the profile, the higher the max speed.

The description doesn’t specifically say it’ll break any laws, but in practice, it definitely does. Here’s a video from well-known Tesla influencer-whatever Sawyer Merritt where you can clearly see the Tesla hitting speeds up to 82 mph:

ADVERTISEMENT

Those 82 miles per hour occurred in a 55 mph zone, as the Tesla itself knows and is happy to display on its screen:

82in55

Why am I bringing this up? It’s not because I’m clutching any pearls about speeding (I don’t even have pearls, at least not real ones, I only have testicles to clutch as needed), because it’s no secret that we all do it, and there’s often a “folk law” speed limit on many roads where people just sort of come to an unspoken agreement about what an acceptable speed is. But that doesn’t mean it’s not breaking the law, because of course it is. And when you or I do such things, we have made the decision to do so, and we are flawed humans, prone to making all manner of bad decisions, or even just capable of being unaware, which, of course, ignorantia legis neminem excusat

But this Tesla running FSD (supervised) is a different matter. FSD is, of course, a Level 2 system, which means it really isn’t fully autonomous, because it requires a person to be monitoring its actions nonstop. So, with that in mind, you could say the responsibility for speeding remains on the driver, who should be supervising the entire process and preventing the car from speeding, at least technically.

But maybe the driver is more of an accomplice, because the car knows what the speed limit is at any given moment; it’s even displayed onscreen, as you can see above. The software engineers at Tesla have this information and could have put in the equivalent of a command like (if they were writing the software in BASIC) IF CURRENTSPEED>SPEEDLIMIT THEN LET CURRENTSPEED=SPEEDLIMIT, but they made a deliberate decision not to do that. Essentially, they have programmed a machine that knows what the law is, and yet chooses to break it.

ADVERTISEMENT

It’s not even really that minor infraction, what we see in the example here; Merritt’s Tesla is driving at 27 mph over the stated speed limit, and if we look at, say, the Marin County, California Traffic Infraction Penalty Schedule, we can see how big a deal this speed-crime is:

Caspeedingcode
Click to embiggen

I’m sure that’s impossibly tiny if you’re reading on a phone, so I’ll just tell you what it says: for going more than 26 mph over a 55 mph speed limit, it’s a total fine of $486 and a point on your license. That’s a big speeding ticket!

So, does this make sense? Even if we all speed on highways at times, that’s very different than a company making and selling a product that is able to or at least willing to assist in violating a known law. Again, this is hardly high treason, but it is a crime, and it’s a crime perpetrated by a machine, with the full blessing and support of the parent company. There are tricky ethical issues embedded in this that we really should be thinking about before we start mass deploying 5,000-pound machines with a healthy disregard for the law into the public.

Does the fact that this software willingly breaks traffic laws affect Tesla’s potential liability if anything should go wrong? I’m no lawyer, but it’s hard to see how it wouldn’t come up if a car running FSD in the Mad Max speed profile were to get into a wreck. Why are decision makers at Tesla willing to put themselves at risk this way? It seems like such a huge can of worms that you would think Tesla would rather not open, right?

If someone gets pulled over for speeding while using this system, can they reasonably claim it wasn’t their fault but was the fault of the car? It’s significantly different than a car on cruise control, because those systems require the driver to actively set the maximum speed; in this case, FSD decides what speed to drive. But the person did choose a speed profile that could speed, so are they responsible? Or does it come back around to the fact that speeding is possible at all in that profile?

ADVERTISEMENT

This is complicated! Cars have always had the ability to break laws if driven in ways that didn’t respect the law, but this is beyond just an ability. This is intent. The fact that the software knows it can break a law almost makes it premeditated.

I’m not saying this is a terrible thing that must be stopped, or anything like that. I am saying that this software is doing something we’ve never really encountered before as a society, and it’s worth some real discussion to decide how we want to incorporate this new thing into our driving culture.

It’s up to us; just because these cars can break speed limits doesn’t mean we have to just accept that. We humans are still in charge here, at least for now.

 

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
71 Comments
Inline Feedbacks
View all comments
Mike F.
Member
Mike F.
19 minutes ago

Seems fully in line with Elon’s “I may be smart but I never matured past the age of 21” aesthetic.

Last edited 17 minutes ago by Mike F.
Slow Joe Crow
Slow Joe Crow
1 hour ago

It’s totally on brand for Tesla to proudly announce they have written software to deliberately break traffic laws. Full Self Crashing, and full employment for liability and personal injury lawyers.

Collegiate Autodidact
Collegiate Autodidact
1 hour ago

“That little guy with the mustache and cowboy hat doesn’t really look like what we think Mad Max (lower left) looks like”
That sort of thing is pretty on brand for the fash billionaire apartheid boy who was mocked for wearing his cowboy hat backwards (though stans claimed he actually wore it the correct way) when he did that cosplay cowboy stunt touring the U.S./Mexico border in Texas last year.
After all, he did claim to be a fan of the film Blade Runner but bragged about the upcoming Cybertruck as being something “Bladerunner” would have driven. Shades of “Tim Apple”…

B3n
Member
B3n
4 hours ago

As a motorcyclist, I just really hope these things can actually see me.
As far as I know, they haven’t reintroduced the lidar, it is still vision-only.
Human drivers are dangerous enough, but there’s going to be more and more cars driving around with this tech and it’ll be impossible to just avoid them in traffic eventually.

Hugh Crawford
Member
Hugh Crawford
4 hours ago

Aw, color me disappointed.

I was expecting to be able to stand on the roof and play flame shooting electric guitar.

This is just driving like my 85 year old mom.

Last edited 4 hours ago by Hugh Crawford
Lotsofchops
Member
Lotsofchops
5 hours ago

I just can’t fathom being a Tesla stan in 2025. Mind blowing, really.
Also I find the mismatched icon unsurprising. The CEO thinks Bladerunner was the name of the protagonist.

Last edited 5 hours ago by Lotsofchops
Hugh Crawford
Member
Hugh Crawford
4 hours ago
Reply to  Lotsofchops

Well, everyone thinks his name is “smelly glandular secretion”, so that’s entirely understandable.

1978fiatspyderfan
Member
1978fiatspyderfan
6 hours ago

Tesla’s FSD is closer to cruise control. You set it at 55 it goes 55, you set it at 85 it goes 85. You turn on mad Max you control it, you own it. Funny thing I was reading that Tesla’s have a hard time being insured due to a few factors one is an abundance of claims. Due no doubt to the ignorant owners not understanding anything. I doubt this feature will do anything but invalidate warranty claims, at fault claims, and claims of stupidity

Last edited 6 hours ago by 1978fiatspyderfan
Horizontally Opposed
Member
Horizontally Opposed
3 hours ago

Ooo, that’s a nice policy product idea: Stupid Insurance. I think this can make more money than AI.

VaiMais
Member
VaiMais
6 hours ago

I still dont get it. I love driving and I love my truck. I dont valet and I drive into service bays or onto lifts when necessary, nobody touches or drives my truck. Oh, I’m not allowed to? OK no worries see you later. All these driver assist ding dongs are for appliance driving neophytes. Oh, but on the highway… on the highway YOU pay attention ffs

Cerberus
Member
Cerberus
5 hours ago
Reply to  VaiMais

This kind of stuff is for guys who prefer fleshlights to the women they can’t get and brag about how quickly it gets the job done.

GhosnInABox
GhosnInABox
6 hours ago

Try it in rural Georgia, d**kheads. I dare you!

5VZ-F'Ever and Ever, Amen
Member
5VZ-F'Ever and Ever, Amen
7 hours ago

An update that allows all Teslas to drive like human Bay Area Tesla drivers. Enjoy!

Shooting Brake
Member
Shooting Brake
7 hours ago

Now we need the government to step in to prevent Teslas from driving themselves like Altimas….yikes. Glad I moved away from big Tesla infested cities and highways…

Horizontally Opposed
Member
Horizontally Opposed
3 hours ago
Reply to  Shooting Brake

Ah, The Government. Yes yes they will clean up the streets right up.

Boulevard_Yachtsman
Member
Boulevard_Yachtsman
8 hours ago

I’m looking forward to the Mad Mad Mad Mad World mode where it just goes full-send to its destination, running other cars off the road as necessary. The icon of course will be a treasure chest under a big “W”.

InvivnI
Member
InvivnI
8 hours ago

This is a completely idiotic own-goal by Tesla. Why? Because this “Mad Max” mode is going to force governments to introduce regulations banning automated driving systems from intentionally breaking road laws.

I can already see many jurisdictions are going to go too far and over-regulate, potentially impacting features in the “normal” FSD mode, and overall increasing the cost of development to meet these regulations. In effect, Tesla’s just guaranteed they’ll have to deal with more red tape in the future.

That’s not to mention the absolute legal liability mess they’re putting themselves in – which I guarantee will be put to the test when a wreck inevitably happens whilst someone is using this feature.

anAutopian
anAutopian
2 minutes ago
Reply to  InvivnI

The way I see it, Tesla is highlighting what humans do (break laws). The first was the “California” stop or rolling stop. It’s what humans do. Government stepped in and now people behind the wheel have to step on the accelerator to do the rolling stop.

At this point, I think they are setting up the argument for having autonomy over people driving. With one OTA, all autonomous cars stop rolling stop signs.

But if all cars are going the speed limit, what will replace speeding ticket income for the government?

Harmanx
Harmanx
8 hours ago

I’ve had Autopilot and later FSD (as it became available to more drivers) for almost eight years. The software has always had the ability to go beyond the speed limit — that’s nothing at all new. (If I recall, 90mph had been the top speed option for a long time, then the top speed was reduced when FSD took on highway driving duties instead of Autopilot, while kinks were worked out.) “Mad Max mode” is also not new. It was an available setting one or two years ago, then removed during that FSD transition — and now put back. (The dumb icon is new, though.)

“Mad Max mode” is not the only setting, though — users can opt for one of several less (and non-) law-breaking settings, and often do. I don’t like the extreme setting, myself, so haven’t really used it.

At the moment, FSD users are finally feeling like the software is driving enough like human drivers that it seems safe to use in nearly all environments — and one of the biggest remaining frustrations had been that it’s least human-like behavior was its speed and lane-changing relative to surrounding traffic. The harsher critics of FSD often call out anything it does that is illegal — which is understandable. But ultimately, if cars are driving themselves, they best do it predictably and as much like the surrounding traffic as possible. FSD has been reaching new high levels of driving safety — so the company reintroduced the option for more aggressive driving.

(I’m not sure if the “NHTSA stop” is still there in the latest software — where the car, under newer NHTSA orders, had to come to a complete stop at all stop signs. That feels the most unnatural of all FSD’s behaviors — and is generally as frustrating for drivers driving behind as it is for the FSD users.)

Last edited 8 hours ago by Harmanx
71
0
Would love your thoughts, please comment.x
()
x