Tesla is recalling 362,758 vehicles after a new report from the federal National Highway Traffic Safety Administration called out its Full Self-Driving Beta (FSD Beta) software for a reason that, when you think about it, is surprisingly human for a computer-based AI system: it sometimes breaks traffic laws.
Sure, we all do that on occasion, but the difference is that you and I aren’t a $15,000 option that’s often touted as being safer than human driving. Unfortunately, that doesn’t seem to be the case, especially according to NHTSA, which advised Tesla that it has “potential concerns related to certain operational characteristics of FSD Beta in four specific roadway environments.” Sounds like someone is in trouble.
The recall notice describes the “defect” as follows:
FSD Beta is an SAE Level 2 driver support feature that can provide steering and braking/acceleration support to the driver under certain operating limitations. With FSD Beta, as with all SAE Level 2 driver support features, the driver is responsible for operation of the vehicle whenever the feature is engaged and must constantly supervise the feature and intervene (e.g., steer, brake or accelerate) as needed to maintain safe operation of the vehicle.
In certain rare circumstances and within the operating limitations of FSD Beta, when the feature is engaged, the feature could potentially infringe upon local traffic laws or customs while executing certain driving maneuvers in the following conditions before some drivers may intervene: 1) traveling or turning through certain intersections during a stale yellow traffic light; 2) the perceived duration of the vehicle’s static position at certain intersections with a stop sign, particularly when the intersection is clear of any other road users; 3) adjusting vehicle speed while traveling through certain variable speed zones, based on detected speed limit signage and/or the vehicle’s speed offset setting that is adjusted by the driver; and 4) negotiating a lane change out of certain turn-only lanes to continue traveling straight.
Essentially, this all just adds up to sloppy driving: not obeying speed limits, rolling through stop signs, going through intersections with “stale” yellow lights (that’s NHTSA’s odd terminology there), and going straight out of turn lanes. The truth is, we’ve seen this sort of behavior and sometimes significantly worse from FSD Beta plenty, which is why personally I think this sort of recall may be overdue.
Because FSD Beta is an SAE Level 2 system, the human in the driver’s seat needs to remain vigilant at all times and be ready to take control at any moment. The fact that this recall exists at all suggests that people are not always able to perform their vigilance task role properly, which is really the biggest flaw of all Level 2 systems, something that I have definitely ranted about multiple times before.
Alongside this problem there’s still the issue of how FSD Beta is marketed and portrayed and how it is understood by the people that use it; in general it tends to be overhyped and its capabilities misunderstood and overestimated, sometimes due to some seemingly intentional overstating by Tesla themselves.
Tesla and its Level 2 semi-automated driving systems have been under investigation by NHTSA for some time. This recall, which affects all Tesla models – Model 3, Model Y, Model X, and Model S, all ranging from 2016 to 2023 – and Tesla plans to correct the issue via an over-the-air update on April 15 to alter the FSD Beta software. According to the recall notice, the remedy plan is this:
Tesla will deploy an over-the-air (“OTA”) software update at no cost to the customer. The OTA update, which we expect to deploy in the coming weeks, will improve how FSD Beta negotiates certain driving maneuvers during the conditions described above.
Tesla does not plan to include a statement in the Part 577 owner notification about pre-notice reimbursement because there are no out of warranty repairs related to these conditions.
The remedy OTA software update will improve how FSD Beta negotiates certain driving maneuvers during the conditions described above, whereas a software release without the remedy does not contain the improvements.
Even with these fixes, there are still the issues of the inherent, non-technical but conceptual flaws with all Level 2 semi-automated systems. NHTSA is reportedly assessing the interactions between humans and these Level 2 systems like Autopilot and FSD Beta to see how driver attentiveness is affected, so perhaps we’ll see new NHTSA rulings on that in the future.
At least the fix is easy for Tesla owners: just be ready for that update on April 15, and you should be – if not good – at least better to go. Just pay attention to what your car is doing, FSD Beta or not, people.
Support our mission of championing car culture by becoming an Official Autopian Member.