Home » The Reported Criminal Probe Against Tesla’s Self-Driving Claims Is Really A Critique Of Level 2 Semi-Autonomy Entirely

The Reported Criminal Probe Against Tesla’s Self-Driving Claims Is Really A Critique Of Level 2 Semi-Autonomy Entirely

Tesladoj Top
ADVERTISEMENT

There’s been a lot of reports recently that suggest that the Department of Justice has been conducting a criminal investigation into Tesla’s claims regarding their Level 2 semi-automated driver assist system known as Autopilot. First reported by Reuters, the probe was started last year in the wake of over a dozen crashes that appear to have involved Tesla’s Autopilot system in some manner. While Reuters’ source was thin on details of the alleged probe, the existence of such a probe is a very big deal, and while Tesla is the focus here – I think for valid reasons – the underlying problems being investigated by the probe are, to some degree, inherent in Level 2 semi-automated tech.

The crux of the Department of Justice’s probe seems to be related to the disparity between how Autopilot is marketed and promoted and the actual realities of what the system is capable of. This imbalance is at the heart of pretty much all Level 2 systems, as a study from the Insurance Institute for Highway Safety revealed earlier this month. People who use Level 2 semi-automated driving systems – which require an alert driver to be ready to take control back with no warning – seem to think their systems are much more capable than they are, which can lead to serious safety issues.

Vidframe Min Top
Vidframe Min Bottom

I think Tesla has definitely implied that Autopilot is more capable than a Level 2 system that is very much not a true self-driving system. This starts with the confusing name Autopilot, something that to most people suggests self-driving, and the company has definitely done more than just suggest the Autopilot system is self-driving, in one prominent and longstanding example, they’ve come out and said it.

That example is in a video from 2016 that is still on Tesla’s website, in a post titled “Full Self-Driving Hardware on All Teslas.” This video, set to the Rolling Stones’ Paint It Black, begins with this text:

Teslavidshot

ADVERTISEMENT

I mean, this feels pretty damning to me: “THE PERSON IN THE DRIVER’S SEAT IS ONLY THERE FOR LEGAL REASONS. HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF.” How is that not suggesting that Autopilot is capable of driving the car on its own? As an extra bit of irony, last year the New York Times found that during shooting of this video, the car hit a roadside barrier and required repairs. I bet you won’t be shocked to find that part didn’t make it into the final video.

Of course, nothing is simple. Tesla has definitely provided all of the necessary warnings and disclaimers needed for Autopilot on their website and in their owners’ manuals, and those are quite clear as well:

Warning2

It comes right out and says Autopilot “does not make driving autonomous,” and “remain alert at all times and be prepared to take immediate action.” There’s no disputing that. Tesla definitely is covering their ass here, and I’m not certain how the DOJ plans to incorporate this into their investigation, and Reuter’s sources didn’t have specific commentary about this, either.

These statements and warnings, though, are in smaller print on websites and deep inside owner’s manuals. This sort of honest, forthright assessment of Autopilot’s limitations are not what’s normally seen in public statements, which are things like

ADVERTISEMENT

…in that one he actually says “car is driving itself” when using “Active Autopilot” and when Tesla got pushback when they called their more advanced, in-development (yet deployed on public roads) semi-automated driving assist system “Full Self-Driving (FSD),” Elon Musk responded like this:

“I think we’re very clear when you buy the car what is meant by full self-driving. It means it’s feature complete. Feature complete requiring supervision … There’s really three steps: feature complete of full self-driving but requiring supervision, feature complete but not requiring supervision, feature complete not requiring supervision and regulators agree.”

Saying you have a Full Self-Driving system that is “feature complete” sure sounds like that means it’s a complete system where the car can drive itself. Plus, going on to say “feature complete but requiring supervision” and “feature complete but not requiring supervision” is confusing at best, because if it’s “feature complete” what’s the differentiator between requiring or not requiring supervision? Testing, I suppose?

Then there’s the differentiation of not requiring supervision, and not requiring supervision “and regulators agree,” which again goes back to the implication that it’s government regulations keeping your car from driving itself more than anything else, which just isn’t the case.

While I can’t speak to exactly what the DOJ is investigating, what I can see are two very different kinds of messaging happening about Autopilot: the grim realism of the warnings on websites and in documentation that makes it very clear that the car is not self-driving, and the driver must be ready to take control at any moment, and the more marketing-type messaging that suggests that Autopilot-assisted driving is so much safer than human driving, and that were it not for all those pesky, fussy, fun-killing government regulators, you could be blasting down the highway at 90 mph watching VR porn on a headset.

ADVERTISEMENT

So, the real problem seems to be a clash between actual capability and how Tesla is presenting Autopilot to the world, and while I can’t speak to what the DOJ will decide, I can note that all Level 2 systems are somewhat inherently going to be plagued by this, and the more advanced an L2 system gets, the worse the problem becomes.

The issue is that all L2 systems require non-stop attention from a driver, because a disengagement could happen for any number of reasons, without any real warning, and the driver is needed to take control immediately. This is literally part of the definition of what a Level 2 system is:

Sae Levels

See where it says “You are driving whenever these driver support systems are engaged,” and “You must constantly supervise these support features?” That’s pretty clear about what’s going on. The problem is that highly developed L2 systems can seem like they’re driving themselves, performing the vast majority of the driving task, yet they’re still just L2 systems, and could require takeover at any moment.

That’s how we get mainstream articles like this New York Times article from earlier this year, where a writer driving a car with GM’s Super Cruise L2 system dangerously mischaracterizes the system as being much more capable than it really is, and stating that what was required of him as a driver was

ADVERTISEMENT

All the car asks is that you keep looking forward, in the general direction of the road.

…which is, of course, absolutely untrue. That’s not all the car is asking. It’s asking you to be ready to drive, at any moment.

This is a huge problem because the more an L2 system does, the less it seems like the driver needs to do anything, so the less attentive they can allow themselves to be.

Yes, the DOJ is going after Tesla now, again, I think with good reason, but it may be time to evaluate how every automaker approaches Level 2 semi-automation, and decide if it makes any sense at all.

 

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
46 Comments
Inline Feedbacks
View all comments
Ivan256
Ivan256
1 year ago

It’s pretty clear that it you have all of the appropriate and necessary disclaimers and warnings to absolve you of liability and ALSO have statements, winks, and nudges suggesting to your customers that they can ignore the warnings and disclaimers because they’re only there to keep the lawyers happy, that you deserve the wrath of the angry lawyers and the crippling financial sanctions that come with said wrath.

Brummbaer
Brummbaer
1 year ago

I hate to bring up my Cadillac CT6 up again, but here I go. My Cadillac is a 2019 and has some of the features described, ie. lane centering, automatic braking, and side collision avoidance along with a huge amount of cameras.

Problem is, the lane centering is unreliable. If it can’t sense the painted lines on the road it goes “bugf**k” and wanders off. The automatic braking works fine except that it has a habit of waiting until I get worried, then, bam, really slowed down. The cameras work fine when there is something to see and the lenses are clean.

All in all certainly not self driving but with some early pretensions of where we are today. Personally, they can keep all of it. I liked my 2007 STS a lot more.

Dangerous_Daveo
Dangerous_Daveo
1 year ago
Reply to  Brummbaer

I find the braking one is because we’re looking multiple cars ahead, where the car’s system is just monitoring distance between the front of itself, and the back of the car in front. It can’t see that 37 cars ahead just had a massive bingle and we need to back off now. And they won’t be able to do that until they can talk to each other.

Bison78
Bison78
1 year ago
Reply to  Brummbaer

>> Problem is, the lane centering is unreliable. If it can’t sense the painted lines on the road it goes “bugf**k” and wanders off.

In my Tesla, which doesn’t have the latest hardware, lane tracking is amazing. Last year, it successfully navigated a 100ft curved section of freeway that had new paving and the lanes were not marked. I have my hands firmly on the wheel, waiting to take over, but frankly, it did better than I could have done.

PajeroPilot
PajeroPilot
1 year ago

I was personally okay with the “Autopilot” name. My understanding of actual autopilots on planes is that the real pilots are constantly monitoring the situation and have to be able to take over the moment anything out of the ordinary happens. Which is sort of how I understand Tesla’s system works (I’ve never driven one).

The “Full Self Driving” name on the other hand is dangerous bullshit. In this case Elon is definitely over reaching. Torch definitely has a point – pilots are highly trained professionals who are accountable and will monitor the situation constantly. Motorists on the other hand – getting a driver’s license in general isn’t hard and many drivers can’t be trusted not to abuse Autopilot like Level 2 systems.

Dave Edgar
Dave Edgar
1 year ago

I just don’t see how any of this is ever going to work. It all depends on perfectly calibrated and maintained sensors that never, ever stop working. Seriously? In the northern tier of this great nation, we have a thing called “winter”. Snow, slush, salt – it’s hard enough to keep windshields and lights clean enough for safe operation. Never mind the software – I don’t believe the hardware will ever be 100% capable of safe operation 24/7/365 for any reasonable operational lifetime. Look around you just about anywhere, anytime, and ask yourself the following: “How many of these cars I’m looking at are sufficiently well maintained to trust them as “self-driving?” At least where I live, it is a very small percentage and we are by no means unique.

Lokki
Lokki
1 year ago

What I have never understood is the general decision by manufacturers to apply ‘Self-Driving’ technology to passenger cars rather than ‘Long Haul’ trucks. The environment on most “interstate highways” et al is a far less complex one than that of city streets. Vehicles enter and exit from designated points (ramps), drive within a limited speed variance, and make very few sudden stops. Trucks start their trips from depots close to the highway and end in similarly situated depots.

This more controlled environment seems far simpler to program- any obstacle or vehicle noted by the system doing something which falls outside “the norm” sounds an alarm for a human to take control and the truck starts immediately braking to a quick stop at the side of the road unless a human operator intercedes to keep the truck going.

Much less complex than the chaos of an urban environment full of baby strollers, and so forth. It also gives developers the opportunity to rack up millions of miles in testing rather quickly.

So…. I just don’t get it.

Further, there is a huge financial incentive for such a system in long haul trucking: professional drivers are expensive. Even if the equipment to operate a semi costs half a million dollars, it would only take a few years to recoup the costs in saved salaries.

However, car manufacturers have chased the sizzle of putting self-driving into luxury cars instead.

Chartreuse Bison
Chartreuse Bison
1 year ago
Reply to  Lokki

Because a trucking company wouldn’t settle for this half/quarter-assed self-driving like the average tesla fan does. Luxury buyers will pay for the feature just to say they have it, being cool doesn’t balance the books for a company. They still have to pay a safety-driver to monitor the system for now, so they might as well just pay a driver. They can’t get away with lower skilled drivers, because the kind of driving that requires trucking training/experience is exactly the kind of scenarios where a level 2 system would crap out.
I’m sure fear of losing jobs is part of it too. Trains could practically be automated with a Raspberry-Pi but most of them still have operators.

F.Y. Jones
F.Y. Jones
1 year ago
Reply to  Lokki

Oh I definitely think this is where were going and the transport companies are probably counting the minutes until it’s achievable. Just have fleshbags for the final mile, but let the robots handle the under 99%.

What’s going to either be great or annoying is how those trucks are driven. Robots don’t have families to come home to, and don’t care if it’s light or dark. I think gas/electric efficiency and safety will become more important than ever (especially if they go electric where wind resistance becomes even more important).

I could see companies running these trucks at 45 mph on the expressways. It won’t matter if a 500 mile trip takes 9 hours or 13 hours anymore if you don’t have a human behind the wheel trying to maximize a paycheck.

Dave Horchak
Dave Horchak
1 year ago
Reply to  F.Y. Jones

Yeah I laughed at this. A company has millions tied up in robotrucks, has a delivery deadline and decides to cut profit in half by slowing down delivery thereby cutting the hours the equipment is available for other loads. I see no scenario where that ever happens.

unclesam
unclesam
1 year ago
Reply to  Dave Horchak

Can’t remember where I read it, but I once saw a discussion of aircraft cruising speed vs fuel burn. Don’t remember if it’s airframe specific or a general design principle, but the idea is a given plane is most fuel efficient at X cruising speed (something like 25% throttle) but that’s too slow for airlines, so they operate at the second best point on the efficiency curve (something like 40% throttle). Fancy term for it and everything. Not hard to see someone try to develop that for robodrivers.

StalePhish
StalePhish
1 year ago
Reply to  Lokki

You get a lot more data putting this on best-selling passenger vehicles out in the wild for years. Supposedly the first Tesla Semis will be delivered to Pepsi on December 1st 2022, and you can bet it will launch with Autopilot. And that the software stack for Autopilot was developed from billions of miles of data collected from the fleet of millions of Model S/3/X/Y over the past 7 or so years.

Nic Periton
Nic Periton
1 year ago

Somebody should write a book about all this stuff.

Dave Edgar
Dave Edgar
1 year ago
Reply to  Nic Periton

Like anyone who thinks this stuff actually works reads books…

Hugh Crawford
Hugh Crawford
1 year ago

Tesla used Paint It Black as a soundtrack? Really? That should tell you something.

Bork Bork
Bork Bork
1 year ago
Reply to  Hugh Crawford

Saw a kids animation where the opening song is Jay-Z’s “Empire State of Mind” and as he’s rapping about his mom getting ridden like bus I started laughing so hard my friend though I had gone mad.

Martin Ibert
Martin Ibert
1 year ago

Read the definition of Level 2 again. The really big difference between 2 and 3 is that you must intervene and take over control when necessary, even if the car does not tell you to. You yourself need to be on guard constantly, and it is fully within the definiton of Level 2 that you need to yank the steering wheel back when your car confidently tries to kill you and others by driving straight into oncoming traffic (like I think some Teslas have).
You must be ready to take over at any time with “no notice”, and that means “no notice”, not “0 seconds’ notice” — literally none, you need to notice it yourself.

eskatonia
eskatonia
1 year ago
Reply to  Martin Ibert

Some may argue if such a system should be allowed on public roads at all. It literally is the worst of both worlds, you are fully responsible for the car but it it just “smart” enough to lull you into a false sense of security that it’s competent.

59turner
59turner
1 year ago
Reply to  Martin Ibert

I think you are confused with the definitions. Level 2 is still a driver assist where the driver decides to listen or not use the aid. Level 3 is where the system says “I can’t handle this condition, you need to do the work, NOW.” This is dangerous, because you have to still know all of the road conditions, but you are not doing any active driving. There is almost no point to level 3 because as a driver you have to be in panic mode knowing that the car may or may not start flashing red signals to take over driving and you will never know when. Hand-over is critical and never easy.

eskatonia
eskatonia
1 year ago

IMO Level 2 systems should require eye tracking by law, like BlueCruise has. Yes that means all Tesla FSD systems would become illegal, they can disable it or they can retrofit eye tracking hardware, that’s not the problem of the rest of us who never agreed to have to drive on the same roads as beta FSD. Then lets go further Level 3 systems that let the user stop paying attention must have the ability to pull safely off road onto a hard shoulder or off the freeway entirely if the driver stops responding. They should never stop in a traffic lane and turn on blinkers. They should also have the ability for a remote human driver to take over as well in the case the level 3 system just can’t cope and the driver doesn’t respond. Don’t have all that? Then you can’t allow the user to stop paying attention. Industry self regulation is not going to work, not when we have Musk beta testing on public roads.

Dave Horchak
Dave Horchak
1 year ago
Reply to  eskatonia

And our political system moves quickly enough to do this? I mean you can’t pass the law before the invention because you don’t know what you need to enforce. Then as the company hides errors in secret to keep government out. Then politicians have to wait for bribes to decide what to do. Then their underpaid overworked sexually harassed interns have to write it into a law. Then wait to see if bigger bribes come along. Then after all this see if the majority party allows it to be voted on. Then like vote on it on one of the ten days politicians work. Then company sues to block the law tying it up from 1 year to forever. By that time people die the system moves on to different flaws.
Yeah you don’t see this on Saturday morning cartoons on how a bill becomes a law do you?

eskatonia
eskatonia
1 year ago
Reply to  Dave Horchak

Yes it does, laws can and could be updated every 5 years or so. Also it doesn’t purely rely on new laws, the NHTSA is legally authorised to mandate safety measures without new laws being passed.

Dead Elvis Inc.
Dead Elvis Inc.
1 year ago
Reply to  eskatonia

Yeah, but Dave’s a moron.

Harmanx
Harmanx
1 year ago
Reply to  eskatonia

Teslas do have vision tracking, so they wouldn’t become illegal.

eskatonia
eskatonia
1 year ago
Reply to  Harmanx

Retrofitted in using the existing camera, its not a dedicated eye tracking system like in BlueCruise. Lets set the Tesla a test and see if it disengages within two seconds 99 percent of the time when the user looks away?

Bison78
Bison78
1 year ago
Reply to  eskatonia

>> Lets set the Tesla a test and see if it disengages within two seconds 99 percent of the time when the user looks away?

I am reasonably sure you don’t want the system to disengage ever. Sound alarms and bring the vehicle to a controlled stop if the driver doesn’t respond.

Note that some car companies have gone down the “disconnect” route, but that leads to cars driving off the road.

Ivan256
Ivan256
1 year ago
Reply to  eskatonia

“Then lets go further Level 3 systems that let the user stop paying attention must have the ability to pull safely off road onto a hard shoulder or off the freeway entirely if the driver stops responding.”

This translates to “Level 3 systems must be Level 4 systems”.

Which, IMO, is completely and utterly reasonable. Level 3 systems should never be allowed to share the road with humans.

SP4CEM4N
SP4CEM4N
1 year ago

Frankly this is ridiculous. Autonomous driving is an evolving tech and until it becomes 99.9999% safe (or however many 9s are determined to be safe enough) drivers using it have to pay attention. If they don’t, they’re liable. Would you sue the manufacturer if your car runs into stopped fire truck while you’re using cruise control? Of course not. Has Tesla said Full Self Driving can be used without paying attention? No. In fact it clearly states the opposite multiple times, and enforces it by requiring hands on the steering wheel AND eyes on the road. This tech is amazing, world-leading, but not yet autonomous . Anyone who crashes while using FSD, SuperCruise etc. is wholly and entirely at fault. Period. No matter what they name it. When the manufacturer says it is autonomous, and requires no supervision, then they will become liable for the vehicle’s actions.

eskatonia
eskatonia
1 year ago
Reply to  SP4CEM4N

The Tesla system does not have any eye tracking, it relies on pressure on the steering wheel which is easily spoofed as shown by the various videos of people in their Tesla on autopilot while they are in the passenger seat, or the back seat.

SCJeff
SCJeff
1 year ago
Reply to  eskatonia

Around the middle of 2021 Tesla enabled vision tracking.

eskatonia
eskatonia
1 year ago
Reply to  SCJeff

Does it reliably disengage auto pilot if the driver is looking elsewhere?

StalePhish
StalePhish
1 year ago
Reply to  eskatonia

“Reliably” is a tough metric. But yes, it does disengage. If you are enrolled in FSD Beta at least, if you’re caught not paying attention (via some sort of head/eye tracking), you get alarms and red flashing lights, and then you get a “strike”. Something like 3 or 5 strikes and you get permanently kicked out of the beta. I got one once, I believe because I was using the brim of my hat as a sun visor driving due east in the morning, and it thought that meant I was looking down for a long time.

SCJeff
SCJeff
1 year ago
Reply to  eskatonia

I don’t have any before experience but it does seem to give a warning pretty quickly. IMO just vision tracking would be a better way to go. The times I’ve used autopilot I’ll have both hands on the wheel but keep getting warnings because I guess they are too balanced. One hand on one side will let the sensor know there’s something there. Not ideal.

Wil Randolph
Wil Randolph
1 year ago
Reply to  SCJeff

They didn’t “enable” it. The car needed to have the driver monitor to begin with, and that was limited to Model 3s. I believe they may now have them installed with FSD, but FSD isn’t FSD. So to get the equipment you need for AP to do its job reasonably well, you have to step up for the full fraud.

Bison78
Bison78
1 year ago
Reply to  eskatonia
Lokki
Lokki
1 year ago
Reply to  SP4CEM4N

Teslas don’t kill people- people kill people.

59turner
59turner
1 year ago
Reply to  SP4CEM4N

So, with a level 4 car that has no steering wheel, you think that the ‘driver’ is still responsible? Ridiculous take on responsibility. Once the owner fully relinquishes control, liability is 100% on the manufacture, it has to be.

Dave Garland
Dave Garland
1 year ago

The solution of course, is to make the manufacturer liable for any accident where level =>2 screws up.

JumboG
JumboG
1 year ago
Reply to  Dave Garland

And for the purposes of determining the Level, you have to go with what the standard large-scale advertising says, using common definitions of the word (in fact, this should be the case for all advertising.) For example, ‘full self driving’ would be interpreted as Level 5, no matter what other disclaimers are stated, unless they are said at the same time, in the same volume and speed or type size/font/placement prominence for print.

Electric Truckaloo (formerly Stig’s Chamorro Cousin)
Electric Truckaloo (formerly Stig’s Chamorro Cousin)
1 year ago

V2I, dedicated autonomous zones (or lanes), then slowly creep our way up to L3 and L4 with a path to L5.

I can’t see a situation where fully-autonomous and meat-operated vehicles can coexist. The tech isn’t there, no matter how much any of us want to be sleeping or masturbating on our morning commutes.

Lew Schiller
Lew Schiller
1 year ago

What is the point of it? So one can check their socials with less fear of a collision?
I get lane position monitors and such but for God’s Sake you’re responsible for piloting a very heavy machine capable of killing a squishy human. What is so darn hard about actually doing that?

StalePhish
StalePhish
1 year ago
Reply to  Lew Schiller

Significant reduction in physical fatigue after a long drive. Until you try it, you’re not really aware how much wear you’re putting on your ankles and forearms by doing a long drive. So many micro corrections near constantly.

Michael Beranek
Michael Beranek
1 year ago
Reply to  StalePhish

This is complete nonsense. I routinely make trips out to the west coast in two 16-hour, 1,000-mile days, with a hotel snooze in between. Since I drive a Buick, it’s like sitting on the sofa all day, but with a better view.
Of course, if you drove 16 hours in city traffic, you’d be pretty beat up. But on the super slab with cruise on? Gimme a break. Maybe in a crappy small car, especially one with only 3 cylinders.

Lew Schiller
Lew Schiller
1 year ago
Reply to  StalePhish

An assist for long distance I can appreciate but aren’t these systems mostly touted for city driving and isn’t that where they fail?

Óscar Morales Vivó
Óscar Morales Vivó
1 year ago

The biggest mistake made here was that everyone decided to go solve the hardest problem first. Driver assist features are still in their infancy tbh and one would expect the lessons learnt from years of seeing them in the road would inform any chance at building truer automation —I personally don’t think Lvl 5 is achievable, and Lvl 4 only for very specific environments—.

Frankly all I’d need is for the car to take over for the both boring and risky bits. Bumper to bumper, traffic jam traffic seems the most reasonable target with the most bang for the buck. Let it keep the car on the lane and disengage the system once a certain low speed is reached or if things get confused (i.e. have to change lane) but there would be ample warning to the driver because of the low speeds.

Huibert Mees
Huibert Mees
1 year ago

Thanks Jason. There is another part of this problem that isn’t really discussed much which is that the driver needs to take over control of the vehicle when the level 2 system makes a mistake. It’s not only when the system dis-engages. We’ve seen level 2 cars drive into the back of stationary firetrucks on multiple occasions. These weren’t a case of feature dis-engagement, these were instances of the feature making a mistake. Those mistakes might give the driver milliseconds to respond which is just not possible for most humans. The problem is that driver attentiveness and intervention will be required until we get to level 5 autonomy which in my opinion is still decades away, if we ever see it at all.

46
0
Would love your thoughts, please comment.x
()
x