Home » New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

Iihsstudfy Top
ADVERTISEMENT

Being right about something occasionally is fun, and gives you a little dopamine rush of pleasure, a pleasure that I personally encounter extremely rarely. I feel like I’m right about something today, so mark your calendars, but it’s not something I’m actually pleased to be right about: People both don’t understand and aren’t using Level 2 (L2) semi-automated driving assist systems well or safely. I’m being bold and saying I’m right about my thoughts on this – which I’ve been ranting about for quite a while – because a study from the Insurance Institute for Highway Safety (IIHS) just dropped, and it comes to essentially the same conclusions: Significant percentages of people who have cars with L2 semi-automated driver assist systems overestimate the capabilities of the systems, and as a result use them in potentially dangerous ways. With these systems becoming more and more common, it’s time we addressed this bluntly and honestly.

The study, called Habits, Attitudes, and Expectations of Regular Users of Partial Driving Automation Systems, is based on responses to questions from a pool of 604 respondents who own cars with one of the three most common L2 semi-automated driving systems, broken down like this:  General Motors Super Cruise (200), Nissan/Infiniti ProPILOT Assist (202), and Tesla Autopilot (202).

Vidframe Min Top
Vidframe Min Bottom

What Level 2 Driving Automation Systems Actually Do

Now, before I get into the results of this study, let’s just make sure we’re all on the same page regarding exactly what each of these driver-assist semi-automated systems actually do. Fundamentally, they they all control speed and braking, dynamically keeping set distances from the car in front like dynamic cruise control, and they do some steering, keeping centered in a lane and following road curves and so on. Some systems do a good deal more, like Tesla’s Autopilot and GM’s SuperCruise, allowing for lane changes and some degree of navigating on streets other than highways, and so on.

What all these systems have in common is that they are very much not actually self-driving systems, and the person in the driver’s seat must remain alert and ready to take control at any moment, because any of these systems could disengage or make a poor decision at any time, with zero warning. This fact that the driver must remain on constant alert even when the semi-automated systems appear to be doing most of the work associated with the driving task, is at the heart of the problem that this study reveals, and, frustratingly, is something that researchers who study automation have known about since at least 1948.

The Problem

The problem even has a name, taken from N.H. Mackworth’s 1948 study called The Breakdown of Vigilance during Prolonged Visual Search: the Vigilance Problem.

ADVERTISEMENT

Essentially, the vigilance problem is not a technological one, it’s a human one. Given a system that operates almost independently but requires passive oversight by a human, that human will not be good at paying attention to the system, and as a result will not likely be ready to take action when needed. In the case of these semi-automated driving systems, the situation is actually worse, because the naming and marketing and perception of these systems suggests more capabilities than they actually have, which makes people even less vigilant.

Nissan’s ProPILOT Assist is a bit of an exception to this, as its marketing and implementation do a lot less to suggest capability, and the numbers from this IIHS study back this up, as we’ll see soon.

What Drivers Think They Need To Do When Using Automated Driver Assist Systems

In fact, let’s look at some numbers right now, starting with this really basic question: Are drivers using these systems comfortable with not paying attention to the road while the systems are engaged? Here’s what the study found:

Over half of Super Cruise users (53%), 42% of Autopilot users, and only 12% of ProPILOT Assist users said they were comfortable letting the partial driving automation drive the vehicle without having to watch what was happening on the road.

Holy crap, over half of Super Cruise users didn’t think they needed to “watch what was happening on the road?” It’s worth noting that Super Cruise is the only one of these three systems that requires no physical contact with the steering wheel when engaged, instead relying on a camera-based system to track the driver’s eyes.

Tesla owners at 42% aren’t much better, but Nissan owners actually are, at only 12%.

ADVERTISEMENT

The study explains a bit:

Over-trusting either hands-free (Schneider et al., 2022) or hands-on-wheel partial automation (Victor et al., 2018) can lead drivers to not intervene even when they see a hazardous situation forming in front of them because they incorrectly believe the system can handle more than it was designed to do. One question this study could not answer was whether hands-free driving capability is more likely to give users the impression that the system is more functionally capable and safer than a hands-on-wheel system. Nonetheless, more than half of Super Cruise users in the current study said they were comfortable letting the system drive itself without having to watch what was happening on the road, compared with approximately 40% of Autopilot users and only 12% of ProPILOT Assist users. Super Cruise users were also far more likely than the other two groups to say that most non-driving-related activities were safe to do while using the system.

What The IIHS Study Reveals Drivers Are Doing Instead Of Driving

So what the hell are these “non-driving-related-activities” that people are doing instead of paying attention to the road? Here’s what the study found:

Chart1

Some of these seem like no big deal: eating and/or drinking is something we all have done while driving, same with talking to passengers or talking on a cellphone call via Bluetooth or looking at scenery. Most of those people have been doing since before cars even shifted themselves, let alone drove themselves in any sort of way.

Other activities on the chart a lot more alarming: per the survey, 49% of Super Cruise, 44% of Autopilot and 19% of ProPILOT users say they’ve been texting, which isn’t great. But 19% of Tesla Autopilot users have apparently used a laptop or tablet? And 20% watch videos? 18% read a book? What the hell? And 10% of Autopilot users admitted to sleeping? Sleeping!

ADVERTISEMENT

The fuck is wrong with these people?

That chart notes the difference in what people do with the systems both on and off, and come to the very obvious conclusion that drivers are much more likely to do stupider things when their semi-automated systems are on:

The present study supports the finding of previous research that partial driving automation facilitates engagement in non-driving-related activities (e.g., Dunn et al., 2021; Noble et al., 2021; Reagan et al., 2021); however, the attitudes, expectations, and habits around individual activities and system design safeguards vary among drivers depending on the system they use.

Interestingly, Super Cruise users were the most likely  “to characterize phone and other peripheral device use, watching videos, grooming, reading, and having hands off the wheel as safe to do while using the system,” which I would suspect is related to the fact that Super Cruise does not require contact with the steering wheel. ProPILOT users were the least likely.

Attention Reminders In Level 2 Systems

All these systems to have Attention Reminders, where if the system thinks you’re not paying adequate attention, measured by either steering wheel torque sensors or eye-tracking cameras, it gives the driver a warning, and, if that warning isn’t heeded, the semi-automated driving assist system may be suspended or disabled.

The study surveyed driver’s “perceived annoyance” at these reminders:

ADVERTISEMENT

Annoyancechart

Sometimes ignoring these warnings can cause the driver to be locked out of the driver assist systems if they don’t respond to prompts quick enough, and we get some fascinating details of some of these lockout scenarios Super Cruise users noted, maybe with more specificity than you’d expect:

Moreover, seven Super Cruise users mentioned eating as one of the activities that led to their lockout experience—most of them specifically described taking their hands off the wheel to eat—but none of the Autopilot users cited this reason. Illustrating the challenges of eating while driving, two Super Cruise users said that they had dropped their food, in one case a taco and in another case a hamburger, and had to let go of the wheel to retrieve their meals. Fifty-four percent of Super Cruise and Autopilot users who experienced lockouts said they were at least somewhat annoyed these lockout events happened.

Look at that; IIHS has recorded two dropped food incidents, one a taco-class incident and one a burger-class incident. Fascinating.

In the attention reminder context again we see differences between the systems, though for the most part, drivers seem to understand the need for these reminders. Nissan ProPILOT Assist users got the fewest reminders, and the study has some good insight into why (emphasis mine):

A substantial proportion of ProPILOT Assist users reported never having received attention reminders, which raises the possibility that interactions with the system’s lane-centering feature might also contribute to these group differences. ProPILOT Assist’s lane-centering support remains active while the driver steers within the lane; this characteristic is a component of cooperative steering, or shared haptic control. This design philosophy encourages the driver to actively participate in the steering task (Marcano et al., 2021), helping to reinforce the driver’s role in the relationship and improving the driver’s sense of agency and willingness to intervene whenever necessary or desired (Wen et al., 2019).

The Benefit of Cooperation

Unlike Autopilot or Super Cruise, Nissan’s system is designed to be cooperative, and doesn’t disconnect when the driver gives steering, throttle, or brake inputs. It’s not modal, like the other systems, and as such the driver is never in a position where they would “give up” control of the car to the machine. While this seems like a less advanced way of doing things, I think for semi-automated L2 systems it makes so much more sense, because the real problems arise when the driver thinks the machine has more control than it actually does. If you never imply that the car is in complete control, with the driver only monitoring and instead always keep the driver an active participant in the loop, then many of the vigilance problem issues simply won’t happen.

ADVERTISEMENT

The study notes this difference between ProPILOT Assist and the other two systems:

In contrast, Autopilot’s lane-centering support deactivates whenever the driver exceeds a (relatively small) threshold amount of steering torque. Super Cruise’s lane-centering support temporarily suspends and only automatically reactivates once the driver has stopped steering and the system has regained the necessary information to  position itself within the lane. It is unclear to what extent these design differences influence driver behavior, but the temporary suspension or deactivation of system support might make drivers more reluctant to participate in the driving over time (Banks & Stanton, 2015).

Perhaps counterintuitively, the more work the semi-automation system does, the less it seems to make drivers want to take control when needed, out of fear of the system disengaging. While I don’t think this study has definitively proven this to be the case, it nevertheless is a pretty alarming concept: Driver reluctance to take control so as not to cause the system to disengage has a lot of potential to be dangerous. There should be no hesitation for drivers to take control if they feel uncomfortable in any way.

The Unexpected Effects Of Camera-Based Driver Monitoring Systems

While driver monitoring systems are getting better, even the camera-based ones are not foolproof, and this study also noted Tesla drivers using a water bottle to fool steering wheel torque sensors. Additionally, camera-based driver monitoring systems that don’t require holding the wheel have another disadvantage in that they further the illusion that the system is more capable than it actually is, which permits dangerous behavior like not looking at the road (emphasis mine again);

While steering torque monitoring alone is a poor basis for managing driver behavior (Lin et al., 2018), the current study’s findings indicate that camera-based driver monitoring on its own is not a silver bullet either. The longer a driver looks away from the road the greater their crash risk (Klauer et al., 2006; Yang et al., 2021). Even though Super Cruise uses camera-based monitoring to know where the driver is looking, many Super Cruise users reported being more likely to look away from the road for extended periods while using the system than the drivers in the other two groups. It is unclear why this is the case, but, compared with Autopilot and ProPILOT Assist users, Super Cruise users were also more likely to say that looking away from the road is safe to do while using the system than during unassisted driving and to say that they can do it better and more often with the automation’s support. Clearly, the expectations and attitudes of many of these drivers do not reflect an accurate understanding of the system’s limits.

The study also notes that we’re still in the early stages of adoption of semi-automated systems, and as a result these findings may be skewed to specific “early adopter” behaviors:

As Lin et al. (2018) noted, the scarcity of vehicles equipped with partially automated systems in the registered vehicle fleet means that studies such as this one are presently capturing behavior and perceptions of early adopters. It is unclear how user attitudes and expectations will evolve as system designs change and the technology becomes more widespread.

What I find alarming about the early adopter idea is that of anyone, early adopters should be the most aware of a system’s technical abilities and limitations, shouldn’t they, since they’re so damn interested? Or, could this just mean that the eagerness of an early adopter is enough to make them victims of their own wishful thinking about the capabilities of these systems?

ADVERTISEMENT

Perception And Marketing Is A Big Deal

Of course, a lot of the issue is with how these systems are marketed and portrayed; names like Autopilot and a heavy emphasis on hands-free driving does lead to a lot of overestimation of these things:

Worryingly, some drivers appear to have a false sense of security about how they are meant to use the technology and what it is designed to do. Misunderstanding the user’s roles and responsibilities corresponds with a higher degree of engagement in nondriving-related activities compared with driving without assistance. This confusion is likely influenced by system design, as some systems are more likely to give drivers the impression that they are more functionally capable than they are.

There’s also some interesting data about the demographics of the groups in the study, none of which is all that surprising: GM’s system, showing up mostly on Cadillacs, skews old. Teslas skew younger and overwhelmingly male, and the Nissans are the group that has demographics closest to just the general population. Again, not shocking.

What is a bit shocking – well, actually, it’s not shocking, because what’s happening is exactly what we’ve known happens with systems that do most of the work, and still somehow try to demand constant vigilance: It doesn’t work. Maybe alarming is a better word, because these systems are already on the roads, and they’re continuing to be built and sold, and they are all, I think, fundamentally, inherently, broken.

The Takeaway

This isn’t something that needs a technological fix, because it’s not a technological problem. It’s a human problem, and it doesn’t matter how many over-the-air updates you send out or what revision the latest Tesla Full Self-Driving Beta is up to, because if any of these systems still demand that a person be ready to take over without warning, they’re doomed.

The better the system is at driving, the worse it will be for keeping people ready and alert, so there’s a bit of a paradox here, too. The more featured the system, the more advanced it is, the more it does, the less it demands from the driver, the worse it actually is. Because if it still may need you to intervene at any moment, the less engaged you are, the worse the result will be if anything should go wrong.

ADVERTISEMENT

That’s why of all of these, Nissan seems to be on the most workable track: keep the interaction between human and machine cooperative, not one or the other in charge at any given time, and you can still get the safety and reduced-stress benefits of semi-automated assisted driving with less of the inherent dangers caused by vigilance problem-related issues.

Level 2 semi-automated systems are flawed, and I think the IIHS made my point for me, so thanks.

 

 

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
51 Comments
Inline Feedbacks
View all comments
Uncle D
Uncle D
1 year ago

Not surprised at all. The only way to be deceived is you want to believe the lie more than you want to know the truth. People want to believe Musk and his hyperbolic BS. Of course Tesla is going to market their system as Full Self Driving (FSD) and then just tack Beta onto the end of it thus absolving themselves of any liability when people trust the name and do dangerous idiotic things. Tesla will be fine as they remind everyone that it was only Beta software and the user (i.e. dead person) was not following the terms of the agreement. Beta testing what is essentially a safety product should not be allowed on public roads with untrained end users.

I know someone who is an engineer and fully understands the limitations of FSD Beta, but still straps a wrist weight to his Model 3’s steering wheel so he can go hands free and stream shows on his commute to and from work every day. If someone who has some level of understanding does this, you can only imagine what others are doing. OR just look at the survey results above.

People are getting injured and dying because of intentionally vague product descriptions intended to sell half baked systems that are years from actually living up to their names. It’s dishonest, irresponsible, and dangerous.

Beater_civic
Beater_civic
1 year ago

Something that really troubles me about all these systems that I’ve never really seen discussed is how relying on them will, over time, erode our ability to take over effectively when the system disengages.

Right now, most drivers encounter mild hazards on every single journey, and navigate them without too much thought. Vehicles stopped in weird places, people driving erratically, inattentive pedestrians – we all learn the actions to take when we see an upcoming threat by doing them over and over and over again. How automatic is it to cover the brake and the horn when you see a car nudging just a little further into an intersection than you would like? When someone a few hundred yards ahead of you on the freeway screeches to a halt, it’s basically automatic to get on your own brake and have a finger ready to poke the four-way button to warn the person behind you.

99.9% of the time, we practise these actions in low-stakes situations, where the worst thing that could happen might be a fender-bender. In other words, we tolerate a low ambient level of risk, but the payoff is that essentially THE SAME actions are also effective when something potentially way more dangerous is about to happen. We change lanes under ‘normal’ conditions frequently, so it’s stressful, but within the realm of possibility, to make a quick lane change to avoid colliding with the truck that pushed a little too far into a turn, or the granny dropping her walker into the crosswalk early.

If you’ve never made a quick lane change in hairy traffic, or never had the guy in front of you drop his sandwich WITHOUT Autopilot, etc, etc, what are the odds of taking the right actions at the right time when it’s really important? Soldiers, firefighters, basically anyone doing performance-critical work practises over and over and over again, just to make sure they’re ready when the chips are down.

In other words, it seems that the better the systems get, the more catastrophic the consequences when they do encounter something they can’t handle. So what do we do? Require road tests every time you renew your license? I don’t know but I think it’s a big problem!

Flyingstitch
Flyingstitch
1 year ago

Interesting, that fear-of-disengagement finding. I can relate. Sometimes when I’m on cruise (non-adaptive) and really enjoying the respite for my tired right leg, I find myself a little annoyed at having to disengage when traffic starts to bunch up. I do it anyway, because it beats crashing, but yeah, this tracks with human foibles.

51
0
Would love your thoughts, please comment.x
()
x