Home » A Waymo Robotaxi Blocked An Ambulance During An Active Shooter Incident In Austin

A Waymo Robotaxi Blocked An Ambulance During An Active Shooter Incident In Austin

Waymo Block Ambulance Top

Driving is not as simple as it seems. It’s not just about the physical control of a car and understanding what signs and road markings mean; there’s an entire other dimension to driving that requires some sort of basic understanding of how human culture works. That’s because cars aren’t driven on neat grids of roads populated with tidy buildings and surrounded by other vehicles piloted by careful, attentive beings dedicated to following rules; cars are driven in the messy real world, which is a colorful, fragrant minestrone of chaos and madness, and this sort of disorder doesn’t constitute “edge cases,” it’s just how reality works. And it’s up to the automated vehicles (AV) to adapt to it, not the other way around. There was a sobering example of this just yesterday, when a Waymo robotaxi blocked an ambulance during an active shooter situation.

The incident happened about 2 am on Sunday, when a 53-year old man shot at patrons at an outdoor bar in Austin, Texas, resulting in 14 people injured and two killed, including the shooter. It’s all tragic, and the investigation is ongoing, and this isn’t the place to discuss that. We can discuss the behavior of a Waymo robotaxi that arrived in the area of the attack.

Vidframe Min Top
Vidframe Min Bottom

A group of people at a nearby club ordered the Waymo to take them home, unaware of what was going on at the nearby bar where the shooting was taking place. When the Waymo arrived in the area, it identified a road blockage, and began to execute a U-turn. While in mid-U-turn, an ambulance approached, which seemed to confuse the Waymo and left it blocking the road perpendicularly, as can be seen in this video of the incident:

There’s no way around it, this is pretty unacceptable behavior from an AV. It’s hard to get much worse than blocking an ambulance (and an entire road) in an actively dangerous area. In the video you can see that a police officer had to actually get into the car and drive it away manually, which is both good that it was possible to do so (this process generally requires the car to contact an overseer to get permission and unlock, which seems to have happened) and also terrible that this was what was required to move the robotaxi out of the way.

 

View this post on Instagram

 

A post shared by John-Carlos Estrada (@mr_jce)

I’m confused by a lot of what happened here that led the Waymo to be flustered and block the whole road; as we have discussed before, Waymo does monitor their cars with human beings at remote locations, and these humans are supposed to step in to help when the car becomes confused. Where were those “fleet response agents” during this event? This seems like a textbook case for the need to reach out for immediate human assistance.

The fact that the Waymo robotaxi ended up in this area at all should be a very loud message that we are not doing enough – and by “we” I don’t just mean Waymo and other automated vehicle robotaxi companies, I mean emergency service agencies, local governments, and national government, and everyone. There needs to be clear procedures and expectations of what AVs should do around emergency vehicles enforced from above, not left to individual companies.

In a case like this, not only should the robotaxi be able to identify emergency vehicle lights and audio cues, but there should be cooperation between emergency services and police and the robotaxi companies so a signal can be sent to all AVs on the road identifying what streets are on police or other sort of lockdown, and then those vehicles should simply not be able to go to such areas.

I describe situations like this in my book, Robot Take the Wheel, if you’re curious, and how it should be up to us to decide how we want these AVs to behave in situations like this. We, as a culture and society, should decide the rules, and it’s up to the AV companies to follow those parameters.

In this case, why are there not modes of behavior for robotaxis? When an emergency is detected, either via audio-visual cues or direct information broadcast to the car or whatever, there should be a special set of emergency situation rules that are then followed – an emergency mode – which would prioritize getting the hell out of the way of other vehicles as quickly as possible, being able to listen and act upon verbal commands from law enforcement officers or other emergency workers, and, of course, contact a live human to help manage the situation and be on call to communicate with people on the scene.

I don’t really understand why these types of situations don’t seem to be a bigger priority? We’ve seen other dangerous and seemingly easy-to-avoid mistakes from Waymo AVs before: in one case, a Waymo not understanding how to behave around a school bus, and another case where a Waymo hit a kid (they were not seriously hurt, luckily) in a school zone. Both of these situations involved very specific circumstances that should be relatively easy to detect – when in a school zone, an extra cautious mode should be employed, and when around a visually distinct school bus, the rules that define how to drive around one should take priority.

Maybe something like this is happening, and I just can’t perceive it? Whatever is happening, I think we can safely say it’s not enough, and if we are serious about having self-driving vehicles on public roads, standards for behavior during emergency situations should be defined, deployed, and understood by any company wishing to operate AVs in public and emergency responders who may have to deal with these vehicles in difficult and dangerous situations.

It’s also worth investigating how AVs can not just get out of the way, but could also possibly be used to actively help. An organized emergency policy for AVs could include the ability to take control of any AVs in the area to use for blocking off roads or helping to corral other vehicles away from dangerous areas; if these machines are going to exist on public roads, we can at least get them to help when we need them, right?

It does not seem like the Waymo, in this case, contributed to any significant harm or delays, thanks to the quick thinking of both the ambulance driver and the police officer who eventually moved the Waymo out of the way. But that doesn’t mean every time will be as forgiving. This is a real issue that demands serious attention.

I reached out to Waymo for comment, but they declined to give a statement at this point.

Top graphic image: KXAN on YouTube

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
110 Comments
Inline Feedbacks
View all comments
Lotsofchops
Member
Lotsofchops
1 month ago

“I don’t really understand why these types of situations don’t seem to be a bigger priority?”
Profits-first and lack of consequence?

Johnologue
Member
Johnologue
1 month ago

Driving is a social, predictive, judgement-making process. ROBOT CARS ARE FUNDAMENTALLY STUPID.

Skurdnin
Skurdnin
1 month ago

This is the most American headline I’ve ever seen

Luxrage
Member
Luxrage
1 month ago

There’s a clip currently making the rounds as well of the police, guns drawn, pointing at a pickup truck they just stopped and a waymo trundles on by directly in the line of fire.

Roofless
Member
Roofless
1 month ago

Fools! This wasn’t a technical glitch – the Waymo was in on the shooting! It’s the beginning of the robot uprising! Quick, smash your phones and run to your bunkers!

Theotherotter
Member
Theotherotter
1 month ago

We, as a culture and society, should decide the rules, and it’s up to the AV companies to follow those parameters.”

Should be, yes; but this is Texas, where the government thinks it should be up to the AV companies to set the rules.

Howie
Member
Howie
1 month ago
Reply to  Theotherotter

I was on a site this weekend. Nobody apparently owned the network drops from the ceiling to the scheduler TPs. Nobody takes ownership of anything unless contractually obliged. Texas has no fucks and the AV companies don’t feel obligated. I am shocked. Shocked. Really shocked. Indeed

Space
Space
1 month ago
Reply to  Theotherotter

I wish it was just a Texas problem but AV’s are acting like this in lots of places.

Black Peter
Black Peter
1 month ago
Reply to  Theotherotter

Not just Texas, I mean is even California making rules?

Theotherotter
Member
Theotherotter
1 month ago
Reply to  Black Peter

Not just Texas, but more Texas than other places.

Black Peter
Black Peter
1 month ago
Reply to  Theotherotter

I don’t know, I mean they literally killed someone here in Arizona.

110
0
Would love your thoughts, please comment.x
()
x