Home » Two Self-Driving Waymo Taxis Get Confused By A Pickup Being Towed Backwards, Crash Into It

Two Self-Driving Waymo Taxis Get Confused By A Pickup Being Towed Backwards, Crash Into It

Waymo Pickup Truck
ADVERTISEMENT

The idea of a robotaxi is quite appealing. It’s a car that takes you where you want to go, and neither you, nor anybody else, has to worry about driving. The reality of robotaxis is altogether different. Many of us are concerned about systems that are incapable of dealing with the whole gamut of often-chaotic road conditions. Waymo’s recent escapades certainly don’t help in that regard.

Titled “Voluntary recall of our previous software,” Waymo’s Chief Safety Officer Mauricio Peña’s new entry on the company’s blog explains a recall report the company filed with the National Highway Traffic Safety Administration (NHTSA). The filing was made in response to a hilarious and embarrassing incident on December 11, 2023 involving two of Waymo’s self-driving robotaxis.

Vidframe Min Top
Vidframe Min Bottom

According to Waymo, one of its robotaxis was operating in the city of Phoenix when it came across a pickup truck facing backwards on the road. The company alleges the vehicle was being “improperly towed” and that “the pickup truck was persistently angled across a center turn lane and a traffic lane.” When the Waymo robotaxi hit the pickup under tow, the tow truck driver didn’t stop after the collision, and continued traveling down the road. Mere minutes later, a second Waymo vehicle hit the same pickup truck under tow, at which point the tow truck driver elected to stop. Here’s Waymo’s full description of events:

On December 11, 2023 in Phoenix, a Waymo vehicle made contact with a backwards-facing pickup truck being improperly towed ahead of the Waymo vehicle such that the pickup truck was persistently angled across a center turn lane and a traffic lane. Following contact, the tow truck and towed pickup truck did not pull over or stop traveling, and a few minutes later another Waymo vehicle made contact with the same pickup truck while it was being towed in the same manner. Neither Waymo vehicle was transporting riders at the time, and this unusual scenario resulted in no injuries and minor vehicle damage.

Just imagine, you’re driving your truck with a pickup in tow behind you, and you feel a little something from behind. You look in the mirror and spot a Waymo vehicle, but assume you maybe just imagined the jolt. You get back to driving down the road, only for another Waymo to show up and again hit your consist from behind. You’d start to think these robot taxis were out to get you or something.

As covered by TechCrunch, both crashes caused only minor damage to bumpers and a sensor. The crashes were reported to police the same day, and the NHTSA on December 15. There were no reported injuries as a result of the crashes, and neither Waymo vehicle was carrying passengers at the time. Waymo put the problem down to the strange towing configuration, which confused its autonomous vehicle software. It apparently could not accurately understand or predict the motion of the tow truck or the pickup behind it, which led to the crash.

ADVERTISEMENT

Here’s the company’s explanation of why these Waymos crashed into the truck, per the aforementioned blog entry:

Given our commitment to safety, our team went to work immediately to understand what happened. We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle. After developing, rigorously testing, and validating a fix, on December 20, 2023 we began deploying a software update to our fleet to address this issue (more here on how we rapidly and regularly enhance the Waymo Driver’s capabilities through software updates).

[Editor’s Note: The fundamental problem here is one I’ve discussed before: automated vehicles have no idea what they’re actually doing. They’re computers, following sets of instructions and reacting to sensor/camera inputs, but they, of course, lack any sort of consciousness or understanding of what they’re doing. And that sort of general understanding of the world around you is actually quite important to the task of driving, and it’s something that we humans do without even thinking about it. 

Looking at the description of events, it seems that it’s just a case of a truck being towed backwards. No human driver would have been confused by this; anyone capable of operating a car would understand that they were looking at a car being towed, and would understand how that affected the motion of the car. This isn’t because humans were calculating the physics of towed vehicle masses or whatever, it’s because we’ve all seen towed cars before and get how that works. 

We can call them “edge cases,” but really they’re just cases. Things like this happen every single day on the roads, and humans deal with them wonderfully, because we have an understanding of the world and what we’re doing in it. Can AVs be programmed to have some sort of conceptual model of the world that will allow them to make sense of potentially confusing situations like this one, with the backwards-towed truck? I’m not sure. But it’s not a concept we can ignore. – JT]

Waymo implemented a software fix for the problem, rolling it out on December 20 last year. The full fleet had received the update by January 12.  “Our ride-hailing service is not and has not been interrupted by this update,” reads Waymo’s blog entry.

ADVERTISEMENT

It’s an interesting decision to make in the context of the accident. On the one hand, nobody was hurt in the twin incidents, and damage was minor. Plus, if Waymo’s account is accurate, it was an oddball situation which they might not reasonably expect to see again any time soon. At the same time, when two cars crash in the same way just minutes apart, you might consider shutting things down until a fix is out.

Based on conversations with the NHTSA, Waymo decided to file a voluntary recall report over the matter. However, this terminology is somewhat confusing, as Waymo didn’t really recall anything. It simply updated the software on its own vehicles over a period of a few weeks. Instead, the report really serves as a public notification that Waymo made a software change in response to the incident.

Waymovehicles
Waymo’s autonomous vehicles use cameras, radars, and lidars to understand the world. Regardless, when dealing with something unfamiliar or unusual, they can struggle to react appropriately.

It’s true that Waymo hasn’t seen quite as much bad press as Cruise. The latter GM-aligned company has had to contend with one of its autonomous vehicles dragging a stricken pedestrian along the road for 20 feet. But it’s faced its own woes. A Waymo vehicle recently hit a cyclist, prompting investigation by California regulators. Worse, the company saw a mob go wild and destroy one of its vehicles in Chinatown just a few days ago.

But ultimately, all these robotaxi operations will need to sharpen up their act. Crashes like this one are the sort of thing that even a poor human driver can avoid. An inexperienced driver is generally smart enough not to drive into a pickup truck dangling from a tow hook, nor would they drag a pedestrian along the road after running them over.

The problem for these companies is that it’s not the regular driving task that’s hard to master. It’s challenging, sure. But the real problem is the strange edge cases that humans deal with every day. You can’t expect every road hazard to have a big flashing orange light or a stop sign, but you have to be able to deal with them anyway.

ADVERTISEMENT

Waymo has actually published papers on this very topic, including one entitled Measuring Surprise In The Wild. It discusses methods for using machine learning models to “detect surprising human behavior in complex, dynamic environments like road traffic.” An ability to appropriately handle novel situations in traffic would be a major boon to any autonomous driving system. In comparison, the only other solution is for companies like Waymo to imagine every conceivable situation ever and provide appropriate countermeasures for such. Obviously, a more general ability to handle surprise is more desirable.

It’s clear from this incident that Waymo isn’t quite there yet; it’s just learned from one more strange situation and stuck that in the training files. Here’s hoping the robotaxis don’t start ganging up on broken BMWs or rusty Jeeps, lest The Autopian staff shortly end up in the firing line.

Image credits: Waymo

 

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
99 Comments
Inline Feedbacks
View all comments
Hugh Crawford
Hugh Crawford
3 months ago

Hope they don’t cross paths with Pippa Garner

https://www.google.com/search?q=pippa+garner+backwards+car

Attila the Hatchback
Attila the Hatchback
3 months ago

Looking at the description of events, it seems that it’s just a case of a truck being towed backwards. No human driver would have been confused by this;

That’s a pretty strong statement.

How about the >13,000 people per year who are killed due to drunk driving. Most of those cases are probably not even unusual circumstances — it’s just a drunk killing someone or themselves with a car.

It’s interesting to report on odd autonomous driving mistakes like this double-hit by Waymo, and I do like reading about it here, but I must disagree with the editorial comment because it implies human driving is a solved problem, while we are actually scratching, wrecking, injuring, and killing people *all the time*

VanGuy
VanGuy
3 months ago

Re: drunk driving–I think that comes down more to public transit options. I would think that only a tiny percentage of drunk drivers are doing it “for fun” rather than “because I have no other (affordable/quick/etc.) transportation home, and I happen to be drunk.”

As for the “solved problem”–I disagree with your premise, at least partly. We are getting into accidents, but wouldn’t “misjudging speeds and distances” or “running a red light” be a lot more of the causes?

Yes, to Torch’s credit, there’s things like “how bad would hitting that roadkill be for the car if it’s not safe for me to swerve right now?” and “oh, that’s a plastic bag, there’s no danger in driving over that” that are much more trivial for a human to process than computers. But I think a lot of the things humans do get wrong are things computers would be better at.

Attila the Hatchback
Attila the Hatchback
3 months ago
Reply to  VanGuy

I think part of what you’re saying is that humans vs. computers will have very different failure modes, for example a human could get distracted or just choose drive at a really unsafe speed in the dark, while a computer can get confused by something that it mis-classifies like a towed vehicle versus a parked vehicle. I agree on this 100%.

Overall we need to hold autonomous vehicles to a much higher standard than human drivers because no one is going to tolerate automated cars killing 20-40K people per year in accidents.

As for drunk driving, it’s not public transit (although that could be a solution). IMO it’s negligent people who don’t care about their fellow humans: if you can’t afford a taxi/lyft/uber then you can’t afford the alcohol.

VanGuy
VanGuy
3 months ago

I think the “higher standard” is already a given, but perhaps to a dangerous degree. We’re commenting on an article about an incident in which no one was even hurt, but apparently warranted its own article (and I’ll admit, I think the circumstances are curious enough to justify it). But I think there’s a big element of “don’t let “perfect” get in the way of “good””.

In addition, even if this is just preaching to the choir–it’s not autonomous driving that’s difficult; it’s autonomous driving when some cars are driven by computers and others are still driven by humans that it’s really difficult.

When all cars are driven by computers, I imagine it’d cut accidents by 95% or more. Plus I salivate at the thought of slot-based intersection management rather than traffic lights.

Alexander Moore
Alexander Moore
3 months ago

IMO it’s negligent people who don’t care about their fellow humans: if you can’t afford a taxi/lyft/uber then you can’t afford the alcohol.

Sure, but it’s a systemic problem that needs to be solved. Just putting up ‘don’t drink and drive!’ signs and making it illegal hasn’t fixed the problem. There needs to be cheaper/subsidized transport options if we actually want to do anything about drunk driving. I agree that driving somewhere to drink is perhaps silly, but it’s part of American culture, and of course some drunk person would rather risk driving themselves home from a bar than hire a $30 Uber and come back the next day to get their car in another $30 Uber. That’s not something driverless cars are going to fix in the next ten years either.

David Smith
David Smith
3 months ago

One DWI or DUI you have to have a breathalyzer ignition interlock and you have to get a new drivers license with a scarlet letter on it. If you get caught driving another car without the interlock you go to jail, get fined, lose your license.

Add that and people might think $60 for a couple of Ubers is worth it.

I don’t know, it might help.

Alexander Moore
Alexander Moore
3 months ago
Reply to  David Smith

Yeah, or like they did when we visited Northern Queensland and pull over every single car on the road and breathalyze them. Obviously that only works in rural places with little traffic and maybe in America that would step on too many ‘personal liberties’, but I’ll bet you it really cuts down on the rate of drunk driving.

Who Knows
Who Knows
3 months ago

I have no personal interest in autonomous vehicles, and think they still have a lot to improve, but considering that I have my own dashcam footage of someone coming towards me the wrong way in a traffic circle (and many other things), I agree that there are plenty of human drivers that are confused by some pretty basic things.

Along the lines you mention of human issues, I’m still waiting for a news story of an autonomous vehicle having alcohol short circuit its electronics, causing it to drive at 100+ mph through town, launch off a median, and end up lodged in the second story of a building, killing all passengers.

M K
M K
3 months ago

What gives me hope about autonomous vehicles is that as we identify these cases, we’ll fix them. We’ll improve lane marking, we’ll improve signage, we’ll come up with standards for specific situations that need immediate intervention etc…Humans have been driving for over 100 years and we’re still doing the same dumb sh!t like drunk driving, speeding, running red lights, and road raging. We’ve also invented new dumb sh!t like cell phones, non-tactile controls, and ultra fast cars that help to make us even worse drivers. I’m guessing we’ll fix all the issues with autonomous cars way before we fix stupid.

Detroit-Lightning
Detroit-Lightning
3 months ago

What problem are these things solving again?

Geoffrey Reuther
Geoffrey Reuther
3 months ago

Same “problem” as basically any automation since the beginning of automation. Having to pay a human to do something.

Attila the Hatchback
Attila the Hatchback
3 months ago

Well, humans fundamentally suck at driving.

The National Highway Traffic Safety Administration has released its latest projections for traffic fatalities in 2022, estimating that 42,795 people died in motor vehicle traffic crashes.

(this is just for the USA)

We shouldn’t be OK with killing an entire small town’s worth of people every year just because we feel like driving with a steering wheel and pedals is a ‘solved problem’

Detroit-Lightning
Detroit-Lightning
3 months ago

I’m not convinced that any of these companies gives a shit about safety…

Your point is correct though, humans are trash at driving – and this potentially could help with it. But as much as these companies say it’s for safety reasons…I’m just not buying it (mostly because of their actions thus far).

Luxobarge
Luxobarge
3 months ago

Well, humans fundamentally suck at driving.


Yes, but but do they suck more at it than computers? That’s the open question here, and there’s a fair bit of evidence that suggests driving a car in traffic is not the sort of logical, algorithmic process computers excel at.

Brian Ash
Brian Ash
3 months ago

US drivers suck especially. US fatality rate is 4x that of Germany and yet our highway speeds are more restrictive.

Alexander Moore
Alexander Moore
3 months ago

Considering the death rate is 12.8 persons/100,000 in the U.S. and 4.54 persons/100,000 in similarly car-centric Australia, something tells me that there’s a lot of deaths that can be mitigated through better transport infrastructure, driver training, and traffic safety enforcement. We shouldn’t sit around and do nothing about it until driverless cars are a thing.

Last edited 3 months ago by Alexander Moore
SaabaruDude
SaabaruDude
3 months ago

I wonder if Waymo’s algorithms are being programmed with similar logic to what I use while driving:

  • Clapped-out G35 weaving through traffic coming up behind me? Stay out of its way.
  • Minivan making a left turn out of a local park after school? They don’t see me.
  • Behind a slightly dented but otherwise new-enough Camry at a stoplight? It’s going to pull away slower than the semi next to me.
  • Parked next to a Merc M-class with local prep school bumper stickers at the grocery store? You’ll have new door dings to go with your fresh bananas.
Kalieaire
Kalieaire
3 months ago
Reply to  SaabaruDude

IIRC algorithms alone actually aren’t very useful in terms of self-driving since they use GANs (generative adversarial networks) in machine learning to develop the patterns (models) used by autonomous vehicles/self-driving to create that “intelligence”. The more they drive, the more data they receive, and the more relevant data is stored within a model to produce a positive result, in this case, driving about without hitting stuff, without getting anyone hurt, and getting folks to the destination without any traffic violations.

obv, with any intelligence, there’re limitations. this is where algorithms are helpful, they provide so-called “guard rails” in AI.

https://www.youtube.com/watch?v=gn4nRCC9TwQ

M K
M K
3 months ago
Reply to  SaabaruDude

In the process of teaching my kid how to drive and I’m finding it incredibly useful to pass on my car stereotypes. My kid thinks I have a six sense when I tell her 10 seconds before it happens that a white BMW is going to pass aggressively, cut her off without signaling and cross 3 lanes and a solid line to get off at the next exit.

SaabaruDude
SaabaruDude
3 months ago
Reply to  M K

FWIW, my BMW is blue!

M K
M K
3 months ago
Reply to  SaabaruDude

FWIW, my clapped out G35 is also blue!

Cpt. Slow
Cpt. Slow
3 months ago

Reminder to never walk backwards in an intersection where Waymo operates.

Last edited 3 months ago by Cpt. Slow
Trust Doesn't Rust
Trust Doesn't Rust
3 months ago

I think it’s pretty obvious that autonomous cars are displaying shark-like behavior in that sharks will bump into their prey before attacking.

This is what you get for lighting one of them on fire.

Last edited 3 months ago by Trust Doesn't Rust
Geoffrey Reuther
Geoffrey Reuther
3 months ago

Begun, the Robot Uprising has.

–Master Yoda Connor, or something.

Lightning
Lightning
3 months ago

I wonder if they programmed it to recognize a Zündapp Janus going forwards and backwards with the taillights knocked out.

LuzifersLicht
LuzifersLicht
3 months ago

The idea of a robotaxi is quite appealing.

Surely not to anybody who has ever had to deal with current-gen “AI”
My phone can’t even consistently do what I ask of it when I say “set a timer for 5 minutes,” like hell am I going to let that thing drive my squishy human butt around at highway speeds.

Geoffrey Reuther
Geoffrey Reuther
3 months ago
Reply to  LuzifersLicht

I’ve worked in tech of various flavors since 1995. “Normies” I work with are always surprised that I don’t have a home full of automation.

Uh, yeah. Because I’ve worked in tech since 1995. I know that shizz is a bad idea that solves problems only the laziest of lazies need fixed.

Tarragon
Tarragon
3 months ago

Oh yeah, hard agree.

A while back I built IoT home automation devices for a living. How many do I have in my home? One, a removable plug that controls christmas tree lights and is removed off season.

Remember the S in IoT stands for security.

Geoffrey Reuther
Geoffrey Reuther
3 months ago
Reply to  Tarragon

Our Christmas tree light controller isn’t even IoT. It’s a straight up mechanical timer.

David Grieco
David Grieco
3 months ago
Reply to  Tarragon

As a person who has spent most of the past 12 months integrating an IIoT solution on embedded devices, I have to say that your “S in IoT…” line is an absolute gem! Thank you!

David Grieco
David Grieco
3 months ago
Reply to  David Grieco

And I also have zero home automation devices.

Tarragon
Tarragon
3 months ago
Reply to  David Grieco

I’ve seen that around enough that I don’t know where it originates. But yeah, it resonates because it’s so true.

LuzifersLicht
LuzifersLicht
3 months ago

Reminds me of that xkcd what-if about the robot apocalypse, where the author provides one of my favourite quotes about robots:

What people don’t appreciate, when they picture Terminator-style automatons striding triumphantly across a mountain of human skulls, is how hard it is to keep your footing on something as unstable as a mountain of human skulls. Most humans probably couldn’t manage it, and they’ve had a lifetime of practice at walking without falling over.

and also this gem:

[T]he robot revolution would end quickly, because the robots would all break down or get stuck against walls. Robots never, ever work right.

And this is a dude who worked in robotics at NASA

Hoonicus
Hoonicus
3 months ago

Cruise “missile”?, Wham-o?, OK how about Full Self-Own.

Stig's Cousin
Stig's Cousin
3 months ago

I just wanted to highlight how well the editor’s note summed up the problem with autonomous vehicles sharing roads with human operated vehicles. I would love to hear Waymo respond to that.

Driving a car is a human interaction like any other. Human interactions are complex, and unconscious inferences play a big role in these interactions. We learn to make these inferences from years of observation (i.e. by being passengers in vehicles until we reach driving age) and driving experience. I don’t see how this can be replaced by algorithms, particularly considering that our roads are optimized for vehicles driven by humans. I have yet to see an example where computer algorithms accurately emulate human behavior, and given the complexity of driving, it seems unlikely driving will be the first example of that.

Until there is true artificial intelligence (which I hope never exists), I don’t think it is realistic for human operated vehicles and fully autonomous vehicles to share the same roads.

Last edited 3 months ago by Stig's Cousin
Studdley
Studdley
3 months ago
Reply to  Stig's Cousin

What if we put the autonomous vehicles on their own track so they steer automatically. Hmmm and maybe we could put a bunch on a single track and mechanically link them together so they all move at the same time so there’s not delay when the light turns green.

Shop-Teacher
Shop-Teacher
3 months ago
Reply to  Studdley

That could never work!

Mr. Asa
Mr. Asa
3 months ago

What I don’t get, and you see it repeatedly, is how the AVs don’t pause what they are doing when they run into a not-an-edge-case. They say “that does not compute” and then go ahead and bang into a truck or a person or a building.

LuzifersLicht
LuzifersLicht
3 months ago
Reply to  Mr. Asa

Right? First rule of traffic (well, of most things in life really) is that if you don’t know what the heck is going on, you take a step back, slow down and analyse the situation.
If you’re driving and something weird is happening in front of you, you slow down in a safe and controlled manner until you’re certain what’s going on. So how did the bloody robot see a car, fail to correctly analyze its vector, and then catch up to it in order to crash into it? Those things do not go together, it’s like the damn things responded to their confusion by going straight for the anomaly.
That’s borderline grossly negligent in my opinion.

Jason Roth
Jason Roth
3 months ago
Reply to  LuzifersLicht

It reacted like a wild animal, aggressively approaching the unfamiliar object.

LuzifersLicht
LuzifersLicht
3 months ago
Reply to  Jason Roth

Just the kind of behaviour we need more of on the road. The current amount of humans behaving like animals isn’t enough /s

Geoffrey Reuther
Geoffrey Reuther
3 months ago
Reply to  LuzifersLicht

but hey, it makes for lots of great YouTube fodder…

Amy Andersen
Amy Andersen
3 months ago
Reply to  LuzifersLicht

The car probably didn’t even realize something was off; it just saw a pickup truck, predicted it’s movement (incorrectly of course, but the car doesn’t realize that yet), picked a route that it thought would be fine, then proceeded to run into the truck when it wound up in a different spot than the car predicted. Humans do this all the time when other drivers behave unpredictably; it would be unlikely to happen to a person in this specific situation only because a human would recognize the truck is being towed rather than operating under its own power and would adjust their expectations accordingly.

Last edited 3 months ago by Amy Andersen
LuzifersLicht
LuzifersLicht
3 months ago
Reply to  Amy Andersen

Fair enough, however I still fail to see how the robot didn’t realize it was getting closer to the other car. Even if you incorrectly predict the other car’s movement, surely you’ll notice that it’s getting closer and initiate some sort of evasive and/or braking measures. I’d love to look at a video of what happened, would be interesting to see if and when the Waymo started braking.

Last edited 3 months ago by LuzifersLicht
Amy Andersen
Amy Andersen
3 months ago
Reply to  LuzifersLicht

Yeah, the lack of video makes it hard to say for certain what went down. I will note that even something seemingly as simple as “object is getting closer to vehicle, apply brakes” is more complicated than you’d think, as anyone who’s ever had an automatic emergency braking system erroneously activate on them can attest. There has to be some amount of wiggle room in the system to avoid false positives.

LuzifersLicht
LuzifersLicht
3 months ago
Reply to  Amy Andersen

Right, so basically the car is thinking “that pickup is getting closer, but it’s facing to 7 o’clock so it’s going to miss me.” Fair. I can see that. That’s where I’d love to see if and when the Waymo realized its assumption was wrong and how it tried to correct.

AssMatt
AssMatt
3 months ago

“Crashes like this one are the sort of thing that even a poor human driver can avoid.”
English is a dumb language:
A) …because people without money are terrible drivers
B) aw, that car’s driver is pitiably human
C) that driver is a terrible human
D) that human is a terrible driver

Cyko9
Cyko9
3 months ago

Autonomous vehicles are starting to feel like a “cool tech” that might go away, like 3D TVs or NFTs. It’s the BIG THING in certain circles, but ultimately more trouble than they’re worth. Maybe not, the head-mounted VR thing is still going (Oculus, Vision), but companies are really going to have to work to get widespread buy-in, nevermind avoiding death and destruction in the process.

Jakob K's Garage
Jakob K's Garage
3 months ago

I love being stupid behind the wheel, and saying things like “Wow that truck is going really fast backwards” when I encounter one of these quite normal cases, or “look at that boat going 50 knots – on the highway!” If someone is towing their boat in the opposite lane. But I keep my distance to not crash into them, so I guess I’m smarter than a robot afterall 🙂

Jason Roth
Jason Roth
3 months ago

Just incredible how being towed (allegedly) across 2 lanes is somehow supposed to justify or even explain the Waymo’s choice to just run into it. “I’m not sure why that car is there or exactly what it’s doing but, just to be safe, I’m going to drive into it.” Superb programming, $150k salaries all around.

Amy Andersen
Amy Andersen
3 months ago
Reply to  Jason Roth

The car is not unsure of anything because it’s not thinking like a person. It saw a truck, predicted how it would behave, and then ran into it when that prediction proved incorrect. Humans do this all the time when another vehicle behaves unpredictably, we’re just better at prediction than a computer (and better at recognizing when a situation has become unpredictable in a way that would merit slowing down and/or pulling over).

Grey alien in a beige sedan
Grey alien in a beige sedan
3 months ago

Waiting for them to use Nissan Rogue vehicles for these robotaxis. That way, when stuff like this towing incident happened, the headlines will just simply write themselves. “Waymo goes rogue!”

Citrus
Citrus
3 months ago

The problem with calling stuff like this “edge cases” is that all driving is edge cases.

StillNotATony
StillNotATony
3 months ago

“An inexperienced driver is generally smart enough not to drive into a pickup truck dangling from a tow hook, nor would they drag a pedestrian along the road after running them over.”

Ummm, about that second one… the news would suggest otherwise.

Trust Doesn't Rust
Trust Doesn't Rust
3 months ago
Reply to  StillNotATony
Tom T
Tom T
3 months ago

I’ve been saying for years that the computing power and technology needed to operate a self driving car properly will be prohibitively expensive and complicated.
self driving cars crash into curbs and barriers on well marked streets on sunny days, they will never work in the real world in real weather with real situations.

Freelivin2713
Freelivin2713
3 months ago
Reply to  Tom T

Yes, like you said in real weather especially where there is snow/ice half the year; even in Florida they have downpours where you can hardly see anything driving on the freeway; it just goes on: flooding, tornados/high winds, smoke from fires, etc

Studdley
Studdley
3 months ago
Reply to  Tom T

Puttem on rails, problem solved

Last edited 3 months ago by Studdley
James Carson
James Carson
3 months ago

I think these things might find bicycles flitting in and out of traffic, up onto sidewalks then back into traffic and blowing lights at will rather more challenging. This human manages to avoid hitting any of them, tempting though it might be.

Canyonero
Canyonero
3 months ago
Reply to  James Carson

I know I’m not going to change your opinion on things, but in many states, by law, bicycles can go through red lights and stop signs as long as the way is clear for them.

James Carson
James Carson
3 months ago
Reply to  Canyonero

Where i live that is not the case so your point is not relevant to me. I am not trying or going to get into a bicycles vs cars fight with anyone. I rode bicycles and motorcycles for years and have a healthy respect for the damage a car or truck can inflict on the human body. Last year I watched a guy blow a light and get boned by a car. Lucky for him, he bounced and rolled. He stood up and walked over to the curb and sat down. Didn’t hang around to check on him as others were already doing so. The year before another blow the light and get hit. He had a compound fracture of his shin. My point is, it is the rider/driver who must take resonsibility for their actions. Right or wrong, a bike or motorcycle is going to lose in an altercation with a car or truck.the rider might just lose their life as several of my friends have.

Amy Andersen
Amy Andersen
3 months ago
Reply to  James Carson

A lot of humans DON’T manage to avoid them, which doesn’t bode well for the cars…

A. Barth
A. Barth
3 months ago

We can call them “edge cases,” but really they’re just cases.

In this particular instance, the purported unpredictability of the backward-facing truck should have been covered by the most basic of cases: “see thing? do not hit thing”.

This would include some level of calculation to leave a gap between the Waymo and the thing, which again is one of the most basic concepts taught in driver education: don’t follow too closely.

James Carson
James Carson
3 months ago

The chief safety officers conclusion is pure gold. The world didn’t fit our model. The world is wrong!

Nsane In The MembraNe
Nsane In The MembraNe
3 months ago
Reply to  James Carson

The most 2024 answer imaginable. No, it is actually me, the perpetrator of the bullshit, who is the REAL victim here!

Jack Beckman
Jack Beckman
3 months ago
Reply to  James Carson

“Blind to what you’ll soon become,
The mirror lies
The whole world’s wrong but you” –WigWam

James Carson
James Carson
3 months ago
Reply to  Jack Beckman

Had to look that up, never heard of them. Not too bad, traces of Queen, Mötley Crewe and other rockers.

Tarragon
Tarragon
3 months ago

a backwards-facing pickup truck being improperly towed”

Wow. That’s a ballsy statement. Clearly the tow-truck operator was in the wrong here. The Waymo Driver simply crashed into a vehicle that it thought was coming right at it, a perfectly reasonable operation.

Michael Beranek
Michael Beranek
3 months ago
Reply to  Tarragon

Note that they can’t say WHY the towing configuration was “improper”.
Can’t trust!

Geoffrey Reuther
Geoffrey Reuther
3 months ago

Probably a dumb assumption by the Waymo programmers thinking that everything would be towed on a flatbed.

Tarragon
Tarragon
3 months ago

I don’t disagree with you, but I don’t think it matters. They could have assumed that vehicles always move forward and a truck facing me is moving in my direction.

This thing has LIDAR sensors that had to be telling it there was an obstruction. It assumed the truck was going to be clear when it got there and trusted that assumption over the LIDAR saying 5 meters, 4 meters, 3 meters…

[That can’t be, it’s inside the room]

Amy Andersen
Amy Andersen
3 months ago

They literally said the truck was spanning two lanes as it was towed. Pretty sure that’s not how towing is supposed to work.

Michael Beranek
Michael Beranek
3 months ago
Reply to  Amy Andersen

If the front wheels were locked in a non-straight condition, and it was on a wheel lift, then yeah it’s going to be skewed as it rolls. Pretty sure that’s how physics works.
That’s the whole point of the criticism here, the AI can’t figure out something so obvious.

Holly Birge
Holly Birge
3 months ago
Reply to  Tarragon

I know there are deeper issues here, but hello, the truck being towed was probably rear wheel drive so of course it’s going to be towed facing backward. RWD pickups are very common in Phoenix. My MIL had a RWD F150 there for years.

Data
Data
3 months ago

I think all robo-taxis should be made from the Nissan Altima. People would instinctively give it space.

Nsane In The MembraNe
Nsane In The MembraNe
3 months ago
Reply to  Data

That or a V6 Challenger. Those goddamn things are driven at 11/10ths at all times and are two ton land yachts. They’re the only thing I get out of the way of as fast as a Nissan that’s doing 25 over.

Last edited 3 months ago by Nsane In The MembraNe
Nsane In The MembraNe
Nsane In The MembraNe
3 months ago

Can we get any and all of this self-driving nonsense off of our public roads please? The general public did not consent to be beta testers for cocaine fueled tech bro fever dreams….

Michael Beranek
Michael Beranek
3 months ago

Actually, you did consent, by electing public officials who passed laws to authorize it.

Nsane In The MembraNe
Nsane In The MembraNe
3 months ago

Don’t look at me, I don’t vote in Arizona

Freelivin2713
Freelivin2713
3 months ago

The Delorean has entered the chat…
Also: “Roads? Where we’re going we don’t need roads!” =high as a kite
I love BTTF but it fits this joke well

Studdley
Studdley
3 months ago

That’s why they burn

EmotionalSupportBMW
EmotionalSupportBMW
3 months ago

This is a tit for tat retaliation for their brunt out comrade. I for one, won’t stand this naked aggression by the computers. We must rise up, pull the plug, and destroy the machines!

My Goat Ate My Homework
My Goat Ate My Homework
3 months ago

Was that YOU in Chinatown the other day?

A Mob Just Vandalized A Waymo Self-Driving Car And Set It On Fire. The Videos Are Nuts – The Autopian

I’m not saying I condone what happened. But man, that must have felt good.

EmotionalSupportBMW
EmotionalSupportBMW
3 months ago

What can I say? Me and the boys had a big night!

My Goat Ate My Homework
My Goat Ate My Homework
3 months ago

Where can I sign up? Do you have a secret handshake?

Freelivin2713
Freelivin2713
3 months ago

Ask the Van Buren boys…
-George Costanza
Also: self-driving LeBaron owned by Jon Voight anyone?

99
0
Would love your thoughts, please comment.x
()
x