Adding Big Cartoony Eyes To Self-Driving Cars Seems Silly But Actually Makes A Lot Of Sense

Eyes Top

One of the most under-appreciated problems with self-driving cars isn’t really a technological problem at all; it’s a social and communication problem. The issue is that we, as humans (or possibly bonobos) rely far more than we realize on unspoken communication with other drivers, pedestrians, cyclists, even animals. This communication can take the form of body language, motion, behavior, or, perhaps most powerfully, direct eye contact. Our eyes telegraph a hell of a lot about our focus, attention, and intentions. Cars, even automated ones, lack human-style eyes, which makes this sort of communication impossible. One team in Japan has a simple solution: give the cars eyes. Duh!


[Editor’s Note: Jason addressed issues like this, how human behavior and culture are important factors worth considering, in his book, Robot Take The Wheel, so maybe you should buy a copy, already? – MH]

Yes, eyes. Give them to the cars. Ideally, big googly eyes, and, more importantly, they should be positioned in the correct place, where the headlights are located, and which is very much not the windshield. The team of University of Tokyo researchers called their project Gazing Car, which involved fitting a golf cart-sized car with large, robotic eyes that conveyed where the car’s attention was focused and the car’s intended general path.

The team acknowledged that equipment like turn indicators do some of this sort of work, visually conveying the car or driver’s intent, but it seems the eyes have potential to convey even more information.

Here, they made a video demonstrating the project:

There’s a lot of interesting stuff in there; in addition to the real-world tests, they also conducted virtual reality tests, and in those found a 64% reduction of unsafe street crossings when the cars had eyes on them compared to those that did not.

Here’s the study’s abstract:

Various car manufacturers and researches have explored the idea of adding eyes to a car as an additional communication modality. A previous work showed that autonomous vehicles’ (AVs) eyes help pedestrians make faster street-crossing decisions. In this work, we examine a more critical question “can eyes reduce traffic accidents?” In order to answer the question, we consider a critical street-crossing situation where a pedestrian is in a hurry to cross a street. If the car is not looking at the pedestrian, it implies that the car is not recognizing the pedestrian. Thus, the pedestrian can judge that they should not cross the street, avoiding potential traffic accidents. We conducted an empirical study using 360-degree video shooting of a real car with robotic eyes. The results showed that the eyes can reduce potential traffic accidents and the gaze directions can increase pedestrians’ subjective feelings of safety and danger. In addition, the results showed gender differences regarding critical and non-critical scenarios in AV-to-pedestrian interaction.

That last sentence there is especially interesting: gender differences? It does seem that the researchers did find differences in how men and women reacted to the eyes, which I wouldn’t have expected.

For example, they found eyes reduced accidents for men, while it improved efficiency for women:


The team’s work is extensive, expanding to situations beyond just basic street-crossing situations to include more advanced, “critical” situations including ones with a hurried street crosser and the car both willing and unwilling to yield to the pedestrian:

There’s a lot of interesting information in these papers and it’s focused on a part of AV development that tends to get overshadowed by the more dramatic actual driving and object avoidance and other parts of the automated driving problem.


We communicate almost constantly as we drive around other beings and people in other vehicles, whether we realize it or not. Part of making a truly automated vehicle will be to automate those innate communication processes we take for granted. Automated vehicles need to operate in the world of human culture and communication, and if that means cars will end up being designed with big, hilarious eyes, then, well, so be it.

It’ll be fun to watch the car designers go nuts with this, anyway.




Share on facebook
Share on whatsapp
Share on twitter
Share on linkedin
Share on reddit

27 Responses

  1. I do not think making self driving cars into emoji’s will matter much. Especially when self driving become the norm versus just a novelty. Hopefully self driving things will not drive with faulty indicators or lack the will to lift a hand from a soda or a cell phone to use them. in addition, hopefully self driving things will have stop sensors that work when some idiot inattentive driver feel entitled enough to not have to actually follow the rules of the road and do something illegal. Definitely hopefully all the cameras and sensors will be admissible as evidence for said moronic drivers and thus force them to walk or buy one of them fancy new fangled self driving cars.

  2. What about human voices for the cars? Hopefully in a stereotypical New York accent:
    “Hey, I’m drivin’ heah!”
    “Look out ya dumbass!”
    “Git yer ass back on that curb! You don’t want none a me! Yeah, that’s what I thought. Run away ya little bitch!”

  3. This is so good.

    If the guv’ment can mandate back-up cameras on all new cars, why couldn’t they mandate googly eyes too? I say bring it on.

    Cars should absolutely be more communicative. Horns, turn signals, flashers, brake lights, and high beams. That’s all we’ve got, and that’s sad.

  4. It’s a funny idea but, as pretty much all research on external vehicle HMI, it focuses on extremely simple scenarios. I mean, where would the car look in case there are pedestrians from both directions that are allowed to cross? Zick-zacking eye movements? rolling? And that’s not even a complex scenario…

  5. It’s not gonna happen, but it pretty evidently would make AVs safer for pedestrians. I wasn’t thinking of it in terms of the car making eye contact with pedestrians in order to tell them it’s seen them, though. I was thinking more for body language.

    Cars do have body language, and drivers telegraph their intent without realizing. Someone who’s getting ready to pull away from a stop will often inch forward a little before they actually get going. If they’re about to make a right turn, you can usually tell by the way the car is positioned even if the driver isn’t using their turn signals. A car whose driver is on their phone will often be slow, and wander in its lane. There are lots of little cues that people give off, without even noticing.

    Do autonomous vehicles have the same body language? I don’t know why they would. An AI doesn’t unconsciously turn its “thoughts” into small movements while it contemplates the action it’s about to take. It doesn’t necessarily set itself up for its next move in the same way as a human, either. It just flips from one state to the next, without warning.

    Mounting eyes on AVs would give them the ability to telegraph what they’re about to do. The car could look in the direction it’s about to go, and its eyes could go from half-closed to fully open just before it starts to move. Eyes are very expressive, and could be used to help signal intentions. Again, it’s not going to happen because car designers are Serious People and car buyers hate whimsy, but I bet it would work.

    1. I remember when I was first learning to drive, my instructor instructor would point at cars on a roundabout and would tell me what exit they were going to use. At first I didn’t understand, but after a while I picked it up, just from where the car is on the road, and how it’s moving, you can get a surprisingly good idea of what a driver is going to do
      Some times if you’re paying enough attention you can tell what someone is going to do before they’ve even made a conscious decision themselves.

      1. Yup, it’s called paying attention.
        I’ve avoided several accidents by somehow knowing the car next to me was about to change lanes into mine and reacting accordingly before they started moving over.
        I’m a truck driver so I’m probably hyper aware of blind spots, including when I’m in someone else’s.

        Cars do have some semblance of body language, though it’s different than ours, it can be learned.

        Anybody who spends time around other animals gets to understand a somewhat alien mode of communication in a similar manner.
        I can tell you exactly what a dog is about to do before it acts without using good ol fashion eye communication.
        Maybe we should put tails on cars too?

  6. This is an interesting point. I am not sure giant eyes are the solution but for pedestrian safety around autonomous vehicles there likely does need to be some communication to those who have to move around them.

    Brake lights on the front too might be a potential solution. (they would need to be a different color, green maybe?)

    1. good call. since the 90s, i’ve been saying front brake lights, or communication of braking in the front, seem to make a lot of sense. now that vehicles are one metric ton heavier, communicating to others isn’t a less-desirable thing!

      1. What about a U-turn light (front and back)? I’ve long thought that would be a good idea, since there’s currently no way to communicate to the drivers around you that you intend to “make a yooie”. There’s not even an appropriate hand gesture for it, although you sometimes get hand gestures in response to a U-turn.

Leave a Reply