How should autonomous vehicles (AVs) be programmed to interact with pedestrians? What about non-intelligent vehicles such as other cars and bicycles? While AV technologies themselves—intelligent computing and AI, LIDAR sensors and so on—receive a great deal of government and popular attention, AV technology domestication is an overlooked topic that will influence how fast the industry can develop.

As industry practitioners in China put drones on crowded streets, they’re recognizing something surprising about this problem: when you put robots on the street that are programmed to avoid people, they get bullied. This means that programming AVs based on existing behaviors may not be enough.

Opinion

Sacha Cody is a business consultant and China Studies scholar in Melbourne, Australia.

Human behavior changes when confronted with new technologies; we slow down at a speed hump, we smile in front of a camera, and we cross the road when the pedestrian light is green. Science and technology studies scholars call this technology domestication; how do people consume, modify, reconfigure, and resist technologies? Technology domestication is user experience writ large.

As a post-doctoral fellow at the Hong Kong University of Science and Technology, I ethnographically explored this topic in 2019. Over six months, I met and interviewed dozens of people across Beijing, Guangzhou, Hong Kong, Shanghai, and Shenzhen.

My interlocutors worked inside AV and traditional automobile companies as data scientists, engineers, marketers, and strategy advisers, as well as along the supply chain making sensors and other components. I also spoke with industry analysts, journalists, and lawyers.

I did not expect technology domestication to be so top of mind among my interlocutors, but it was. People involved in actually making AVs, as well as those responsible for getting them onto China’s roads, were especially perceptive. In fact, while other countries have been focused on developing AVs that “fit in” with existing behaviors, Chinese researchers are approaching the topic differently.

Yi Zeng, a prominent computer scientist and Director of the Research Center for Artificial Intelligence Ethics and Safety in Beijing, encourages Chinese AI companies to better understand how people will ultimately use and interact with the products and platforms they are creating.

Fitting in

You might think it is the AVs job to fit in with people. A team of social scientists working on AV behavior at Nissan thought just that. They worked hard to develop AVs with “socially acceptable behavior,” defined as behavior that takes into account existing social and cultural practices related to mobility and human-automobile interaction.

Perhaps due to established road rules and ingrained behaviors around Silicon Valley, where the team was placed, Nissan focused on teaching AVs to simulate how a typical driver navigates a vehicle in the presence of others.

Take a pedestrian crossing that does not have traffic lights (i.e., a zebra crossing): even with clear rules of engagement, it is common for drivers to momentarily make eye contact—maybe also using subtle facial gestures—to signal the pedestrian can cross safely. In fact, the pedestrian may let the driver pass first for various reasons; such is the complexity of this seemingly simple human-automobile interaction.

The team proposed that Nissan’s AVs should be equipped with a device that functions analogously to such cultural signaling, alerting pedestrians with a caption that lights up and flashes “I have seen you, you may cross safely.” In this case, the solution ensures that AVs are programmed to behave based on current social and cultural norms.

Standing up to a bully

My interlocutors in China thought differently. After seeing how people treat AVs during testing, they became convinced fitting-in was not enough. Yan Li (a pseudonym), a deep learning (shendu xuexi) engineer at a large Chinese automobile conglomerate developing their own line of AVs, put it pithily when we met in Guangzhou: “Humans bully AVs.”

Also using zebra crossings as an example, Yan Li explained that time and time again during testing, pedestrians crossed the road and payed scant attention to the AV. “They were so confident the AV would stop they completely tuned out.”

This worries Yan Li because outside testing areas in the real world, where traffic is greater and there are more pedestrians, the AV will get stuck. A flashing light alerting the pedestrian they can cross safely is useless. Yan Li explained, “Chinese just won’t let the AV pass. They’ll keep crossing and even loiter, because they know an AV will not harm a human. We need to break the deadlock. But how? That’s our conundrum.”

Yan Li treated the issue matter-of-factly at first; just another algorithm oddity that needed fixing. Over time, however, she came to see the deep social and cultural realities at the heart of the issue. Part of the challenge, she explained, is the sheer variety of users and behaviors on Chinese roads compared to western environments. “What do you do when a chicken crosses the road?” Yan Li asked earnestly. “This is not as uncommon as you think. Chickens move differently to a cat or a dog. We need to think about all these things.”

It’s a fair point; how many chickens roam the roads of Silicon Valley?

Sammy Wang (also a pseudonym), a senior executive at a large Chinese e-commerce company, had similar experiences and concerns. Sammy is part of a team developing autonomous delivery solutions in which an AV travels to the entrance of a residential compound and then dispatches smaller autonomous units to complete delivery to the customer’s door (these smaller units are even capable of riding up and down an elevator). Sammy explained:

During our testing, we had lots of problems. The biggest one was that people would not let the small unit pass them and often ignored it; they just stood there talking or whatever. Even if they noticed it, they didn’t let it pass. It’s a big headache for us.

Engineers were just beginning to recognize this challenge when I spoke to them.

Both Yan Li and Sammy were trying to figure out a way forward: how can AVs operate in such an environment without hurting others? Some people I met believe Yan Li’s and Sammy’s concerns are not relevant, even in China’s unique environment. These interlocutors explained that AVs will anyway be deployed in highly controlled environments as part of China’s smart-city development agenda.

But when AVs eventually share space with pedestrians and non-intelligent vehicles, as is likely, an AV that asserts itself rather than yields may be necessary. But how assertive should it be and what will an assertive AV actually look like?

Will it edge forward and nudge people with a soft yet harmless bumper bar? Will it announce “I am proceeding slowly, please disperse” and move forward? Right now, we just don’t know. It all depends on how people (mis)behave in the future? Yan Li summarizes the challenge nicely: “How can I program an AV to be assertive, yet not endanger others?”

While we’re not sure of the answer to AV technology domestication, at least now we’re asking the right questions.

Sacha Cody is a business consultant and China Studies scholar. He lived in China for 15 years and currently resides in Melbourne, Australia. He can be found on the web at http://sachacody.info.