Despite calls to regulate artificial intelligence on the battlefield, China’s tech sector is in danger of complicity in developing lethal autonomous weapons, as several companies have shown a keen interest in collaborations with the country’s public security organs.
Sensetime, the world’s most valuable AI startup, and facial recognition firm Yitu were cited by Dutch anti-war non-governmental organization PAX over concerns that their technology could be used for developing “killer robots” that could choose and engage targets without human intervention.
While Sensetime and Yitu’s products are currently not employed on the battlefield, the nature of those products as well as the companies’ history of working with China’s government is worrying, PAX says. In a recent report, the NGO referred to the two firms as being of “high concern.”
Sensetime and Yitu were not immediately available for comment.
PAX’s report ranks the possible complicity of tech companies according to the technologies they develop, past collaboration with law enforcement or the military, and whether they have pledged not to aid in the development of killer robots.
The report also mentions other Chinese tech firms, including Alibaba, Baidu, and Tencent, though PAX classifies these companies as less of a concern.
PAX’s report comes amid increasing calls for caution over what has been dubbed the third revolution in warfare, after gunpowder and nuclear weapons. Around 30 countries currently support a ban on killer robots, and prominent figures from the research and tech communities, including Tesla’s Elon Musk, who spoke at a government-led AI conference in Shanghai this week, have warned of the dangers they present.
Currently, seven nations are developing lethal autonomous weapons, including the US and China. The projects under development include autonomous drones, as well as AI-equipped tanks and fighter jets, whose autonomy have raised alarm bells.
“Killer robots would be unable to apply either compassion or nuanced legal and ethical judgment to decisions to use lethal force,” Human Rights Watch said of the technology earlier this month.
In the US, tech companies including Google and Palantir have taken on government contracts, with applications ranging from analyzing drone footage to documenting immigrants. The same is true in China, where the private sector has filled government tenders to provide technology in a bid to ensure social stability.
PAX’s report raises questions over possible tech sector involvement in the race for the next generation of military technology, in which lucrative government contracts could provide significant incentives. Meanwhile, China holds an ambiguous stance toward autonomous weapons, supporting a ban on these arms while simultaneously pushing for the prohibition to exclude developing such weapons.
“It’s very clear that the Chinese military is very actively engaged in pursuing a number of applications of AI,” says Elsa Kania, an adjunct senior fellow who studies the modernization of China’s military at Center for a New American Security, a Washington DC-based think tank.
The future of combat
“In future battlegrounds, there will be no people fighting,” said Zeng Yi, a senior executive at Norinco, one of China’s biggest defense companies, at the Xiangshan Forum in Beijing last year.
The Xiangshan Forum is a big deal. With its focus on security in the Asia-Pacific region, it is to the Shangri-La Dialogue what the Boao Forum is to Davos. And Norinco is a key player in China’s defense industry; its products are used both domestically and internationally, including in the Middle East.
Zeng went on to predict that by 2025, autonomous weapons would be ubiquitous on the world’s battlegrounds, given the use of AI. “We are sure about the direction and that this is the future,” he added.
This kind of thinking has critics concerned. Much like the sprint to produce nuclear weapons during the Cold War, a push to develop autonomous weapons could lead to what PAX calls an “AI arms race,” in which various states compete to develop these weapons. Unlike nuclear arms, which act as a deterrent, autonomous weapons could make nations increasingly trigger-happy, as “you don’t have to put troops on the groups,” observers say.
Like most sectors earmarked for development, the government has put its might behind modernizing the military, creating an attractive proposition for tech startups. Daan Kayser, PAX’s project leader on autonomous weapons, told TechNode in a phone interview, “For Chinese companies, these could be quite lucrative projects, so there are economic reasons for getting involved.”
Financial incentives are evident in China’s surveillance sector, where companies like Sensetime, Yitu, and rival Megvii—which this week announced plans for a Hong Kong listing—have seen their profits swell on the back of government contracts.
While financial figures aren’t available for Sensetime and Yitu, documents filed with the Hong Kong Stock Exchange show that Megvii’s revenue reached almost RMB 1 billion ($133 million) in the first half of 2019, which the company attributes, in part, to government spending. The AI firm’s revenue in the first six months of this year was three times that of sales for the whole of 2017.
Sensetime, Yitu, Megvii, and Cloudwalk—also mentioned in PAX’s report—have all developed AI monitoring systems that help China’s police force keep tabs on its citizens by analyzing video and flagging persons of interest.
For example, Sensetime’s SenseTotem and SenseFace systems are currently being used by various police departments around China for this purpose. Meanwhile, Yitu’s tech is being used by public security organs in 20 provinces throughout the country.
“The government creates lucrative business opportunities by including these companies in its digital agenda. The companies, in turn, help secure political stability,” Sebastian Heilmann, the founding president of the Mercator Institute for China Studies, wrote in a blog post.
China’s government has also launched several state-driven investment initiatives focusing on private sector-military partnerships. As of the middle of this year, these funds had reached tens of billions of yuan.
Incentives to provide tech for killer robots could extend beyond monetary gain, as the Chinese government aims to promote an atmosphere of “civil-military fusion.” China’s army is looking to develop closer ties with the country’s private sector and research institutions.
China sees a need for these partnerships to drive a defense industry that has traditionally been viewed as unimaginative and a military that hasn’t been able to leverage commercial sector innovation. The enterprise is being overseen at the highest level, with Chinese president Xi Jinping leading the charge.
“Whenever there is a national initiative, there is pressure on companies to engage,” Kania said. She added that the military’s drive to forge close ties with civil society creates “more programs, and avenues, and opportunities” for businesses to work with the armed forces.
Despite rising pressure, companies are not being coerced into these sorts of partnerships. The characterization that the Chinese military has direct access to technology in the commercial sector is not accurate. Some tech companies have articulated interest in this type of work, while others have not—at least based on public information, Kania says.
Nevertheless, there is “absolutely a connection” between security and defense applications, she added, meaning that it wouldn’t be a stretch for companies that are involved in one to explore the other.
AI is central to developing autonomous weapons, and China is betting big on the technology. The country is catching up with the US and has overtaken the European Union in its capabilities, according to the Center for Data Innovation, a US-based think tank. The State Council, China’s cabinet, has announced plans to become a world leader in AI by 2030.
Meanwhile, the country’s Made in China 2025 initiative, which the country’s leaders have touted as the strategy for moving China up the industrial value chain, prioritizes the development of the robotics, aerospace and information technology industries. All of these sectors develop dual-use, military-civil technologies.
“The Chinese military believes that there is a revolution in military affairs underway in which AI could be critical to future military power,” Kania said.
These systems could be used in applications ranging from cyberdefense to creating weapons with ever-increasing levels of autonomy. Machine recognition, in particular, could prove to be extremely valuable in developing highly autonomous weapons, allowing these arms to not only “see” the world around them, but also to understand it and make decisions based on what they perceive.
The danger, according to Kayser, is that intelligent weapons could make decisions at a speed that is out of the realm of human capability. “If you can make decisions faster than your enemy, you’ll be able to beat them,” he said.
In June 2018, US search giant Google announced it would not renew a Pentagon contract to analyze drone video footage. Dubbed “Project Maven,” its aim had been simple: to use machine learning to improve the accuracy of drone strikes.
The tie-up came to an abrupt end. Thousands of Google’s employees signed a petition imploring the company to abandon the project, while many others resigned.
“We believe that Google should not be in the business of war,” the open letter to Google CEO Sundar Pichai began.
Similarly, the data analysis firm Palantir, founded by Facebook board member Peter Thiel, recently found itself at the center of controversy for its contracts with the US Immigration and Customs Enforcement to gather information about undocumented immigrants.
Similar opposition to tech companies working with the government has largely been absent in China.
“To my knowledge, there has not been any Chinese tech company that has been working with the Ministry of Public Security or the military where there has been any articulation of resistance to that engagement,” Kania said.
Regardless, countries working on these sorts of weapons are unlikely to stop as a result of public outcry. If even one nation pursues autonomous weapons, others will likely follow suit.
“These countries are looking at each other. The main rationale for exploring these sorts of technologies seems to be: ‘Our adversaries are also doing this,’” said Kayser.