Alibaba’s Software program Can Discover Uighur Faces, It Informed China Purchasers
When the Chinese government tracked down and persecuted members of predominantly Muslim minorities, tech giant Alibaba taught its corporate clients how to play a role.
Alibaba’s cloud computing business website demonstrated how customers can use their software to recognize the faces of Uyghurs and other ethnic minorities in pictures and videos. This is evident from pages on the site discovered by the surveillance industry publication IPVM and shared with The New York Times. The feature has been built into Alibaba software, which enables web platforms to monitor digital content for material related to terrorism, pornography and other red flag categories.
The discovery could propel one of the world’s most valuable internet companies into a storm of international condemnation for China’s treatment of its Muslim minorities.
The Chinese government has taken hundreds of thousands of Uyghurs and others to indoctrination camps as part of a so-called anti-terror campaign. It also introduced a wide surveillance magnet that uses facial recognition and genetic testing to monitor them. The United States government has denounced the program and penalized Chinese companies believed to be involved.
It could not be determined whether or how Alibaba’s customers had used the minority detection tool. However, the potential for disruptive use is high. A social media platform could, for example, automatically mark videos for additional review or even alert the authorities if they contain faces that the software says are Uighurs.
After The Times asked Alibaba about the tool this week, the company updated its website to remove the references to Uighur and minority faces.
“The ethnicity mention refers to a feature / function that was used in a test environment during an investigation of our technical capabilities,” said an Alibaba Cloud representative in a written statement. “It was never used outside of the test environment.”
The company declined to say more about its testing or explain why information about the feature was included in its software’s official documentation. It also declined to comment on why it had tested tools to recognize ethnic minority faces.
Alibaba is a Chinese corporate giant with global reach. It is perhaps the only global partner of Amazon, a digital commerce giant that has expanded to include logistics, groceries, brick and mortar retail, and cloud services. Alibaba’s shares are traded on the New York Stock Exchange and are owned by major international investors. Global brands like Nike, Starbucks and Ralph Lauren use their platforms to sell to Chinese buyers. Alibaba is the official cloud service partner for the Olympic Games.
However, the Trump administration has viewed Chinese tech companies with growing suspicion, especially those involved in human rights abuses in Xinjiang, the western Chinese region home to many Uyghurs.
The government blacklisted 28 Chinese companies, including surveillance equipment makers and artificial intelligence startups, last year to address concerns about their role in crackdown. Last month, the White House banned Americans from investing in a list of companies with ties to the Chinese military, a step in segregating Chinese companies’ access to American capital markets.
Chinese officials have defended the campaign in Xinjiang as a non-lethal method of combating extremism. You have drawn attention to racist tension in the United States to deflect criticism from American officials.
Surveillance technology has been vital to China’s efforts. The vast majority of the country’s population are Han nationals. However, people from other ethnic groups can be so different from Han Chinese that software can pick them out more easily.
The Washington Post reported last week that Huawei, another Chinese tech giant, was testing software that could automatically alert police if their surveillance cameras recognized Uighur faces. The Post’s coverage, citing a document found on Huawei’s website, caused a French soccer star, Antoine Griezmann, to cut ties with the company. He was a brand ambassador for Huawei’s smartphones.
Economy & Economy
Apr. 16, 2020, 2:57 pm ET
A Huawei spokesman told The Post that the tool was “just a test”.
Face recognition technology has created ethical challenges in many places. In the United States, the potential for inaccuracies and bias has led some local governments to block the use of the technology for law enforcement purposes. This year, Amazon suspended law enforcement permission to use its facial recognition service for one year to allow lawmakers to consider stricter regulations.
The way the technology has been used in China has raised more serious questions.
Earlier this week, the Alibaba website announced that tools to recognize the faces of Uyghurs and other minorities were part of the “content security” service. The service helps Alibaba’s cloud customers flag potentially risky material in the images, videos, text and documents uploaded to their digital platforms.
“As government regulations become stricter by the day, these are tasks that all websites and platforms urgently need to take seriously and manage,” explains Alibaba’s website. The company is China’s leading provider of cloud services and a partner to international companies operating online in China.
According to the Alibaba website, the content security service can use still images and videos to perform “face recognition for sensitive people”. When a picture of a face is displayed, the software can search for attributes, including whether the person is wearing glasses or smiling, the website’s descriptions say.
Before Alibaba edited these descriptions this week, they said that the software could also evaluate two other attributes: whether a person is of “Asian” ancestry and whether they are a “minority” – which is added as a description on another page in brackets, referred to Uighurs.
The company’s online documentation in English for the same software made no mention of ethnic minority detection, a possible indication that the feature was primarily intended for Chinese customers.
Alibaba hadn’t been to China alone promoting tools for automated racial profiling.
Another Chinese cloud provider, Kingsoft Cloud, had described technology on its website that could, among other things, use an image of a face to predict “race”. According to a page and document on the Kingsoft Cloud website discovered by IPVM and shared with The Times, the company’s software was able to evaluate whether a person’s race was “Uyghur” or “non-Uyghur”.
After The Times asked Kingsoft Cloud about the software, the company deleted these pages from its website. A written statement said the tool in question had never been sold to customers and could not distinguish Uighur faces.
The statement stated that the software has left the company’s internal review processes behind and that the company is evaluating these mechanisms to ensure proper monitoring.
“Labeling based on a race is inappropriate and contrary to Kingsoft Cloud’s policies and values,” the statement said. “Our products will never attempt to identify and label specific ethnic groups.”
Kingsoft Cloud is listed on the Nasdaq Stock Exchange.
Aaron Krolik contributed to the reporting, and Lin Qiqing contributed to the research.