31 August 2019

Data Leviathan: China’s Burgeoning Surveillance State

Kenneth Roth and Maya Wang
Source Link

Classical totalitarianism, in which the state controls all institutions and most aspects of public life, largely died with the Soviet Union, apart from a few holdouts such as North Korea. The Chinese Communist Party retained a state monopoly in the political realm but allowed a significant private economy to flourish. Yet today, in Xinjiang, a region in China’s northwest, a new totalitarianism is emerging—one built not on state ownership of enterprises or property but on the state’s intrusive collection and analysis of information about the people there. Xinjiang shows us what a surveillance state looks like under a government that brooks no dissent and seeks to preclude the ability to fight back. And it demonstrates the power of personal information as a tool of social control.

Xinjiang covers 16 percent of China’s landmass but includes only a tiny fraction of its population—22 million people, roughly 13 million of whom are Uighur and other Turkic Muslims, out of nearly 1.4 billion people in China. Hardly lax about security anywhere in the country, the Chinese government is especially preoccupied with it in Xinjiang, justifying the resulting repression as a fight against the “Three Evils” of “separatism, terrorism, and extremism.”


Yet far from targeting bona fide criminals, Beijing’s actions in Xinjiang have been extraordinarily indiscriminate. As is now generally known, Chinese authorities have detained one million or more Turkic Muslims for political “re-education.” This latest “Strike Hard Campaign” has yielded the world’s largest case of mass arbitrary detention in decades. 

Beijing has tried to pass off the proliferating indoctrination centers as “vocational training” sites. In reality, the purpose is forced assimilation. Turkic Muslims are confined indefinitely until authorities determine that they have sufficiently replaced their religious and ethnic identity—their Islamic beliefs, language, culture, and traditions—with loyalty to the Chinese Communist Party. In some areas, the government considers children with a parent or parents in detention to be “orphans” and holds them in state-run orphanages where they face similar brainwashing.

But the use of mass detention is only part of Xinjiang’s story. What is even more striking is Beijing’s establishment there of a surveillance state, which plays a central role in determining who will be detained. The scope and intrusiveness of this effort may well be unprecedented. If this new form of totalitarianism is not curtailed, it portends a dystopia that other governments can be expected to emulate, threatening us all. 

Even in countries where the legal protection of privacy is more developed than in China, the law often lags far behind the changing technical capacities illustrated in Xinjiang. There is an urgent need to elaborate the right to privacy in concrete regulations that constrain a government’s surveillance powers, whether in China or the rest of the world.

The extraordinary nature of China’s surveillance effort in Xinjiang begins with the vast resources devoted to it. One million government employees are regularly dispatched to stay as “guests” in the homes of Turkic Muslims in Xinjiang, with instructions to report any sign of religiosity or unusual thinking. The authorities have also recruited tens of thousands of new police officers for Xinjiang, set up thousands of new police stations and checkpoints throughout the region, and dramatically increased the public-security budget.

Beijing then uses the latest technology to collect and analyze information gathered about Muslims there. Some Xinjiang checkpoints are equipped with special machines called “data doors” that—unbeknown to the people passing through them—vacuum up identifying information from their mobile phones and other electronic devices. Machine-readable QR codes are engraved on knives and posted on people’s front doors (and officials are equipped with mobile apps to scan them), allowing the authorities to quickly link individuals to their homes and possessions. To track, monitor, and profile Turkic Muslims, agents also rely on artificial intelligence, including facial and number-plate recognition, which have been connected with surveillance cameras that blanket both the region and other parts of the country. In addition, the authorities collect biometric data—including voice samples, iris scans, and DNA—and store them in searchable databases. 

Chinese authorities have had to deploy a new and innovative system to integrate, sort, and analyze this enormous quantity of data. The mobile app that police and other officials use to communicate with the Integrated Joint Operations Platform (IJOP), one of the main policing platforms that Xinjiang authorities deploy, provides insight into this system. 

Based on its aggregated data, the IJOP program flags for officials anyone deemed a potential threat. Some of those suspects are targeted for further investigation, and some for detention and re-education. By “reverse-engineering” this mobile app—looking at its source code—our organization, Human Rights Watch (HRW), was able to look inside it to see the vast array of information collected. The breadth of that intelligence-gathering helps to explain the bewildering set of questions that Xinjiang residents report being asked by the police.

That information ranges from obvious personal attributes—a person’s blood type or height—to their “religious atmosphere” and political affiliations. It includes whether someone has obtained a new phone number, donated to a mosque, or preached the Qur’an without authorization. The platform incorporates assessments of whether a person might not be “socializing with neighbors” or is “often avoiding using the front door.” If a phone suddenly goes “off-grid,” the system sends an alert to an official nearby to investigate. All of this information is fed into the Integrated Joint Operations Platform’s central system and linked to a person’s national identification card number.

In some cases, investigations require officials to check people’s phones. One Turkic Muslim from Xinjiang told HRW what happened when he was pulled over by police in a traffic stop: “SWAT police officers came and demanded that I give them my phone. I did, and they plugged the phone in.” A few days later, his wife experienced a similar check on her phone while they were stopped at a gas station. The platform considers “suspicious” fifty-one types of software and communications systems, including VPNs, as well as software that permits end-to-end encryption such as WhatsApp, Viber, and Telegram.

The platform predictably devotes special interest to personal relationships: Is a person in question connected to someone who has recently obtained a new phone number? Has that person traveled with someone whom the authorities find problematic? Is the person in contact with anyone abroad?

The scope of the surveillance can be terrifying to Xinjiang residents, who have no ability to challenge it. “There is a place I go at night that nobody knows. But the app knows. That’s when I got really scared,” said a Uighur Muslim who was familiar with the system from his time in Xinjiang, when shown the reverse–engineered app. Once, he input his friend’s ID card number into the system and was shocked when the app spat out “immediate arrest.”

The platform works with the region’s many checkpoints so that, when people pass through, their movement can be restricted depending on how “trustworthy” the computer system (or its programmers) deems them. Former residents said they have been stopped at checkpoints and taken for police interrogation simply because their relatives were being held in a political re-education camp. The system also stops people who are traveling to a different location from the one where they are registered to live. The effect of all this is to impose a series of digital fences around Xinjiang residents.Greg Baker/AFP/Getty Images
Schoolchildren walking beneath surveillance cameras in Akto, Xinjiang, a region in western China inhabited by Uighur and other Turkic peoples, June 4, 2019

The use of mass surveillance is not limited to Xinjiang. The Chinese police are researching and putting similar mass surveillance systems in operation throughout the country. For example, Human Rights Watch has documented the use of a big-data policing platform called Police Cloud, which collects and integrates people’s personal data—from their supermarket memberships to their health records.

Another system designed to shape social behavior is the “social credit” system that Chinese authorities are developing. Under this system, which the government has begun to put into operation and hopes to roll out more fully by 2020, people are, as it says, “rewarded everywhere” for good social behavior and “restricted everywhere” for bad behavior. Some types of measured behavior might seem relatively innocuous, such as whether a person obeys traffic regulations, pays court fines, or refrains from eating on public transport. But it would take little to add political criteria.

The details of the system vary in different parts of the country but have in common an attempt to link social reliability to eligibility for desirable social goods. Does one get residency in an attractive city? The ability to send one’s children to a private school? Permission to travel on a plane or high-speed train?

The ingenuity of these social control systems is that, for most people, the desire for such social benefits will be enough to keep them in line, even without the threat of detention. That is all the more true because most people in China, for reasons of self-preservation, already exercise a significant degree of self-censorship. They know to refrain from publicly criticizing the government and to keep their distance from outspoken acquaintances.

Given the human resources needed to build and maintain such elaborate systems of social control, the Chinese government recognizes that it must also monitor and regulate the conduct of the large number of police and bureaucrats who operate the system, particularly because many of the tasks involved are tedious and grueling. The Xinjiang police officer who completes the eleven pages of information requested by the Integrated Joint Operating Platform is engaged in pages and pages of the most mundane process of data collection. The app monitors how well officers carry out these tasks, giving them a score that is available to both the officer and his or her supervisors. 

Technology also helps to ease any qualms that police officers might have about the consequences of their work. Unlike the executioner or the torturer who knows that what he is doing is wrong, the officer inputting material into the platform is just doing routine police work, albeit with an unusual level of intrusiveness. The resulting evil is the consequence of computer programs, managed by siloed parts of the police state, that determine who is to be arrested. Responsibility is diffuse.

Taken together, these surveillance powers in Xinjiang suggest that the Chinese government is perfecting a system of social control that is both all-encompassing and highly individualized, using a mix of mechanisms to impose varying levels of supervision and constraint on people depending on their perceived threat to the state. John Garnaut, an expert on Chinese politics, traced from Mao Zedong to Xi Jinping the Communist Party leaders’ lineage as “engineers of the soul.” Both Mao and Xi shared the belief that humans can be conditioned “in the same way that [the Russian psychologist] Pavlov had learned to condition dogs” by “controlling all incentives and disincentives” in their lives, he said. That is why the Chinese government under Xi—who enjoys greater resources, more advanced technologies, and a stronger bureaucracy than Mao—rarely needs to resort to overt violence.

That is also why, for most people in China, life can seem “normal,” despite the social controls. This illusory effect also works in China’s favor abroad, because many visitors miss how carefully and coercively choregraphed its superficial calm is. Yet even in Hong Kong—a city under Chinese sovereignty that still retains some freedoms—many participants in the continuing pro-democracy protests are taking steps to protect themselves, with measures such as turning off location-tracking on their phones, buying old-format subway cards with cash, pointing laser beams at surveillance cameras, wearing face masks, and switching to encrypted communication platforms like Telegram to avoid identification and tracking.

Terrifying as the emerging system of social control is, though, it has its limits. Researchers developing these surveillance systems have bemoaned the difficulty of mining genuinely useful analytics from such huge quantities of data. Among the problems cited are that frontline officers lack the motivation to collect data accurately, or that surveillance systems developed by different companies are not fully compatible. While the ubiquity of surveillance tools, from biometric databases linked to national ID numbers to pervasive surveillance cameras, suggests fearsome capabilities, many of these systems do not yet work as intended.

What can be done to curtail this system? Publicly criticizing it is the first step. Despite their façade of imperviousness, the Chinese authorities have shown themselves to be sensitive to criticism. As media attention to the mass detention of Xinjiang’s Turkic Muslims mounts, the Chinese government has felt the heat; it has organized show-tours for diplomats and journalists as part of an effort to pass off the detention centers as benign. The tours have not been terribly convincing—in one case, inmates were compelled to sing, in English, the children’s song “If you’re happy and you know it, clap your hands”—but the charade gives other governments an excuse not to pick a fight with a powerful economic actor.

In July, twenty-four governments at the United Nations Human Rights Council in Geneva issued—for the first time in such numbers—a statement of concern about China, focusing on the mass detentions in Xinjiang. The statement shows that, despite China’s economic power, these governments will try to hold Beijing to the same standard as they would other abusive governments. China immediately countered by orchestrating its own statement of support, although it had to rely on the likes of North Korea, Venezuela, Saudi Arabia, Cuba, Syria, and Russia. One Chinese official even claimed—though there is no known evidence to support his statement—that in Xinjiang “over 90 percent of the students have returned to society and returned to their families and are living happily.”

However ham-fisted these responses, they show that international criticism has struck a nerve, that the Chinese government cares about its reputation and knows that what it is doing to the Turkic Muslim population of Xinjiang is difficult to defend. Its reaction indicates the importance of continuing to shine a spotlight on the extraordinary system of surveillance and detention that it has erected in Xinjiang.

Beijing also seems to fear the growing efforts by other governments to impose targeted sanctions on companies and individuals that help to build or operate this surveillance system. In February, Thermo Fisher Scientific, a large US-based medical technology manufacturer, announced that it would stop selling human identification technology to the Xinjiang Public Security Bureau. In July, in another sign of defensiveness, China opted to send a lower-ranking official—rather than Xinjiang Party Secretary Chen Quanguo, whom many have suggested should be targeted for sanctions—to defend its repressive policies at the UN Human Rights Council.Simina Mistrenau/picture alliance via Getty Images
An image of President Xi Jinping playing on a video wall next to a minaret in the city of Kashgar, Xinjiang, China, November 8, 2018

Ultimately, however, global protections for privacy need to be strengthened in the face of the state’s new technical capacities for surveillance and analysis. Citizens are starting from a position of weakness. The US government, for example, has long taken a minimalist view of the right to privacy. It maintains that we lose our right to privacy in such matters as the phone numbers that we dial or the addresses to which we send emails because we “share” that information with the phone or internet company—as if, in the modern age, we had any meaningful choice. The US government has backed off that position only when the Supreme Court has compelled it to do so, such as when the court ruled in 2018 that people have an expectation of privacy in data that communications service providers gather about their locations.

International standards, as laid out by the United Nations’ Office of the High Commissioner for Human Rights, make clear that information about our communications, just like the content of a message, is protected by the human right to privacy. This means that a government can gather such data only when doing so is legal under domestic and international law, as well as necessary and proportionate to achieving a legitimate goal. However, not just the United States but many governments around the world as well have a long way to go in recognizing this aspect of privacy rights.

Similarly, citizens are used to thinking about privacy as something that exists only behind closed doors, but in fact we expect a degree of privacy even as we go about our day-to-day affairs in public. Government agents could physically follow us, but the time and expense required to do so means that for most people they never bother. But the dynamics have changed now that most of us carry tracking devices with us everywhere we go—that is, our smart phones—and the government can with relative ease reconstruct our lives by capturing and analyzing that data. The US Supreme Court, as noted, recently recognized that we do have a privacy interest even as we go about our public lives, though this is a relatively new concept that needs to be developed.

In an earlier age, an extensive repository of information such as the one built by Chinese authorities about Muslims in Xinjiang would have been of limited utility because security officials would have had to comb through it manually. That would have allowed them to focus on selected individuals, but any effort at large-scale monitoring would have been overwhelming. Today, however, advances in machine-learning and data analytics enable the detection of “suspicious” patterns of behavior, such as “overuse” of electricity, that might not be apparent even to a trained detective.

Of course, even were other governments to embrace privacy protections, the Chinese state would be unlikely to join them. And supposing China were to subscribe theoretically to standards on surveillance, as it has for some human rights standards, residents would have no capacity to enforce them. Chinese citizens have no independent judiciary and no meaningful right to petition or protest against governmental misconduct.

All the same, international standards can exert influence even on a government like China’s once a critical mass of other countries shows it is ready to abide by those standards. Although, for example, when the 1997 treaty banning landmines was adopted, a number of major powers including China, Russia, and the United States refused to ratify it, a sufficiently large number of governments did embrace the accord. As a result, landmines became stigmatized as an indiscriminate weapon, and few governments would now admit to using them, whether or not they have ratified the treaty. The same process occurred after the international adoption of treaties banning cluster munitions and child soldiers.

The next step in checking the new surveillance apparatus is for civic groups to organize and press the world’s leading governments to develop and promote privacy protections for the modern technological world. The aim should not be a universally endorsed treaty. That would produce a lowest-common-denominator standard that would sell short our privacy rights. Instead, the objective should be to secure the endorsement of strong standards by a sufficient number of governments to stigmatize certain intrusions on our privacy. The focus should be to exercise restraint on the large-scale collection and transfer of personal data that can be used to profile whole populations; regulation of state acquisition and deployment of biodata such as DNA, facial images, or voice samples; meaningful export controls on surveillance technology; and transparency and public audit requirements for machine-learning tools deployed by governments that can affect basic rights.

Such measures, even if widely adopted, would not be a panacea. In many cases, they would not be legally enforceable. But they would make an important contribution to the development of broadly accepted international norms on the limits of surveillance. Those norms, combined with the opprobrium visited upon governments that violate them, are the best practical way we have to push back against the surveillance-state Leviathan that Beijing has built to monitor and control the Turkic Muslims of Xinjiang.

No comments: