19 September 2020

TikTok and WeChat

Fergus Ryan , Audrey Fritz & Daria Impiombato

What's the Problem?

While most major international social media networks remain banned from the Chinese market in the People’s Republic of China (PRC), Chinese social media companies are expanding overseas and building up large global audiences. Some of those networks—including WeChat and TikTok—pose challenges, including to freedom of expression, that governments around the world are struggling to deal with.

The Chinese ‘super-app’ WeChat, which is indispensable in China, has approximately 1.2 billion monthly active users1 worldwide, including 100 million installations outside of China.2 The app has become the long arm of the Chinese regime, extending the PRC’s techno-authoritarian reach into the lives of its citizens and non-citizens in the diaspora.3 WeChat users outside of China are increasingly finding themselves trapped in a mobile extension of the Great Firewall of China through which they’re subjected to surveillance, censorship and propaganda. This report also shows how Covid-19 has ushered in an expanded effort to covertly censor and control the public diplomacy communications of foreign governments on WeChat.

Newcomer TikTok, through its unparalleled growth in both Asian and Western markets, has a vastly larger and broader global audience of nearly 700 million as of July 2020.4 This report finds that TikTok engages in censorship on a range of political and social topics, while also demoting and suppressing content. Case studies in this report show how discussions related to LGBTQ+ issues, Xinjiang and protests currently occurring in the US, for example, are being affected by censorship and the curation and control of information. Leaked content moderation documents have previously revealed that TikTok has instructed "its moderators to censor videos that mention Tiananmen Square, Tibetan independence, or the banned religious group Falun Gong,” among other censorship rules.5

Both Tencent and ByteDance, the companies that own and operate WeChat and TikTok, respectively, are subject to China’s security, intelligence, counter-espionage and cybersecurity laws. Internal Chinese Communist Party (CCP) committees at both companies are in place to ensure that the party’s political goals are pursued alongside the companies’ commercial goals. ByteDance CEO Zhang Yiming has stated on the record that he will ensure his products serve to promote the CCP’s propaganda agenda.6

While most major international social media platforms have traditionally taken a cautious and public approach to content moderation, TikTok is the first globally popular social media network to take a heavy-handed approach to content moderation. Possessing and deploying the capability to covertly control information flows, across geographical regions, topics and languages, positions TikTok as a powerful political actor with a global reach.

What’s the solution?

The global expansion of Chinese social media networks continues to pose unique challenges to policymakers around the world. Thus far governments have tended to hold most major international social media networks and Chinese social media networks to different standards. It’s imperative that states move to a policy position where all social media and internet companies are being held to the same set of standards, regardless of their country of origin or ownership.

This report recommends (on page 50) that governments implement transparent user data privacy and user data protection frameworks that apply to all social media networks. If companies refuse to comply with such frameworks, they shouldn’t be allowed to operate. Independent audits of social media algorithms should be conducted. Social media companies should be transparent about the guidelines that human moderators use and what impact their decisions have on their algorithms. Governments should require that all social media platforms investigate and disclose information operations being conducted on their platforms by state and non-state actors. Disclosures should include publicly releasing datasets linked to those information campaigns.

Finally, all of these recommended actions would benefit from multilateral collaboration that includes participation from governments, the private sector and civil society actors. For example, independent audits of algorithms could be shared by multiple governments that are seeking the same outcomes of accountability and transparency; governments, social media companies and research institutes could share data on information operations; all stakeholders could share lessons learned on data frameworks.

No comments: