19 December 2021

Trollfare: How to Recognize and Fight Off Online Psyops

LARISSA DOROSHENKO and JOSEPHINE LUKITO

EU President Ursula von der Leyen and others have correctly diagnosed Belarus’ use of migrants as part of a “hybrid attack” against Europe’s democracies. But most have missed a key component of this and other such attacks: the psychological operations deployed online. The West must get better at detecting and countering them.

This starts by understanding common tactics, including the 5D toolkit: distort, distract, dismiss, deny, and dismay. We have seen these tactics at work during the post-Euromaidan conflict in Donbass, when Russia used fake-news sites to distort public knowledge, used stand-alone casualty figures to distract from events, dismissed concerns about its military presence in the region, and denied involvement in the MH17 Boeing crash.

We have also seen the 5D kit at work in the recent escalation on the EU-Belarusian border. The migrant flows are distracting the world from Russia’s growing military presence in Belarus and on the Ukrainian border. Putin denies helping to bring illegal immigrants to Belarus, lying in the faces of European journalists’ reportage. Russian media distorts the facts, claiming that it is the United States preparing to launch a campaign in Donbass. They even enlist Western media in creating dismay by saying that conflict in the region could escalate into nuclear war. (And these tactics are not just aimed at Western governments. The manufactured crisis has distracted ordinary Belarusians from COVID-19, Stalin-era repressions, and an upcoming constitutional referendum.) All this has escalated xenophobia in Belarus, fueled anti-Western sentiments in Russia, and exacerbated tensions between all countries in the region.

Russia uses the web and social media to amplift the effects of 5D tactics. News-impersonating sites blur the distinction between fact and fiction. Social-media bots and sockpuppets help Russia build armies of followers who spread divisive messages. It can be difficult to tell a psyop from the normal flow of online discourse. But there has been promising research on using network analyses or text classification to reveal the patterns left in social-media by the mass production and amplification of propaganda and disinformation: the coordinated posting of links, the use of @mentions to garner attention, or the strategic sharing and retweeting of messages.

The West should also track propaganda news machines. For example, U.S.-based social media platforms have worked to ban accounts run by Russia’s Internet Research Agency. But the organization continues to do its work through, for example, the news group Patriot, which includes several websites (e.g., riafan.ru, nevnov.ru) that encourage readers to share links. It is therefore necessary for the United States and its allies to track disinformation across social media platforms, including social media, blogs on hosts such as LiveJournal, messenger apps like Viber and Telegram, and fraudulent websites that masquerade as legitimate media outlets.

Local and regional experts should be enlisted to help scan for psyops. With a greater understanding of the history, culture, and politics in the area, they may be able to anticipate and detect an increase in disinformation production.

Once the West understands that a psyop campaign is underway, it must push back domestically and abroad. Part of this is public diplomacy campaigns to encourage and promote reputable Western-based news sources in Eastern European region, such as Radio Free Europe/Radio Liberty, BBC, or Deutsche Welle. Another part of it is working with digital media companies and researchers to shut down psyop campaigns and the users and accounts that spread them in real time. Still another part is studying these campaigns in depth—and using the understanding to develop resilience strategies, build more advanced detection classifiers, and assess the causal effects of psyops.

No comments: