Pages

2 October 2019

We're In the Middle of a Global Information War. Here's What We Need to Do to Win

BY RICHARD STENGEL 
Source Link

If the Russians had tried to find a more inhospitable space for our meeting, I don’t know how they could have succeeded. I was led into a narrow trapezoidal room with one grimy window in a faceless building off Red Square. It was 10 days before the 2016 presidential election, and I was the last State Department official to visit Moscow before the vote. I had been Under Secretary of State for Public Diplomacy for almost three years, and a big part of my job had been trying to counter the deluge of Russian disinformation that we saw beginning with the invasion of Crimea in 2014. But I was under strict orders from the National Security Council to not bring up Russian disinformation or interference in the U.S. election. No one wanted any hiccups.

The two Russian officials seemed to be channeling Putin: chilly, inhospitable, inflexible. They made no effort to be pleasant–or even diplomatic. I brought up Russian harassment of American diplomats. They shrugged. I brought up the forced closing of American cultural facilities. They shrugged. I did not bring up Russian interference in our election. I wish I had.


I had come to State after being a journalist for many years and the editor of TIME for the previous seven. My job was to help shape America’s image in the world–I thought of myself as the chief marketing officer for brand USA. But then a funny thing happened. Within a few weeks of my being on the job in 2014, the Russians invaded and then annexed Crimea, the southernmost peninsula of Ukraine. The largest violation of another nation’s sovereignty since Iraq’s invasion of Kuwait. And Vladimir Putin lied about it–over and over again.
Illustration by Lincoln Agnew for TIME

President Obama and Secretary of State John Kerry condemned this willful act of aggression and called for sanctions against Russia. I shared their outrage, but I couldn’t impose sanctions or call up troops. What I could do was, well, tweet about it. After all, I was the head of all of State’s communications, and I could marshal department messaging against Russia and the invasion. So I decided to tweet on my own, hoping others would follow. Here’s the first: “The unshakable principle guiding events must be that the people of #Ukraine determine their own future.”

As I started tweeting, I noticed something odd. Within the first few minutes and then for months after, I started getting attacked, often by Russian-sounding Twitter handles. A single tweet would get dozens, sometimes hundreds of comments. I soon started getting hundreds of tweets calling me a fascist propagandist, a hypocrite and much, much worse. At the same time, we observed a wave of social media in the Russian periphery supporting the Russian line on Ukraine, accusing the West of being the source of instability, claiming Ukraine was a part of Russia. Who knew that the Russians were so good at this? We didn’t realize or even suspect it at the time, but this tsunami of Russian propaganda and disinformation became a kind of test run for what they did here in the 2016 election.

In many ways, these were also the first salvos in the global information war we are living in now. Today, we are all actors in a global information war that is ubiquitous, difficult to comprehend and taking place at the speed of light. When I was at the State Department, there were hundreds of thousands of cyberattacks a day. The Pentagon says it thwarts more than a million malware attacks an hour. About 600,000 Facebook accounts are compromised every day. More than 25 million data records are lost or stolen from businesses each day. And all that doesn’t even take into account the rising tide of disinformation, which is impossible to measure.

It is a war without limits and boundaries, and one we still don’t know how to fight. Governments, nonstate actors and terrorists are creating their own narratives that have nothing to do with reality. These false narratives undermine our democracy and the ability of free people to make intelligent choices. The disinformationists are aided by the big-platform companies who benefit as much from the sharing of the false as from the true. The bad guys use all the same behavioral and information tools supplied by Facebook, Google and Twitter. Just as Nike buys your information to sell you sneakers, the Russians bought your information to persuade you that America is a mess. Autocrats have learned the same tools once seen as spreading democracy can also undermine it. Studies show that more than a quarter of Americans recall seeing at least one false story leading up to the 2016 election.
Grove Atlantic

Sometimes experience can be a barrier to discovery. My very ignorance of how things worked at State helped me launch something new. I had looked around the department and didn’t see any entity that could push back against all the Russian disinformation and propaganda around Ukraine and Crimea. I called a meeting of the senior leaders of public diplomacy and public affairs. By coincidence, we had a public-affairs officer visiting from our Kiev embassy. He was a big, bearded burly fellow from the Midwest, and after listening to some of the milquetoast comments, he stood up, banged the table and said, “The Russians have a big engine. They are building a compelling narrative. They repeat the same lies over and over. They don’t feel the need to be truthful. We are being outmessaged. We are too timid and reactive.”

When he sat down, there was silence. His speech was much more powerful than anything I could have said. But I sought to harness his passion and said, Let’s start a counter-Russian information group here at State. I asked for volunteers. Zero hands went up. Then the public-affairs officer said, Count Kiev in. We’re in all the way. Others then slowly stepped forward. That day we started what became known as the Ukraine Task Force. It was the first entity in the federal government that sought to reckon with Russian disinformation.

By 2015, we realized that a lot of this Russian disinformation emanated from an anonymous building in St. Petersburg that was the home of a shadowy Russian company called the Internet Research Agency (IRA). The IRA was in fact a troll factory. A few hundred young people entered every morning at precisely 8:55 and spent the day doing everything from tweeting about how corrupt the U.S. was to writing Facebook posts on recipes. The enterprise was owned and financed by an oligarch close to Putin. I actually got hold of a manual–from an open source–that was given to each of the people who worked there. The guide instructs them to each create numerous online personas, sometimes called sock puppets, that look and sound like real people.

What we also saw–which was not written about in the Mueller report–is how the “fake news” from the IRA was complemented by traditional Russian media like RT and the new digital arm Sputnik. The trolls from the IRA would retweet RT and Sputnik stories. Sputnik and RT would pick up some of the false leads from the IRA. And then the Russian Foreign Ministry would give credence to all of it.

But the truth was we were getting outgunned. We estimated that the IRA alone produced a few thousand pieces of social media a day. We were collectively producing a few dozen. Plus, they were buying ads on social media, something we didn’t do. The Russian initiative was a whole-of-government effort, so even their public statements were echoed and amplified by the IRA. Our little entity changed its name to the Russian Information Group and mainly began doing research and helping our public-affairs offices around the world. But it became the seed for something else. In March 2016, the President signed an Executive Order creating the Global Engagement Center, an interagency group tasked primarily with combatting Russian disinformation.

After the election, I wondered whether we should have done more. Did we see all of this at the time? No. I wish I’d been able to connect the dots faster. There was a lot we missed. But I wish we had made a whole of government effort to tell voters that the Russians were not “meddling” in our election–they were staging an unprecedented attack against the very foundation of our democracy.

So what should we do now as 2020 approaches?

One thing we know is Russia does not have an “off” switch. Sanctions slow but do not deter it. And as Putin sees a friendly face in the White House, he’s going to continue to probe and interfere. So we must assume the Russians are already preparing for 2020.

While the Russians have pioneered election interference, they are no longer alone. In the intelligence world, countries copy what works, and the FBI has already said China and Iran are getting in on the game. Just this year, Facebook and Twitter have taken down hundreds of accounts and handles affiliated with Iran influence operations. China has used online influence to counter the Hong Kong protesters; it’s hard to imagine they won’t experiment against the U.S.

I came to see firsthand that government was not the answer to fighting disinformation. It’s too big, too slow, too siloed and just too averse to creating content itself. It’s also not government’s job to censor content. The first five words of the First Amendment are “Congress shall make no law …” But that doesn’t mean government has no role. Congress should amend the Communications Decency Act of 1996, particularly Section 230. It was that law that declared the platform companies were not publishers and gave them blanket immunity for the content that is on them. This is wrong. They are the largest publishers in the history of the world, and they need to have more accountability for the content they publish.

It’s not a mystery what we should do about disinformation. There’s a broad consensus around the following: 1) Prohibiting foreign governments, organizations and individuals from buying online advertising to influence U.S. elections; 2) Making the source and buyers of all political advertising transparent, and informing users why they were specifically targeted; 3) Providing more privacy for your data, and moving toward the E.U. policy that citizens own their own information; 4) Removing verifiably and provably false information, including deep fakes; 5) Getting media organizations to collectively agree not to use stolen information from hacks or cybermeddling; 6) Requiring campaigns and officials to disclose any contacts with foreign governments or individuals seeking to influence elections; 7) Appointing a senior federal official, even creating a Cabinet office, to deal with disinformation and election interference.

But ultimately the problem is centered less on government or even platforms than on users; that is, you and me. I’ve long thought that we don’t have a “fake news” problem, we have a media-literacy problem. Millions of people just can’t tell the difference between a made-up story and a factual one, and don’t know how to do so. This is a long-term problem with a long-term solution: media literacy needs to be taught in elementary school. People need to learn the provenance of information: what is an accepted fact and what is not; what is a trusted source and what is not. At the same time, the media itself must become radically transparent: publish the full text of interviews and reporters’ research. That alone will begin to make people more literate about the sources of the information we get.

Jefferson said a nation could never be ignorant and free. Governments derive their just power from the consent of the governed. That consent is obtained by the free flow of information. Factual information. That’s still an idea worth fighting for.

No comments:

Post a Comment