Pages

17 June 2022

How to Avoid Extremism on Social Media


She's a policy researcher at RAND. Her recent work has focused on the growing threat of online extremism—work that has required long days immersed in violence, racism, misogyny, and hate. It led her and fellow extremism researcher Heather Williams to oversee the creation of a scorecard to help social media users—or parents, or advertisers, or the social media companies themselves—avoid the kind of content they've seen.

That's not as easy as it might sound. Extremist groups have been trolling the internet for decades, and they have learned to temper their words and disguise their intentions. Nazis and hard-right militia members don't always shout their fury at the digital masses. Sometimes, they whisper.

“There's this idea that there's a dark part of the internet, and if you just stay away from websites with a Nazi flag at the top, you can avoid this material,” Evans said. “What we found is that this dark internet, this racist internet, doesn't exist. You can find this material on platforms that any average internet user might visit.”

In her previous life, before she pulled herself out, Acacia Dietz was a lead propagandist for the National Socialist Movement, marketing one of the largest neo-Nazi groups in America. She didn't do it with swastikas and White power salutes. She did it with articles about illegal immigration or social unrest, dropping breadcrumbs here and there to lead people deeper into the rabbit hole.

“Say we had a podcast about Hitler,” she says now. “We would market it as a show about World War II history. It literally has almost nothing to do with that, but nobody knows until they go and listen to it. And if you can get people to listen, one of two things will happen. Either they'll just go in the opposite direction—or it will pique their interest.”

The internet has been a haven for extremists since long before most people even knew it existed. The Anti-Defamation League issued a bulletin on “Computerized Networks of Hate” in 1985—the year Facebook founder Mark Zuckerberg turned one. Today, extremists share their likes and tweet their thoughts like everyone else. But they have also spun off into an ever-widening array of social media sites with greater appetites for hateful words and violent images.

Evans, Williams, and other RAND researchers had planned to study online manifestos posted by far-right extremists in the days and hours before acts of violence. But as they started their search, they realized there was no good way to identify sites that provide safe harbor for such content. They changed direction and started working on a ratings system for websites and social media platforms based on how receptive they are to extremist content.

The researchers looked at traffic volume, ownership information, and the presence or lack of advertising. They dug into content policies and awarded extra points to sites that actually enforce them. They added more points if a site had never been shut down by its service providers. They deducted points for swastikas or other extremist symbols.

In the end, the sites with the most points—think Facebook or Twitter—landed in a category the researchers called “mainstream.” That didn't mean they were free of extremist content; far from it. But that content wasn't their main reason for being. At the other extreme were “niche” sites like Stormfront or 8chan, for which it was.

But then there were the sites in the middle. The researchers called them “fringe.” They hosted a mix of extremist and non-extremist content, often under the banner of protecting free speech and standing up to what they describe as censorship on the mainstream platforms. Some, like Gab, are designed to look almost exactly like a mainstream site, down to the fonts they use.

“People sometimes fall into extremist material on these sites; they don't understand what it is because it's coded or hides its violent intent behind humor or memes,” said Williams, a senior policy researcher at RAND. “We wanted to give individuals and communities a better tool to help them appreciate when they could be interacting with extremist content.”

The researchers used their scoring system to identify dozens of sites that could host all manner of extremists: anti-government militia members, neo-Nazis, White supremacists. They also included incels—viciously misogynistic “involuntary celibates” who blame women for their inability to find a partner, and who sometimes get overlooked as ideological extremists. On some sites, the researchers found content that was so disturbing, they decided it was probably criminal.

Companies that host social media sites could use RAND's scorecard as a checklist to strengthen their defenses against extremist content, if they wanted to. Advertisers and other service providers could also use it to decide which sites they want to do business with and which they want to avoid.

The scorecard also gives everyday users a way to anticipate what kind of content they might find on an unfamiliar website—especially on a “fringe” website, where that might not be obvious. In that, it supports one of the key pillars of the nation's strategy to combat domestic terrorism: making people more careful and skeptical of the content they find online.

“This isn't an impossible problem,” Evans said. “We know there are things sites can do to make it more difficult for these groups to find each other or to organize or to attract large audiences. But consumers also need to become more informed about what they are consuming online. Maybe this is a way for individuals to think about what they expect and what they can petition companies to do.”

Acacia Dietz knows how slippery the slope can be. She was following news of social justice protests several years ago when she stumbled on a site with a seemingly simple premise: Nobody should feel guilty about their heritage. It was her door into the American neo-Nazi movement.

She got out in 2019, having watched in horror as a gunman who espoused the same White supremacist beliefs stormed mosques in Christchurch, New Zealand, and murdered 51 people. She works now as the managing director of Beyond Barriers, a group that works to prevent people from joining extremist movements and helps them deradicalize when they do. As part of that work, she still monitors social media sites to see what extremists are talking about—and with whom. She sees teenagers as young as 15 in some of those chat rooms.

“It looks pretty innocent. It's not until you actually get in there and start talking to people that you realize, wait a minute, this is not what it looks like,” she said. “It's very easy for individuals who are just curious, just looking, to get sucked in. That's a lot more common than what most people would want to admit.”

No comments:

Post a Comment