Pages

21 March 2019

The New Zealand Shooting and the Challenges of Governing Live-Streamed Video

By Neima Jahromi

On Friday afternoon, in Christchurch, New Zealand, a man parked his car in an alleyway outside Al Noor Mosque. Six minutes later, dozens of people were dead or wounded. We know far too much about what happened in between, because the shooter streamed it all to Facebook Live. A post on a far-right Internet forum hosted by the Web site 8chan directed users to the stream; quickly, video of the shooting spread across YouTube and Instagram. A manifesto was shared on Twitter, filled with references to Donald Trump, right-wing American punditry, and white-supremacist memes.

I spoke with Sarah T. Roberts, a professor of information studies at the University of California, Los Angeles, who, over the past eight years, has become an authority on the content-moderation strategies employed by tech companies. (Her book “Behind the Screen: Content Moderation in the Shadows of Social Media” will be published by Yale University Press, in June.) In recent years, Roberts has watched with incredulity as companies such as Twitter and Facebook encouraged users to begin streaming live video. “There are not enough moderators in the world to monitor every live stream,” she said. Social-media platforms were already struggling to moderate content posted in the usual way; live-streamed video, which can attract large audiences almost instantly, is even more challenging.


In theory, Roberts told me, machine-learning systems might one day identify violent and hateful streams as they unfold in real time, although she questions whether we actually want even the most sophisticated algorithms determining what should or shouldn’t make its way onto a platform. In practice, there are few effective systems in place. “By and large, live streaming came online with none of that stuff sorted out,” she said. “There was no real plan for, ‘What are we going to do when people start using it in the worst possible way?’ ” As live-streaming tools proliferated, some people in the industry worried that it might be “ungovernable.” But the social-media industry was embroiled in what Roberts calls a “functionality arms race.” Twitter acquired the live-streaming app Periscope in 2015; Facebook launched Facebook Live the next year.

There are options for slowing the torrent of live-streamed videos in general. Platforms could make access to the technology a privilege for trusted users, for example. But such measures would also impede the streaming of video that’s regarded as socially useful—broadcasts from political protests, say, or of police violence. For a time, Facebook Live was widely seen as a useful conduit for positive social change. In the end, it may be impossible to separate the good from the bad.

Social-media platforms were eager to embrace live streaming because it promised growth. Now scale has become a burden. “Mainstream platforms are putting resources into moderation, but it’s a bit like closing the barn door after the horses have gotten out,” Roberts said. The problem isn’t just new content; once a video is streamed, it spreads, often to sites that have no interest in policing what they host or that lack the resources to moderate. “It will take a while to undo the years of allowing these materials to proliferate and have a foothold on all of these platforms.”

And moderators at any one company must reckon with the fact that, ultimately, social-media platforms form a collective ecosystem, ranging from the mainstream to the fringe—from Facebook to 8chan. “People don’t use social-media sites in isolation,” Roberts said. Part of the challenge of moderation is that users may act differently in different spaces, appearing anodyne in one and toxic in another. The shooter in Christchurch, Roberts told me, seems to have been moving “not only among the mainstream platforms but also among different kinds of ideological registers online. He probably cut his teeth on the mainstream platforms. Then, at some point, he became engaged in these spaces where radical incitement to violence and hate speech were the bread and butter. And then, in order to have—I’m sickened to even talk like this—the greatest impact with his behavior, he went back to the mainstream to showcase it.” At one point in his stream, the shooter called out, “Subscribe to PewDiePie!”—a nod to a YouTube channel with eighty-nine million followers, run by the divisive Internet personality Felix Kjellberg. This seems less like an earnest endorsement and more like an attempt to merge the mainstream of social media with the fringe. (“I feel absolutely sickened having my named uttered by this person,” Kjellberg tweeted, after the attack.)

The industry-wide, cross-platform scale of the content-moderation problem is becoming increasingly hard to ignore. “I cannot believe that anyone in these companies feels anything but sick about this,” Roberts said. Lately, she has noticed a shift in the way the industry talks about itself. “For years, it was, ‘Our tools are neutral,’ ” she said. “Now they might say, ‘Our tools are a mirror, and they reflect society as it is.’ ” Increasingly, those who work in social media are shifting even further. “The mirror analogy isn’t sufficient, because of the vast power of these platforms to normalize, reify, amplify, and disseminate this material,” Roberts said. Collectively, as a system, social platforms don’t just mirror society—they change it.

No comments:

Post a Comment