19 May 2021

Trump Abused the System. Facebook Created It


"IT IS ONLY too typical that the ‘content’ of any medium blinds us to the character of the medium.” So said the awesomely gonzo communication theorist Marshall McLuhan some 57 years ago.

What McLuhan meant was that, in a discourse dominated by electronic media, we fret over individual utterances far too much, while ignoring the communications systems in which those utterances live.

This week, McLuhan’s famous observation came off mothballs and found supremely practical application when the Facebook Oversight Board, the panel of experts appointed by Facebook, Inc., decided to extend restrictions on Donald Trump’s use of Facebook and Instagram, giving Facebook six months to figure out “a proportionate response that is consistent with the rules” of the platform.

At this point, who really cares? The former president’s damage is done, and even with him benched, Facebook is filled with insidious disinformation, dissimulation and masquerade of every kind, hate speech, and defamation and harassment amounting to a range of torts.

But Facebook’s board was charged with evaluating just two posts to Instagram and Facebook, independent of the dynamics of the social media to which they were posted. It did these two close readings, and credibly well. But that success, and the decision about Donald Trump, was neither here nor there. In the end, the result of the exercise was to distract from Facebook’s own culpability in much broader damage to democracy. First off, the committee cited two “pieces of content,” what McLuhan would have called “messages,” as key to its decision-making. The first was a video of Trump giving an address to the camera that began, “I know your pain.” It was posted to Facebook and Instagram and time-stamped 4:21 pm, EST, January 6, 2021, as the US Capitol was under violent attack by Trump supporters.

The second was a 42-word paragraph on Facebook under Trump’s name, time-stamped just under two hours later. “These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly unfairly treated for so long. Go home with love in peace. Remember this day forever!”

The statement by Facebook’s oversight group focused on the language, the timing, and the origin of the two posts. It did not mention the dynamics, the business model, or the tools of social media, Instagram and Facebook, even once.

According to the board’s statement, “‘We love you. You’re very special’ in the first post and ‘great patriots’ and ‘remember this day forever’ in the second post violated Facebook’s rules prohibiting praise or support of people engaged in violence.”

As for timestamps, the statement says, “At the time of Mr. Trump’s posts, there was a clear, immediate risk of harm and his words of support for those involved in the riots legitimized their violent actions.”

About the American president as author of the posts, the statement says, “As president, Mr. Trump had a high level of influence. The reach of his posts was large, with 35 million followers on Facebook and 24 million on Instagram.” The board went on: “It is not always useful to draw a firm distinction between political leaders and other influential users, recognizing that other users with large audiences can also contribute to serious risks of harm.”

Though put in a matter-of-fact way, this point was the one surprise—even shock—in the oversight board’s statement. To Facebook, the American president is clearly not a public servant or even a commander-in-chief. He’s an influencer. And he gets his power not from the people but from Facebook and its business model of influencers and followers.

Power established on Facebook is not “legitimate” in sociological terms; it’s not power, like that of a schoolteacher or elected official, that’s regarded as just and appropriate by those over whom it is exercised. Far from it. “Influence” on Facebook is based on nothing but a (cheatable) point system in Facebook’s highly stylized massively multiplayer role-playing game. But that does not get mentioned by anyone on this committee, which has been blinded, in the McLuhan sense, to the game’s contrivances. Influence on Facebook is closer to influence in World of Warcraft than it is to legitimate power. But instead of calling out Facebook for creating a system that confers unregulated and dangerous “influence” on people, they speak of the abuse of that system by a designated bad actor.

Shoshana Zuboff, a professor at the Harvard Business School and a member of something called the Real Facebook Oversight Board, which was formed by Facebook skeptics determined to oversee the overseers the corporation had appointed, says that over two decades, internet-users have turned over responsibility for the common good to a “for-profit surveillance society”—the big tech companies. It’s Facebook’s business model and no one bad actor who put Facebook on what Zuboff calls “a collision course with democracy.”

Facebook’s “extractive surveillance economics” require specific tools, Zuboff argues. This proprietary apparatus is as central to its survival as a company as hydraulic fracking machinery is to Halliburton. And it’s those data-mining tools—the ones that allow for virality, amplification and audience-targeting—that sorely require regulation. “Facebook made business decisions that harmed democracy,” according to Yaël Eisenstat, a former CIA officer who once worked as Facebook’s Global Head of Elections Integrity Operations. Eisenstat also belongs to the Real Facebook Oversight Board. “Regulation should not be over speech but tools.”

It is nothing but dithering and distraction for tech companies, and indeed users of social media in search of transgressions or opportunities to cancel people, to keep parsing individual speech acts. The problem is in the tools—the apparatus for virality, amplification and targeting that is not protected by the First Amendment, and has no analog in the pre-Facebook world.

The struggle in social media is not among competing messages on a web site. It’s between people and the big companies that would colonize us, mine us for data, and program our every online move. The medium is indeed the message here. And the medium, which is in dire need of regulation, is Facebook.

No comments: