Pages

6 October 2022

‘Where Is the Line Where Immoral Becomes Evil?’

Ellen Cushing

Adrienne LaFrance: I want to start by going back, specifically to the start of Rappler.

Maria Ressa: You were the first American reporter to write about Rappler, in 2012.

LaFrance: We go way back. And when we talked all those years ago, I remember your preoccupation with emotional contagion. I remember you saying something to me to the effect of, “We need to create informational environments where people can be rational and not just emotional.” That was 10 years ago. Here we are now. When you think back to what your dreams were for Rappler when you started, what’s your primary observation about what’s changed other than it’s gotten worse? And do you still feel like rationality over emotionality is the core mission? Is it enough?

Ressa: Oh my God, there’s three things that are there, right? Right off the top. So we came out with a mood meter and a mood navigator a few years before Facebook did the emoji reactions. But the reason for the mood meter was because I wanted to see how a story impacted our society emotionally, right? And it was actually used by several universities to look at sharing—you know, how people share online—and it’s based on valence, arousal, and dominance.

LaFrance: You had readers actually respond and tell you, “This article makes me angry.”

Ressa: Correct, correct, correct. But at that point in time—until 2016—the top emotion that people responded with on Rappler was “happy” in the Philippines, right? So we really looked at this stuff, and we knew when people were angry, because it wasn’t being gamed and we weren’t manipulating.

So the question that you had is social contagion. I thought that if you click how you feel, that that would stop you and make you think. And because you stopped to think, you’d become more rational. I began looking at social-network theory when I was looking at how the virulent ideology of terrorism spreads. And so when we were looking, when I was looking at that and I looked at how to create social cascades, we looked at emotions. We all know emotions are important. I was hoping again, against hope, that we could use it for good, and we still do because we don’t manipulate you with the algorithms. We don’t micro-target you. We don’t collect the kind of data that you now give freely to the Big Tech platforms, which is where our devil’s Faustian bargain began.

Fast-forward to today: I haven’t given up. But this is why I believe that we need to get legislation in place, that our data should be ours, that Section 230 should be killed, because in the end, these platforms, your technology platforms, are not like the telephone. They decide. They weigh in on what you get. And the primary driver is money. It’s surveillance capitalism.

On micro-targeting, it’s like going to a psychologist—I’m going to quote Tristan Harris—going to a psychologist and telling the psychologist your deepest, darkest secret. And then that psychologist goes around and says, “Yo, who wants this? Who’s the highest bidder? I got Adrienne’s secret.” That’s kind of what we’re doing, you know, so it’s a bleak moment. The Duterte government’s [new] threats against Rappler are just legal complaints now, but I’ll give you an idea for what that means. What that means is, for a person running a news group, that I have to worry about the people named on that complaint, that they could be picked up on the last day … before Easter vacations. That would be like Holy Thursday in the Philippines. Everything shuts down for Easter. And when that happens, if they’re arrested, they will stay in jail until the Monday after. So these are the kinds of harassment we have to deal with. It makes me very angry.

LaFrance: As it should. Press freedom in the Philippines has not been robust historically, and yet it seems to have gotten particularly bad in recent years. Certainly, you’ve experienced this firsthand. Can you talk about, especially for an audience of many Americans, what it is like to experience that slide, and especially the things that people may not notice are changing around them as the ground sort of shifts underneath them?

Ressa: In 1986, 36 years ago, the People Power Revolution ousted Ferdinand Marcos and his whole family. Ferdinand Marcos declared victory in the 1986 elections, and in four days the people came out on the streets, and by the fourth or fifth day, the U.S. helicopters took the family out of the Philippines, right? We all thought democracy was here to stay. How wrong we were. No president has really ever liked me completely in the Philippines. That’s okay.

LaFrance: That’s a good thing in journalism.

Ressa: Right! I don’t mind so long as they respect you. But today it’s so much different.

LaFrance: What do you think Americans—whether journalists or just individuals, institutions—should be doing now, or yesterday, or five years ago, to prevent what’s happening to you in the Philippines?

Ressa: The manipulation isn’t about how smart or how not smart you are. The manipulation is our biology. It’s your biology. It’s the manipulation of your emotions. So the minute you get angry, right? Don’t share. Yet. I mean, I talked about valence, arousal, and dominance. So I’ll take this first in the virtual world and then I’ll go into democracy in the physical world. Valence is how something, how content, makes you feel. Arousal is the kind of emotion you feel, so anger is high arousal; sadness is low arousal. You’re more prone to share high arousal versus low arousal, and dominance is how empowered you feel in the moment. So what people share the most? So, for example, if people are afraid like we were in the Philippines for long periods of time, you don’t share, you know, because you feel low dominance. Now take that into the real world. What’s happening in the virtual world is exactly how we’re supposed to vote. You’re still lucky. You don’t feel fear for your safety yet. Democracy is so fragile, and all of it rests on what we believe, on the facts, on that shared reality. So I guess … we’re the guinea pigs for you, right? So this is what the Cambridge Analytica whistleblower Christopher Wylie said about the Philippines. I think you wrote about this. The consultancy tested these tactics of manipulation, of mass manipulation in countries like the Philippines, Kenya, Nigeria, the global South, because you can get away with it. And then, if they worked in our countries, they—the word he used was they “ported” it over to you. It works in our social-media networks; it works in yours.

LaFrance: Let’s talk about the platforms for a minute. You mentioned TikTok in your keynote at our conference, which I also want to hear your views on, but I’ll ask an oversimplified question just to get your reaction. If you had a magic wand and could make one of the platforms go away—Google, Facebook, TikTok, YouTube, Instagram; obviously some of those entities own one another—which concerns you most, and which one’s absence would make humanity better off?

Ressa: I think we have to do the same with all of them, which is, you know, they cannot insidiously manipulate us, like CRISPR technology can basically customize your baby. But in our countries, in your country, you cannot do that. You put guardrails because you know we don’t have the wisdom of gods. But we don’t do that in our minds.

I’ll answer your question directly. In the Philippines, Facebook is the largest delivery platform of news, and for the sixth year in a row last year, Filipinos spent the most time online and on social media globally. But the other part is, because 100 percent of Filipinos [on the internet] are on Facebook. As early as 2013, Filipinos uploaded and downloaded the most … videos on YouTube globally. And then here’s the other part, right: So disinformation, like the historical denialism of the Marcos disinformation networks, would start on YouTube. When those stories are shared on Facebook, Facebook doesn’t actually check them, because the content isn’t on Facebook. So here it is. It’s an entire ecosystem of manipulation. I do not understand why we allow that. It’s not me. I’m just the victim. Please do something about it.

LaFrance: Hmm, I mean, my short answer would be that Facebook and Google have been most consequential for news organizations, and for the informational environment. But I really worry about YouTube. I think it’s underscrutinized—and Google for that matter too. I think the extent to which we have just totally outsourced our relationship with knowledge to Google is frightening. So all of them, maybe?

Ressa: I think our other problem with this is that we news people, we actually voluntarily gave away our deepest relationships when we put a “Share” button on our websites.

LaFrance: Rappler doesn’t have that, do you?

Ressa: We do! We do! Because we were born on Facebook. I drank the Kool-Aid.

LaFrance: Everybody did. I mean, the whole industry did.

Ressa: Exactly. So here’s the other part that’s for news people: Right now, you’re letting the technology platforms determine what news survives. And we already know what the algorithms amplify. So what news will survive?

LaFrance: So how should we change? I know you mentioned Section 230, and we can get into that. But how should we change the architecture of the social web to both allow for dissent—which, by the way, is also very important for democracy—and avoid tech companies making all of the big decisions on our behalf, but also prevent harm and abuse?

Ressa: You can—it’s not a free-speech issue. This is not a free-speech issue, like, don’t believe the lie. This is actually that algorithm and the data. This is tech and data. That’s what we need to look at. Because look, I can turn to my neighbor and tell a lie, right? You can tell your neighbor a lie, but it won’t get amplified to 10 million people, right? It’s the distribution that’s the problem. To quote a comedian, it’s a freedom-of-reach issue, not a freedom-of-speech issue. Sacha Baron Cohen said this; he had more wisdom than we did. I’m just saying, right? So freedom of speech has never been the problem.

But it is power and money, and that is part of what’s wrong. So we need to fix this, right? What is the source of the inordinate corporate gain that these companies have had? It’s all of our data. It’s our personal lives. So when you think about what our internet should look like, it’s not that hard to imagine it. The laws in the real world, which have checks and balances, which protect both individual rights and the public sphere—those need to be reflected in the virtual world.

LaFrance: Say a little more specifically about what you mean. I’ll give an example. Some people would argue—I think Frances Haugen has said, for instance, to the point about freedom of reach, as you put it—that we should have circuit breakers, basically, for when something’s about to go extremely viral, so there’s a check before it does. For a long time I argued for better content moderation, which is a very journalistic way of looking at things. I’ve since come to the view that that’s not going to be what solves our problems.

But speak a little bit more specifically about protecting individuals and communities. I’m coming at it from a very American perspective in terms of free speech. People do have this expectation that you should be able to go out into these new public squares and debate one another and not have anyone tell you what you can or can’t say. So what does that look like?

Ressa: But again, I’ll go back to, like, you can’t debate anymore on these social-media platforms, right? Because in the age of abundance, and because the business model helps it to go this way, because they want you to stay on the site longest, the emotion that’s encouraged is moral outrage, and moral outrage becomes mob rule. It isn’t a free-speech issue; it is, again, what you amplify. Imagine if The Atlantic decided, “I’m going to make money at all costs, and I’m going to amplify the content that will make the most people angry because they will stay on my site the longest and they will share it the most.” We don’t do that, because we have standards and ethics.

We have some platforms saying that it’s about the corporate shareholder. We’re here to make money for them. That’s immoral. So here’s the problem: Where is the line where immoral becomes evil?

LaFrance: Have we crossed that line?

Ressa: I believe so.

LaFrance: When?

Ressa: You know, I was with CNN when we grew from chicken-noodle news to the world’s breaking-news leader, so I know how difficult and overwhelming it can be to revamp while the bus is going to change the tires, right? So I gave a lot of leeway. But now what we’re seeing is the platforms are doubling down, some of them. And they’re actually saying, you know, it’s up to you. But every day that there isn’t a law, that there aren’t guardrails in place, someone dies. Where’s the line to evil? When you know people die because you’re making more money, and you continue doing it. So, yeah, I’m very angry. I try not to be angry. Wait, optimism! There is optimism! The hope—where will the hope come from? Ukraine. Okay, this is horrific, what we’re seeing happening in Ukraine, and what Russia has done. But how quickly did the free world come together? Right? And all of a sudden the world seems to be righting itself, but the platforms haven’t really changed yet. And what sparked the change online? It wasn’t a government; it was Ukrainian President Volodymyr Zelensky.

LaFrance: We’re at the very beginning of understanding the platforms’ role in this conflict. Through the fog of war, et cetera, we have a very limited sense of the information flow, still.

Ressa: It’s bad, really. But you know, I guess what I’m saying with Zelensky is, if he had left—one person—wouldn’t the Russians have marched in?

LaFrance: And to be fair, he and others are using platforms to promote democracy, which was the utopian dream for the internet in the first place.

Ressa: Again, I go back to the design of the platforms. This is, by design, a behavior-modification system that sells us our weakest moments for profit.

LaFrance: Let me ask one more question: What do you think of Elon Musk being on the Twitter board?

Ressa: I should turn this around on you, Adrienne!

LaFrance: No, you shouldn’t.

Ressa: I don’t know; it’s kind of like, where we are right now, right? We know his track record. And that is—that can be worrying, but it’s a shift. And you can see that they are proactive in dealing with disinformation. Will that change? I’ll tell you what the data show, but right now it’s too soon to tell. So I’ll give him the benefit of the doubt.

No comments:

Post a Comment