Facebook is the world’s largest social media network, with nearly 2 billion users. The company started by Mark Zuckerberg when he was a student at Harvard University has become an essential communications tool for business and individuals. As Facebook grew, it acquired other Internet-based companies, including Instagram, WhatsApp, Messenger and Oculus VR.
While Facebook’s stated mission is “to give people the power to build community and bring the world closer together,” the company’s business model and profitability depends on maximizing engagement, where the most provocative content achieves the greatest visibility — creating an echo chamber of extremism, bullying, hate speech, disinformation, conspiracy theories and rhetorical violence.
Many former Facebook employees have publicly condemned the company’s business practices, but the testimony of Frances Haugen, a former Facebook data scientist before the U.S. Senate on Oct. 5, made some familiar criticisms of the company, but with the support of thousands of internal documents. Haugen bluntly stated she believes Facebook harms children, sows division and undermines democracy in pursuit of “astronomical profits.” Between The Lines’ Scott Harris spoke with Tracy Rosenberg, executive director of Media Alliance, who talks about Haugen’s testimony and the campaign to fire Facebook co-founder and CEO Mark Zuckerberg.
TRACY ROSENBERG: What Francis Haugen delivered — and I think probably did so a bit more clearly than a number of the other, you know, Facebook whistleblowers who have come forward — is sort of the question of intention and the question of knowledge. As stated, she’s made it very clear that it is her educated opinion from working in a civic integrity division in Facebook, there’s absolutely no lack of knowledge of the negative impacts that we can all see. That, in fact, Facebook had studied them extensively. They had concluded that the way they were handling their algorithm, certain characteristics of their site were specifically doing, having negative impacts that could be quantified. Instagram in particular, was creative negative body images for young women and negatively impacting their mental health in a significant number of people. To name one — they had actively contributed to ethnic slaughter and genocide in Myanmar and also in, I believe it was west Africa, as well, that they specifically played a role in the buildup to Jan. 6 and so on.
So that basically the things that sort of, anecdotally we could all observe — had been thoroughly researched, quantified and measured by Facebook. They had drawn specific conclusions with their internal research and having read all of that, they made specific choices not to take the actions that were recommended to mitigate the harmful impacts that they had been shown were happening. And this is sort of a different process. And the way it’s often been sort of described as, “Oh, you know, these are complicated decisions and nuance and pros and cons.”
It’s a very clear sort of statement — you were given an opportunity here with all of the data to research all you could possibly want to understand exactly what the impact was. Probably more so than any of us on the outside could possibly have seen. And you looked at it and you looked at your bank book and you said, “No, in a concrete way.” You said, “No, let those harmful impacts go or because it would cost me money. And I don’t want to do that. And this is from someone who is fabulously rich and has more money than anyone could possibly need. That was the motivation. And that was the reasoning. And a lot of us would say that verges on white collar crime. You know, it’s basically placing money that you don’t even need in front of literally people’s mental health and physical safety.
SCOTT HARRIS: What in your view should Congress be doing in terms of regulating Facebook and social media platforms in general? There’s discussion about reforming Section 230, that immunizes social media companies from being sued over what their users post. There’s also the idea that antitrust actions should be taken against Facebook, particularly, because they are a monopoly in terms of social media right now.
TRACY ROSENBERG: So when you’re looking at Section 230, what it comes down to is that there are some reforms that can be done that would consensually provide some limits on Section 230 liability that can be carefully crafted. And that’s really where we have to go. And that’s a tough conversation because Washington, D.C. Is not a place that is real good at subtlety, and they’re not a place that is real good at deep policy. So we get a lot of soundbites when what we actually need is careful examination of where 230 liability makes sense. And the point at which it stops making sense.
SCOTT HARRIS: Our listeners about the #FireZuck campaign, your demand that the board of directors at Facebook terminate Mark Zuckerberg.
TRACY ROSENBERG: We’ve been doing this now for the better part of a year-and-a-half, getting on two years, until the #FireZuck sort of, you know, hashtag or campaign came out of basic frustration. We feel like we’re asking Facebook as their users, as essentially the parties that are making all the money for the organization to do some fairly simple things. And what comes back in response to these kinds of demands is one of two things. Either we’re already doing it — which they’re not. Or we’re going to wait for the government to break us up, or we don’t see the importance of it, or blah, blah, blah. And what it comes down to is Mark Zuckerberg is in charge. It looks like these kinds of demands are not going to be undertaken voluntarily by Facebook. It just isn’t going to happen. The leadership is not there. They will do things when they’re forced to and not before.