Throughout the 2024 U.S. election campaign, there’s been growing concern that the spread of disinformation by candidates, conspiracy theorists and foreign actors could mislead voters into believing false narratives about the issues being debated and/or the veracity of the final vote count.
The media democracy group Free Press has analyzed 12 major tech companies’ readiness to address political disinformation and concluded that many of these social media platforms, including X, Meta and YouTube have either degraded or abandoned content moderation, creating an environment that has fueled the spread of dangerous disinformation before the Nov. 5 election. A situation that is even more dangerous in the critical period after the election when voters are focused on who won, who lost and the integrity of the electoral process.
Between The Lines’ Scott Harris spoke with Nora Benavidez, senior counsel and director of Digital Justice and Civil Rights with Free Press, who assesses the danger posed by political disinformation in this election campaign and what social media platforms need to do to minimize the lies that can lead news consumers to distrust the electoral process and even incite political violence.
NORA BENAVIDEZ: You know, I remember years ago when I started getting phone calls from people in my community, you know, they didn’t have the word disinformation yet like we do now. And we certainly weren’t using it all the time. But people would call me and they’d say, “I’ve heard rumors that there are people, you know, cops in this community pulling people over.”
And at the time, we didn’t even know how to fact check anything other than literally going to drive to the corner of where they were talking about to see if it was true. And we over the years have become more and more accustomed to hearing about this phrase, “disinformation.” And, in the last several years, more and more efforts by other countries and even people here in the United States have purported to spread and create lies that make us believe, you know, false narratives.
In particular, the way that social media is part of our everyday lives. There’s really the disturbing reality that we all live with these different news feeds. Yours and mine might look really different. And the data that these companies collect about us gives those companies the ability to target us with specific content. So, if you believe a certain set of beliefs or you’re part of different Facebook groups, Facebook could really target you with different content than me.
And what we’ve been looking at over the last several years now is how that phenomenon, how these companies are applying policies and giving us content, how that whole world influences our beliefs. And what we’ve really found is that most of the largest social media companies where people go almost every day for information are not adequately prepared for this election and certainly for the days that follow.
Over the last year, we’ve witnessed that companies like Meta, you know, Facebook and Instagram and Threads, as well as YouTube and Twitter, which now everyone calls X, Wwe’ve learned that they all have laid off staff. They have stopped applying policies around the Big Lie, or claims that the election from 2020 was stolen. And we have found that increasingly, they all promise that their A.I. and artificial technology programs can do the job that humans used to do.
But we know that that’s not the case. And we know actually that a lot of what we’re seeing online right now is false. It’s also in the gray area of maybe a little true and maybe very false. And then we also see things that are just conspiracies to make people afraid to really engage in democracy.
SCOTT HARRIS: Well, Nora, we only have about two minutes left, but I want to ask you this question. And the question revolves around how to minimize the flow of disinformation and misinformation through social media platforms. It seems to be a matter of finding a reasonable balance between blatant censorship versus preventing irresponsible actions from metaphorically shouting “Fire” in a crowded theater and causing a lot of damage and harm to people.
NORA BENAVIDEZ: Yeah. You know, I think for the next few weeks at least, because that’s what we’re looking at right now. You know, when the election season and these types of lies are at their peak, we need social media companies to be doing their jobs. We need them to be applying their policies. The black and white letter that they’ve written and promise to apply for their own content.
And we need them to be really committed to this issue where the online world of what people say online has offline consequence. So they should be applying their policies. In the days that follow, however, we know that given the track record of social media companies, it’s unlikely that they’re going to wake up literally tomorrow and start doing that.
And so we need everyday people to also step in. And that’s an unfortunate thing that we’re telling people they have to do the labor. But we need to all of us set expectations about what’s coming. And most likely our election results will not come on election night. So we have to be prepared for the lies. We have to be ready for old lies like the Big Lie.
And for new ones, you know, the immigrants lies and others. And then I would say we just need to find the ability and the grit to unplug. Because over the next few days, we will have a deep desire for news, all of us, and very little as we’re waiting for the results. So we’ll be waiting and we’ll be waiting, refreshing our news feeds.
And those are the moments where we need to be very careful since we’ll be so vulnerable to disinformation. So take a beat and unplug, go on a walk and trust that so far, our democratic systems have held.
For the best listening experience and to never miss an episode, subscribe to Between The Lines on your favorite podcast app or platform.
Or subscribe to our Between The Lines and Counterpoint Weekly Summary.