Facebook exec's outrageous defense of the 'drunk Pelosi' video doesn't add up

It's an argument that completely sidesteps Russia's proven 2016 election interference.
 By 
Adam Rosenberg
 on 
Facebook exec's outrageous defense of the 'drunk Pelosi' video doesn't add up
Mandatory Credit: Photo by ETIENNE LAURENT/EPA-EFE/REX/Shutterstock (10230715b) A large screen shows an advertisement for Facebook reading 'Find your Facebook Group' in Los Angeles, California, USA, 07 May 2019. After a difficult year and a series of scandal damaging its image, Facebook is trying to rebuilt its reputation throughout major changes and a different approach of social media. Facebook advertisement in Los Angeles, USA - 07 May 2019 Credit: ETIENNE LAURENT/EPA-EFE/REX/Shutterstock

All the noise around the "drunk Pelosi" video has made something clear: Facebook wants to have it both ways.

Some at the company would have us all believe that great strides have been made in the platform's ongoing fight against misleading "fake news" content. But Facebook also won't take the step of removing material like the doctored Pelosi clip, preferring instead to let users make their own choice about what to believe.

That's the takeaway from comments made by Monika Bickert, Head of Global Policy Management at Facebook, in an interview with CNN's Anderson Cooper.

"We think it's important for people to make their own, informed choice about what to believe," she said. Facebook works with independent fact-checking organizations to identify misleading content and flag it accordingly. So the company knows what's fake. It just won't take the step of removing such content.

That's what happened with a video of House Speaker Nancy Pelosi that was doctored to make it appear as if she was drunk or in some other way impaired. Fact-checkers flagged it as "false," a designation that earns the video a captioned warning and a reduced presence in News Feeds.

The video won't be deleted entirely because it doesn't violate community standards. Facebook will actively remove content that incites or promotes violence, which breaks the rules. The company has also shown a willingness to ban fake accounts and to de-platform problematic figures who repeatedly violate the site's rules.

"We think it's important for people to make their own, informed choice about what to believe."

In the CNN interview, Cooper asks Bickert again and again why Facebook wouldn't just delete content that's been flagged as false. She returns again and again to that idea of letting users decide on their own what to trust Misleading content is flagged as such, and that's supposed to be enough.

Is it, though? There's evidence that even flagged and downranked content can enjoy considerable reach on Facebook. Perhaps that's because certain individuals have worked hard to promote the message that the media as a whole is an enemy that shouldn't be trusted. People believe what they want to believe in the current climate, to the point that a "misleading content" tag could be read as a positive among certain readers and belief systems.

Cooper tries to address that in his chat with Bickert, pointing out that "the video is more powerful than whatever you're putting under the video." Bickert deflects, suggesting that the video is OK to keep around because the conversation around it has shifted to questions like the one Cooper posed.

"Well actually what we're seeing is that the conversation on Facebook, on Twitter, offline as well, is about this video having been manipulated," she said. "As evidenced by my appearance today, this is the conversation."

Later in the segment, Cooper presses Bickert on Facebook's responsibility to accuracy as a provider of news. She pushes back, pointing out that the company is a social media business, not a news business. When Cooper presses her yet again -- "you're sharing news ... because you make money from it," he argues -- Bickert draws a line between rules-violating violent content and political discourse.

"If it's misinformation that's related to safety, we can and we do remove it. And we work with safety groups to do that. But when we're talking about political discourse and the misinformation around that, we think the right approach is to let people make an informed choice," she said.

What a stunning point to make when we're barely a month removed from the release of the almost 500-page Mueller report, roughly half of which focuses on Russian efforts to influence the 2016 U.S. presidential election. As we now know, a big portion of those efforts involved exploiting social media platforms like Facebook.

It is a proven fact by now that political misinformation can have harmful effects. Proven beyond the shadow of any doubt. It's great to see Facebook taking action against the kinds of fake accounts that help to spread the bad stuff around, but it's only a half-measure. Plenty of real people are taken in by misinformation around the internet that they then share on social media.

This isn't the first time Facebook has leaned on policy to defend the presence of inappropriate content on the platform. But it's an increasingly hard pill to swallow as suspicions about the negative impact of disinformation on political discourse are proven correct again and again. Maybe it's time for Facebook to change the company line.

Mashable Image
Adam Rosenberg

Adam Rosenberg is a Senior Games Reporter for Mashable, where he plays all the games. Every single one. From AAA blockbusters to indie darlings to mobile favorites and browser-based oddities, he consumes as much as he can, whenever he can.Adam brings more than a decade of experience working in the space to the Mashable Games team. He previously headed up all games coverage at Digital Trends, and prior to that was a long-time, full-time freelancer, writing for a diverse lineup of outlets that includes Rolling Stone, MTV, G4, Joystiq, IGN, Official Xbox Magazine, EGM, 1UP, UGO and others.Born and raised in the beautiful suburbs of New York, Adam has spent his life in and around the city. He's a New York University graduate with a double major in Journalism and Cinema Studios. He's also a certified audio engineer. Currently, Adam resides in Crown Heights with his dog and his partner's two cats. He's a lover of fine food, adorable animals, video games, all things geeky and shiny gadgets.

Mashable Potato

Recommended For You
'Cornbread Mafia' review: True crime meets stoner comedy in this outrageous documentary
American marijuana farmers sit at the center of "Cornbread Mafia."



Prime Video subscribers can add Mubi to their streaming lineup for 50% off
Mubi and Prime Video logos side by side with blue background

Streaming deal alert: The BritBox Prime Video add-on is 25% off for your first year
BritBox and Prime Video app logos side by side with blue background

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma


What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!