Facebook internal investigation finds millions of members in QAnon groups

There's more than 3 million of them.
Facebook internal investigation finds millions of members in QAnon groups

Facebook officially has a QAnon problem.

An internal investigation carried out by the social networking giant found that Facebook groups related to the far-right conspiracy theory QAnon are racking up millions of members on the platform.

According to documents provided to NBC News by a Facebook employee, the company’s investigation discovered thousands of QAnon pages and groups on the site. When combined, these groups and pages have an audience of more than 3 million members and followers. Ten of Facebook’s most popular QAnon groups alone make up more than 1 million of those members.

Facebook’s investigation also uncovered 185 ads “praising, supporting, or representing” the QAnon conspiracy that ran on the platform, NBC reports. The company made around $12,000 from the ads. In the last 30 days, those ads hit 4 million impressions on the platform.

The internal findings will be used to guide any policy decisions related to QAnon that Facebook may be working on, according to two company employees who anonymously spoke to NBC News.

Facebook could decide to treat QAnon as it treats other extremist content. The company outright banned white supremacy and white nationalism content from its platform in early 2019.

That same year it also rolled out rules for conspiratorial pages, specifically anti-vaccination content. Facebook excludes anti-vaxxer accounts from its search results and recommendation engine, making the content harder to find. It also rejects advertising that promotes anti-vaccination messages.

Last week, Facebook banned one of its biggest QAnon groups, “Official Q/QAnon,” for repeatedly breaking its rules on misinformation, harassment, and hate speech. The group had nearly 200,000 members at the time of its removal.

Facebook took its first major action against QAnon content in May when it removed a network of groups, pages, and accounts about the conspiracy theory from its platform. However, that removal was due to the pages breaking Facebook’s policies on coordinated inauthentic behavior. Fake accounts were being set up by the network to promote its content.

While Facebook figures out what to do about QAnon, competing social media platforms have already taken action. Twitter announced in late July that it would block QAnon from appearing in its trends and recommendations sections and would remove links related to the conspiracy theory. TikTok followed shortly after by blocking QAnon terms and content from its search feature

QAnon has been especially popular within the baby boomer generation, making Facebook with its older demographics, the perfect place for the conspiracy to spread and grow to where it is today.

Mashable Potato

Recommended For You
Grok is producing millions of sexualized images of adults and children
A sign next to bus stop in London reads "Who the hell would want to use social media with a built-in child abuse tool?" and a photo of Elon Musk.

How hackers are stealing millions from ATMs, FBI warns
a card being inserted into an atm


Elon Musk's Grok faces another EU investigation over nonconsensual AI images
Elon Musk's tweet and Grok logo

Hackers target millions of iPhones with new DarkSword spyware
iPhone on keyboard

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.


NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!