Instagram Reels reportedly shows sexual content to users who only follow children

It also delivers ads for big brands such as Disney alongside them.
 By 
Amanda Yeo
 on 
Instagram Reels logos displayed on a phone screens and Instagram logo displayed on a screen in the background.
Credit: Jakub Porzycki / NurPhoto via Getty Images

It's been a rough few days for Meta. First the tech giant was accused of deliberately targeting children under 13 to use its platforms. Then it seemed to be rejecting ads for period care products on the basis that they were "adult" and "political." Now it's facing allegations that Instagram's Reels algorithm delivers overtly sexual content to accounts which only follow children — as well as ads for big brands alongside them. Overall, it isn't a great look.

In a new report from The Wall Street Journal, the publication tested Instagram's algorithm by creating accounts which only followed "young gymnasts, cheerleaders, and other teen and preteen influencers" — content involving children and which was devoid of any sexual connotation. Even so, the Journal's experiment found that Meta's TikTok competitor subsequently recommended sexual content to its test accounts, including both provocative adult videos and "risqué footage of children."

The Journal further found that child users such as those its test accounts followed were also often followed by accounts owned by adult men. Following such accounts appeared to prompt Instagram's algorithm to show it "more-disturbing content."


You May Also Like

All of this is bad enough, but it gets even worse for Meta. The report also found that Instagram Reels displayed ads for companies such as Disney, Walmart, Pizza Hut, Bumble, Match Group, and even the Journal itself alongside such unsolicited, algorithmically-delivered sexual content. 

In response, dating app companies Bumble and Match Group have both suspended advertising on Instagram, objecting to their brands being placed alongside inappropriate content.

According to Meta's Samantha Stetson, the Journal's test results are "based on a manufactured experience that does not represent what billions of people around the world see." Meta's Vice President of Client Council and Industry Trade Relations stated that over four million Reels are removed every month for violating its policies. A Meta spokesperson further noted that instances of content that breaches its policies are relatively low. 

"We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it. We continue to invest aggressively to stop it — and report every quarter on the prevalence of such content, which remains very low," Stetson said in a statement to Mashable. "Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions."

Earlier this year Meta rolled out an AI tool designed to determine whether content meets its monetisation policies, classifying it into suitability categories and disabling ads if it falls outside all of them. This tool was expanded to Reels in October.

It's been a difficult month for brands trying to advertise on social media. Earlier this month big advertisers such as Apple and IBM fled Twitter/X after owner Elon Musk expressed support for an anti-Semitic conspiracy, and a Media Matters report found it displayed ads alongside Nazi content.

Twitter/X made the same argument that Meta is mounting now, namely that the tests that resulted in inappropriate content being shown alongside advertisers were "manufactured." Yet just as in Twitter/X's case, the issue is less about how many people saw it or how it occurred, and more about it being able to happen at all. 

Instagram Reels also differs from Twitter/X's issue in that while Media Matters' testing had it follow accounts that posted "extreme fringe content," the Journal only followed young athletes and influencers. The sexual content offered up seemed to be entirely due to inferences drawn by Instagram's algorithm.

As such, it seems as though said algorithm could do with some significant adjustments.

Topics Instagram Meta

Amanda Yeo
Amanda Yeo
Assistant Editor

Amanda Yeo is an Assistant Editor at Mashable, covering entertainment, culture, tech, science, and social good. Based in Australia, she writes about everything from video games and K-pop to movies and gadgets.

Mashable Potato

Recommended For You

Some AI users are starting to consider themselves 'AI-sexual'
man on bed looking at computer screen

Instagram reportedly deletes Bellesa sex toy shop account for using the word 'clitoris'
illustration showing screenshot of email banning bellesa instagram account

Teens sue xAI for Grok's reported sexual image generation issues
finger tapping grok app icon

UK government could ban VPNs for children
a woman looking in a lit-up phone screen with a lock next to her

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

You can track Artemis II in real time as Orion flies to the moon
Victor Glover and Reid Wiseman piloting the Orion spacecraft

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!