Explicit deepfakes are now a federal crime. Enforcing that may be a major problem.

A law that's both narrow and broad may prove too difficult to follow.
 By 
Chase DiBenedetto
 on 
A gavel made out of multicolored lines of code on a black background.
Experts have mixed feelings on whether the Take It Down Act will live up to its promises. Credit: the-lightwriter / iStock / Getty Images Plus via Getty Images

On May 19, President Donald Trump and First Lady Melania Trump beamed to press and allies as they signed the administration's first major piece of tech regulation, the bipartisan Take It Down Act.

It was seen as a win for those who have long been calling on the criminalization of NDII, or the nonconsensual distribution of intimate images, and a federal pathway of redress for victims. Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance, explained it may be a needed kick in the pants to a lethargic legislative arena.

"I think it's good that they're going to force social media companies to have a process in place to remove content that people ask to be removed," he said. "This is kind of a start; to build the infrastructure to be able to respond to this type of request, and it's a really thin slice of what the issues with AI are going to be."


You May Also Like

But other digital rights groups say the legislation may stir false hope for swift legal resolutions among victims, with unclear vetting procedures and an overly broad list of applicable content. The law's implementation is just as murky.

The act's notice and takedown provision could pose major problems 

"The Take It Down Act’s removal provision has been presented as a virtual guarantee to victims that nonconsensual intimate visual depictions of them will be removed from websites and online services within 48 hours," said the Cyber Civil Rights Initiative (CCRI) in a statement. "But given the lack of any safeguards against false reports, the arbitrarily selective definition of covered platforms, and the broad enforcement discretion given to the FTC with no avenue for individual redress and vindication, this is an unrealistic promise." 

Exacerbating free speech and content moderation concerns

These same digital rights activists, who had issued warnings throughout the bill's congressional journey, will also be keeping a close eye on how the act may affect constitutionally protected speech, with the fear that publishers may remove legal speech to preempt criminal repercussions (or flatly suppress free speech, such as consensual LGBTQ pornography). Some worry that the bill's takedown system, modeled after the Digital Millennium Copyright Act (DMCA), may over-inflate the power of the Federal Trade Commission, which now has the power to hold online content publishers accountable to the law with unlimited jurisdiction. 

"Now that the Take It Down Act has passed, imperfect as it is, the Federal Trade Commission and platforms need to both meet the bill’s best intentions for victims while also respecting the privacy and free expression rights of all users," said Becca Branum, deputy director of the Center for Democracy & Technology (CDT)'s Free Expression Project. "The constitutional flaws in the Take It Down Act do not alleviate the FTC's obligations under the First Amendment."

A lack of government infrastructure

Organizations like the CCRI and the CDT had spent months lobbying legislatures to adjust the act's enforcement provisions. The CCRI, which penned the bill framework that Take It Down is based on, has taken issue with the legislation's exceptions for images posted by someone that appears in them, for example. They also fear the removal process may be rife for abuse, including false reports made by disgruntled individuals or politically-motivated groups under an overly broad scope for takedowns. 

The CDT, conversely, interprets the law's AI-specific provisions as too specific. "Take It Down’s criminal prohibition and the takedown system focus only on AI generated images that would cause a 'reasonable person [to] believe the individual is actually depicted in the intimate visual depiction.' In doing so, the Take It Down Act is unduly narrow, missing several instances where perpetrators could harm victims," the organization argues. For example, a defendant could reasonably get around the law by publishing synthetic likenesses placed in implausible or fantastical environments. 

Just as confusing is that while the FTC's takedown authority for applicable publishers is vast, its oversight is exempt for others, such as sites that don't host user-generated synthetic content, but rather their own, curated content. Instead of being forced to take down media under the 48-hour stipulation, these sites can only be pursued in a criminal case. "Law enforcement, however, has historically neglected crimes disproportionately perpetrated against women and may not have the capacity to prosecute all such operators," the CDT warns. 

Steinhauer theorizes that the bill may face a general infrastructure problem in its early enforcement. For example, publishers may find it difficult to corroborate that the individuals filing claims are actually depicted in the NDII within the 48 hour period, unless they beef up their own oversight investments — most social media platforms have scaled back their moderation processes in recent years. Automatic moderation tools could help, but they're known to have their own set of issues

No cohesion on AI regulation

There's also the question of how publishers will spot and prove that images and videos are synthetically generated, specifically, a problem that's plagued the industry as generative AI has grown. "The Take It Down Act effectively increases the liability for content publishers, and now the onus is on them to be able to prove that the content they’re publishing is not a deepfake," Manny Ahmed, founder and CEO of content provenance company OpenOrigins. "One of the issues with synthetic media and having provable deniability is that detection doesn’t work anymore. Running a deepfake detector post hoc doesn’t give you a lot of confidence because these detectors can be faked or fooled pretty easily and existing media pipelines don't have any audit trail functionality built into them.”

It's easy to follow the logic of such a strong takedown tool being used as a weapon of censorship and surveillance, especially under an administration that is already doing plenty to sow distrust among its citizens and wage war on ideological grounds.

Steinhauer still urges an open mind. "This is going to open a door to those other conversations and hopefully reasonable regulation that is a compromise for everyone," he said. "There's no world we should live in where somebody can fake a sexual video of someone and not be held accountable. We have to find a balance between protecting people, and protecting people's rights."

The future of broader AI regulation remains in question, however. Through Trump championed and signed the Take It Down Act, he and congressional Republicans also pushed to include a 10-year ban on state- and local-level AI regulation in their touted One Big Beautiful Bill

And even with the president's signature, the future of the law is uncertain, with rights organizations predicting that the legislation may be contested in court on free speech grounds. "There's plenty of non pornographic or sexual material that could be created with your likeness, and right now there's no law against it," added Steinhauer. Regardless of whether Take It Down remains or gets the boot, the issue of AI regulation is far from settled.

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also captures how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.

Mashable Potato

Recommended For You
Elon Musk's xAI raises $20 billion as Grok is investigated for deepfakes
Elon Musk

Indonesia and Malaysia block Grok access, UK threatens ban as explicit deepfake problem grows
In the background: a laptop screen showing the Grok logo. In the foreground is a large red no symbol on a phone.

Bart Layton on how 'Crime 101' compares to 'The Imposter' and 'American Animals'
Bart Layton attends the world premiere of Amazon MGM Studios' "Crime 101" at The United Theater on Broadway on Feb. 10, 2026 in Los Angeles, California.

'Crime 101' review: Chris Hemsworth, Mark Ruffalo, and Halle Berry star in heist thriller
Mark Ruffalo and Chris Hemsworth star in "Crime 101."

French police raids X's Paris offices
X logo

More in Tech
The Shark FlexStyle is our favorite Dyson Airwrap dupe, and it's $160 off at Amazon right now
The Shark FlexStyle Air Styling & Drying System against a colorful background.

Amazon's sister site is having a one-day sale, and this Bissell TurboClean deal is too good to skip
A woman using the Bissell TurboClean Cordless Hard Floor Cleaner Mop and Lightweight Wet/Dry Vacuum.

The best smartwatch you've never heard of is on sale for less than $50
Nothing CMF Watch 3 Pro in light green with blue and green abstract background

Reddit r/all takes another step into the grave
Reddit logo on phone screen

Take back your screen from ads and trackers with this $16 tool
AdGuard Family Plan: Lifetime Subscription

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!