A new tool could protect artists by sabotaging AI image generators

This could wreck havoc with AI.
 By 
Amanda Yeo
 on 
The OpenAI and ChatGPT logos are visible on a colourful background.
Credit: onathan Raa/NurPhoto via Getty Images

Artists may soon have a new weapon helping to protect their work from becoming machine learning fodder. Called Nightshade, the tool makes tiny changes to the pixels in a digital artwork to effectively "poison" it, rendering the image useless for the purposes of training AI.

MIT Technology Review reports that a team led by University of Chicago professor Ben Zhao submitted Nightshade for peer review at the USENIX computer security conference. The software works by making small edits to an image which, while invisible to the human eye, cause AI algorithms to completely misidentify them.

For example, an artist may paint a picture of a cat that can clearly be identified as a feline by any human or AI that examines it. However, upon applying Nightshade, humans will still see the same image while AI will incorrectly believe it's a dog. 


You May Also Like

Flood the AI with enough bad training material like this, and soon a request for an image of a cat will cause it to generate a dog instead.

Of course, just one poisoned image is unlikely to have a significant effect on an AI image generator's algorithm. Its training data would need to be tainted by thousands of altered images before a real impact is seen. 

However, AI image generators are known to indiscriminately scoop up thousands of new samples from the internet in order to refine their algorithm. If enough artists upload their images with Nightshade applied, it could eventually make such AI tools unusable. 

It would also be incredibly difficult for AI companies to fix the issue, as each poisoned image must be individually identified and removed from their training pool. This could create a powerful incentive for such companies to think twice before dragging a trawl net through the internet and using artists' work without their explicit consent.

This isn't the first AI-disrupting tool Zhao's team has created. The group previously released Glaze, a tool that disguises an artist's personal style in a similar manner. Nightshade will eventually be integrated into Glaze, as well as made open source in order to allow others to build on their work protecting artists.

Amanda Yeo
Amanda Yeo
Assistant Editor

Amanda Yeo is an Assistant Editor at Mashable, covering entertainment, culture, tech, science, and social good. Based in Australia, she writes about everything from video games and K-pop to movies and gadgets.

Mashable Potato

Recommended For You

How creators can protect their identity in an age of AI clones
woman with doodles and color streaks on face


Pranksters and pickup artists are using Meta Ray-Ban glasses to harass strangers for content
Man with meta ray ban glasses with creepy grin

Enjoy a peaceful internet experience for life with this $20 tool
AdGuard Family Plan: Lifetime Subscription

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!