This tool strips away anti-AI protections from digital art
Peter Hall
created: July 10, 2025, 9 a.m. | updated: July 15, 2025, 12:20 p.m.
AI models work, in part, by implicitly creating boundaries between what they perceive as different categories of images.
These almost imperceptible changes are called perturbations, and they mess up the AI model’s ability to understand the artwork.
Glaze is used to defend an artist’s individual style, whereas Nightshade is used to attack AI models that crawl the internet for art.
The researchers trained LightShed by feeding it pieces of art with and without Nightshade, Glaze, and other similar programs applied.
The creators of Glaze and Nightshade seem to agree with that sentiment: The website for Nightshade warned the tool wasn’t future-proof before work on LightShed ever began.
5 months, 1 week ago: MIT Technology Review