It’s here: months after it was announced, Nightshade, a new, free software tool allowing artists to “poison” AI models seeking to train on their works, is now available for artists to download and use on any artworks they see fit, VentureBeat reported.
Developed by computer scientists on the Glaze Project at the University of Chicago under Professor Ben Zhao, the tool essentially works by turning AI against AI. It makes use of the popular open-source machine learning framework PyTorch to identify what’s in a given image, then applies a tag that subtly alters the image at the pixel level so other AI programs see something totally different than what’s actually there.
According to VentureBeat, it’s the second such tool from the team: nearly one year ago, the team unveiled Glaze, a separate program designed to alter digital artwork at a user’s behest to confuse AI training algorithms into thinking the image has a different style than what is actually present (such as different colors and brush strokes than are really there).
But whereas the Chicago team designed Glaze to be a defensive tool – and still recommends artists use it in addition to Nightshade to prevent an artist’s style from being imitated by AI models — Nightshade is designed to be “an offensive tool.”
Artists seeking to use Nightshade must have a Mac with Apple chips inside (M1, M2, or M3), or a PC running Windows 10 or 11. The tool can be downloaded for both OSes here.
The Register reported that University of Chicago boffins released Nightshade 1.0, a tool built to punish unscrupulous makers of machine learning models who train their systems on data without getting permission first.
Nightshade is an offensive data poisoning tool, a companion to a defensive style protection tool called Glaze, which The Register covered in February of last year.
According to The Register, Nightshade poisons image files to give indigestion to models that ingest data without permission. It’s intended to make those training image-oriented models respect content creators’ wishes about the use of their work.
“Nightshade is computed as a multi-object optimization that minimizes visible changes to the original image,” said the team responsible for the project. “For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass.”
Nightshade was developed by University of Chicago doctoral students Shawn Shan, Wenxin Ding, and Josephine Passananti, and professors Heather Zheng and Ben Zhao, some of whom helped with Glaze.
IGN reported: As AI continues to be a double-edged sword in the digital landscape, a new data-poisoning tool will allow artists to reclaim their control over their creative worlds and void any AI-generated replications.
Personally, I think Nightshade is a big win for artists! No one wants to have their artwork stolen without their permission. Poisoning the AI with Nightshade sends a very clear message that artists are not going to allow AI to steal their work.