New Poisoning Tool against AI

Scientists have developed Nightshade, a tool enabling artists to make subtle changes to their artwork before sharing it online.

New Poisoning Tool against AI

Scientists have developed Nightshade, a tool enabling artists to make subtle changes to their artwork before sharing it online.
10 November 2023

The ongoing debate surrounding whether AI companies can freely use images sourced from the internet has been contentious. Instances of artistic appropriation, like Richard Prince altering Instagram images and selling them as his own, have sparked controversy. The art world grapples with the transformative effects of AI, with platforms like DALL-E, Stable Diffusion, and Google collecting data without artists’ permission. This has led to a divide on whether AI-generated art is innovative, appropriation, or legitimate.

Personally, I see potential for artists to unite, pooling their work to achieve unprecedented feats through available tools. However, not everyone shares this optimistic view, as some artists are actively resisting AI. In response, scientists have developed Nightshade, a tool enabling artists to make subtle changes to their artwork before sharing it online. Nightshade disrupts AI training sets, causing unpredictable malfunctions in resulting models. This tool addresses copyright concerns artists face with major AI companies.

Another tool, Glaze, is introduced to safeguard artists’ unique styles from AI scraping. Nightshade leverages a security vulnerability in generative AI models. When integrated with Glaze, it provides artists with the choice to use the data-poisoning feature. Nightshade’s open-source nature fosters collaboration, potentially enhancing its effectiveness. Tests on Stable Diffusion’s models demonstrate the tool’s significant impact, transforming outputs with just 50 poisoned images. Beyond direct keywords, Nightshade influences similar concepts and tangentially related images, reshaping the dynamics between AI models and the artistic community.

.

In this post:

Leave a Reply

Your email address will not be published. Required fields are marked *

More Posts