Introducing Nightshade: A Tool Helping Artists to Fight Against AI | HackerNoon
Briefly

Jon Keegan discusses the challenges artists face as their work is used in generative AI model training without consent. Many artists are concerned about losing control and compensation, as seen with tools like Stable Diffusion and DALL-E using millions of images. To address this, researchers from the University of Chicago developed 'Nightshade,' an innovative tool that allows artists to covertly alter their images to disrupt AI training, potentially generating inaccurate results and empowering artists in the ongoing battle over intellectual property rights.
As generative AI expands, artists find their unique styles being exploited for profit without consent, leading to widespread dissatisfaction and demand for control.
The researchers at the University of Chicago developed 'Nightshade,' a tool that enables artists to sabotage their work's use in AI training through pixel alteration.
Read at Hackernoon
[
|
]