Nightshade ‘poisons’ AI models to fight copyright theft
University of Chicago researchers have unveiled Nightshade, a tool designed to disrupt AI models attempting to learn from artistic imagery.
The tool – still in its developmental phase – allows artists to protect their work by subtly altering pixels in images, rendering them imperceptibly different to the human eye but confusing to AI models.
Many artists and creators have expressed concern over the use of their work in training commercial AI products without their...