News Technology

With Nightshade, artists might have a chance to fight against AI

Nightshade is a tool that can corrupt AI by feeding it with wrong information, eventually corrupting the model and causing it to generate wrong outputs.

The AI vs artists battle is just getting started. With generative AI on the rise, artists seem to be worried about tech companies using their work to train generative AI models.

Since the start of the year, several artists, record labels, musicians, and entertainers have filed lawsuits against companies like ChatGPT maker OpenAI and Stable Diffusion, suggesting they used their work to train their AI models without their permission.

However, a new tool called “Nightshade,” developed by researchers at the University of Chicago, might give artists a chance to fight back against text-to-image generators like Stable Diffusion and DALL-E and prevent them from scraping their work.

How does Nightshade work?

According to MIT Technology Review, Nightshade works by feeding false information to the AI model, which eventually corrupts by making it learn the wrong names of objects. This is also known as ‘poisoning’

Since AI models like DALL-E and Midjourney are trained on vast amounts of photos and videos, most of which are scraped from the internet, if you mess with enough images, the AI might start giving wrong outputs.

Nightshade uses this exploit and makes small changes to an image that are invisible to the human eye but cause the AI algorithms to identify them incorrectly.

While a small sample of images may not have any effect on the output, if thousands of artists start using Nightshade before uploading their work, it might make the generative AI model unusable at best.

LIVE

An error occurred. Please try again later

To test the tool, researchers used Stable Diffusion and poisoned photos of dogs with Nightshade, making them appear as a cat to the AI model. After feeding images of 50 poisoned dogs and prompting the AI to generate a picture, the output was an animal with many limbs. After 300 samples, Stable Diffusion generated an image that looked like a cat.

Researchers say the method Nightshade uses is pretty hard to fight against since developers would need to remove each and every image manually, which again is almost impossible since the changes the tool makes are invisible to the human eye.

The group that developed Nightshade is considering integrating it with Glaze, another tool designed to protect an artist’s work from AI.

Source:indianexpress.com

Leave a Reply

Your email address will not be published. Required fields are marked *