Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Tech

This new data poisoning tool lets artists fight back against generative AI

COURTESY OF THE RESEARCHERS

Zhao admits there is a risk that people might abuse the data poisoning technique for malicious uses. However, he says attackers would need thousands of poisoned samples to inflict real damage on larger, more powerful models, as they are trained on billions of data samples. 

“We don’t yet know of robust defenses against these attacks. We haven’t yet seen poisoning attacks on modern [machine learning] models in the wild, but it could be just a matter of time,” says Vitaly Shmatikov, a professor at Cornell University who studies AI model security and was not involved in the research. “The time to work on defenses is now,” Shmatikov adds.

Gautam Kamath, an assistant professor at the University of Waterloo who researches data privacy and robustness in AI models and wasn’t involved in the study, says the work is “fantastic.” 

The research shows that vulnerabilities “don’t magically go away for these new models, and in fact only become more serious,” Kamath says. “This is especially true as these models become more powerful and people place more trust in them, since the stakes only rise over time.” 

A powerful deterrent

Junfeng Yang, a computer science professor at Columbia University, who has studied the security of deep-learning systems and wasn’t involved in the work, says Nightshade could have a big impact if it makes AI companies respect artists’ rights more—for example, by being more willing to pay out royalties.

AI companies that have developed generative text-to-image models, such as Stability AI and OpenAI, have offered to let artists opt out of having their images used to train future versions of the models. But artists say this is not enough. Eva Toorenent, an illustrator and artist who has used Glaze, says opt-out policies require artists to jump through hoops and still leave tech companies with all the power. 

Toorenent hopes Nightshade will change the status quo. 

“It is going to make [AI companies] think twice, because they have the possibility of destroying their entire model by taking our work without our consent,” she says. 

Autumn Beverly, another artist, says tools like Nightshade and Glaze have given her the confidence to post her work online again. She previously removed it from the internet after discovering it had been scraped without her consent into the popular LAION image database. 

“I’m just really grateful that we have a tool that can help return the power back to the artists for their own work,” she says.

Source by [author_name]

Related Articles

Back to top button