Nightshade, the tool that 'poisons' data, gives artists a fighting chance against AI | TechCrunch
Briefly

"We're showing the fact that generative models in general, no pun intended, are just models. Nightshade itself is not meant as an end-all, extremely powerful weapon to kill these companies," Zhao said. "Nightshade shows that these models are vulnerable and there are ways to attack. What it means is that there are ways for content owners to provide harder returns than writing Congress or complaining via email or social media."
"There is a right way of doing this," he continued. "The real issue here is about consent, is about compensation. We are just giving content creators a way to push back against unauthorized training."
Read at TechCrunch
[
add
]
[
|
|
]