Free Nightshade Tool 'Poisons' AI Models to Stop Art Theft | Entrepreneur
Briefly

"Nightshade v1.0: a cutting-edge tool released by computer scientists at the University of Chicago that provides artists with a digital shield to guard their creations against unwanted AI consumption. Nightshade embeds pixel-level alterations inconspicuous to the human eye within an artwork - but its tweaks effectively serve as a 'poison' hallucinogenic for AI, causing it to misinterpret the content entirely. Pictures of pastoral scenes might suddenly be recognized by AI as fashionable accessories - for example, a cow becomes a leather purse. However, critics denounce the tool as a veiled attack on AI models and companies, with one going so far as to call it 'illegal' hacking."
"Many artists, including Kelly McKernan - a plaintiff in the highly publicized copyright infringement class-action lawsuit against AI art firms, including Midjourney and DeviantArt - have welcomed Nightshade with open arms, per the outlet. However, critics denounce the tool as a veiled attack on AI models and companies, with one going so far as to call it 'illegal' hacking."
Read at Entrepreneur
[
add
]
[
|
|
]