Artists' fear regarding style mimicry from image generation models is valid, as protections in place provide only a false sense of security against misuse.
Our findings indicate that even simple techniques like image upscaling can significantly undermine established protections, leaving artists exposed to style reproduction.
Existing protections against style mimicry, which rely on adversarial perturbations, have been shown to be easily circumvented, emphasizing the need for more effective solutions.
Through our user study, we reveal that current approaches fail to offer reliable security measures for artists, pushing the necessity for novel protective strategies.
Collection
[
|
...
]