
"An AI image generator startup left more than 1 million images and videos created with its systems exposed and accessible to anyone online, according to new research reviewed by WIRED. The "overwhelming majority" of the images involved nudity and were "depicted adult content," according to the researcher who uncovered the exposed trove of data, with some appearing to depict children or the faces of children swapped onto the AI-generated bodies of nude adults."
"Multiple websites-including MagicEdit and DreamPal-all appeared to be using the same unsecured database, says security researcher Jeremiah Fowler, who discovered the security flaw in October. At the time, Fowler says, around 10,000 new images were being added to the database every day. Indicating how people may have been using the image-generation and editing tools, these images included "unaltered" photos of real people who may have been nonconsensually " nudified, " or had their faces swapped onto other, naked bodies."
""The real issue is just innocent people, and especially underage people, having their images used without their consent to make sexual content," says Fowler, a prolific hunter of exposed databases, who published the findings on the ExpressVPN blog. Fowler says it is the third misconfigured AI-image-generation database he has found accessible online this year-with all of them appearing to contain nonconsensual explicit imagery, including those of young people and children."
An unsecured database exposed more than one million images and videos created by an AI image generator, with the overwhelming majority depicting nudity and adult content. Some items appear to depict children or the faces of children swapped onto AI-generated nude bodies. Multiple websites, including MagicEdit and DreamPal, accessed the same unsecured database. Security researcher Jeremiah Fowler discovered the misconfiguration in October and observed roughly 10,000 new images being added daily. The exposed collection included unaltered photos of real people who may have been nonconsensually nudified, and similar misconfigurations have contained explicit imagery of young people.
Read at WIRED
Unable to calculate read time
Collection
[
|
...
]