Microsoft joins coalition to scrub revenge and deepfake porn from Bing
Briefly

Microsoft has partnered with StopNCII to help remove non-consensual intimate images and AI-generated deepfakes from its Bing search engine, aiding victims in regaining privacy.
The collaboration creates a digital fingerprint of intimate images that victims don't need to upload, allowing industry partners to detect and remove content matching these hashes.
While several tech companies including Meta, TikTok, and Reddit have joined StopNCII's initiative, Google’s absence raises concerns about fragmentation in addressing non-consensual image sharing.
The US government is also responding to the issue, with proposals like the NO FAKES Act introduced to protect victims from the harms of deepfakes.
Read at Engadget
[
|
]