Bluesky ramps up content moderation as millions join the platform
Briefly

"The surge in new users has brought with it concomitant growth in the number of tricky, disturbing, and outright bizarre edge cases that the trust and safety team must contend with."
"In all of 2023, Bluesky had two confirmed cases of CSAM posted on the network. It had eight confirmed cases on Monday alone."
"We're triaging this large queue so the most harmful content such as CSAM is removed quickly. With this significant influx of users, we've also seen increased spam, scam, and trolling activity."
"Bluesky's bolstering of its human workforce supplements what is often a complex and confusing world of automatic, AI-powered content moderation."
Read at Mashable
[
|
]