Content moderation: key facts to learn from Facebook, Instagram, X, and TikTok transparency reports
Briefly

In addition to general statistics on their services, such as the number of users, these documents give unprecedented insight into the resources dedicated by Facebook, Snapchat, TikTok, and others to the moderation of illegal, hateful, or fraudulent content.
How can this discrepancy be explained? In its transparency report, the American-run Meta specifies that its workforce outside the EU includes other moderators capable, in the event of a peak in activity in the region, of intervening in English, French, or Spanish. Meta also highlights the efficiency of its artificial intelligence (AI) moderation tools, which TikTok does not emphasize. The rate of moderation automation is very high at Meta: At Facebook and Instagram, respectively, 94% and 98% of decisions are made by machines - far more than the 45% reported by TikTok.
Read at Le Monde.fr
[
add
]
[
|
|
]