A study suggests that Facebook's content moderation efforts are less impactful than previously believed. While the platform often boasts about the removal of harmful content, research shows that most users have already seen the posts before they are taken down. This revelation calls for a reevaluation of how social media platforms measure effectiveness, proposing a shift from quantity of take-downs to the number of users prevented from encountering harmful content. Such insights have sparked discussions among civil rights groups and highlight ongoing public concerns over content moderation.
"Content takedowns on Facebook just don't matter all that much, because of how long they take to happen," said Laura Edelson, an assistant professor of computer science at Northeastern University and the lead author of the paper in the Journal of Online Trust and Safety.
The researchers advocate a new metric: How many people were prevented from seeing a bad post by Facebook taking it down?
Collection
[
|
...
]