
"For many, these platforms are more than entertainment; they are the primary gateway to news and information. The problem is that the content they see every day is far from accurate or neutral. Every post, video, or story is filtered through opaque systems designed to maximize engagement. These systems push emotional, sensational, and often misleading content to the top, making false or biased information spread faster, while nudging users into filter bubbles that narrow their worldview."
"Educators and policymakers often point to algorithmic literacy as the solution. I was one of those voices. The idea is straightforward: If young people understand how algorithms select, prioritize, and promote content, they can better navigate their news environment. Because they are "digital natives," the hope is that such education will be both intuitive and effective. But my recent study, published in the Harvard Kennedy School Misinformation Review, complicates this optimism."
A survey of 348 Americans ages 18 to 25 measured algorithmic knowledge and online behavior. Young adults who understand how algorithms use data, what motivates algorithm formulas, and the ethical consequences show greater awareness of risks such as amplification of misinformation and filter bubbles. However, greater algorithmic knowledge correlated with lower likelihood of correcting misinformation or seeking diverse perspectives on social media. The phenomenon is labeled algorithmic cynicism, a belief that individual action is futile against massive, profit-driven platforms. Algorithmic cynicism can reduce corrective behaviors and weaken efforts to broaden information exposure despite heightened risk awareness.
Read at Nieman Lab
Unable to calculate read time
Collection
[
|
...
]