"Big tech has built machines designed for one thing: to hold your attention. The algorithms don't care what keeps you scrolling. It could be puppy videos or conspiracy theories about election fraud. They only care that you keep consuming. And unfortunately nothing keeps people engaged quite like rage. The executives at these companies will tell you they're neutral platforms, that they don't choose what content gets seen. This is a lie."
"For years, these companies have hidden behind Section 230 protections while operating more like media companies than neutral platforms. They've used recommendation algorithms to actively shape what billions of people see every day, then claimed they bear no responsibility for the consequences. It's like a newspaper publisher claiming they're not responsible for what appears on their front page because they didn't write the articles themselves."
Big technology companies design algorithms specifically to maximize user attention and engagement. Recommendation systems prioritize content that keeps people scrolling, regardless of truth or societal harm. Content that provokes anger or outrage generates the strongest engagement and is therefore amplified. Platform decision-making about what to recommend functions as editorial judgment rather than neutral facilitation. These companies have relied on legal protections while behaving like publishers and resisting accountability. The algorithms not only amplify existing divisions but can create new polarization by surfacing emotionally provocative content. Changing platform incentives could prioritize accuracy, community, and wellbeing over profit.
Read at Staticmade
Unable to calculate read time
Collection
[
|
...
]