
"We have an old saying in behavioral therapy: Whatever you permit, you promote. If we apply this to our use of social media, we might recognize that if we permit our devices to deliver a certain type of media and content, we are then, in a way, promoting that this type of media and content continues to be delivered. To wit, if I persistently watch political reels on TikTok, Instagram,"
"and Facebook, the algorithm that underlies these content-delivery platforms will learn my behavior and continue to deliver this type of content. Taken a step further, if I watch primarily partisan political content, these platforms will deliver only this type of content in the future. This culture of data and habit tracking has created what we might call an "algorithmic society," where most everything we do on our devices becomes a data point for future content delivery."
Algorithm-driven platforms learn users' viewing habits and reinforce similar content, creating feedback loops that narrow exposure. This filtering mirrors psychological filtering, causing attention to focus on selected aspects while ignoring nuance. Continuous delivery of partisan or specific content can lead to anxiety, loss of agency, and incomplete understanding of situations. Habitual data collection constructs an "algorithmic society" where device interactions become predictors for future content. Deliberate curation of algorithms and mindful consumption can reduce negative influences and restore broader perspective and agency in digital environments. Education about algorithm mechanics and active control of preferences empowers users to interrupt harmful patterns.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]