The Kids Online Safety Act (KOSA) has re-emerged in the Senate with claims of protecting youth without censorship. However, the bill mandates platforms to mitigate numerous harms to minors, thus incentivizing over-censorship of legal speech. Major tech firms can adapt, but smaller platforms will struggle to comply with vague requirements. By establishing liability for content, KOSA could result in platforms deleting forums or discussions that promote healthy conversations about mental health, leading to a suppression of essential, lawful speech online.
When the safest legal option is to delete a forum, platforms will delete the forum.
This bill won't bother big tech. Large companies will be able to manage this regulation, which is why and have agreed to support it.
To avoid liability, platforms will over-censor. It's not merely hypothetical. It's what happens when speech becomes a legal risk.
The list of harms in KOSA's 'duty of care' provision is so broad and vague that no platform will know what to do regarding any given piece of content.
Collection
[
|
...
]