Algorithms on popular social media platforms are delivering unsolicited pornographic material into children’s feeds. Seventy percent of respondents aged 16–21 reported seeing pornography, with average first exposure at age 13 and more than one quarter by age 11. Eight of the top ten sources of pornographic content identified were social media or networking sites, with X accounting for 45% of encounters — higher than dedicated pornographic websites. Snapchat (29%), Instagram (23%), TikTok (22%) and YouTube (15%) were also frequently cited. Accidental exposure rose to 59% from 38% in 2023. Platforms face duties under the Online Safety Act to prevent algorithmic recommendations of harmful content and to implement age-assurance measures.
"children are viewing harmful content due to algorithms used by platforms, rather than actively searching it out themselves"
"Under the Online Safety Act and the child safety duties, platforms are required to stop their algorithms from recommending harmful content. This, coupled with age assurance measures, aims to protect children in the online world. The algorithms should filter out harmful content from reac"
Collection
[
|
...
]