European Commission: TikTok's addictive design breaches EU law | Computer Weekly
Briefly

European Commission: TikTok's addictive design breaches EU law | Computer Weekly
"The preliminary decision outlines how addictive design features on the platform, such as infinite scroll and autoplay, are resulting in users going into "autopilot mode", with the EC stating this may lead to "compulsive use". The DSA, which sets out rules for online services used by European citizens, is designed to strengthen consumer rights and consumer choice, while also minimising the risk of harm."
"TikTok is one of the 17 companies defined as Very Large Online Platforms under the act, which means it has to comply with the most stringent rules of the DSA because the size of its user base means there is greater potential for systemic harms to occur. In its ruling, the EC stated that TikTok had failed to implement reasonable and effective measures to mitigate risks from its addictive design features, arguing that minors and vulnerable adults are at particular risk of harm."
"The EC's investigation also revealed that TikTok's risk assessment had not adequately addressed how its design features and dark patterns could cause harm to the physical and mental health of its users. On the protective measures that are in place, including screen time management and parental control tools, the EC noted they "do not seem to effectively reduce the risks stemming from TikTok's addictive design" due to being easy to dismiss or overlook."
"At this stage, the Commission considers that TikTok needs to change the basic design of its service. For instance, by disabling key addictive features such as 'infinite scroll' over time, implementing effective 'screen time breaks', including during the night, and adapting its recommender system," it said."
The European Commission has preliminarily found that TikTok's addictive design features violate the Digital Services Act. Addictive mechanisms such as infinite scroll and autoplay are resulting in users entering 'autopilot mode' and may lead to 'compulsive use'. The DSA requires platforms to assess risks to users, including negative effects on children's mental health, and to present those assessments to the Commission. TikTok is designated a Very Large Online Platform and must meet the most stringent DSA obligations. The Commission found that TikTok failed to implement reasonable and effective mitigation measures, leaving minors and vulnerable adults particularly at risk. Existing tools like screen-time management and parental controls do not effectively reduce those risks.
Read at ComputerWeekly.com
Unable to calculate read time
[
|
]