Several hundred trust and safety jobs at TikTok in the UK and south and south-east Asia face cuts as part of a global reorganisation. Work will be reallocated to other European offices and third-party providers, while some roles will remain in the UK. The company is increasing reliance on automated moderation, with more than 85% of removed content identified by automation. The cuts coincide with new UK online safety rules requiring age checks and enabling fines up to 18m or 10% of global turnover. Unions warn that replacing human moderators with AI could put user safety at risk.
The viral video app said several hundred jobs in its trust and safety team could be affected in the UK, as well as south and south-east Asia, as part of a global reorganisation. Their work will be reallocated to other European offices and third-party providers, with some trust and safety jobs remaining in the UK, the company said. It is part of a wider move at TikTok to rely on artificial intelligence for moderation.
More than 85% of the content removed for violating its community guidelines is identified and taken down by automation, according to the platform. The cuts come despite the recent introduction of new UK online safety rules, which require companies to introduce age checks on users attempting to view potentially harmful content. Companies can be fined up to 18m or 10% of global turnover for breaches, whichever is greater.
TikTok, which is owned by the Chinese tech group ByteDance, employs more than 2,500 staff in the UK. Over the past year, TikTok has been cutting trust and safety staff across the world, often substituting workers with automated systems. In September, the company fired its entire team of 300 content moderators in the Netherlands. In October, it then announced it would replace about 500 content moderation employees in Malaysia as part of its shift towards AI.
Collection
[
|
...
]