
"Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform."
"Telegram said in a statement that it 'categorically denies Ofcom's accusations' and has virtually eliminated the public spread of CSAM through world-class detection algorithms."
"'Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,' said Suzanne Cater, director of enforcement at Ofcom."
"'Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day,' said Rani Govender, its associate head of policy."
Ofcom has launched an investigation into Telegram due to concerns about the sharing of child sexual abuse material (CSAM) on the platform. The regulator gathered evidence suggesting CSAM is present and being shared, which violates UK laws requiring user-to-user services to prevent illegal content. Telegram denies these accusations, claiming to have eliminated public CSAM spread through detection algorithms. Ofcom emphasizes the importance of tackling CSAM, stating that child sexual exploitation causes significant harm, and the investigation is part of a broader effort to enforce online safety regulations.
Read at www.bbc.com
Unable to calculate read time
Collection
[
|
...
]