The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared.
Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.
Under the current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content, as well as mechanisms to tackle it - or risk huge fines for breaches.
Telegram said in a statement that it categorically denies Ofcom's accusations. Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with non-governmental organisations, it told the BBC.
The company added: We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.
This investigation is part of a wider crackdown from Ofcom on services it suspects could be flouting the UK's sweeping online safety requirements - including toughened-up rules for tech firms to tackle CSAM, which is illegal to possess or share in the UK.
Children's charity the NSPCC welcomed Ofcom's Telegram probe. Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day, said Rani Govender, its associate head of policy.
Ofcom launched its probe into Telegram after being contacted by the Canadian Centre for Child Protection regarding alleged CSAM sharing on the app.






















