The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared.
Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.
Under current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content, as well as mechanisms to tackle it - or risk huge fines for breaches.
Telegram said in a statement that it categorically denies Ofcom's accusations. The company claimed that since 2018, it has virtually eliminated the public spread of CSAM on its platform through advanced detection algorithms and cooperation with non-governmental organizations.
It added: We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.
The investigation is part of a wider crackdown from Ofcom on services it suspects could be flouting the UK's sweeping online safety requirements, including toughened-up rules for tech firms to tackle CSAM, which is illegal to possess or share in the UK.
Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities, said Suzanne Cater, director of enforcement at Ofcom.
She added while there had been progress with tackling CSAM on smaller services, the issue extends to big platforms too.
Children's charity the NSPCC welcomed Ofcom's probe into Telegram, stating that recent research revealed around 100 child sexual abuse image offences are being recorded by police every day. The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram, said Rani Govender, the charity's associate head of policy.
Ofcom said it launched its probe into Telegram after being contacted by the Canadian Centre for Child Protection over the alleged presence and sharing of CSAM on the messaging app. The regulator is also investigating other platforms including Teen Chat and Chat Avenue for potential grooming risks.
Teen-focused chat services are too easily being used by predators to groom children, Cater said. These firms must do more to protect children, or face serious consequences under the Online Safety Act. The Act requires user-to-user services to prove they are tackling priority illegal content, including CSAM, terrorism, grooming, and extreme pornography.
Ofcom has issued several fines to providers accused of failing to comply with its illegal content duties. The regulator has the authority to levy fines of up to £18 million or 10% of global revenues for non-compliance. However, its rules have met resistance from some firms.
Despite resistance, Ofcom reported that one file-sharing service it contacted had made significant improvements to comply with its duties regarding illegal content.






















