Skip to main content
Advertisement
Advertisement

Business

UK regulator probes Telegram over child sexual abuse concerns

UK regulator probes Telegram over child sexual abuse concerns

Teenagers pose for a photo while holding smartphones in front of a Telegram logo in this illustration taken September 11, 2025. REUTERS/Dado Ruvic/Illustration

21 Apr 2026 05:02PM (Updated: 21 Apr 2026 11:13PM)

LONDON, April 21 : Britain's communications regulator, Ofcom, launched an investigation on Tuesday into the Telegram messaging app after evidence suggested child sexual abuse material was being shared on the platform.

The probe is part of UK efforts to crack down on children being exposed to harm online without clear accountability. While the country's 2023 Online Safety Act has set tougher standards for social media platforms such as Facebook, YouTube and TikTok, Prime Minister Keir Starmer wants them to go further.

The government has been consulting on a potential social media ban for children under 16, and Starmer met last week with social media company executives where he asked them to take more responsibility.

Ofcom said it had received evidence from the Canadian Centre for Child Protection regarding the alleged sharing of child sexual abuse material on Telegram, and had carried out its own assessment of the platform.

CNA Games
Show More
Show Less

"In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content," Ofcom said in a statement.

Telegram said it "categorically" denied Ofcom's accusations, adding that since 2018 it had "virtually eliminated" the public spread of child sexual abuse material on its platform through detection algorithms.

"We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy," the Dubai-based company said in a statement.

Telegram was fined in February by Australia's online safety regulator for delaying answering questions about measures taken to prevent the spread of child abuse and violent extremist material.

'NOT ENOUGH' ACTION TO PREVENT HARM

Internet Watch Foundation, a British nonprofit that has been working with Telegram to help the company identify and remove harmful material, said there was more to do.

"We share concerns that bad actor networks are operating across Telegram’s ecosystem, and that not enough is being done to prevent known, detected, child sexual abuse imagery from being distributed," it said in a statement.

Ofcom said on Tuesday it had also opened investigations into Teen Chat and Chat Avenue to examine whether they were meeting their duties to prevent children from the risk of being groomed by predators.

Teen Chat and Chat Avenue did not respond to Reuters requests for comment.

Ofcom said that after engagement with the companies, it remained unsatisfied as to whether they were providing adequate protection to British children from the risk of grooming.

"These firms must do more to protect children, or face serious consequences under the Online Safety Act," Suzanne Cater, director of enforcement at Ofcom, said. Last October, Ofcom fined U.S. internet forum site 4chan 20,000 pounds ($27,020.00) for failures under the new rules.

($1 = 0.7402 pounds)

Source: Reuters
Advertisement

Also worth reading

Advertisement