Telegram has become one of the most powerful platforms for news distribution - but also one of the most controversial. With over a billion users, it’s where journalists in authoritarian states share classified documents, where activists organize protests, and where conspiracy theorists spread falsehoods to millions in seconds. The same tools that protect free speech also make it nearly impossible to stop illegal content. And as governments crack down, Telegram is caught in the middle.
How Telegram Moderates Content - And Why It’s Not Enough
Telegram doesn’t scan private messages. That’s by design. Secret Chats use end-to-end encryption, meaning even Telegram can’t see what’s sent. But public news channels? Those are different. They’re stored on Telegram’s servers. That’s why the platform can - and does - remove content from them. Since January 2026 alone, Telegram has blocked nearly 7.5 million groups and channels. That’s not a typo. In February 2026, it blocked over 2 million in just over two weeks. On February 17, 2026, it took down 187,634 channels in a single day - the fourth-highest daily total that month. Most of these were linked to terrorism, scams, or child exploitation. Telegram says it uses AI to detect harmful content, and it works. Cybercriminal channels that used to last months now get shut down in days. But here’s the problem: the same AI that blocks illegal content also flags legitimate journalism. A news channel reporting on political corruption in Belarus might get caught in the same net as a terrorist recruitment channel. There’s no public appeal process. No transparency report. No way to know if a channel was wrongly removed. That’s why journalists and human rights groups are worried.The Child Protection Paradox
Telegram has blocked over 12,000 pieces of child sexual abuse material (CSAM) in the first half of 2025. That’s thanks to reports from organizations like the National Center for Missing & Exploited Children and Stichting Offlimits. In one 37-day window in 2024, Stichting Offlimits alone sent over 17,000 reports - and every single one was acted on. That sounds like progress. But here’s the catch: Telegram doesn’t proactively scan for this content. It waits for reports. That means if no one files a complaint, the material stays up. And because Telegram doesn’t verify user age, minors can join any public channel. A 13-year-old in Spain or Brazil can stumble onto a CSAM link through a search result or a forwarded message. There’s no parental control. No age gate. No warning. This isn’t just a technical flaw - it’s a legal liability. France is investigating Telegram for failing to protect children. Australia fined the platform in 2023 for not removing illegal content. Spain is now pushing laws that would hold Telegram’s executives personally responsible. Pavel Durov says these rules turn countries into surveillance states. But he’s ignoring the fact that Telegram’s own architecture makes it impossible to monitor what it claims to protect.
Disinformation: The Invisible Threat
Unlike child abuse or malware, disinformation doesn’t have a clear legal definition. Is a false claim about vaccines illegal? What about a doctored video of a politician? Telegram doesn’t define it. It doesn’t label it. It doesn’t fact-check it. That’s why El País called Telegram “the preferred channel for disinformation spreaders.” Public news channels can have hundreds of thousands of subscribers. A single post can go viral across continents. And because Telegram doesn’t require identity verification, anyone can create a channel pretending to be a real news outlet. There’s no blue check. No verification badge. No accountability. In 2025, researchers found over 200 fake news channels on Telegram impersonating major media outlets. Some had more followers than the real ones. They spread false election results, fake health advice, and inflammatory propaganda. Telegram didn’t remove them - until users reported them. And even then, removals were slow. This is the core tension: Telegram’s openness makes it a lifeline for free press. But that same openness makes it the perfect tool for lies.Regulatory Pressure Is Mounting
Australia fined Telegram. France is investigating. Spain is drafting laws that could jail platform executives. Russia is preparing to block Telegram entirely on April 1, 2026. These aren’t random events. They’re signals. Governments are done waiting. They see Telegram’s hands-off approach as a threat to public safety. And they’re not just asking for change - they’re forcing it. Telegram’s response? Resistance. Durov publicly mocked Spain’s proposed rules. But resistance doesn’t work when a country cuts off access to 8 million users. Russia’s planned block isn’t just about censorship - it’s a warning. If Telegram won’t comply, it will be erased. The platform’s hybrid encryption model - cloud chats that can be moderated, Secret Chats that can’t - was meant to balance privacy and control. But in practice, it’s a loophole. Regulators can’t trust it. Journalists can’t rely on it. Users don’t understand it.
The Human Cost of Over-Moderation
When Telegram blocks a channel, it doesn’t tell you why. No explanation. No appeal. Just gone. In 2025, a group of independent journalists in Ukraine used a Telegram channel to share real-time updates from the front lines. Their channel had 400,000 followers. One day, it vanished. No warning. No notice. Months later, they found out it was flagged as “potentially extremist” because it mentioned a banned political group - even though they were reporting on it neutrally. This isn’t rare. Human rights groups report dozens of cases like this every year. A channel documenting police brutality in Brazil. A news site covering religious minorities in India. A whistleblower channel in Turkey. All removed without explanation. The AI doesn’t know context. It doesn’t understand journalism. It just matches keywords. And with millions of channels being scanned daily, mistakes are inevitable.What’s Next? A Platform at a Crossroads
Telegram is at a turning point. It can’t keep resisting global pressure forever. At some point, it will have to choose: become a compliant platform with stricter rules - or become a ghost. If it chooses compliance, it risks losing the trust of journalists, activists, and dissidents who rely on its anonymity. If it chooses resistance, it risks being banned in dozens of countries - and losing its relevance. The truth is, there’s no perfect solution. Free expression and safety don’t always fit together. But right now, Telegram isn’t trying to find a balance. It’s pretending one doesn’t exist. For users, that means staying vigilant. Use the "Sensitive Content" filter. Restrict channel additions to contacts. Monitor who you follow. For journalists, it means using encrypted tools like Signal for sensitive communication - and treating Telegram like a public bulletin board, not a private diary. For regulators, it means demanding transparency. Not just blocking numbers - but appeal processes, audit trails, and clear standards. For Telegram? It’s time to stop pretending it can have it all.Can Telegram be trusted to protect free speech while removing illegal content?
Telegram claims to protect free expression, but its moderation is opaque and inconsistent. It blocks millions of channels with no explanations, and its AI often removes legitimate journalism alongside illegal content. Without transparency, appeals, or oversight, users can’t trust that free speech is being protected - only that content is being removed at scale.
Why doesn’t Telegram verify user identities?
Telegram avoids identity verification to preserve user anonymity, which is critical for journalists, activists, and dissidents in repressive regimes. But this also makes it easy for bad actors to create fake channels, impersonate news outlets, and spread disinformation without consequences. The trade-off favors privacy over accountability.
Are Telegram’s AI moderation tools effective?
Yes - but with major flaws. Since early 2024, Telegram’s AI has dramatically increased the speed of takedowns, especially for criminal and terrorist content. However, it lacks context. It can’t distinguish between reporting on extremism and promoting it. This leads to false positives, where legitimate news channels are removed without recourse.
Can parents monitor what their kids see on Telegram?
Very little. Telegram doesn’t offer parental controls. Secret Chats are encrypted and self-destructing. Deleted messages can’t be recovered. Parents can only see if the app is installed and how much time is spent on it - not what content is viewed. Schools often ban Telegram for this reason.
What happens if a country blocks Telegram?
If a government blocks Telegram, users can still access it through VPNs - but that’s not a long-term solution. Russia plans to block Telegram on April 1, 2026, which could cut off access for millions. This doesn’t stop the platform globally, but it signals a growing trend: governments are willing to cut off entire platforms rather than negotiate compliance.
Telegram’s future depends on whether it can evolve beyond its ideological stance. The world isn’t asking for perfect privacy - it’s asking for responsible governance. Right now, Telegram is choosing silence over accountability. And that’s a choice with real consequences.