• Home
  • Balancing Transparency and Safety in Telegram News Groups

Balancing Transparency and Safety in Telegram News Groups

Digital Media

Telegram news groups have become one of the most powerful tools for real-time information sharing - but they’re also a breeding ground for misinformation, illegal activity, and coordinated harm. For years, Telegram marketed itself as the ultimate privacy app: no ads, no tracking, no cooperation with governments. But everything changed after Pavel Durov was arrested in France in August 2024. That moment forced Telegram to rethink its entire approach to safety and transparency - and the results are reshaping how millions use the platform every day.

What Changed After Durov’s Arrest?

Before August 2024, Telegram claimed it couldn’t share user data even if asked. That wasn’t true. The platform had always had the technical ability to access IP addresses and phone numbers - it just refused to do so unless forced. After Durov’s arrest, Telegram quietly updated its Privacy Policy. Now, if a court order proves someone broke the rules - say, by running a channel that sells stolen credit cards or shares child abuse material - Telegram will hand over their IP and phone number. It’s not a blanket policy. It’s targeted. But it’s real.

This shift didn’t come out of nowhere. Australia’s eSafety Commissioner had already fined Telegram $957,780 AUD for failing to respond to transparency demands in 2024. While other platforms like Meta and Google submitted their reports on time, Telegram waited over five months. That delay cost them money - and credibility. The fine wasn’t just a penalty. It was a warning: if you want to operate globally, you can’t ignore the law.

How Many Users Are Being Reported?

The numbers speak louder than policy updates. In the first quarter of 2024, Telegram gave out data on just 5,826 users worldwide in response to legal requests. By Q1 2025? That number jumped to 22,777. That’s a 291% increase in just six months.

India led the charge, with over 9,000 requests targeting nearly 10,000 users. The U.S. saw 576 requests for 1,664 users. These aren’t random numbers. They’re the result of coordinated efforts by law enforcement agencies cracking down on fraud rings, terrorist networks, and cybercriminal groups that used Telegram’s public channels to plan attacks, recruit members, and distribute illegal content.

Telegram’s own transparency bot - @transparency - now publishes quarterly reports. But here’s the catch: you need a Telegram account to see them. And even then, you only see data from your own region. That means someone in Brazil can’t verify what’s happening in Russia. Independent researchers can’t audit the full picture. It’s transparency with a blindfold on.

AI Moderation Is Here - But It’s Limited

On September 23, 2024, Durov announced on his public channel that Telegram had deployed a dedicated AI team to clean up search results. "All the problematic content we identified in Search is no longer accessible," he wrote. That sounds impressive. But here’s what he didn’t say: this only applies to public search. If you’re part of a private news group with 50,000 members, and someone posts a link to stolen bank logs, Telegram won’t automatically detect it.

Why? Because private chats and groups are still encrypted in a way that prevents Telegram from scanning them - not end-to-end, but close enough to make automated moderation nearly impossible. The platform now claims it can remove illegal content from public channels and search indexes. But what about the thousands of hidden groups where scams, weapon sales, and extremist propaganda still thrive? There’s no public data on how many of those have been shut down.

And then there’s the risk of false positives. If an AI flags a news group about political dissent in a repressive country as "violent," could that lead to innocent activists being targeted? Telegram hasn’t explained how appeals work - or if they even exist.

A world map with glowing data request lines connecting countries to a cracked Telegram logo revealing a privacy-safety balance scale.

The Rise of the Underground Exodus

Telegram’s crackdown didn’t just scare off bad actors - it pushed them out. Cyberint, a threat intelligence firm, tracked discussions on dark web forums where group admins started warning members: "Telegram is getting too hot. Move to Signal. Move to Discord. Move to Matrix." Some groups vanished entirely. Others migrated to less-monitored platforms.

This is a win for safety - but only if you believe the goal is to make illegal activity harder to find. The problem? These groups didn’t disappear. They just went deeper underground. And now, they’re using platforms with even less oversight, fewer reporting tools, and zero transparency reports.

Telegram’s move may have cleaned up its public face, but it didn’t solve the problem. It just moved it.

Privacy vs. Safety: Who Wins?

The core tension hasn’t gone away. Telegram used to be the go-to app for journalists, activists, and whistleblowers in authoritarian regimes. People trusted it because it refused to hand over data - even under pressure. Now, that same platform is turning over user info to governments in India, the U.S., and beyond.

For some, that’s a relief. Parents worried about child exploitation, journalists covering organized crime, and local police fighting fraud all benefit from better cooperation. But for others - protesters in Iran, dissidents in China, LGBTQ+ communities in Uganda - this shift is terrifying. If your phone number and IP address can now be shared with authorities after a single report, what happens when a government decides to silence critics under the guise of "safety"?

There’s no easy answer. You can’t have total privacy and total safety at the same time. Telegram is trying to walk that line - and so far, it’s stumbling.

An activist typing on a laptop with Telegram open, surrounded by news clippings about platform changes and safety notes.

What’s Next for Telegram?

The regulatory pressure isn’t slowing down. California’s AB 3211 bill, backed by Microsoft and Adobe, wants all platforms to label AI-generated content - including deepfakes shared in news groups. The EU’s Digital Services Act is watching closely. Australia’s eSafety Commissioner has already said they’ll keep fining platforms that don’t comply.

Telegram’s new transparency reports are a step forward - but they’re not enough. Without independent audits, without public access, without clear rules on how moderation decisions are made, users are left guessing. Are you safe? Or are you just being watched?

For now, Telegram’s strategy is simple: cooperate enough to avoid more fines, but not so much that users flee. It’s a tightrope walk. And the rope is getting thinner.

What Should Users Do?

If you run or participate in a Telegram news group:

  • Assume anything you post publicly - even in a "private" group - can be traced.
  • Use end-to-end encrypted chats (Secret Chats) for sensitive conversations. Regular group chats aren’t safe.
  • Report illegal content through the official in-app button. Don’t rely on third-party tools.
  • Be cautious about joining new groups with no verification. Many are scams or honeypots.
  • Check the @transparency bot quarterly. See what data Telegram is sharing - and who’s asking for it.

There’s no perfect solution. But awareness is your best defense.

Does Telegram really scan private messages for illegal content?

No. Telegram does not scan the content of private one-on-one chats or private groups - even if they’re large. Only public channels, search results, and files uploaded to public storage are monitored by AI. The platform states it can access metadata (like IP addresses) if legally required, but it cannot read the content of encrypted private chats.

Can law enforcement track me just by joining a Telegram news group?

Not directly. Merely joining a group doesn’t trigger data sharing. But if you post illegal content - like links to stolen data, child abuse material, or bomb-making instructions - and someone reports it, Telegram may be legally required to hand over your IP address and phone number. That’s how they trace users now.

Why did Telegram remove "People Nearby"?

The "People Nearby" feature allowed users to connect with others in their physical location - a feature abused by predators, scammers, and stalkers. After the 2024 policy changes, Telegram replaced it with "Businesses Nearby," which only shows verified, registered businesses. This was a direct response to safety concerns and regulatory pressure.

Are Telegram transparency reports trustworthy?

They’re better than nothing, but limited. The reports only show data requests from regions where the user is located, and you need a Telegram account to view them. Independent researchers can’t verify global trends. Human Rights Watch and other watchdogs have noted that while the data is accurate, the lack of public access and third-party auditing reduces its credibility as a full accountability tool.

Is Telegram safer now than it was in 2023?

Publicly? Yes. Illegal content in search results and public channels has dropped sharply. But privately? No. The same tools that helped criminals hide in 2023 are still there - just harder to find. The real change is that Telegram now responds to law enforcement. That makes it safer for society - but riskier for users who rely on it for privacy.