• Home
  • Telegram's Privacy vs. Safety Shift: What the New Rules Mean for Users

Telegram's Privacy vs. Safety Shift: What the New Rules Mean for Users

Regulatory Governance

For years, Telegram was the messaging app that promised absolute privacy, operating with minimal content moderation and strong encryption to protect users from surveillance. Founded by brothers Pavel Durov and Nikolai Durov, it attracted over 900 million active users who valued its ability to create anonymous groups and channels without interference. But that era of unchecked freedom is effectively over. The platform’s approach to balancing user rights with safety has undergone a radical transformation, driven by legal pressure, criminal exploitation, and high-profile enforcement actions.

This isn't just a minor tweak to terms of service. It represents a fundamental shift in how one of the world's largest communication platforms handles the tension between secrecy and accountability. If you rely on Telegram for secure communication, or if you are concerned about online safety, understanding these changes is critical. The landscape has changed, and the rules of engagement have been rewritten.

The Catalyst: From Permissive Haven to Regulatory Target

To understand why Telegram changed, we have to look at what happened before. Historically, Telegram maintained a much more permissive content policy than competitors like WhatsApp or WeChat. While WhatsApp and WeChat banned harmful content and political speech respectively, Telegram allowed a wider range of expression, arguing that censorship violated user autonomy. This stance made it a favorite for journalists, activists, and dissidents in authoritarian regimes. However, this same openness created a blind spot that cybercriminals quickly exploited.

FraudAction’s intelligence researchers documented how the platform became a hub for illicit activity. With over 900 million users, Telegram evolved into a marketplace for stolen databases, credit card information, and hacking techniques. Criminals used the app’s anonymity features to coordinate fraud rings and distribute illegal goods. The lack of robust content moderation meant that harmful material could spread with minimal oversight. This "privacy-first" design, while appealing to legitimate users, inadvertently enabled abuse at an industrial scale.

The turning point arrived on August 24, 2024, when CEO Pavel Durov was arrested outside Paris. This event sent shockwaves through the tech community and directly influenced the behavior of cybercriminals relying on the platform. Following his release and subsequent negotiations with authorities, Durov announced a major policy update on September 23, 2024. This announcement marked a decisive break from Telegram’s founding principles, signaling that the days of total operational opacity were ending.

The New Policy: Data Sharing and Legal Compliance

The September 2024 policy update introduced several key changes that fundamentally alter Telegram’s relationship with law enforcement. Most significantly, the platform now mandates sharing specific user data with authorities in response to valid legal requests. This includes disclosing IP addresses and phone numbers of users suspected of criminal activity or violations of Telegram’s terms of service.

Previously, Telegram rarely cooperated with law enforcement, citing privacy concerns. Now, the company explicitly states it will assist in curbing illegal purposes such as fraud, cybercrime, drug trafficking, and the distribution of child exploitation material. This shift acknowledges that absolute privacy can shield serious crimes from detection. By providing IP and phone data, Telegram aims to strike a balance: preserving message content encryption while enabling authorities to identify perpetrators involved in severe offenses.

Comparison of Messaging Platform Policies (Pre- vs. Post-2024)
Feature Telegram (Pre-Sept 2024) Telegram (Post-Sept 2024) WhatsApp
Data Sharing with Law Enforcement Rarely disclosed; minimal cooperation Mandatory disclosure of IP/phone for valid legal requests Cooperates with legal requests
Content Moderation Approach Highly permissive; limited proactive scanning Enhanced reporting tools; stricter terms enforcement Strict bans on harmful/illegal content
Group Admin Powers Limited oversight capabilities Admins can monitor/report abuse in private chats within groups Standard admin controls
End-to-End Encryption Default only for "Secret Chats" Still default only for "Secret Chats" (unchanged) Default for all messages

This change does not mean Telegram is scanning every message. One-to-one "Secret Chats" remain end-to-end encrypted, meaning even Telegram cannot read their contents. However, cloud-based chats (the default) are stored on Telegram’s servers, allowing the company to comply with data requests regarding metadata and account ownership. This distinction is crucial for users who assume all their conversations are equally protected.

Internal Moderation: Empowering Group Administrators

Beyond external legal compliance, Telegram also shifted its internal moderation structure. The new rules allow group administrators to monitor and report abuse within private chats associated with their groups. This is a significant departure from the platform’s earlier stance, which prioritized individual privacy above community management.

Why did this happen? Investigations revealed organized abuse networks using Telegram channels to distribute nonconsensual sexual material and child sexual abuse content. A European non-profit investigation uncovered nearly 25,000 users actively participating in these networks across Spain and Italy. These weren’t isolated incidents; they were structured ecosystems with recruitment patterns, payment systems (charging €5 monthly or €50 one-time fees), and operational security protocols.

By empowering admins, Telegram aims to give communities better tools to police themselves. Admins can now flag toxic behavior and unlawful content without theoretically violating the encryption of direct messages. However, this move has raised concerns among privacy advocates. Critics argue that giving admins more power blurs the line between public forums and private communications, potentially chilling free speech and creating new vulnerabilities for targeted harassment.

The platform insists these tools are limited to addressing unlawful content and obscene language. Yet, the perception matters. For many users, Telegram was founded on the principle of respecting privacy above all else. Introducing moderation facilities into group dynamics fundamentally alters that trust model.

Balance scale showing privacy vs safety under EU regulations

The Regulatory Pressure: EU Digital Services Act and Global Laws

Telegram’s policy shift didn’t happen in a vacuum. It was heavily influenced by evolving regulatory frameworks, particularly the EU Digital Services Act (DSA). As a Very Large Online Platform (VLOP) with over 45 million monthly active users in the EU, Telegram faces strict obligations under the DSA.

The DSA requires platforms to mitigate systemic risks, including the spread of illegal content and disinformation. It mandates transparency reports, independent audits, and effective mechanisms for users to report harmful material. Coinedition’s analysis notes that EU VLOP rules may impose audits and restrictions on monetized private channels-directly targeting the subscription-based abuse networks documented in Europe. Regulators view the monetization of abuse (via subscriptions or crypto payments) as a critical vector for exploitation.

Other jurisdictions are also tightening the net. In India, where Telegram has seen meteoric growth, authorities struggle with the app’s lack of a physical office and limited cooperation. Asia News Network highlights that this creates a regulatory gray area, making it difficult to combat piracy, drug trafficking, and extremist content. Drug traffickers specifically prefer Telegram because its encrypted messaging allows them to communicate with buyers without fear of interception. The new policy aims to close this gap by providing Indian and other international authorities with actionable data.

In the United States, lawmakers are examining whether additional legal requirements should apply to messaging platforms, focusing on child safety and national security. The global trend is clear: regulators no longer accept "we don’t know what’s happening on our platform" as a valid defense. Platforms are expected to take affirmative responsibility for preventing systematic abuse.

Child Safety and Parental Controls: A Critical Gap

One of the most pressing issues driving policy changes is child safety. Findmykids notes that Telegram’s native privacy and security processes do not incorporate built-in parental controls. This lack of oversight exposes minors to unsolicited contact from strangers and inappropriate content classified as offensive, explicit, or illegal.

While Telegram offers some privacy settings-such as restricting who can see your phone number, limiting who can message you, controlling group invitations, and managing call accessibility-these require manual configuration. There are no age-based protections or automatic content filtering. For parents, this means they must either constantly monitor their child’s device or rely on third-party applications like Findmykids to overlay monitoring capabilities.

This gap is particularly dangerous given the rise of AI-nudifying tools that automate the creation of nonconsensual sexual imagery. These technologies exacerbate content abuse problems, complicating detection efforts for both platform administrators and law enforcement. Without native safeguards, children remain vulnerable to predators who exploit Telegram’s anonymity features. The new moderation tools for groups offer some relief, but they do not replace the need for comprehensive parental control features.

User adjusting privacy settings on laptop in dim room

Stakeholder Perspectives: Who Wins and Who Loses?

The debate over Telegram’s new direction involves distinct stakeholder groups with conflicting priorities.

  • Privacy Advocates: They argue that strong encryption and limited moderation protect vulnerable populations from government surveillance and control. For journalists and activists in repressive regimes, Telegram remains a vital tool. Any reduction in privacy is viewed as a slippery slope toward state control.
  • Child Safety Organizations: Groups focused on protecting minors counter that privacy-first architectures enable systematic exploitation. They argue that platforms have an ethical obligation to prevent harm, even if it means sacrificing some anonymity. The documented abuse networks demonstrate the real-world consequences of unregulated spaces.
  • Law Enforcement Agencies: Police and investigative bodies emphasize their inability to solve serious crimes when platforms provide absolute privacy shielding. The new data-sharing policies provide a necessary pathway for investigations into fraud, trafficking, and terrorism.
  • Cybercriminals: FraudAction’s research indicates that the policy changes have already begun influencing criminal behavior. Some actors are migrating to more obscure platforms or adopting stricter operational security measures. However, the sheer size of Telegram’s user base ensures it remains a target.

Foresiet’s analysis characterizes the policy update as an attempt to highlight the fine balance between protecting user privacy and combating criminal activity. Yet, the practical effectiveness remains unproven. Will these measures substantially reduce abuse, or are they merely performative adjustments designed to satisfy regulators?

What This Means for You: Practical Steps

If you use Telegram, here’s how to navigate the new landscape:

  1. Understand Your Chat Type: Remember that only "Secret Chats" are end-to-end encrypted by default. Regular cloud chats are stored on Telegram’s servers and subject to data requests. Use Secret Chats for sensitive conversations.
  2. Adjust Privacy Settings: Go to Settings > Privacy and Security. Restrict who can see your phone number, add you to groups, or message you. This reduces your exposure to unsolicited contact and potential abuse.
  3. Be Cautious with Groups: Recognize that group admins now have enhanced monitoring powers. Avoid sharing highly personal information in large or unverified groups.
  4. Parental Oversight: If you’re a parent, consider using third-party monitoring tools since Telegram lacks native parental controls. Educate your children about the risks of interacting with strangers online.
  5. Stay Informed: Regulatory environments continue to evolve. Keep an eye on updates to Telegram’s Terms of Service and local laws affecting digital communications.

The era of Telegram as a lawless digital frontier is closing. The platform is moving toward a model that prioritizes safety and compliance, albeit at the cost of some privacy. Whether this balance is right depends on your perspective. For those seeking anonymity, the window is narrowing. For those demanding accountability, the changes are a step in the right direction. As of May 2026, this tension remains dynamic, with policy continuing to evolve in response to regulatory pressure and documented abuse patterns.

Did Telegram change its encryption policy after Pavel Durov's arrest?

No, Telegram did not change its core encryption technology. End-to-end encryption remains available only for "Secret Chats," just as before. However, the company updated its privacy policy to mandate sharing user metadata (like IP addresses and phone numbers) with law enforcement in response to valid legal requests. This affects account identification, not the decryption of message content in Secret Chats.

Can group admins read my private messages on Telegram now?

Group admins cannot read your one-to-one private messages with other users. However, the new rules allow admins to monitor and report abuse within private chats that occur *within* the context of their group (e.g., direct messages between group members). This is intended to help manage toxic behavior and illegal content, but it does not grant access to all your personal communications.

Why is the EU Digital Services Act important for Telegram?

The EU Digital Services Act (DSA) classifies Telegram as a Very Large Online Platform (VLOP) due to its large user base in Europe. This imposes strict obligations, including mitigating systemic risks, conducting independent audits, and ensuring transparency in content moderation. The DSA pressures Telegram to take more active steps against illegal content and abuse networks, driving the recent policy shifts.

Does Telegram have parental controls?

No, Telegram does not have built-in parental controls. Parents must manually adjust privacy settings (like restricting who can message or add their child to groups) or use third-party monitoring applications. The lack of native age-based protections or content filtering makes it essential for parents to stay vigilant about their children's usage.

How does Telegram compare to WhatsApp in terms of privacy?

WhatsApp uses end-to-end encryption by default for all messages, whereas Telegram only encrypts "Secret Chats" by default; regular cloud chats are encrypted but accessible to Telegram. WhatsApp cooperates with law enforcement via legal requests, similar to Telegram’s new post-2024 policy. However, Telegram historically offered more anonymity features (like anonymous groups), which it is now scaling back to comply with regulations.

What types of data can Telegram share with authorities now?

Following the September 2024 policy update, Telegram can share IP addresses and phone numbers associated with user accounts in response to valid legal requests. This data helps identify users involved in criminal activities such as fraud, drug trafficking, or distribution of illegal content. Message content in end-to-end encrypted Secret Chats remains inaccessible.

Is Telegram still safe for whistleblowers and journalists?

Telegram remains a valuable tool for whistleblowers and journalists, especially those using Secret Chats for end-to-end encryption. However, the increased data sharing and moderation tools mean it is less opaque than before. Users should be aware that metadata (IP, phone number) can be disclosed to authorities. For maximum security, combining Telegram with other operational security practices is recommended.