• Home
  • Cultural Considerations for News Tone and Moderation on Telegram

Cultural Considerations for News Tone and Moderation on Telegram

Digital Media

When you open Telegram to check the news, you don’t just see headlines. You see a live pulse of communities-some trustworthy, some chaotic, some dangerously biased. Unlike Facebook or Twitter, Telegram doesn’t push you content based on what you’ve liked. It doesn’t algorithmically sort your feed. Instead, you join channels. You follow people. And what you see depends almost entirely on who you chose to follow-and where those people are from.

This is why cultural context matters more on Telegram than on any other platform. A news post that feels urgent and truthful in Kyiv might seem inflammatory in Jakarta. A tone that’s seen as responsible in Berlin could be read as aggressive in Lagos. And moderators? They’re often just one person running a channel from their bedroom, trying to balance truth, speed, and safety-with no rulebook.

How Telegram’s Design Shapes News Tone

Telegram was built for speed, privacy, and control. It doesn’t auto-curate. It doesn’t hide posts. If you’re in a news channel, you get everything-raw footage, unverified claims, live updates from war zones, and opinions dressed as facts. This freedom makes it powerful. But it also makes it dangerous.

Journalists and citizen reporters use Telegram because it’s fast. When a protest breaks out in Sudan or a power plant explodes in Ukraine, Telegram channels are often the first to report it. But speed comes at a cost. Without editorial oversight, tone becomes a weapon. A headline like “They burned our homes” might be accurate in one context, but in another, it’s designed to stir rage. The difference isn’t in the facts-it’s in the culture.

Research from Israeli Telegram news groups during the 2023-2025 Israel-Hamas conflict showed something surprising: some channels were more ethical than mainstream outlets. They corrected errors publicly, cited sources, and avoided dehumanizing language. Others? They removed subscribers for “not speaking nicely”-a chilling example of how moderation can become censorship disguised as order.

Who Decides What’s Offensive?

There’s no global standard for what counts as hate speech, incitement, or misinformation. In one country, calling a leader corrupt is journalism. In another, it’s a crime. Telegram leaves moderation to channel owners. That means a group in the Netherlands might ban someone for sharing a conspiracy theory about vaccines, while a similar group in Brazil lets it stay because the narrative aligns with local distrust of government.

And here’s the problem: Telegram doesn’t tell you how those decisions are made. No transparency. No appeals process. No cultural guidelines. Just a rule that says: “Don’t promote violence or terrorism.” Everything else? Up to you.

That’s why the same post can be flagged in Paris but ignored in Manila. A video of a police officer using force might be seen as evidence of abuse in the U.S., but as necessary order in Singapore. Moderators don’t have training in cross-cultural communication. They don’t have access to local context. They’re just guessing.

Three cities display the same news post with wildly different tones, illustrating cultural interpretation.

The Algorithm That Doesn’t Exist-But Still Pushes Extremes

Telegram says its "Similar Channels" feature only suggests content based on what you’ve already joined. But studies show otherwise. The Southern Poverty Law Center analyzed 28,000 public channels and found that users searching for harmless topics-like cooking or tech gadgets-were being funneled into extremist spaces. A person who joined a channel about electric cars ended up in a white nationalist group. Someone following celebrity gossip got pushed into antisemitic conspiracy channels.

This isn’t accidental. It’s structural. Telegram’s recommendation engine doesn’t use AI to filter. It uses engagement. If a channel gets shares, replies, or forwards-even if it’s hate-filled-it gets promoted. The platform doesn’t care about intent. It only cares about activity.

And because Telegram’s user base spans over 800 million people across dozens of languages and belief systems, this system amplifies the worst. A far-right channel in Germany can gain traction in Austria, Switzerland, and even parts of Latin America-all through shared language, not shared values.

When Moderation Becomes Cultural Erasure

Telegram’s official stance is that it doesn’t monitor private chats. But it does remove millions of public posts daily. The problem? It doesn’t explain why.

Some moderators delete posts that question official narratives-whether it’s about vaccines, elections, or wars. In authoritarian states, that might be necessary. But in democracies, it looks like suppression. In India, a channel that criticized a new law was shut down. In Canada, a group discussing Indigenous land rights was flagged for “hate.” Both had valid context. Neither was given a fair review.

When platforms apply one-size-fits-all rules, they erase nuance. A phrase that’s slang in Nigeria might be an insult in Sweden. A gesture that’s respectful in Japan might be offensive in Brazil. Telegram’s moderation teams, mostly based in Europe and North America, don’t have the cultural fluency to understand these differences. So they delete. And the result? Trust erodes. People stop believing the platform is fair.

Volunteer moderators from Ukraine, Indonesia, and Nigeria use culturally tailored tools to combat misinformation.

What’s Changing-and What’s Not

After pressure from France and other governments, Telegram made some changes. It removed the “People Nearby” feature, which scammers used to target victims. It stopped letting users upload media to standalone blogs. It also updated its FAQ, admitting it does monitor illegal content in public channels-even if it won’t look inside private chats.

But these are technical fixes, not cultural ones. Pavel Durov still believes free speech should come before control. He says most users are law-abiding. He’s right. But he ignores that the loudest, most extreme voices often drown out the quiet, responsible ones.

There’s a middle ground. Some Telegram channels are proving it. In Ukraine, news groups now have volunteer fact-checkers who tag unverified posts with “Pending Verification.” In Indonesia, moderators use local dialects to explain why certain claims are false. In Nigeria, community leaders run training sessions for new channel admins on ethical reporting.

These aren’t perfect. But they’re real. They show that culture isn’t a barrier to moderation-it’s the foundation.

The Way Forward: Local Rules, Global Tools

Telegram won’t become a news network with editors. But it can become a platform that empowers local moderation. Imagine this:

  • Channel owners can select their region and language to unlock culturally-aware moderation tools.
  • Community leaders in each country can help define what counts as harmful in their context.
  • Telegram provides AI that flags potential violations-but human moderators from the same culture review them.
  • Users can see why a post was removed, with a link to local norms or laws.

This isn’t fantasy. It’s how WhatsApp handles misinformation in India-with local fact-checking networks. It’s how Signal supports end-to-end encryption while letting users report abuse. Telegram could do the same.

Right now, it’s a wild west. But it doesn’t have to be. The tools are there. The people are there. What’s missing is the will to listen.