• Home
  • Will Telegram Add Algorithmic News Recommendations? Here’s What’s Really Happening

Will Telegram Add Algorithmic News Recommendations? Here’s What’s Really Happening

Digital Media

Telegram has always bragged about being different. No algorithmic feeds. No endless scroll. Just channels you choose, in the order they were posted. That’s what made it feel like a private club for serious news, deep dives, and niche communities. But now, with over 1 billion users and billions of messages flying through it every day, that model is cracking. The question isn’t whether Telegram will introduce algorithmic recommendations-it’s when, and at what cost.

Telegram’s "No Algorithm" Claim Doesn’t Match Reality

Telegram’s official line is simple: we don’t recommend content. Users follow channels. They see posts in chronological order. That’s it. But behind the scenes, something else is happening. Since mid-2023, Telegram quietly rolled out a "similar channels" feature. Type in "Donald Trump," and you’re shown channels about QAnon. Search for "UK riots," and the top result is an Adolf Hitler meme. That’s not a coincidence. That’s an algorithm at work-even if Telegram calls it "topic-based." A 2024 study by the Southern Poverty Law Center analyzed over 28,000 Telegram channels and found a disturbing pattern: users browsing neutral topics were being funneled toward extremist content. Neo-Nazi groups, antisemitic forums, and MAGA conspiracy channels were appearing as "similar" to legitimate news sources. Even channels run by U.S. politicians like Marjorie Taylor Greene were recommending Steve Bannon and Lin Wood. Telegram says this is just keyword matching. Experts say that’s like saying a fire alarm is just a loud noise-it ignores the fact that it’s going off in the wrong places.

Why Telegram Can’t Ignore Algorithms Anymore

Telegram’s business model is under pressure. It makes money through Sponsored Messages-ads in big channels where admins split revenue. But right now, those ads pay only $0.02 to $0.05 per thousand views. Compare that to TikTok’s $2.50 to $5.00. Why? Because advertisers pay more when they know exactly who’s seeing their ads. Telegram’s chronological feed doesn’t give them that. Advertisers want targeting. They want engagement data. They want to know if someone watched the whole video, clicked a link, or shared it. Telegram doesn’t collect that data-because it claims it doesn’t need to.

But the numbers don’t lie. Sponsored Messages revenue jumped 320% from Q4 2024 to Q1 2025. In-app purchases hit $11.66 million in just two months in 2024. And Telegram’s job listings now include 15 openings for "Recommendation Algorithm Engineers" with requirements for deep learning experience. This isn’t a rumor. This is hiring. They’re building the system.

The User Divide: Control vs. Discovery

Telegram users are split. On one side, there are the purists. They love the lack of algorithmic manipulation. On Reddit, users like u/PrivacyAdvocate99 say, "On Instagram I’m trapped in a bubble. On Telegram, I control what I see." Trustpilot reviews show 68% of positive feedback highlights "no algorithmic feed" as the top reason they stick with Telegram.

But there’s another group-72% of negative reviewers on Trustpilot, and 57% of users in a December 2024 survey-want better discovery. They’re tired of manually hunting for new channels. They want Telegram to help them find quality news, not just the ones they already follow. One user on Reddit, u/NewsBuff2025, put it bluntly: "I follow legitimate news channels but keep getting recommended conspiracy theory channels under 'similar channels.' It’s like the algorithm is trying to radicalize me." The irony? Telegram’s current system is worse than a true algorithm. It doesn’t understand context. It doesn’t know the difference between a political debate and a hate group. It just matches keywords. A channel about "Ukraine war analysis" gets paired with "Ukraine war conspiracy." That’s not smart. That’s sloppy. And it’s dangerous.

Hand scrolling Telegram app while sinister extremist images reflect in the screen.

Regulators Are Knocking

Telegram’s legal troubles are piling up. In August 2024, founder Pavel Durov was arrested in Paris over allegations that Telegram was used for drug trafficking, money laundering, and distributing extremist content. French authorities didn’t just blame users-they blamed Telegram’s systems. The EU’s Digital Services Act forced Telegram to implement automated moderation tools by January 2025. Durov admitted they now use "AI-powered tools" to take down millions of pieces of content daily. That’s a big step. But it’s also a sign they’re already using machine learning-just not for recommendations.

Gartner predicts that by late 2025, all major platforms will be legally required to explain how their algorithms work. Telegram can’t hide behind "we don’t have one" forever. Either they build a transparent, auditable recommendation system-or they risk being shut down in Europe, the U.S., and beyond.

What a Real Algorithmic Feed Would Look Like

Telegram won’t copy TikTok. It can’t. Its users wouldn’t stand for it. But they don’t need to. A hybrid model is coming. Industry analysts at eMarketer predict a "discovery toggle" by September 2025-like Twitter’s "For You" vs. "Following" switch. Default: chronological. Option: algorithmic.

If they do this right, the algorithm could work differently than others. Instead of rewarding outrage, it could reward depth. It could learn that a user who reads 10-minute analytical posts about economics prefers long-form journalism over viral clips. It could surface quality news channels from Ukraine, Nigeria, or Brazil that users haven’t found yet-without pushing them toward extremists.

But that requires real engineering. Not keyword matching. Not topic clusters. Real machine learning trained on ethical signals: content quality, source credibility, user feedback, and moderation reports. Telegram’s engineering team has the scale-they handle 100 billion messages daily. Now they need the will.

Glowing neural network connecting news channels to extremist groups through keyword bridges.

The Risk: More Extremism, Less Trust

The biggest danger isn’t that Telegram will become like Facebook. It’s that it will become worse. Because unlike Facebook or TikTok, Telegram has no history of responsible algorithm design. It’s never had to moderate at scale. Its moderation team is small. Its tools are reactive. Its leadership still denies the problem.

If they slap on a recommendation engine without fixing those flaws, they’ll turn their platform into a radicalization engine. The SPLC’s Heidi Beirich warned in February 2025: "Any algorithmic system Telegram implements must have radically better safeguards than current platforms." That’s not a suggestion. That’s a warning.

And if they fail? The backlash won’t just be from regulators. It’ll be from users. The same people who chose Telegram for its privacy and control will leave. The platform’s 4.3/5 rating on Trustpilot could collapse. The "no algorithm" brand-that’s its biggest asset-will be gone.

The Choice Ahead

Telegram stands at a fork. One path leads to monetization, growth, and global relevance-but at the cost of its soul. The other path leads to irrelevance, declining revenue, and regulatory crackdowns. Neither is easy.

But here’s the truth: the algorithm is already here. It’s just hidden. And it’s broken. The real question isn’t whether Telegram will adopt algorithmic recommendations. It’s whether they’ll fix their system before it turns their platform into a weapon.

The next 12 months will decide if Telegram remains a haven for free speech-or becomes the most dangerous social network on earth.