Top 3 Key Points:
- Telegram will now moderate private chats after CEO Pavel Durov’s arrest.
- The company quietly updated its FAQ page, removing claims of non-moderation of private chats.
- Durov promises to improve Telegram’s security following a surge in illegal content.
Telegram has recently updated its policy regarding the moderation of private chats, following the arrest of its CEO, Pavel Durov, in France. The changes came to light after Telegram discreetly removed a statement from its FAQ page, which had previously assured users that private chats would not be subject to moderation. This shift in policy has raised concerns and speculation about the platform’s future stance on privacy.
The arrest of Durov was linked to allegations that the platform allowed criminal activities, including the distribution of illegal content, to go unchecked. French authorities have accused Telegram of enabling the spread of child abuse material and drug trafficking while refusing to cooperate with law enforcement. These charges have put pressure on the platform to reassess its policies.
In his first public statement since the arrest, Durov acknowledged the need for stricter content moderation. He emphasized that the platform’s rapid growth to over 950 million users made it more vulnerable to misuse by criminals. He also made it clear that improving Telegram’s security features and preventing illegal activities from flourishing on the platform are now top priorities.
Durov’s statement represents a significant shift from the company’s previous stance, where Telegram defended its hands-off approach, arguing that platform owners should not be held responsible for how users behave. The company had earlier downplayed the allegations, claiming there was “nothing to hide.”
However, changes have already begun. The company’s FAQ section, which once stated that Telegram does not handle moderation requests related to private and group chats, has been altered. The updated response now informs users that Telegram apps include a “Report” feature, allowing users to flag illegal content to moderators for review. This marks a clear move toward increased moderation.
Telegram has long been a critical platform for communication, particularly in regions like Ukraine, where it serves as a vital source of information. However, the platform’s previous lack of content moderation has drawn criticism. As Telegram starts implementing these changes, users may see further developments in how the company handles privacy and security concerns.
I’m still trying to understand what happened in France. But we hear the concerns. I made it my personal goal to prevent abusers of Telegram’s platform from interfering with the future of our 950+ million users.
My full post below. https://t.co/cDvRSodjst
— Pavel Durov (@durov) September 5, 2024