On August 24th, Pavel Durov, the billionaire Russian founder of Telegram, a messaging app and social network, was detained after his private plane landed outside of Paris, from Baku, Azerbaijan. At first, the action was greeted with shock: the C.E.O. of a major tech company, with nearly a billion users, appeared to have been punished for the misdeeds of his platform, not unlike if Mark Zuckerberg were arrested for the misinformation published on Facebook. Elon Musk quickly posted in support of Durov on X, framing the arrest as a violation of the principles of free speech. (In 2022, after his acquisition of Twitter, Musk fired much of the company’s content-moderation staff.) But when the charges against Durov were made public, last Wednesday, the six items on the list appeared deeply consequential, including accusations of complicity in drug trafficking and the dissemination of child pornography, in addition to unlawfully “providing cryptology services.” (Durov’s lawyer has called the allegations against his client “absurd.”)
Telegram, founded in 2013, combines the features of a messaging app and a social network, with broadcasting functions that allow users to reach hundreds of thousands of people at once. The app has developed a reputation for robust privacy and security, reflecting values that Durov has often promoted on X, where he has more than two and a half million followers. On its Web site, Telegram notes that, because its servers are scattered around the world, “we can ensure that no single government or block [sic] of like-minded countries can intrude on people’s privacy and freedom of expression.” Its lax content-moderation policies have helped to make it a haven for users who might not otherwise be able to post freely, including dissidents, brokers of stolen personal data, child pornographers, American right-wing extremists, and members of the Islamic State. (After Durov’s arrest, Telegram said that its content moderation meets industry standards and that it abides by E.U. laws.)
Despite the platform’s public image, however, cybersecurity experts have long known that Telegram’s encryption system is far shallower than those of its competitors. Many messaging apps, such as the U.S.-based Signal, or Meta’s WhatsApp, use a security protocol known as end-to-end encryption, which insures that only the direct parties of any communication can read its content. Communication on Telegram is not end-to-end encrypted by default; in theory, this means that any illegal activity on the platform should be easy enough to monitor and report to law enforcement. It’s possible to turn on full encryption while using Telegram, but the settings are hard to find and both chat participants have to opt in.
David Thiel, the chief technologist at the Stanford Internet Observatory, who has studied the online ecosystem surrounding child-sexual-abuse material (otherwise known as CSAM), told me, “To the extent that they are marketing user privacy and free speech, it’s not about their tech; it’s about their behavior.” Instead of making it technically impossible to spy on users, as many apps do, Telegram only promises not to monitor too closely. Evidence would suggest that, in many cases, the company is true to its word. Thiel’s team has monitored the platform for hashes—the abstract strings of characters that indicate what’s in a file—of known CSAM material and found it in, for example, QAnon-related groups. “Our systems flagged things in those groups, then nothing seems to happen to them,” Thiel said. He continued, “The fact that we were able to detect that content means that they would be able to detect it perfectly fine as well.”
Beyond content-moderation concerns, Durov’s arrest is a sign that governments around the world are growing more alarmed by what they see as digital platforms’ outsized power. The U.S. is years into its effort to pass a ban on TikTok, fearing data leaks to Chinese authorities and the app’s ability to influence American youth. The European Union is rolling out a series of laws that protect users’ rights to their own data and regulate digital marketplaces, curtailing Apple’s app store, for example. In 2023, following deadly attacks on schools, Brazil temporarily banned Telegram after it refused to divulge data from neo-Nazi groups. On Saturday, the country also suspended X, because the company refused to name a local legal representative, following the government’s issues with the persistence of disinformation on the platform.
In a time of global conflict, the digital services that the world uses to communicate take on more importance than ever. Durov’s arrest comes as Telegram is being used by all sides of the war in Ukraine, including both the Russian military, which uses it for battlefield communications, and Russian civilians, who turn to it as one of their main sources of relatively uncensored news. The Russian military’s dependence on Telegram as a crucial part of its communications infrastructure would seem to have changed the government’s attitude toward its founder: Durov left Russia in 2014, after the Federal Security Service reportedly pressured him to sell his stake in his first social-network company. Now the Russian government is loudly decrying his arrest.
Tech companies are “looking at a very volatile geopolitical environment,” Meredith Whittaker, the president of Signal, told Wired in an interview last week. As a result, they may find themselves newly confronted with more stringent regulation, which would fundamentally change their businesses—and, potentially, their users’ capabilities. The experts I spoke with affirmed that Telegram markets itself inaccurately and fails to guarantee users’ privacy, but they were also concerned that the French government may end up overreaching in its prosecution of the platform. Matthew Green, a professor at Johns Hopkins University who studies cryptography, told me that issues like CSAM can provide a justification for more scrutiny—but, he continued, “I also feel, underneath that, there is a motivation that is more ‘We need to control these platforms.’ ” In particular, there is a worry that governments may soon begin to actively prevent more companies from offering privacy-protecting technologies. (The Spanish government is already considering banning end-to-end encryption entirely.) “I’m hoping they’re just coming up with a bunch of charges and assuming some would drop,” Thiel, the Stanford researcher, said of France’s charges against Durov. But, he added, if French officials succeed in prosecuting encryption, “I would be pretty concerned.”
Telegram is an outlier in the landscape of enormous social networks; the company has reportedly employed around fifty staff members in total, whereas Facebook hires and contracts with thousands of workers for content moderation alone. It is conspicuously not invested in policing its platform. But Durov’s predicament has sparked concern among fellow tech executives that their own platforms’ shortcomings may soon be in government crosshairs. Shortly after Durov’s arrest, Mark Zuckerberg sent a letter to the House Judiciary Committee chairman, Jim Jordan, complaining that the Biden Administration had “pressured our teams for months to censor certain COVID-19 content,” which Zuckerberg now saw as “wrong.” The letter seemed like an attempt to court Republicans concerned about “woke” influence and to preëmpt the kind of criticism that took down Durov: We already try to censor our content, maybe even too much. Musk and Zuckerberg may be right to be concerned. Fines and suspensions are much more likely punishments than outright arrests; nevertheless, the landscape they operate in has changed. For a long time, governments did little to resist the growing power of globalized social networks and messaging apps. It seems that is starting to change. ♦