The head of Telegram, Pavel Durov, has been charged by the French judiciary for allegedly allowing criminal activity on the messaging app but avoided jail with a €5m bail.
The Russian-born multi-billionaire, who has French citizenship, was granted release on condition that he report to a police station twice a week and remain in France, Paris prosecutor Laure Beccuau said in a statement.
The charges against Durov include complicity in the spread of sexual images of children and a litany of other alleged violations on the messaging app.
His surprise arrest has put a spotlight on the criminal liability of Telegram, the popular app with around 1 billion users, and has sparked debate over free speech and government censorship.
That’s a wild way of twisting the logic. Just because the platform doesn’t fall under your e2ee definition doesn’t mean they had to do something that is only possible on purely cloud services.
The reason for arrest doesn’t even have anything to do with encryption. All content that facilitates mentioned crimes is public. Handling it shouldn’t involve any backdoors or otherwise service-side decryption.
It is about encryption though. Since it’s possible for him to get access to anything said in those group chats, they asked him to provide all Telegram has on those users and chats. He didn’t, he got arrested.
He wouldn’t have been in as much trouble if those chats were encrypted and Telegram couldn’t know anything about what’s said in what chat by which user.
Because hw wouldn’t be “betraying” his users by giving everything that was asked of him by the authorities.
Assuming things should work that way is ignorant. According to you, service owners should design and redesign their services to not store any data in order to avoid arrests. Also that a service owner should invent stuff they might not had a plan for if they have even a theoretical possibility to help identify individual users, in other words go against policies they designed at some point.
If they don’t want to be arrested yes, they should either do that or have good enough moderation to not get in the bad graces of some big entities like countries.
I’m not sure what you meant with the rest of your comment.
I mean the basic logic of the service was designed somewhere before its release. Data policies, promises to users are nothing if you assume services should adapt to stuff like this, at the expense of breaking those policies and promises.
Here is an old article from telegram about reasons for how it works https://telegra.ph/Why-Isnt-Telegram-End-to-End-Encrypted-by-Default-08-14
The thing is I think he did think of stuff like this.
From what the article says and from what I knew. Telegram purposefuly made “distributed cross-jurisdictional encrypted cloud storage” to try and evade governments. So he did have them in mind.
If we lived in a world where we didn’t have to think about governments spying on us, we might have not even needed encryption to begin with.
But thank you for the link, it was an interesting read even if I don’t agree with what he’s trying to convey / prove.
After reading the article and the links in the article, I’m not seeing anywhere where they stated the chats there requesting information on were public chats, did you have a source that discussed that I could read up on? It sounds fairly interesting to me.
No, just personal experience (I use telegram for many years) and absence of server data implications anywhere across the issues in the past (at this time too). You can find questionable or illegal businesses in telegram with a few words, they are all public channels. Hence “no moderation” accuses mentioned in every article.
There are of course darknet-like private communities, but I assume they are not a subject of interest at this time. Authorities would need to dig very deep past all the obvious illegal stuff, and telegram shouldn’t care about resources consumed by such a small chunk of user base. Those groups will stay, as they are, private and safe, I assume, for quite some time.