Moderated Content

The Arrest of Telegram's CEO

Episode Summary

Alex and Evelyn discuss the arrest and charges against Telegram's CEO, Pavel Durov, in France, what we do and don't know, and what it means for the future of platform regulation, with Frédérick Douzet, Professor at the French Institute of Politics and the director of GEODE, and Daphne Keller, director of the Program on Platform Regulation at Stanford’s Cyber Policy Center.

Episode Transcription

Alex Stamos:                                           Durov is out on bail. Holding a Russian billionaire who is now... He has a complicated relationship with Vladimir Putin, but Putin... If there's anything Putin does not want is he does not want Durov in French custody. So holding him while he's out on bail is going to be incredibly hard. The odds of him disappearing if DGSE doesn't have concentric rings of people following him right now, obviously the French are quite good at this, but the odds that he does not make it back to the courthouse I think are actually quite good because holding Russian billionaires on bail does not have a fantastic historical precedent.

Evelyn Douek:                                          Hello and welcome to Moderated Content, stochastically released, slightly random and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek and Alex Stamos. And the big news of the week as all of our listeners will no doubt be aware, has been the arrest of Telegram's CEO in France and the subsequent filing of charges against him. And so we have two guests who are going to help us walk through this complicated morass, what we know, what we don't know and what this means for the future of the internet. Frédérick Douzet is a professor at the French Institute of Politics and the director of GEODE. And Daphne Keller, well known to our listeners, directs the program on platform regulation at Stanford's Cyber Policy Center. Thank you very much for joining us both.

Daphne Keller:                                         Thank you.

Alex Stamos:                                           I just need to say it, that makes this Law and Order, [inaudible 00:01:30] edition.

Evelyn Douek:                                          That's excellent. Unfortunately, you'll have no difficulty telling our two guests apart, listeners, because Frédérick is joining us. Are you actually in Paris right now, Frédérick, is that where you are?

Frédérick Douze...:                                    Yes, I am.

Evelyn Douek:                                          Excellent. So we have someone on the ground, our French correspondent reporting to us live from France. And that's basically where I want to start, Frédérick about exactly what do we know about what's going on, what are the charges. The initial framing of this, I think in the initial commentary was this is a big threat to free speech because any platform CEO could be arrested for having bad content on their services, and so that basically every platform CEO is liable. Is that a good framing? What do we know so far about exactly what the charges are under French law?

Frédérick Douze...:                                    So what we know so far, from my understanding at least, is that it is an initiative from the French judicial power alone, more precisely the Paris Public Prosecutor's office. So it's not a political move and it's not coordinated at the level of the European Union and the European Commission has distanced itself from the decision. The accusations insist on the complicity on the wide range of crimes mainly related to child pornography and organized crime. There is no mention of this information or more general content moderation. So it's more about crime and complicity for crime than it is for moderation in general, even if there is some mention of moderation, but it's less a free speech issue than a crime issue.

                                                       And the judicial authority obviously considers that Telegram has lost its immunity, including at the national level because it has not complied with its obligation and therefore it falls under the scope of French criminal law. So the assumption here is that French authorities have repeatedly found criminal content, criminal activities on Telegram, signaled them to Telegram, and Telegram has almost never answered their demands and has not cooperated. So the logic here is to believe that because there has been no cooperation or not enough cooperation, then they lose their liability. And regarding child pornography and some other criminal activity, it is pretty easy to determine what is legal and what is not legal regarding child abuse.

                                                       So that's for the most straightforward part of the accusation. Now, there is also some ambiguity that remains, the decision itself is not specifically targeted at Telegram, so that's maybe why there is this feeling that it might be a warning for all platforms. The scopes of the charges are also pretty large and diverse, even though it's very focused on crime. But Pavel Durov has been indicted for overall lack of moderation. So that raises the point of whether it should be a physical person indicted or a legal entity. And there's also this mention of being indicted for lack of respect of obligations or declarations of cryptology tools, that is another issue that we might discuss later.

                                                       So the press release itself is not very detailed about the legal grounds. So my guess would be that this is grounded in French law, Law on Confidence in the Digital Economy of 2004, which is the transposition into national law of the EU eCommerce directive from 2000, but also in the law of May 2024, which complements this 2004 law and harmonizes French law with the EU law in order to enable the enforcement of the DSA and DMA, which is interesting because it creates a double framework because the DSA is self-sufficient, the Digital Services Act can just be enforced across Europe but is not grounded into criminal law. Maybe to give a bit of context, Telegram is a very large platform, which is supposed to have over 900 million users, but it has declared only 41 million users, so-

Alex Stamos:                                            Which is suspiciously low considering the percentage of Europeans in the world.

Frédérick Douze...:                                    Yes, that's a form of denial of being a very large online platform. But the last quick point that I wanted to make is in the present situation, nobody really has the power to enforce the Digital Services Act on Telegram because it's an [inaudible 00:06:37] European platform. It has a legal representation in Belgium, but Belgium has not been able yet to install its regulation authority. So at this point, nobody can even find Telegram in the European Union for not meeting its obligations with regards to the Digital Services Act.

Evelyn Douek:                                          Okay, that's very, very helpful background and context, thank you. They can't find Telegram, but they have found Pavel Durov I guess in France, and so that's the hook there. Okay, so a lot of things on the table. I will just note as well that I was reading this morning that the EU has opened an investigation into whether Telegram has been fully candid about its number of users in the EU because while as you said, it is a very large platform, it is not technically a very large online platform under the DSA as yet, and so therefore not currently subject to the most onerous provisions under the DSA.

Daphne Keller:                                         And the other thing that's suspicious about their supposed 41 million users in the EU is the cutoff to be a very large online platform is 45 million. And if you're looking to be under that somewhat, maybe you'd pick 41 as a number,

Alex Stamos:                                           Right, you're not going to say 44.99. Yeah.

Evelyn Douek:                                          Right, exactly. Okay, so Daphne, let's turn to you to talk about the DSA because I have seen reporting that's been a bit confused around the role of the Digital Services Act, this new huge piece of European Union legislation that's been passed, this package about content moderation. Now, I think Frédérick's clarification was really helpful in saying that indictment, this press release, we haven't actually seen the indictment. The press release doesn't mention things like hate speech or disinformation, which I think, often when we on this podcast have talked about Europe and have talked about the DSA, have talked about efforts to use the DSA to pressure platforms to moderate things like hate speech and disinformation, but there are other obligations on platforms. And so what is the role of the DSA here, if at all, in how this is playing in?

Daphne Keller:                                         Yeah, so I'm getting a lot of reporters who have the idea that this is about enforcing the DSA, which is completely wrong. As Frédérick just explained really well, this is about enforcing French criminal law. The DSA does put some obligations on platforms and maybe Telegram not meeting those, having a notice and takedown system that has certain mechanical requirements or publishing transparency reports. It may well be violating the DSA's sort of administrative obligations separately, but that is not what this case is about. That's not something that French prosecutors would be enforcing anyway, as Frédérick says, there isn't even the authority established in Belgium to be able to do that yet.

                                                       The only reason the DSA matters for this is because the DSA defines the immunities that a platform like Telegram can preserve if they do the notice and takedown system or otherwise ensure that once they know about specific unlawful content, they act to take it down. And as Frédérick described, it seems like Telegram probably is getting notices about specific illegal content including really, really bad stuff like CSAM and then failing to respond by taking it down. And that means they forfeit the immunity that they have under the DSA, they forfeit this EU level immunity and expose themselves to prosecution by French prosecutors under French law.

Evelyn Douek:                                          Great. And this is not just necessarily speculation. Telegram literally says, "We do not respond to requests about illegal content or content moderation requests in our channels." This is something that is on their website that they boast about, it's not like they are like other platforms maybe who say, "We're trying, we miss things." Like Telegram saying, "This is not something that we do or that we want to do." Okay, so that's all really useful. Frédérick, you mentioned the cryptology aspect of this as well. What's going on there? What is operating cryptology or providing cryptology without a license under French law?

Frédérick Douze...:                                    So this is again in the law from 2004 that's been there for a long time, but it's not something that has been really strongly enforced so far. There is jurisprudence though. There's a case from another famous app called [inaudible 00:11:27] that's been nicknamed a crime messaging app where the French courts have condemned them for not signaling that they were importing cryptology. That I think is really about the type of tools that have been used for organized crime. I think that's what is targeted there.

                                                       So the question is how much do you risk? I think the sanctions are not that heavy, but I think that they took this opportunity to have a very broad scope and that aspect is really not so much linked about private messaging, but really those groups that favor organized crime. And at the same moment there is also another case that is being judged of two persons who've been using Telegram to conduct organized fraud. So it's a separate trial, but it's happening at the same time. So I think that's what is being targeted. So in the French press, some people have kind of made fun of that because it's not apparently a big deal, but they could also take this seriously.

                                                       What I'm not sure is, in the French system, you can have police custody after a preliminary police investigation and then once there is the indictment is the judge that conducts the investigation and there can be a priority questions of constitutionality that are being asked and in the end the judge will gather all the information and decide whether this goes to trial or not. So the question is, what are the charges that are going to hold and be carried to trial and whether some of the charges might be dropped in the context of the investigation by the investigating judge.

Evelyn Douek:                                          Great. Okay. So Alex, we've talked a lot on this podcast about Telegram actually a number of times often how it is left out of conversations about platforms and the harms of digital abuse, but that there are people that have been paying attention to Telegram for a long time and you are one of them. And so the question is, in all of your research of this platform, how similar or different is it in how it operates and what you can find there compared with other platforms? Is this a situation where they are just like Facebook, they are just like Twitter and they're being charged for things that every platform does or is Telegram unique in some way?

Alex Stamos:                                           No, they are unique among the very large platforms and almost certainly they are actually a VLOP per EU's definition and they're close to a billion users, so they're certainly in, whether legally or not, they are a very large global platform. Among those large platforms, they are the only one that intentionally allows people to use their platform for really bad, harmful, illegal purposes. Nobody else does intentionally. Now that doesn't mean bad stuff doesn't happen elsewhere. We have written plenty of reports about bad things that happen on lots of platforms and in fact a lot of our criticism of Telegram happens in the context of talking about bad things that happen elsewhere. But when we talk about bad things happening elsewhere, those other companies get back to us and they say, "Oh no, we're sorry, how do we do better?" And they try to get better and they put task force on it and they fix things and they come back to us and they try to argue it and they say, "Oh, can we do better? How's this? How's that?" Telegram just ignores it.

                                                       And in fact, they've done a couple of things here. One, Durov is paying the price now for something that has made him very rich, that is an intentional decision he's made, which is that he has created, he has effectively lied to people for years about the security of Telegram. He has created this very gray area where they have implied that Telegram is end-to-end encrypted everywhere, and they have denigrated the security of other platforms including Signal where they have said that Signal is an American plot, basically they pulled in our colleague Renee DiResta because she was sitting on a board of something that was somewhat related to Signal a little bit, and that's the CIA taking over Signal because of her college internship. So they have argued that other things are not really encrypted that are, and then they've implied that they're self-encrypted.

                                                       Telegram only supports end-to-end encryption in a very small set of circumstances, the vast majority of use of Telegram is not end-to-end encrypted, meaning that content is just sitting on Telegram servers. And this is not something that you have to do some kind of deep reverse engineering for. If you go right now and you install Telegram, you create an account, you can go find a group that has tens of thousands of people and then you can go see all of the content that they've been talking about. It is stored on those servers, and so they imply there's a level of privacy that doesn't exist and they do this because it attracts people who are doing things that are very sketchy, but it also means that they don't have to deal with the actual product downsides of doing end-to-end encryption.

                                                       It is a huge pain in the butt to do group conversations that are end-to-end encrypted. It's extremely hard. And we don't have to get into technical details, but effectively it has to do with the fact that as you add people to a group you end up with end-factorial relationships between those people. And so this is why you end up with limits in the hundreds of people who can join an end-to-end encrypted group. So I had to deal with this, my name's on a white paper for the end-to-end encryption algorithm for Zoom. And Zoom rooms are limited to a thousand people because you have to deal with all these corner cases of when people add and drop from meetings and such, you have to do things like key rotation, you have to do this, you have to do that, you have to handle all of the possible iterations.

                                                       It's way better to just say you're encrypted and not do it, and then you can handle hundreds of thousands of people coming and going and it's just like it's an unencrypted group, you just store the data in a database and then you sync it up to people and then you're fine. And so it doesn't mean that there's no encryption at all, they do basic transport level encryption, but they have all the content.

                                                       And so they have lied about other platforms, they have misled their users and now they're paying the price for it because governments are waking up that there is a difference between Telegram and WhatsApp and Signal, where WhatsApp and Signal really cannot get to the content. That being said, even though WhatsApp does not have access to the content, they do hundreds of thousands of NCMEC reports and Telegram does zero, and it's because WhatsApp will do things like scan the things that are public, like people's profile photos. They'll also go investigate if they're given a tip like, "This is a group chat with a public link that people can join that is hosting child sexual abuse material." They will enter it like anybody else can enter it, they will grab the content, scan it, and then they will archive all of the data and turn it over to NCMEC. Telegram does none of that.

                                                       The other interesting thing about Telegram is if you look at their policies, and we can link in the show notes to our policy comparison here, is they're the only major platform that implicitly allows illegal activity on their platform as long as it's not in public surfaces. And they really only have the public surface thing because that is a base requirement of the Google and Apple app stores. And so their policy says, "Do not trade child sexual abuse material in public groups." They don't have to say the word public. They can just say, "Don't use us for child sexual abuse materials." They did not have to add a qualifier.

                                                       And no other platform we looked at adds that qualifier. So they pretty much advertise, we are open for business as long as you keep it a little bit quiet, and it's not that quiet. I don't recommend anybody do this, but in our research it is not hard to find people trading the most horrible, terrible stuff. And whenever we do research into child safety work, bad things will happen on an Instagram or a TikTok, but almost immediately people pivot to Telegram. The most common platform that you pivot, if you're a professional, is you pivot to Telegram because that's where you can actually trade stuff, that's where you can actually move money. They have their own coin.

                                                       So that's the other issue here is they decided that, and this has mad Durov rich, they're also going to get a cryptocurrency. So they are a payment platform, a trading platform, a content platform, all of that all in one and so it's a one-stop shop if you want to do illegal stuff. And so I think that is the trap they have fallen into is creating this kind of mystery about security, advertising to people, all your stuff's private. They say We have given zero bytes of data to law enforcement. If you say that on your website, then you're just going to have to deal with law enforcement reading that too. If it's in your FAQ, then I'm going to guess that that is going to be attached to some... That will be brought up in his trial because it does seem that that is in violation then of the laws of all these countries that they operate in. So yeah, they are intentional bad actors. And so I think that is, for the child safety stuff especially, that's what sets them apart.

                                                       Now, what bothers me, a couple of things bothered me I just got to get off my chest here, first, in the French system, the fact that they can arrest you and not have a detailed indictment... So in the United States you'd have to be arraigned, you go front of a judge and the prosecutor has to have a big detailed document that says, "You did this exact thing." And then you have an affidavit from an FBI agent or a cop who says, "I believe they did this thing because we have this evidence." And then the judge says, "Well, this is pretty good. We're going to hold you." The fact that they can just be like, "Eh, we think you did something bad and we're going to hold you," is not that great. As an American, I don't find that super cool, but I guess that's what makes us special.

                                                       Second, the doing math without a license, that is the bad thing here. And so for all the French prosecutors and judges listening to this podcast, of which I'm sure we have dozens, my strong suggestion here is if you want to both have the strongest argument, and if you also do not want to have a massive backlash from the American tech industry, go after these guys for the things that they actually do that are intentionally different and outside the norm, which is they have decided that they will host illegal content because it is economically valuable to them, do not go after them for doing math without a license.

                                                       Because doing math without a license, I'm looking at the English translation of this law from 2004, but effectively every company in the world is in violation of this law because it's like to fulfill this law, you have to inform the prime minister of you're doing a provision of any cryptology service that is not just for authentication or identification, so any confidentiality service. So this is basically probably every major multinational enterprise in France is in violation of this law. So it seems to me that that is going to be both the weakest argument and it's the thing that you're going to get the most backlash to.

Frédérick Douze...:                                    Yes, I just want to say a word about the rules of police custody because I think there is a difference between the press release and what the person who's arrested has a right to know. So when someone is in police custody, then the suspect is fully informed of the offense that he's accused of and the presumed date and place and also of their rights. So they know the nature of the offense. They know when it has taken place, so that's not been publicly released, but that doesn't mean that it does not exist.

                                                       The idea of this police custody, it follows the police investigation and it's really to be able to pursue, to continue the investigation in the presence of the suspect, to guarantee also that the person will have representation before a magistrate, but also to prevent at this stage of the investigation, the suspect from exchanging information or destroying evidence or putting pressure on witnesses or victims and to make sure that the suspect may also be heard by the courts in open court. So that's the spirit, but then it goes to an investigating judge who continues the investigation so that time is the person is sort of shielded from external interactions in order to guarantee the conditions to collect further information, but I believe that he's been fully informed of the offenses.`

Daphne Keller:                                         And I will add a strong speculation, which is Politico reported that this investigation has been going on since March. That article focused on their failure to disclose content to turn over user information to law enforcement in France and on CSAM, which to me suggests that's what the law enforcement and prosecutors are mad about here, that's a motivating thing. But I assume in an investigation that's been going on that long, they've had plenty of opportunity to try notifying Telegram about the existence of specific known instances of CSAM or illegal drug sales or terrorist organizations, et cetera, and document Telegram's failure to respond. So if I'm speculating and writing fan fiction about what has happened in these confidential processes so far, I would guess that's the kind of thing that the prosecutor would bring in front of the judge.

Alex Stamos:                                           Right, I guess I'm not super comfortable with the level of... The fact that we're dealing with press releases from the prosecutor and they don't have to in open court say, this man is guilty of these specific things and provide some level of evidence. I mean, just as an American, I mean I understand that that's how other legal systems, but that makes me a little icky and I would suggest to the French prosecutors, a lot of people would feel a lot better if you said there's a specific case and we had a specific ask and then like I said, dropping the cryptology stuff because that would apply. Again, that feels like a very selective prosecution. If this is the only time that that's ever been used, that is the kind of thing that is going to get a lot of people's hackles up and it is not equivalent to the law in a lot of other places.

Frédérick Douze...:                                    Correct me if I'm wrong, but I think the investigative judge in France is quite with the specificity, so the idea is also to protect the confidentiality of the investigation to avoid interferences in the investigation. Another point of context that might be interesting is that the initiative was initially launched by the Office Minors, which was created in November 2023 in France to combat the most severe violence against minors. So they took the lead and then they were joined by other organizations like the national Anti-Fraud Office, or also they have co-seized the Center for the Fight Against Digital Crimes. But initially that was the office in charge of protecting minors that was in charge. So it's consistent with all child sexual abuse concerns that were raised and probably notifications that were made to Telegram that they have not answered, so clearly that move from the French judicial authority is really most likely about frustration of having no answer and no cooperation from Telegram.

Evelyn Douek:                                          Okay, so given we've been talking a lot about what we know and don't know, I just want to timestamp our recording here. So we're talking on the morning of Friday, August 30th, this specific time, and this podcast might not come out until next week. And so when we're talking about what we do or don't know, new details may emerge over the weekend, and that's our knowledge base that we're working from.

                                                       And there still is a lot unknown. The press release is very broad, it speaks in large generalities, but we are starting to fill in those details based on what we know about law enforcement, what we know about the platform. And so Daphne, my question to you is, given that speculation, if the facts are, as we've been describing, as Alex has been talking about, what you can find on Telegram about its FAQ on the website, which is like, "Hi, we don't report to law enforcement." And then we say, "And we never work with law enforcement." A lot of the meta narrative about this has been like Europe's bad free speech versus the free speech utopia that is First Amendment land USA, and we would never have such a terrible Draconian free speech response in America, based on the facts that just been describing and the speculation that we've just done, what would be potential liability for that in the United States? Is that the kind of thing that the First Amendment protects or is this the kind of thing that A CEO could also be liable for in America?

Daphne Keller:                                         Yeah, so let me start by bucketing the charges into three categories, and Frédérick can tell me if I'm missing nuance here, but to me it looks like there are a stack of charges that are about Telegram being liable for illegal content that users shared or illegal things users did on the platform. So this is the CSAM charges, etc. Then secondly, there's a charge that is about failing to disclose user data to law enforcement in France. And then lastly, there are these three charges that are at the bottom of the bullet list they released that are all about distributing an encrypted platform, and these are the charges that Alex is calling illegally doing math, and I agree strongly with his concerns about those charges.

                                                       The charges that I think have a really clear analog in the US are the ones about Telegram knowingly allowing users to continue sharing extremely illegal content such as CSAM. And in the US there is no immunity that platforms have for violating federal criminal laws, including criminal laws about CSAM and material supportive terrorism and drug sales and money laundering, et cetera. In fact, we know from the prosecution of Ross Ulbricht, the guy who ran Silk Road, that platform operators can go to jail under those laws.

                                                       And it's funny, in a way, European law is more protective of platforms in this situation because platforms do have DSA immunities as a starting point, and then you have to do notice and take down to maintain them in the EU, whereas in the US there's just no special platform immunity law at all for these federal laws. And so if an American prosecutor wanted to bring a case like this against Telegram or maybe against Durov personally, I think they could. The standard for CSAM is did you know that it was there and continue to host it or transmit it, et cetera, knowing that, also there's an affirmative obligation to notify NCMEC when you find CSAM.

                                                       So I don't think it's that different. I mean, it might be that in practice, in attempting to prosecute a case like that here, you would run into questions about whether, "Well, it really is the standard that this had to be intentional." You would get into some fights about what the mental state is of the platform operator that would create liability. But I think similar questions are likely to arise in the French system. So I don't think it's all that different here in the US for these extremely bad content, stuff that is subject of federal criminal law and where Telegram is getting notices about specific content and failing to respond.

Alex Stamos:                                           Yeah, I'm just going to give a shout-out to, we wrote a much, an 80-some-page, report on the NCMEC cyber tip line, and in it our colleague, Riana Pfefferkorn, I think did a very tight explanation of the law here, but it's 18 USC 2258A creates an affirmative responsibility to report things to NCMEC. Certainly if you ignored that, and especially if you had millions and millions of users, you would eventually go to jail. It wouldn't just be instant, they're not just going to show up, but you will get a call first from NCMEC and then the FBI, and then a judge would ask you to show up, and if you didn't show up, they're going to send the US Marshals, you can't... And certainly search warrants. If you read a search warrant, it's terrifying. It's like, "We order you to show up at this time with this evidence." It sounds like...

                                                       American search warrant language, it's like old West language. You almost expect it to say, "We'll take you dead or alive with a bounty hunter that will..." It's serious, and Musk is freaking out about this, but X reports stuff to NCMEC. Now, not perfectly. One of the things we found to do in our work is that their reporting pipeline had broken and that they had open A1 CSAM on their front page that they were not taking down. We reported that to Twitter, and if we reported that to Telegram, they just ignore it. We reported to Twitter, they freaked out, they fixed it, it got fixed. So that was a mistake, but it's not something they wanted to happen. And Twitter responds to search warrants all the time. They talk about that, they respond to search warrants. In fact, it looks like some of the search warrant response rates have gone up, at least for some other countries as we've discussed on this pod.

                                                       So Musk actually believes, he has said multiple times, "If you operate in a country, you have to follow their laws." So I'm not sure what he's freaking out about because he has followed the law in a bunch of countries to either censor people or to turn over data. And so as long as he continues doing that, he's not going to end up in a problem. I mean, he's picking and choosing the places he follows the law. He's in a big fight in Brazil, which we'll talk about in a future episode. So certainly if he just decides to kick sand in the faces of judges and then he decides to fly into that country, he could be in trouble. But in this case, he's not doing what Telegram does, which is ignore the law in every single country in which they operate.

Frédérick Douze...:                                    So if I can just nuance this, so he's not doing the same yet. I think from what I hear, he is kind of second on the list of very large online platforms that don't cooperate so well. So it's not the same situation and it's not the same type of platform, but there is this question of how much you need to cooperate to preserve your immunity and that might be what is a concern [inaudible 00:34:14]. I don't know.

Alex Stamos:                                           Doesn't cooperate in what way, would you say?

Frédérick Douze...:                                    Well, with the authorities with taking down content that is illegal, in that way.

Daphne Keller:                                         And the letters that Thierry Breton has been sending, which I think are very problematic and overstayed his authority under the DSA and undermine the whole project of the DSA as sort of a legitimate source of law. But some of those letters, as I recall, allege that specific illegal content was reported to X and X did not take it down. So if that's true, then X forfeits immunity for those specific pieces of content the same way that we think Telegram has done.

                                                       And I wouldn't be too surprised if that's true, just because it seems like in purging the Trust and Safety team at X, a lot of balls got dropped. It seems like even they dropped the ball on responding to DMCA notices in the US, which exposed them to a whole bunch of money, liability, high damages, there's a case going on in a federal court in Tennessee about that. So it seems plausible to me that X has made mistakes and not responded to notices and forfeited immunity, but in a way that was a mistake, part of just being a chaotic mess, maybe, not something that is the sort of deliberate flouting, or that is not yet the sort of deliberate flouting of authority that Telegram [inaudible 00:35:42].

Alex Stamos:                                           Well, one thing that's happened is Musk fired the people who ran the law enforcement response team and then said that they were FBI plants and then used the fact that they get reimbursed for wiretaps, which is in the US law, and something that AT&T does and Sprint and every company gets reimbursed by the US government for the costs associated with search warrants, he said that that was an FBI plot. I mean, he just doesn't know what he's doing. And so yes, I expect that their response time... Dealing with these things is actually incredibly hard. I'll just say, being on the platform side, it's gone better. But when I was at Facebook, we were dealing with a fax from a German prosecutor. So you have a search warrant written in German, that's been photocopied three times and then faxed and signed by a German judge, and you have to authenticate, is this actually this German judge and do they have the right to have this information? Turns out to be incredibly hard.

                                                       What Facebook did was try to consolidate this all into electronic communications. And so everybody around the world, they made a humongous push to get all the judges, all the prosecutors, all the courts onto an electronic system where you're authenticated and it's all uploaded. And so the first time you do a search warrant, it's kind of a pain, but from then on it's much simpler and the odds of somebody abusing the system is much lower. Because one of the things that's happened is there's been around the world, a huge abuse of the search warrant system because bad guys have figured out, wait a second, I can just fake a letter and then fax it to a fax number that is not that hard to find at Facebook, and then I can get somebody's information sent to me. So a number of bad guys, including child abusers, have figured out ways to stalk people to get private information by abusing the system.

                                                       And so this is actually, we can talk about this some other time, this is something that needs a global solution. The EU has actually been working on this, the possibility of an EU clearinghouse in which all the different... Because you're not just dealing with Paris Metro, you're dealing with some tiny little village cop, or in the United States, every Podunk County's Sheriff's Department, and DA, and local state and county judge. And so we need country wide solutions here for authentication and standardization of this stuff. But it is actually quite a difficult problem, and if you end up firing all the people who work on it, your SLAs are going to get quite poor.

Frédérick Douze...:                                    And if I may add, I think regarding Thierry Breton's declarations, that it creates some confusion between what X does as a company. And I think defunding the Trust and Safety Team does create a hazard on how they can comply with the law. And what Elon Musk does as a personal user of his own platform, which tends to aggravate some people but does not fall under the same regulations.

                                                       But it's also, if I may bring a bit of geopolitics here, we're talking also about two very different worlds. And today, Telegram not only is being used for child abuse and criminal organizations, but it's also become the vector of Russia's information or influence in the world. And clearly, Pavel Durov is a figure of very strong geopolitical importance. I mean, the platform is used in the context of the war in Ukraine, and there is a lot of this information that is produced in the platform and disseminated, propagated by other vectors. So it's quite interesting as well in this context, given how the platform is accused of carrying Wagner's operations in Africa and in other places. I think even if it's not a political move, it will have strong geopolitical consequences.

Alex Stamos:                                           Yeah, it's a fascinating point is that there are two major wars right now where Telegram is the most important platform, both the Russia-Ukraine War and the Middle East war, the Israel/Palestine, Hamas uses Telegram both for operational communications but also for their propaganda, something they learned from ISIS. And the crazy thing about Russia-Ukraine is, especially, in the early days of the war, both the Russians and the Ukrainians were using Telegram for battlefield communications, it's just nuts that this is the product that isn't...

                                                       And also I think this goes back to the fuzziness around encryption is all these people thought it was secure, and it's not that secure, which also I think is one of the reasons why some western intelligence services and such are probably okay with it. Especially in those very early months of the war, it was a very dangerous time to be a Russian field officer. The Ukrainians were really good about putting precise fires right on top of the heads of Russian generals and colonels and such. And one of the theories was it was because they were using unpatched Android phones doing Telegram and NSA, and the other Five Eyes intelligence agencies were backing up the Ukrainians with signals, intelligence capabilities partially based upon the insecurity of Telegram. So the fact that the Russians use it as their tactical communications net, probably not a great idea.

                                                       Now an interesting thing is the Russians are freaking out now. They are losing their minds. Which is also I think an interesting thing that we need to watch for, is Durov is out on bail. Holding a Russian billionaire who is now... He has a complicated relationship with Vladimir Putin, but if there's anything Putin does not want is he does not want Durov in French custody. So holding him while he's out on bail is going to be incredibly hard. The odds of him disappearing, I mean if DGSE doesn't have concentric rings of people following him right now, obviously the French are quite good at this, but the odds that he does not make it back to the courthouse I think are actually quite good because holding Russian billionaires on bail does not have a fantastic historical precedent.

Daphne Keller:                                         Maybe to expand a little on the geopolitical and institutional jockeying going on here, at the EU level there's some interesting questions about who did this and why. So it's a little bit less sexy than the Russia Vladimir Putin part, but for EU policy wonks, it seems like probably the commission was surprised by this and maybe not that into it. Presumably whenever Belgian authorities are eventually going to become the Digital Services Coordinator there, the DSA enforcer, that they might feel a little bit upstaged. That's pure speculation, I don't know who they are. And then separately, it kind of sounds like maybe Macron was surprised by this, and it's been reported that he is himself a heavy Telegram user in, for example, communications with reporters.

Alex Stamos:                                           Why?

Daphne Keller:                                         Alex is shaking his head in dismay but also-

Alex Stamos:                                           On behalf of L'ANSSI, the Cybersecurity Agency of France... How do I say, "Why, sir?" In French, Frédérick?

Daphne Keller:                                         Pourquoi, Monsieur.

Alex Stamos:                                           Pourquoi. Pourquoi.

Daphne Keller:                                         Maybe if you knew this investigation was coming, you might migrate off a Telegram right away, move to Signal. And so it's just interesting looking at who is acting and who is coordinating with whom and who might be annoyed with whom at an EU institutional level and French institutional level also.

Frédérick Douze...:                                    But that explains also the ambivalence of Putin regarding Durov because I mean, he previously created VKontakte, which is a huge social network, but he's allegedly never collaborated with the FSB unless there is a secret agreement with the services, but I have no information on that. But on the other hand, because it's connecting nearly a billion users and because what's going on during the wars is being directly reported, archived and he has access to all the data as Alex mentioned earlier, I mean, there's a trove of information and enough data that is very sensitive and very important, and that's also something important for the Russian power potentially. So he is a major geopolitical player in that sense, and he participates in the topological power of Russia through handling so much critical data.

Alex Stamos:                                           Okay, this is where I want to get a little bit paranoid is I'm going to do, Frédérick you might not get the cultural reference here, but the other two will. So I have no evidence for this, but we're going to start to dive into the truly paranoid theories here of which there's two. One possible theory here is yes, Telegram is sitting on perhaps the most important treasure trove of untapped signals intelligence information about the Russian-Ukrainian war, and about Russia in general. And so one option is that this is not an independent French investigation, but this is being driven by the French state to try to create pressure on Durov for them to do a deal to try to get information and back into access to Telegram.

                                                       But there's an even crazier option. And I'm going to throw out this crazy option, again, there's no evidence for this, but I do want to believe, and one crazy option is that Durov went to France intentionally because he's trying to hide from Putin. Because one of the things that is a possibility here is Durov has had this difficult relationship with Putin. The Ukrainians freely use Telegram, they use it all the time. They are using it for, partially, to organize their war. Ukraine is currently occupying Russian soil.

                                                       We are living through the second invasion of Kursk, the first being by Nazi Germany. And Putin is not known as being a psychologically well man, but the kind of thing that might really push him over from, hey, Durov is this irritating kid with great abs who I'm going to put up with to pushing him into Prigozhin style of, this guy's an actual traitor to Russia, might be in a situation in which you're losing the Ukrainian war and you can't get Telegram to give you the information you want, then you might all of a sudden say, this guy is now... He's persona non grata to me. He's actually crossed the line.

                                                       The Russians have a long history here of yes, they will mess with you if you're not their friend, but if you are a Russia and you're seen as betraying them, then that is the, you cannot have tea anywhere, you are getting poisoned or kidnapped or your plane gets blown up. And if you're Durov and there's rumors that he was trying to meet with Putin in a neutral location, he doesn't want fly to Moscow, that Putin would not meet with him, and then all of a sudden he flies to Paris, why would he fly to Paris on his own private jet? Right after he was visiting the Central Asian countries, he has his girlfriend who's very openly posting, "Look at me. I'm in Central Asia." Here's all these Instagram posts that she's making that are obviously geo-tagged. The whole thing's very suspicious.

                                                       And so one possibility too was that Durov was flying to France to do a deal and that this thing either messed up the deal or this is part of the cover, or this is the French trying to get some leverage as part of the deal, that Durov is looking for a safe home because Dubai is not going to be able to protect him from the FSB.

                                                       If you're him, you can't go to the US. Trump's got a 50/50 chance at being president, and Trump would send Durov back to Putin with a bow on his head. You can't go to a Five Eyes country because they would extradite you to the us. You need to find a western country that's powerful enough to protect you from Russia, probably means nukes, means having a strong intelligence service who will do a deal for you. And he is a French citizen. And nobody really knows why he's a French citizen, apparently getting French citizenship is quite hard. You guys are all snooty about your language and stuff, and nobody knows if he actually speaks French. So anyway, that's my contribution of I want to believe of the truly paranoid possibility, but there is a possibility here that what we're seeing is just the tip of the iceberg and that aliens are real. Sorry.

Daphne Keller:                                         So I want to jump in with some paranoia that I think is actually justified and worth paying attention to. Alex's might well be also, but we're seeing a lot of people on social media really freaking out about threats to free expression from this, and most of that I think is overblown. But we've talked kind of in vague terms so far about platforms maybe losing immunity or facing liability if they're failing to cooperate with police. What does that mean? Are there limits on what that means? I want to believe that there are limits, and I do believe as somebody who studies EU law, that one important limit is that Telegram should only face liability if they actually knew about specific illegal content and didn't take it down, that's very crisp. I also want to believe that they only have to hand over user data if there is a clear formal process with protections for the rights of the people whose data is potentially being exposed.

                                                       It's also, there's a really important red line in EU law where the government can't come along and make platforms just go proactively searching for things and monitoring their user's communications in ways they otherwise wouldn't, that there should not be a form of cooperation with law enforcement that involves that. There are a bunch of important bright-lines that exist in EU law and in French law to protect fundamental rights. And I think and hope that those are being respected here. But I understand why people are worried because we're only hearing these vague communications about what's going on, and we have to hope for the best about the protection for free expression and access to information and privacy and data protection.

Frédérick Douze...:                                    So if I may, at this stage, today, it's still an ongoing legal proceedings, so there's going to be adversarial principle, there's going to be an investigation. If they can't establish the facts that you've just mentioned, then the charge might be dropped. But I believe that the reason why they're pursuing and they indicted him is that they have the evidence to support that hypothesis.

Evelyn Douek:                                          I think this is really a super important piece of this, Daphne, and a great place to start closing up, because I think the reason why this got so much attention and got so much focus is not necessarily because of Telegram, although it is a very large platform, and it is a very important platform, but also because it's coming in the context of this now nearly decade-long debate about the responsibility of platforms and their CEOs for content on their services. And when can A CEO be made liable for bad stuff that happens on every single online service?

                                                       And clearly the answer cannot be always, the answer always is a very bad answer for free speech. And if you can arrest a CEO every time that there's even the worst of the worst, even when there's CSAM or human trafficking or drug trafficking, all of these things on their platform, that's going to have terrible consequences for free speech because they're going to become very risk averse, they're going to shut down a whole lot of channels, they're going to shut down a whole lot of affordances, any sort of hint of liability is going to make them crack down, and it's going to have bad ramifications for the rest of us.

                                                       But there has been an effort in this debate to say, what the French are doing is to say that you can always be liable, and every platform CEO may now be liable. And as we've been talking about today, that's not what's going on here at all. What we think is, well, maybe this is at the far extreme where people of reasonable minds will agree that when you know, and you've been repeatedly notified that there is this terrible, terrible stuff that is not protected under any system of freedom of expression, including the first amendment, then you have a responsibility to remove it.

                                                       And then there's this whole gray area in the middle, and that's where our conversation about X plays in. So what level of negligence or willful blindness is adequate to expose you to liability? If you fire absolutely everyone that was looking for these things or just like, don't check that inbox for a while, is that enough to expose you to liability? And I think that that's the very big meta question that is on the table here and that we will play out as more details emerge as we know how much of this is exceptional to Telegram, how much of this is that very, very high bar, and how much of it is in the gray area? And we still don't know enough. And I think Alex is right that it is concerning that so much of this debate is playing out-

Alex Stamos:                                           Well and I think that one of my fears-

Evelyn Douek:                                          ... in the absence of so many facts.

Alex Stamos:                                           So, the X Files thing, I don't think that's true, but I just wanted to throw that out there because there is a possibility that there is a deeper geopolitical thing. I think most likely this is what it seems like, which is they are the worst actor on CSAM and fraud and a bunch of other stuff.

                                                       But I think, unfortunately, this might be a bad defendants make bad law situation where if they had just done the minimum and dealt with the worst things, they would not have created this precedent that now might be informed because we might create this precedent where it is so bad that everybody has to agree, "Oh my God, these guys are the worst actors," and all of a sudden you get the French legal system revved up about arresting CEOs, and the next time it is not about the widespread abuse of children, but it's about people posting videos where comedians are making jokes about immigrants. And then that's where you start to get a real transatlantic rift, because that is the fear of Americans, is that you end up with European sensibilities about what is appropriate hate speech, what is appropriate comedy, what is a loud speech online being enforced via-

Daphne Keller:                                         What protesters can say.

Alex Stamos:                                           What protestors can say.

Frédérick Douze...:                                    Yes, but that's... I mean, the big difference if you look at French law is that it's focusing on something that is clearly defined in law about what's legal and not legal, or has disinformation, it is not so clearly defined. I mean, there's no definition. So you can't use the same type, I think, of decision for something that does not have a clear legal status.

Alex Stamos:                                           But for cryptology, you could. So if you end up not liking WhatsApp, you could just say they did not get a license to put elliptic curves into their app, which they have not. But like I said-

Evelyn Douek:                                          Yeah, but that's still-

Alex Stamos:                                           Every major French company has not gotten a license to put cryptology into their computers.

Evelyn Douek:                                          Okay, so any final thoughts?

Frédérick Douze...:                                    Just one more thing. Apparently there is also accusations of physical abuse on one of his children in Switzerland, so that's something that's coming up as well. And that might bring some new developments in the story.

Evelyn Douek:                                          But he can't be liable for that, he's a platform owner, I guess he's the response of some.

Alex Stamos:                                           Wow. Okay, that was a heck of a turn there at the last-

Evelyn Douek:                                          Daphne, anything you-

Alex Stamos:                                           ... 10 seconds of the podcast.

Evelyn Douek:                                          Clearly being facetious, it's just that the theme of this podcast has been it cannot be enough that you own a free speech platform to get away with [inaudible 00:55:14] heinous conduct.

Alex Stamos:                                           I think you should wrap it up there, Evelyn. And I've got some exit music for you as you wrap it up. So here you go. Go ahead.

Evelyn Douek:                                          Excellent. All right. Well, this has been a Moderated Content episode, special edition on Telegram. This show is available in all the usual places and show notes and transcripts are available at law.stanford.edu/moderatedcontent. This episode is produced by the wonderful Brian Pelletier. And special thanks to Lily Chang and Rob Loughlin.