Moderated Content

MC LIVE 9/27

Episode Summary

Alex and Evelyn repeat the now-annual tradition of recording the podcast in front of probably their entire active listener base. They are joined by David Thiel, Brian Fishman, and Daphne Keller, to say goodbye to Theirry Breton and RT's accounts on Meta, talk about Zuckerberg's retreat from politics, and all the developments in the land of the First Amendment and platform regulation.

Episode Transcription

Alex Stamos:                                           We have to start this moderate content on a sad note, saying goodbye to somebody we love very much.

Evelyn Douek :                                         He's been really important to this show over the last many months and years, I think, being a reliable source of important content for the show.

Alex Stamos:                                           You guys know who we're talking about? The one, the only, the irreplaceable, [inaudible 00:00:22] Thierry Breton of the European Commission.

Evelyn Douek :                                         Yeah, just shocking, terrible, shocking news. Last week, France's European Commissioner Thierry Breton resigned accusing his boss, the president of the commission, Ursula von der Leyen, of undermining him and of questionable governance, which you have to respect as a way of going out. You might remember Thierry from such hits as his hostage video with Elon Musk in Texas where they awkwardly shake hands and Musk says, "I really think the DSA is exactly aligned with my thinking." His many letters to platforms over the years, efforts at jawboning effectively, urging them to take more aggressive action on disinformation. And most recently, a really touching moment, a memory that will live on forever. His letter to Musk in advance of the interview with former President Trump, the former President of the European.... Oh wait, hold on. No, that's not right. The former President of the United States reminding him of [inaudible 00:01:19] obligations under the European Digital Services Act-

Alex Stamos:                                           It's so hard-

Evelyn Douek :                                         To take mitigation measures.

Alex Stamos:                                           To say goodbye-

Evelyn Douek :                                         A real great track record of-

Alex Stamos:                                           to Thierry Breton.

Evelyn Douek :                                         Greatest hits there. I don't know how we will replace this amazing, amazing stream of content, but we will persevere. We will endure.

Alex Stamos:                                           Hopefully the Europeans can find somebody else to lecture Americans on how the First Amendment is against European ideals of speech. Goodbye, and enjoy that last TGV. Back to Paris.

Evelyn Douek :                                         Here we are. Hello and welcome to Moderated Content stochastically released slightly random and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek and Alex Stamos, and today recording live from the Trust and Safety Research Conference version 3.0 at Stanford University. All 20 of our listeners are here with us in the room today.

Alex Stamos:                                           It's amazing. It's 500 people. It's guaranteed.

Evelyn Douek :                                         6,000. 6,000.

Alex Stamos:                                           I'm counting the same way that Twitter counts daily active users.

Evelyn Douek :                                         Yeah, that's right.

Alex Stamos:                                           I'm sorry, unregretted-

Evelyn Douek :                                         3 million people, wow, this is-

Alex Stamos:                                           Unregretted seconds. Unregretted podcast seconds. There's about a billion unregretted seconds in this room, yes.

Evelyn Douek :                                         It might be going up. 1, 2, 3, regretted seconds as we proceed.

Alex Stamos:                                           Lots of regrets. There's beer out in the courtyard, folks, if you want to regret these seconds less, you're welcome.

Evelyn Douek :                                         Lock the doors. Lock the doors.
                                                      All right, so we've already said one touching goodbye, but we have another touching goodbye to make on this podcast. Another farewell that I'm sure Alex is really a tough one for you because this is at the end of a very long relationship for you. It goes way back. Lots of memories, I'm sure, for you. The relationship between you and RT's accounts on Meta, which were-

Alex Stamos:                                           I'm less worried about fair use here of the Soviet Union filing a complaint, but yes.

Evelyn Douek :                                         Yeah, they don't use fair use. They don't use copyright claims. That's not what you need to be worried about. So yes, last week Meta said it was banning RT and other Russian state media networks from its platforms claiming, shock horror, that the outlets had used deceptive tactics to carry out covert information and influence operations online. Alex, I'm sure you have lots of thoughts about this. Why now? Why here in September 2024 has RT suddenly been taken down?

Alex Stamos:                                           That feel a little late to you?

Evelyn Douek :                                         Yeah. They only just started.

Alex Stamos:                                           This was an adventure that I started on back in 2017 when we were looking at what happened in 2016. As folks in the room know, RT is an only putatively independent media agency really under pretty much direct control of the Kremlin with direct financial support from the Russian state of the Russian taxpayers. And what finally got them in real trouble is they've been in this cat and mouse game with Facebook. They've done, what, overt propaganda for a very long time, but for a long time were considered the equivalent of our Voice of America. That the R stands for Russia, everybody knew it was Russia, and they said that this is what their position was and everybody knew that this was the propaganda of the Russian state and we're okay with it.
                                                      After the actions of the 2017-2018 timeframe, when the Internet Research Agency went down and all these Russian propaganda accounts went down, RT started to take on a different set of responsibilities within the Russian propaganda sphere in which they started to run a variety of different, much less overt accounts. Very famously back then we had the [inaudible 00:05:09] media accounts, so they started to build these subsidiaries where they started to hide that these subsidiaries were actually subsidiaries of theirs. They would hire Americans, so it no longer was a Russian accented voice, but it was an American who's being paid, or a person in Spanish and such. I think for a while actually, RT Spanish was one of the largest Spanish language news sites on YouTube.
                                                      And so they got quite big on the semi-overt, semi-covert stuff where yet they weren't totally lying about who they were. The final straw here was this last DOJ indictment in which they directly pointed out that RT was now powering totally covert operations in which they're directly lying about who they were, and that I think was the straw that broke the camel's back for Facebook of directly violating your favorite policy, the coordinated inauthentic behavior policy, the policy that you have never complained about or given me any crap about, either for naming or for the actual implementation of it.
                                                      That finally after years and years of RT playing games, of going right up to the line of Facebook's policies that just declaring the whole thing persona non grata and of saying, "Instead of playing these games, of doing these behavior-based rules where we know you are going to evade them by changing your behaviors to go right up against the line and then we ban those behaviors and you're going to go right up again, we're just going to declare anybody who is part of the RT universe to not be allowed," which I think was a totally appropriate thing. It is something that, to be frank, I tried to make happen years ago and I failed to do.
                                                      I'm not going to name the people who did this partially because I think they want to be able to drink tea or fly in private jets or stand next to windows sometime in the future, all things that you have to be careful of doing if you piss off certain people in Russia. But there are some people who are very brave and who push this inside of Facebook who should be taking big laps today, including some people who are actually here at the conference who laid the groundwork for this when they were at Facebook years ago, and some people who aren't here who are still at Facebook, who are actually the one who took this fight internally at Facebook and finally finished what we started eight years ago.

Evelyn Douek :                                         Yeah, I think it's important though to underline that policy rationale and exactly why the accounts were taken down because it's not just that. You might read a headline, "Meta has finally removed RT," that oh, we're now taking... Like all Russian propaganda needs to be taken off our services or anything along those lines. And ordinarily Meta's answer, and indeed the First Amendment answer to this has been you can spread Russian propaganda, you can spread any propaganda provided you're transparent about it, provided you say, "Hey, this is who we are." And this was the line that they crossed in the indictments as well, was failure to register under the Foreign Agents Registration Act. And so that's what you're saying Meta justified the policy as well on their platforms. It was this failure to be transparent about manipulation.

Alex Stamos:                                           Right. And for years, Meta was okay with it, and continues to be. If you go on Meta right now, there are a bunch of state actors, is CGTN is still up, I believe? A bunch of Chinese actors who still say we are this. They don't say we're a member of the state of the Chinese government, but they're not lying about who they are. I think this is a huge shot across the bow of those actors of you're not going to be able to get away of creating these brands at which you completely obfuscate that they happen to be you're paying influencers or you're paying people who are local to certain localities. We'll talk to David in a second about a specific African example, but they were really obfuscating that these were not Russian voices. They're trying to make it look like they're local, legitimate voices within certain localities while hiding the Russian money that was behind them.

Evelyn Douek :                                         So this story has something for everyone, just as it has something that caters to your long-running interests and commentary in this space about RT and Russia's use of the platform. It has something for me and my favorite [inaudible 00:08:55] which is YouTube, because YouTube continues to YouTube about this. Meta takes action. It turns out YouTube also took action on RT's accounts, but it didn't announce this until the great tech reporter, Casey Newton of Platformer asked them to comment on what they were planning to do and they say, "Oh yeah, yeah, sure. Nope. Yeah, we've removed them too," which is exactly-

Alex Stamos:                                           Oh, thanks for asking, Casey.

Evelyn Douek :                                         Yeah, exactly.

Alex Stamos:                                           We were just about to announce that It's just a complete coincidence that... Yeah, US is five minutes before-

Evelyn Douek :                                         Oh, my hand was on the send button and really want to be transparent and upfront about this.

Alex Stamos:                                           That is also something that I've never experienced personally, is Google hiding behind their skirts. Yes.

Evelyn Douek :                                         Yeah. So YouTube, never change, or do change actually. It would be great if you could change,.

Alex Stamos:                                           Why do we not have a YouTube sound? I was just thinking like I need to push the YouTube button.

Evelyn Douek :                                         Yeah, that's right.

Alex Stamos:                                           But we don't. What should we be? We've got the audience here. First, are there any YouTubers in the audience? We should have asked this at the beginning. There's somebody here, it's fine. We're not going to point you out, but what should YouTube sound be? What's the sound for the lack of transparency? What's a sound for opaqueness?

Evelyn Douek :                                         Sad trombone? Do we have a sad trombone?

Alex Stamos:                                           We've sad trombone, but we've used that for Twitter. Right?

Evelyn Douek :                                         Oh yeah. Ouch. YouTube. Yeah, we'll work on it. Happy to take suggestions.

Alex Stamos:                                           Oh, here we go. The crickets.

Evelyn Douek :                                         The sound of YouTube making a public policy announcement.

Alex Stamos:                                           There you go.

Evelyn Douek :                                         Okay. And TikTok also made the move today apparently in taking down the accounts, so-

Alex Stamos:                                           They finally made the move.

Evelyn Douek :                                         There you go. Okay.

Alex Stamos:                                           Which is a little surprising, because TikTok's actually been super aggressive on some of these things.

Evelyn Douek :                                         Yeah, why? Are they feeling heat? Any stress wanting to be looking like a very responsible platform for any particular reason?

Alex Stamos:                                           Yeah. You tell me, is there anything going on, say in the DC circuit right now? I think we'll talk about that in a minute, right?

Evelyn Douek :                                         We will talk about that later. But you mentioned David Thiel's been tracking some other influence operations that have been happening across the platform. So let's bring David in now, chief Technologist at the Stanford Internet Observatory. Welcome David, thanks for joining us.

Alex Stamos:                                           David's going to appear magically. Here he comes. Oh.

Evelyn Douek :                                         Whoa. Amazing.

David Thiel:                                           Lovely to be here. Thank you.
                                                      So yeah, it was actually refreshing to have a little bit of a throwback at SIO to take a look at analyzing Russian information operations and following some of the patterns that they had in the past, but-

Alex Stamos:                                           It's classic. Didn't make you feel like a younger man, David?

David Thiel:                                           Yeah, it was a different time.

Alex Stamos:                                           All of a sudden my beard was black and yeah, I felt much younger.

David Thiel:                                           I didn't have one. So yeah, it's been a while. So one of the organizations that was named by the State Department among a few was a news operation called African Stream. It operated across most social media platforms and tended to push a left-wing somewhat arguably communist in some states. Most of it following narratives that were favorable to Russia, although not entirely, and in various times being pro or anti-Wagner group as well. And in this case there were definitely slants to a lot of the coverage that were potentially questionable, but not on the whole a disinformation operation or anything like that except when it comes to actual funding.
                                                      In this case, it was interesting going back and seeing some of the history of the organization over the last roughly a year or so. Most of the people that were involved with it had a decent history of working for state-sponsored media organizations. So either press TV, which is operated by Iran, CGTN operated by China. They seem to just circulate among these state-sponsored news outlets.

Alex Stamos:                                           Do these folks have a LinkedIn, do you think? Is it like... They're on LinkedIn for state-sponsored media. Yeah.

David Thiel:                                           Oh, like an overarching group where they can just-

Alex Stamos:                                           It's fascinating to see the people working for like press TV and then moving to CGTN and moving Africa Stream.

David Thiel:                                           It's very interconnected as well as with some of the more notorious groups like Grayzone that overlapped with a bunch of these. These have all various reporters at various times receiving sponsored money by authoritarian states. So one of the things that was interesting here and dovetails with trust and safety research is that they had actually been called out by a number of people that they were ostensibly targeting, the moderators on Reddit of the Africa subreddit, of the Haiti subreddit, Somalia, had called them out various times and called them like an astroturfing organization, asked them where they were getting their funding from, telling them that there were huge gaps in their stories or that were always leaning in the direction of a certain politician.
                                                      And their response, as near as we can tell from what remains is that they would get very testy and then go and delete all of their testy responses, apologize and try and give a little bit of detail, which never went beyond, "Yes, we have real people that have an office in Africa," they just said, "Come to our office in Kenya. You can see that we have an office" nowhere specific Kenya apparently. But there were some true statements there, which was that the CEO did have a history working with press TV, but really never clarified the funding sources, just saying like, "Yeah, we went to investors and we have a Patreon, and so we get some money there."

Evelyn Douek :                                         Can I pick up on something that you just said? As near, or as far as we can tell, this is what they were saying and are at the Trust and Safety Research Conference, and one of the things that we've been talking about is data access and transparency on these platforms and how it's been harder to have insight into what's happening on these platforms. And so can you expand a little bit on that about the challenges in following these influence operations in this new environment where there has been this turn away from transparency at a lot of the platforms?

David Thiel:                                           Sure. So part of why there's all of these gaps is because they would go back and every time they got burned publicly on Reddit, they would go and clean up their tracks and delete their comments, delete their posts. Normally we would go to an archive such as Pushshift or something like that and actually see what that content had been, what potentially useful information there could have been in attributing these sources sooner. But the only research archives that are available of Reddit right now are basically they end at 2022, and that is where open research Reddit ended. There are still realtime archives of that, but it's only provided to Reddit moderators to make moderation decisions.
                                                      So this was one case where we bumped up against a number of places that could have been very interesting research avenues that had just been successfully wiped clean by a disingenuous actor. Similarly, with the deprecation of CrowdTangle, we had to resort to very manual methods of trying to archive all of the Instagram content. Before that came down, we were able to get a decent amount of metadata to track the various narratives that they were sharing, see that they were cross-posting between organizations like Red Stream, which was another organization named by the State Department as being operated by RT might be strong in the case of African Stream, but funded non-transparently.
                                                      So yeah, there are some gaps that would be lovely to follow up on there. I don't think there's a really good reason for Reddit to be preventing academic access to things like Pushshift for these kinds of research projects. But otherwise, it was an interesting operation. It got actually quite high engagement compared to most of the content that Russian-backed media has put out in the past. And I think part of that is that they actually did partner with journalists that had a significant amount of experience and production experience as well.
                                                      I would also point out that like a week prior to this was the whole Tenet Media debacle where Russian-backed, or RT-backed specifically, entities were found to be funding, right-wing influencers like Tim Pool, Lauren Southern, etc. So they are doing the typical thing of trying to split here where one side is trying to shore up right-wing narratives, the other side is a combination thing where it's still very Africa-focused and trying to push pro-Russia narratives there, but also targeting at people in the US pretty significantly likely with the goal of trying to breed some feeling of disaffection on the left, splitting it to some degree to the point where people are potentially discouraged from voting and so forth. You get the typical both parties or two sides of the same coin kind of thing, and that's a narrative that they would actively promote, that kind of stuff.

Evelyn Douek :                                         Great, thank you so much both for the work and for the overview of what you found. Appreciate it.

David Thiel:                                           Thank you.

Alex Stamos:                                           Thanks, David.

Evelyn Douek :                                         So you going to [inaudible 00:18:28] away now?

Alex Stamos:                                           Goodbye, David.

Evelyn Douek :                                         Wow, amazing. So speaking of political apathy and division and exhaustion about politics, that's a good segue to our next topic, which is a bunch of reporting lately about how Meta is getting out of politics, and in particular, Mark Zuckerberg is done with thinking about politics, talking about politics, and done with apologizing for taking responsibility for things that are really political issues and that he shouldn't have been sorry for or it wasn't his problem. So to talk Zuckerberg-ology, we are going to be joined by Brian Fishman, friend of the pod co-founder of Cinder, a software platform for trust and safety, and also previously served as a policy director for counterterrorism and dangerous orgs at Facebook and Meta. So I guess the two of you have some catching up to do about Facebook and its new views on politics.

Alex Stamos:                                           Hi Brian.

Brian Fishman:                                         Hi Alex. I interviewed with Alex back in the day and he was my half-boss for a while there.

Alex Stamos:                                           Half-boss. Greatest mistake I ever made. No, it was actually one of the best moves I ever made at Facebook was recommending Brian. Yeah, so we're going to talk a little bit about, man, what's going on with Mark, what's going on with Zuck? Yeah. And I'm not talking about the hair and the chains, which just seems like an early midlife crisis. So the two big things. One, he sent this letter to Jim Jordan both apologizing for things he should not have apologized for effectively, that he understood that why people thought there was something political behind the work that he had done around supporting the ability for people to vote during COVID but then also talking about the fact that there had been pressure put on Facebook, which was not a surprise that all the evidence had come out, but it's the first time you heard it from his mouth, which is what made it newsworthy. And then we have this reporting from the New York Times as well as he has talked on a couple of podcasts and these interviews with folks that he feels like he's made a mistakes not being politically neutral enough.
                                                      So look, I've got my theory and I'd love to hear your theory here, but my theory here is I feel like Mark, he just feels like he can't win. That post-2016, he's really upset that Russia did this big thing to mess with our election, that Facebook was part of it. He never thought that Facebook was the only platform. It wasn't. I think he at the time rightfully rejected the idea that Facebook was the reason that Trump won, which was like you might call the New York Times consensus of the election, which was never backed by evidence. Then I think has fallen away as people have gotten a better understanding of what exactly happened. And so he never really accepted that idea, but he did accept some level of responsibility of, "Hey, we weren't really paying attention." He had built this platform in which a bunch of things happened. And so really invested for 2018 and especially 2020.
                                                      And 2020, not only has he invested significantly in trusted safety/integrity, but he throws a bunch of his personal money, $100 million over of his and Priscilla's money into helping people vote in a totally neutral way. He just basically gives money to states and counties to help them buy machines, help them get buy absentee ballots, help them pay for people to be able to count absentee votes, to make it possible for people to vote during COVID, which is just an incredible philanthropic thing. And as a result, he gets this incredible backlash mostly from the right, all of these crazy theories, many of which are terribly anti-Semitic calling him like the next George Soros. This term Zuckerbucks that meant that he basically bought the election, which makes absolutely no sense.
                                                      So he gets that from the right, which is expected, but from the left I think he feels like he gets betrayed because he does everything he's asked to do. He builds this huge integrity thing. He has this huge civic integrity team. Has this election integrity war room, builds all these rules. January 6th still happens. It happens because people wanted to riot. They wanted to be fooled. The President of the United States told people the election was going to be stolen and so they fooled themselves and think it was stolen. He told them to show up, he told them to take the capital, they took the capital. And that that's not something that he thinks he could have prevented. He probably could not have prevented at all. And I think from his perspective, he did everything he could do while still being neutral, and then gets all this blame for what happened after the election.
                                                      And then does all of this work around the vaccine, of which Priscilla I think takes a leading role as a pediatrician and giving him all this advice. And then Joe Biden goes up there and says that Mark Zuckerberg's killing people with vaccine disinformation, which from his perspective... Now, Biden, it was clear that he was off script and a couple of weeks later they cleaned it up, because I'm sure, a bunch of people in the White House are like, "Mr. president, you can't do that. We are doing all this work with Facebook, you can't just..." But that's got to be a huge betrayal from his perspective.
                                                      And so from his perspective, he's like, "I've done everything you wanted. How am I supposed to win? I might as well just be neutral." I don't know, that's my read on this. And yes, Trump attacks him and the right attacks him, but the left, if he does everything he wants, he's not accepted and he's still attacked. And so he's just deciding... He's throwing up his hands like, "I'm going to opt out. I'm out of this. I don't think I can win at all." And he's decided to just give up.

Brian Fishman:                                         I agree with a lot of that. I think a couple things. I come from a background studying terrorism, which is famously difficult to define. It's easier to define than misinformation. Misinformation is so difficult to define. And there are people here that have probably worked on that with more detail than I have. And so I think there is this dynamic, especially around COVID, where defining those issues was really hard. When there is a lack of definitions, it's even easier for political pressure to come in and play an important role. And I do think, and I was still at Meta when a lot of this was going down certainly through January 6th, which was a time when I think there were things we could have done better. I don't think any of them would've prevented January 6th for all the reasons you laid out, Alex.
                                                      And I do think that there is fundamentally this dynamic that a lot of people pointed to of folks on both the right and the left trying to game the refs, which is Facebook and YouTube and the others. And I think the thing that you see is that many folks on the right, the Jim Jordans of the world have approached this kind... Or the Glenn Becks of the world have approached this with platforms in an entirely transactional way. You do what we like, we'll lay off. You don't do what we like, we'll increase the pressure. Whereas I think folks on the left, and frankly a lot of folks within the Democratic Party in a more organized way, it's pressure all the time. Facebook, and a lot of this lands on Facebook in particular, which I think of as like less so today because I think their comms folks have really done an incredible job. The people that are resurrecting Zuck's brand deserve a pay raise and a bonus. They're doing an amazing job. Whatever you think of him-

Evelyn Douek :                                         So that's Elon Musk.

Brian Fishman:                                         [inaudible 00:25:35] Elon Musk, yeah.

Evelyn Douek :                                         He probably doesn't need a pay raise, but yes, he's primarily responsible, I think.

Brian Fishman:                                         I totally agree, Evelyn, I think you're totally right. I just think they've capitalized on that incredibly well and they haven't missed that opportunity. But I think they can't win on the left. It's too easy to fundraise off the pressure on the platforms and Facebook is this giant wounded animal, or at least it was, and it's less of a wounded animal today. At the same time, what you see is Zuck trying to do is say, "Look, I'm not part of these policy decisions. Nick Clegg is going to make all those calls. I'm going to go over here and be this product guy." And there's a lot of that that you see including earlier this week.
                                                      I don't really buy that completely. At the end of the day, there's one CEO, there's one chief executive. I was in meetings where everybody's around the... Everyone's had a really tough argument and there's a very hard call 51, 49 decision to be made. There's only one person that can make that call. At the same time, that's clearly not what Zuck ever got into this for. He wants to be that product guy. He wants to be the Silicon Valley archetype. He doesn't want to make these calls. I don't think he ever did. And so in some ways I see this is like a genuine effort on his part to try to return to his roots because I don't think he really wanted to get into this political debate.

Alex Stamos:                                           No, he absolutely did. What I always told people was you could tell what was important at Facebook because it reported to Zuck and what wasn't important because it reported to Sheryl. That's the difference you could tell. And I reported to Sheryl, so that's how you knew that security was not important to Mark. And he's trying to get back to that, that he's created with Clegg, he's recreated Sheryl, except Clegg being a different kind of political animal. [inaudible 00:27:19] the transactional because it's clearly... That's part of it, of he believes that he can deal with the Republican side. If he gives them what he wants, they're going to leave him alone. He gave the Democrats what they want, what was his reward? Was Joe Biden says he's killing people. And Lina Kahn blocks him from buying Beat Saber while letting Microsoft buy Activision for 80 [inaudible 00:27:41]. One of the largest video game companies, not the largest video company in the world, the Beat Saber, a company that makes one stupid video game that nobody's ever heard of.

Brian Fishman:                                         It is fun though. It's fun.

Alex Stamos:                                           It's fun.

Evelyn Douek :                                         Super fun.

Alex Stamos:                                           There's no possible way you can come up with an actual legitimate antitrust theory in which you block Beat Saber, but you allow the Activision merger.

Evelyn Douek :                                         Is that Supernatural? Is it Beat Saber or Supernatural? Anyway, it was one of the-

Alex Stamos:                                           I think Supernatural is the name of the company that makes it, but it's just like there's no possible way you can say you let Microsoft Activision happen, but you're going to block Facebook buying any video game companies unless you're just saying, "We don't like Facebook." That's the only plausible answer to that. And so if you're Zuck, you're like, "If this is the world I've got," then yes, you might as well... "At least Trump, I could do a deal with him and he's not going to raise my taxes." And it sucks, it sucks because it's also him giving up his actual personal views for a transactional deal, but it is not an illogical one, which is the unfortunate part.

Brian Fishman:                                         But I also think that there's some overreaction to that letter. He clearly did a letter that was written in a way to give Jim Jordan and Republicans on the hill a win that they could walk away with. But at the same time, that was a very carefully crafted letter that I think was done in a way to try to indicate frustration with the way the Biden administration handed some of the COVID stuff, which I frankly agree they went too far. And I generally support the Biden administration, but I do think that there was some of that dynamic where they were putting too much pressure on the platforms in a way that overall we need more transparency around and we need really clear guidelines because I don't think there would be the same kind of agreement with that sort of government pressure if it was in a different direction on a separate issue.
                                                      And we know that our understanding of COVID has evolved over time. And I think we have to recognize, and one of the [inaudible 00:29:38] things that I think we see in all of these dynamics is frankly we just have to have some humility in our what we're going to do around speech when conditions are fundamentally dynamic. That humility can lead to lack of inaction, and I think that's the danger, but if we lose that humility, it can lead to excess action. And so I think Zuck is trying to understand that and recognize it, but he's between a rock and a hard place. I remember at a moment when before I left, I was thinking about how can we increase transparency on some of the things around dangerous organizations? And I developed some proposals.
                                                      And one of the questions I got asked as we were thinking about this, is this going to buy us any wins? Will this get us anything? It wasn't the only question, but it was one of the questions. And I had to go back and say, "No, I don't think it's going to buy us anything. We're going to lead the industry on transparency around these things and it's going to get us no credit with the media. It's going to get us no credit with activists because everyone is still going to say, 'Let's rag on Facebook.'" When I was asked that question, that's the answer I had to give honestly, and that's a losing argument. That meant I had a losing argument. And I lost that argument.
                                                      But we all have to recognize as folks on the outside of a big company like that that you create incentives. And the incentives placed on Facebook were not to be transparent, were not to go do these things because of all that pressure that capitalized on that transparency, that capitalized on those efforts. I just think we have to look that in the eye as folks that are going to shape policy moving forward and think about how can we do it better in the future.

Alex Stamos:                                           And you're right, the letter was careful and he didn't break any news. We knew the Biden administration... I agree, the Biden administration went way too far. Specifically the emails where you have some White House staff are saying, "If you don't do blank, we will not help you with this." Which is clearly, if you look at the [inaudible 00:31:39] book standard where you're like... We're asking you to suppress speech and we're holding back government power either not to help you or we're going to harm you with government power, then that's the thing that touches up upon it. I would argue that Donald Trump saying, "I'm going to arrest and kill Mark Zuckerberg" would also hit that level. This is where the people not taking Donald Trump literally starts to become like an issue, right?

Brian Fishman:                                         Yes.

Alex Stamos:                                           But right, he didn't break any news there. The part of the letter I really didn't like the most was him apologizing for his gifts because that was just completely unfair and there was attacks against him and Priscilla I think were completely unfair, and I really wish in that section he just said like, "I reject this. I reject that this is wrong. And you leading into this anti-Semitic conspiracy is wrong," but he did it... I think he sent that letter so he didn't get subpoenaed. That's probably the deal that they made there was that the committee wanted one last splash before the election, and they're probably threatening the subpoena to get him to testify one more time. And they gave him the letter to... They negotiated the letter that... And he would not have been able to take that stand and then not get [inaudible 00:32:44].

Brian Fishman:                                         Voting access is a real issue in our country, and Zuck did a good thing when he provided money to support that. He shouldn't have had to do it because we should actually fund this stuff without private donors, but the fact that he did it was a good thing. I agree with you, I wish he hadn't walked away from that. I don't think he should apologize for it because I think he did right by the country as an independent citizen, not as the CEO of Facebook. This is where we define, and obviously this is Moderated Content, it's a podcast about content moderation, but the [inaudible 00:33:16] is we... No, but part of the problem here is that we define-

Alex Stamos:                                           [inaudible 00:33:21] identification.

Brian Fishman:                                         We define all of these problems as if tech is going to solve them when fundamentally the core issue there with Zuck providing I think it was $300 million to secretaries of State, is that we got to fund those secretaries of state and has nothing to do with tech, it's about our society writ large. And I don't think we do that well enough in any of these issues around trust and safety is just step back and say like you just did Alex talking about, look, if the President of the United States is messaging something aggressively, Mark Zuckerberg isn't going to be able to fix it.

Evelyn Douek :                                         So this is a pretty sympathetic portrait, and I think I agree with a lot of it, and I've said as much. Over the many years I've said that I think we often ask content moderation to solve a whole bunch of societal problems that it's never going to be able to solve. I've also said I think that some of the pressure from the Biden administration definitely was way over the line and also shouldn't be happening in that opaque way, that it's really problematic when government actors use that kind of force and authority to pressure these companies. But there is this question I have of... We started by Mark saying, "I'm going to be neutral," and I feel like we might just be going in a circle here because this whole conversation started with the idea of well, maybe there is no neutral. What does it mean to be neutral?
                                                      And I feel like I'm having flashbacks to 2016 or whenever it was, 2017, and Mark Zuckerberg says to some podcast or something, "The idea that Russians on Facebook swung this election is frankly a pretty crazy idea." And then cue, we all know what happens next. And this argument, which I think I also do fundamentally believe in, which is there is no way to be a neutral platform, a platform has all sorts of incentives baked in, and it has content moderation rules. It's not going to step away from those or is it? And I guess that's one question here is how much of this is just rhetoric, how much of this is just PR and political posturing from Mark Zuckerberg's CEO, and how much of this do we think is going to be a substantive change at the platform and how thinks about these issues?

Alex Stamos:                                           You're giving me PTSD because it's like I was in Berlin into my hotel room, I was there briefing the BSI, the German government on their upcoming elections in our Russian investigation, what we're going to do to protect the German investigations while he's giving the interview. And so it's like I'm having a very vivid flashback of screaming in a German hotel room.

Evelyn Douek :                                         Do we have sound effects for that?

Alex Stamos:                                           The actual Wilhelm scream? Yeah, that'd be perfect.
                                                      I think what we're going to have to find out is this actually... Is there going to be any actual changes in how Facebook acts or is it mostly a positioning change? I think it's quite possible this is mostly positioning change. They just banned RT.

Brian Fishman:                                         That's my assumption.

Alex Stamos:                                           My assumption is is what they're going to do is there's not going to be a war room because the whole war room thing was silly. It's just not how you do things. You don't put people in a room and like, "Oh look, we just happen to have a bunch of New York Times reporters in a room with all of our civic integrity people. This is just a normal way of how we operate." That was clearly like a photo op. You just get rid of the photo and you have people operate appropriately. So I think we should just watch and see how things happen this election, I think it's quite possible operationally things will be 98% of the same as they were in 2020. It's just that they're not going to push it, they're not going to push the stories, they're not going to talk about it in a way and he's not going to do donations. That could be...
                                                      I think the biggest thing will be his personal giving. That's what I hear from him a lot is his personal giving is going to change. He's not going to do anything that could possibly be... It's just going to be malaria and cancer and medical stuff and things that are not possibly at all political, but it is quite possible the actual enforcement will not change at all. It'll be interesting to see. It'll be hard to tell too because it's like everybody here is mourning CrowdTangle. Brandon Silverman's here, he said like, "We've been sitting Shiva for CrowdTangle here. We should be covering mirrors." But it'll be harder this time for us to collectively know what's actually happening on Facebook, but it's quite possible the actual outcomes won't be that different.

Brian Fishman:                                         Also, a lot of the processes that they have put together have been systematized. Obviously 2016 was this seminal moment. 2020, we were putting things together in real time, scrambling to do so. Everybody was wondering and asking this question about 2016. And a lot of that stuff works today. We did a decent job in 2020. Definitely lessons learned. Undoubtedly. But [inaudible 00:37:46] lot of smart folks, there's been a lot of talk about how those teams have been downsized. There's still a lot of smart people there that have learned, I'm sure have learned those lessons from 2020 and are moving it forward.
                                                      And I see this fundamentally as Zuck trying to rebrand himself and his company as one that is about product and not about politics. And if you're the CEO trying to maximize the value of that company, that's what you're going to do. And I just think at the end of the day, whatever you might think about Mark Zuckerberg and his decisions, the guy is a killer when it comes to being a CEO and making money. And everything that I see him doing here is consistent with that kind of goal. That's how I understand it.

Alex Stamos:                                           I think he's just going neutral because Civilization VII's coming out and he's just trying to clear his... So he can just go... Because his whole thing is the new Civ's come out and then he just plays it until he beats it at the highest level.

Brian Fishman:                                         It makes me want to be 15 again.

Alex Stamos:                                           Yeah.

Brian Fishman:                                         Yeah.

Alex Stamos:                                           So yeah-

Evelyn Douek :                                         What functioning democracy we have here where the shape of the public sphere is determined by the release state of particular video games. It's great.

Alex Stamos:                                           I don't know, that sounds just totally normal to me. Yeah. I think we're going to find out. And you mentioned Elon. The Elon thing takes pressure off him on a number of levels, both from a PR perspective but also practically because one, he doesn't have to deal with the Trump question anymore because Trump's on Truth and to a lesser extent X. And second, the most important platform on the right now is X. And so if there's going to be a groundswell of movement that is going to cause the January 6th or any kind of violence around the election, the vast odds are it's going to be on X, and operationally it's going to be on Telegram. Those will be the two platforms that are much more likely than Facebook this time around. So even if they've come back off of it 5%, I think the pressure has been relieved 20 or 30% by the creation of a platform that has... X has unified the parlors and the gabs and all of that, that entire ecosystem. And I think that greatly reduces the pressure on Zuck.

Brian Fishman:                                         I do think it's interesting, we haven't really interrogated what it means that Trump hasn't decided to come back to Facebook, if that's his... I think it oftentimes gets dismissed as, "Oh, he's trying to make money off Truth Social," but if he really thought that was going to determine wins and losses in this election, wouldn't he be there? Maybe that content is getting distributed in secondary ways.

Alex Stamos:                                           His team's back. They're running ads.

Brian Fishman:                                         His PI's team's back, but I don't know, I don't know what that distribution works like.

Alex Stamos:                                           So he's personally on Truth and then his team is running all these ads, and I think that's their plan. They're probably telling him, "Sir, you stay on Truth." Because their plan is they want his crazy rants to be hidden. Somehow, he's able to get away with having a wall of text in which he says insane things.

Brian Fishman:                                         It is interesting.

Alex Stamos:                                           And it's not a big deal as long as it's on Truth. And then they can run the nice polished ads on Facebook.

Brian Fishman:                                         But isn't that a fascinating dynamic? The way the media understands what those words means depends on which electronic platform they're posted on because I think you're absolutely right about this, but that contextual framing is a really fascinating thing. And I wonder what the reach looks like for his team's-

Alex Stamos:                                           It is crazy. Like the start of the ABC nightly news should be, "The former President of the United States and possible next President of the United States said these words on the social network that he owns" and they should just read it. That should be the top story every single night. It really should be. And somehow it's not.

Brian Fishman:                                         That's Harris's social media strategy. It's to try to do that, yeah.

Evelyn Douek :                                         Again, functioning democracy, it's great. But it happens on Truth and so we don't talk about it. It's I guess not where the journalists are. All right, thank you Brian for peering deep into Mark's soul and Mark's mind with us here and appreciate it.

Alex Stamos:                                           Mark, if you want to come on moderate content, let us know. Thanks Brian.

Evelyn Douek :                                         So speaking of peering into the minds of individual CEOs and the enormous influence that they have on platforms, and the-

Alex Stamos:                                           This is like what is the stretch, like speaking of. What a pivot.

Evelyn Douek :                                         There've been a huge weekend of backflips by two CEOs. This is actually an excellent-

Alex Stamos:                                           You're pivoting like Magic Johnson under the basket here.

Evelyn Douek :                                         Thank you.

Alex Stamos:                                           This is incredible.

Evelyn Douek :                                         Huge weekend of backflips by two CEOs who found themselves under enormous pressure. In fact, we spent the last two episodes of Moderated Content doing deep dives on these big showdowns between regulators and particular CEOs. We did one about Telegram in France, and then X in Brazil. And both of those podcasts are now out of date, and so-

Alex Stamos:                                           We should just delete them, make sure-

Evelyn Douek :                                         They were excellent for the 10 days that they were relevant.

Alex Stamos:                                           Or maybe we cause these things to happen.

Evelyn Douek :                                         Clearly by explaining them so well. So just the brief update of course is that Telegram has agreed to start giving data to governments when they make lawful requests. So it turns out if you lock up a CEO, you can get some response to those emails.

Alex Stamos:                                           So the fascinating question is this part of a plea deal with the French? We haven't heard this yet. Is there a quid quo pro here or is Durov throwing stuff at the wall hoping that they'll let him go? I wouldn't be shocked if the French say, "This is enough," because as we discussed, actually tying him to a specific crime is going to be a big deal in the French system. And so it's a huge win. Telegram actually responding to search warrants and perhaps doing some, they have yet to announce they're going to do any proactive child safety stuff. But if they end up saying, "Hey, we're going to scan these million person groups for CSAM," if I was the French, I'd be like, "Great, you can go now" because that's a big enough win from my perspective.

Evelyn Douek :                                         Right. Exactly. So still pretty vague and still to be seen, but that's the pretty big development for Telegram, which has made a big deal about how it has never before handed over data of this kind and it's going to start doing so. And the other big backflip or back down has been Elon Musk in Brazil who last week said that it'll start complying with orders from Brazil's Supreme Court in the hopes that the country will lift the block on the platform within the country. And so we'll talk about that more next week, I think, when we're going to talk return to X deep dives.

Alex Stamos:                                           We have a very, very special episode next week, an entire hour with two very special guests. We're not going to say it yet, but we have two incredibly special guests. We're going to have an all X, all Twitter, all trombone episode. It'll just be that trombone finger is going to be sore because this'll be so much sad trombone. I'm going to have to get like a sad bass trombone, sad... Yeah, all kinds of sad, different kind of trombones.

Evelyn Douek :                                         I'm excited. That's right. A trombone choir.

Alex Stamos:                                           It'll be like a... Yeah, sad Hans Zimmer. It'll be incredible.

Evelyn Douek :                                         How's that for a teaser everybody? I bet you can't wait to listen.

Alex Stamos:                                           I really hope those people don't cancel. It's going to be a real problem for us.

Evelyn Douek :                                         Trombone Hans Zimmer. Wow. It's going to be great. So tune in for that.
                                                      But now we want to talk about something closer to home and we have a well-known friend of the pod here to join us to talk about this, Daphne Keller, the director of the Program on Platform Regulation at Stanford's Cyber Policy Center. Daphne, are you there?

Daphne Keller:                                         I'm here.

Alex Stamos:                                           Are you here for the legal corner?

Daphne Keller:                                         I guess so.

Alex Stamos:                                           Sorry, I have to do that.

Daphne Keller:                                         That was very startling.

Evelyn Douek :                                         It's always a joy recording with Alex. You just never know what's going to come out of the headphones.
                                                      So Daphne, the reason why we asked you to come up and talk is because I think there was all of this conversation about the First Amendment and platform regulation and what does the Constitution mean for how we can regulate platforms. This was the huge Supreme Court cases that we had last year, they came down at the end of the summer. And then we at least on Moderated Content haven't really been talking about this very much, and that's because nothing else has been going on. It's all quiet on the western front when it comes to platform regulation in the First Amendment.

Daphne Keller:                                         Definitely.

Evelyn Douek :                                         Yeah. You've been sleeping super well.

Daphne Keller:                                         Practically no cases.

Evelyn Douek :                                         Yeah, you're reading list I bet is super short. So obviously being sarcastic, it has been a super busy time. There is so much going on in the lower courts. And of course we also have pending in the Supreme Court, which we actually talked about at the live recording of Moderated Content last year was the Free Speech Coalition lawsuit in Texas about their age verification law, Free Speech Coalition v. Paxton, which is pending in the Supreme Court. It's going to be heard. Is there an argument date set, do you know?

Daphne Keller:                                         I don't think so.

Evelyn Douek :                                         Okay. So we're still waiting, but it'll be in the coming months. And this is the case where the Fifth Circuit basically explicitly just pretended that binding relevant Supreme Court precedent that said, "No, you can't have age verification laws" didn't exist, and that for some reason that they weren't going to apply it in this particular case. So yes, that one's pending. Are you watching that one closely?

Daphne Keller:                                         I'm watching it medium closely. I'm a platform regulation girl, and this is ultimately... It's going to affect all the platform regulation cases, but right now, it's a case about a regulation targeting porn sites or sites with a third or more pornographic content.

Evelyn Douek :                                         I did see a tweet or an X that you made about this case the other day though, which I thought was very funny. So it's still waiting in the Supreme Court and people have been filing amicus briefs, which is friend of the court briefs where people say, "Hey court." They're not parties to the case, but they just say, "Hey court, here's some relevant information that you want to know." You picked up on a particularly funny amicus brief that had been filed.

Daphne Keller:                                         It was remarkable. It was from an age verification vendor saying, "Texas's age verification law is unconstitutional because they didn't adopt the best least restrictive means, the most privacy protective means of age verification, which is our technology."

Evelyn Douek :                                         That's quite the advertising strategy there. I don't know, once you pay the lawyers, I don't know how that works out in terms of cost of the product versus a number of new customers. I don't know if anyone in the Supreme Court is going to be buying age verification technology anytime soon. But yes, I applaud their ingenuity.
                                                      Okay, so that one's pending at the Supreme Court and then there's just been over the summer over the last couple of months, so many decisions in the lower court. So we're going to do a lightning round and step through them relatively quickly.

Daphne Keller:                                         It's so many decisions, Evelyn.

Evelyn Douek :                                         Yeah, I know the NetChoice restatement of the law is really ballooning here. So let's start with NetChoice v. Bonta. August 16th, 2024, the Ninth Circuit came down with the decision. What's the upshot of that one?

Daphne Keller:                                         The upshot of that one is there are parts of this law that are regular old privacy rules about collection and use of children's data. And that might be okay. That's being reassessed on remand. And then there's another part of this law that says, "And when you're processing the data, don't let the children see harmful content or don't... Mitigate the risks or explain how you're going to mitigate the risks of a list of harms," including things like hate speech. And the Ninth Circuit was like that, that my friends is a speech law. And in oral arguments the poor lawyer for the California Attorney's General's Office just had a terrible time trying to explain how that's not a speech law. So unsurprisingly, that part gets struck down [inaudible 00:48:57].

Evelyn Douek :                                         Right. So that's the California Age-Appropriate Design Code Act has been enjoined there. I think the Ninth Circuit said that it deputized covered businesses into serving as censors for the state. So not a close call for the Ninth Circuit on that one.

Daphne Keller:                                         Not a close call. And just to call out one thing about that, mostly the court spoke as if it was talking about the platform speech rights. But when you say deputize to serve as censors of the state, that is definitely alluding to how the user's speech rights are going to be affected.

Evelyn Douek :                                         Yeah. Okay, so then a couple of weeks later, again in the Ninth Circuit, again California, X v Bonta, September 4th, 2024. What happens in that one?

Daphne Keller:                                         All right. These are the same panel, same three guys. It was argued the same day. They also struck down this one or struck down the majority of this one. So this was a platform transparency law that listed out some categories of speech disapproved by the California legislature and by me, such as hate speech and disinformation, and said that platforms had to explain their policies and publish quantitative transparency reports about how much content they had moderated. And there too, the court said, "This is a restriction on speech. This is telling the platforms what they have to say and setting forth rules based on the disapproved content of speech." And that also was unconstitutional.

Evelyn Douek :                                         Yeah. And this one's a big deal I think, or I think we both think that these transparency laws, they're really interesting, they raise really interesting issues as a First Amendment question. Because technically this law didn't say, "Go and moderate hate speech" or, "Go and moderate disinformation." The point of this law is a law... We've spent a lot of today talking about trying to get transparency from platforms and how important it is. And as platforms are reneging on their voluntary commitments to transparency, maybe we need laws. And ostensibly that is what California had tried to do here, was enact this transparency law and say, "Social media platforms need to say what are their policies on these particular kinds of content and what are they doing about it." And the court said, "Even that's a violation of the First Amendment." How worried should we be about that in terms of the breadth of the court's ruling in terms of future transparency laws?

Daphne Keller:                                         So I personally am fine with it. I think this is reasonable. I also think that what they said doesn't tell us what the answer would be about researcher access to data. If it were a law saying, as with the EU and the DSA, that platforms have to hand over internal data, or if it were a law saying platforms have to permit researchers to scrape content in order to do research, I think that would be a very different situation. That's not the government saying what the disclosures have to be. Maybe in the researcher access scenario. But it's just saying have a mechanism for researchers to do things like scraping or potentially do things like have an API. So that's very different than the state saying, "Okay, we want you to explain some things," and they are things about these specific kinds of speech that we disapprove of.

Evelyn Douek :                                         So this is a super interesting question and the Supreme Court precedent on this is not especially clear, how far can the state go in mandating transparency. And this is a case that may tiered up. Now, California if you're keeping track, is zero for two on these two laws with those two decisions, but it is not dissuaded. It has kept legislating anyway. And I think it's released three bills in the last couple of weeks on deepfakes, which are all... They're going to be totally fine, right?

Daphne Keller:                                         I don't know. They're passing hella laws in Sacramento. I can't even keep up with them.

Evelyn Douek :                                         Yeah.

Alex Stamos:                                           As a Northern Californian, [inaudible 00:52:40] good use of hella.

Evelyn Douek :                                         Was impressive. It was seamless. These laws are regulating individuals' use of deepfakes in political campaigning, and then another law which requires social media platforms to remove deepfakes or AI-generated or manipulated content to do with elections. This one is safe, do you think, under ordinary established First Amendment principles?

Daphne Keller:                                         It seems like have a lawsuit already.

Evelyn Douek :                                         Yeah. Unsurprisingly. It's good to be a government lawyer in California, I guess. There's lots of work.

Daphne Keller:                                         I guess so.

Evelyn Douek :                                         Although [inaudible 00:53:19].

Alex Stamos:                                           Newsom's given up a bit because it used to be he would threaten to veto these things, that he would push back and now it just seems that he's like, "Eh, fine."

Daphne Keller:                                         So this is a weird case because... Okay, first of all, I got my facts from the plaintiff's complaint, so caveat there. But apparently Newsom... So the plaintiff is somebody who made a parody video about Kamala Harris and apparently Newsom had said, "Once this law has passed, that video will have to come down. That would violate our rules unless it's labeled as parody." And so there was this like... And then Elon Musk got in on it somehow. There was some sparring [inaudible 00:53:54], of course, that Newsom was already involved in for this one.

Evelyn Douek :                                         Yeah. My professional opinion is these laws are pretty flagrantly unconstitutional and we don't like it when red state legislators and governors pass performative legislation battering platforms around the head and asking them to do things. And so we shouldn't like it.

Alex Stamos:                                           But we haven't had an explicit deepfake First Amendment case yet.

Evelyn Douek :                                         We have not. It's true.

Alex Stamos:                                           So do you think that will be a... Is California in the running to get the first one or do you think they'll come out of a red state?

Evelyn Douek :                                         Yeah, I think it's entirely possible that these laws could... There is this First Amendment challenge. We will see what happens, but it's possible that it could go all the way up.

Daphne Keller:                                         It seems bonkers to me that even supposing you could require parodies to be... Make sure people know that they're parodies. People would know this one was a parody just by looking at it, the idea that you have to stick a label on that and otherwise you can't distribute it on major platforms.

Alex Stamos:                                           It just seems like maybe the law is a little early because in two or three years you're going to end up with anybody be able to make stuff that is not obviously parody. And so I just wonder if we're going to end up with either... It's just interesting that it's coming up now when we're right that if they had waited a couple of years, that you could have had judges who are now inundated with examples where it's so obvious that this technology is so incredibly dangerous in the political sphere as well as in the personal sphere, that they would be a little more careful about attaching full First Amendment protections to as this is political speech versus something that's an action.

Evelyn Douek :                                         So you may be right, this is always the question, has technology changed so fundamentally that our old First Amendment principles can no longer govern? I think that the political speech question will still be relatively easy for a while. I think the more difficult question, the harder question is going to be like deepfake images of sexually explicit images of women where there is that severe privacy harm and dignitary harm that comes from this speech. It's not political, it's not engaged in public discourse and about the marketplace of ideas. I think that that one's going to be... If we have the first deepfake case, I think if I were the Supreme Court, that's the case that I would want to take [inaudible 00:56:11] those laws coming, yeah.

Alex Stamos:                                           Have any of those made it at the circuit level, do you know of?

Evelyn Douek :                                         I don't think that any have been challenged-

Alex Stamos:                                           Because Rihanna was looking, Riana Pfefferkorn, our colleague. She was looking, and what's happened so far is the people who've been prosecuted for AI-generated CSAM, they almost always have tons of real CSAM, so it doesn't really matter. So there's no big win for them to challenge. We're going to enter that era where somebody is careful enough that they'll have 10,000 pieces of AI-generated CSAM and no real, and not 10,000 and 10,000. And that will be the first person where it's like the cops really want them, the DA wants them, and they're willing to take the fight of all the way up and then that'll be it.

Evelyn Douek :                                         Yeah. So I think that's going to be our jobs and our podcast content in like two to three years is probably what we're going to be talking about is all the First Amendment challenges and Supreme Court cases about these deepfake laws.

Alex Stamos:                                           Because those people exist now. There's going to be people who are smart enough now who want that content, who are smart enough to generate it now, who are like... They're not going to touch the real stuff, they'll create it all themselves or they'll trade it all and they'll only trade the fake stuff.

Daphne Keller:                                         Okay, but another way this could come up sooner is there's a right of publicity law in Congress right now that would give really aggressive rights to people who appear in deepfakes. There's a law that already passed in Tennessee, Tennessee because of the music industry tends to have strong right of publicity laws. So we could see a civil case based on deepfakes that's very sympathetic, that pushes on these First Amendment questions sooner and by that avenue.

Alex Stamos:                                           And we have several state NCII laws, the Danielle Citron laws, so it's interesting if any of... I have not heard of any appeals out of those yet.

Evelyn Douek :                                         So they've by and large been upheld?

Alex Stamos:                                           But generally those are real photos, right?

Evelyn Douek :                                         Exactly.

Alex Stamos:                                           Yeah, it'll be interesting to see if any of those come out of ML-generated.

Evelyn Douek :                                         Right. So that's the next chapter. The current chapter of course is child safety, which is what is generating a lot of laws and challenges as well. And we've been talking about those that have been passed in a bunch of states around the country and progressively struck down by a bunch of courts around the country. And the latest addition to that chapter has been NetChoice v. Reyes in Utah. What's the caption on that one, Daphne?

Daphne Keller:                                         [inaudible 00:58:26] the law got struck down

Evelyn Douek :                                         Again?

Daphne Keller:                                         Again. I think there are five state laws that have been struck down or mostly struck down this way, like Texas, Arkansas, Mississippi, Utah, California? It's funny, the laws are all this grab bag of different things that might be in a child safety law and I can never remember which one is which, but they're consistently getting struck down. Something I didn't like about the Utah case is there are actually... There are two cases there. The NetChoice, the trade association for the platform sued based on their speech rights, and then some users sued based on the user's speech rights. And the users were teenagers who help victims of abusive polygamist families in Utah find a way out. It's like really sympathetic plaintiffs and the court was like, "Oh, they don't have standing. Plaintiffs, users, you can't bring this case. This is just a case about the platform's speech rights," which is the wrong way to go about it as far as I'm concerned.

Evelyn Douek :                                         Completely the wrong way to go about it. And also just such a shame that so much of this First Amendment conversation is really so focused on platform rights and what is the First Amendment rights of platforms rather than talking about what are the First Amendment rights of speakers and listeners on the platforms.

Daphne Keller:                                         Of course that's because [inaudible 00:59:39] paying to litigate it right now. Attention Funders, if you would like to fund any user litigation, this is an important thing to do right [inaudible 00:59:46].

Evelyn Douek :                                         NetChoice, free statement of the law for a reason.

Daphne Keller:                                         Okay, so can I do a little bit more on child safety?

Evelyn Douek :                                         Yeah, please.

Daphne Keller:                                         There's basically, there's two batches of litigation. There's the challenges to the state laws and then there are tort claims, things like school districts suing platforms on theories like addiction or products liability. So those are running in parallel and that second batch doesn't get nearly as much attention. But there's an interesting thing that has happened in those cases at the lower courts and also happened in a, as I understand, earlier ruling in the Utah case, which is some questions about the limits of Section 230 immunity where the courts aggregating have suggested maybe there isn't immunity to the extent that the claim is based on some specific platform design features such as a continuous scroll, autoplaying videos, and having notifications. And it's not clear if there could be liability anyway for those design features without it also being about user content and running into 230. But it is an interesting exploration of the edges of 230 and what these "Design claims are about."

Evelyn Douek :                                         Okay, so speaking of the exploration of the edges of 230, and I worry we're going to trigger you talking about this next case, Daphne, because there's a lot of déjà vu here. Anderson, v. TikTok. Have there been any cases recently about whether Section 230 gives immunity for platforms for the way they amplify content and the argument that it doesn't because that's their speech. Is this ringing any bells at all for you about recent [inaudible 01:01:31] Supreme Court-

Daphne Keller:                                         Yeah, maybe like a year and a half ago before the Supreme Court we had this exact same issue. Yeah.

Evelyn Douek :                                         That's right.

Daphne Keller:                                         And they didn't resolve it, in the Third Circuit's defense.

Evelyn Douek :                                         Right. So Gonzales... Loyal listeners of the pod will remember, we did so many episodes about this, the idea that platforms might lose 230 immunity for amplifying content. In that case it was terrorist content on YouTube. The Supreme Court ducked the question, said they don't want to answer it. Turns out trying to distinguish amplification from any other kind of feed was very, very difficult. And then the Third Circuit here, undeterred on August 27th, found that Section 230 wouldn't immunize TikTok in that case for you feed recommendations. And as always, these cases are based on tragic facts. In this case, completely heartbreaking. A 10-year-old who died after attempting to recreate a blackout challenge video that she had seen on the platform. It's always heartbreaking.
                                                      But this court comes in and thinks that the answer to this is this question that the Supreme Court had ultimately ducked, and it does it in part based on Moody. It says that Moody from last term had fundamentally changed this question. Moody of course for people will remember held that it is a First Amendment right of a platform to moderate when a platform moderates its content on the platform, that that's a First Amendment-protected activity, a First Amendment-protected expression. How does that play in here and does that make sense in the Third Circuit's defense?

Daphne Keller:                                         They took an argument that Clarence Thomas likes to make and that sometimes he gets Alito and Gorsuch to join him on, which is to say, "If ranking is the platform's speech, then how can that be immunized under Section 230?" The claim isn't about the user's speech, it's about the thing the platform say is their own speech, which is superficially appealing. But as the literal drafters of 230 explained in their Gonzales brief, the whole point of 230 was to get platforms to exercise editorial discretion, to immunize them so they would go out and exercise these First Amendment rights to set and enforce editorial policy, including through deciding how to rank things. This is, again, in the brief of the drafters of Section 230. The idea that you lose 230 anytime you're acting like an editor and exercising your speech rights as a platform would mean you always lose 230, it would be a meaningless law under this theory.

Evelyn Douek :                                         So this is going to create a nice... Has created, if it gets upheld on appeal, will create a circuit split. And so all of these cases, they're basically like a DDoS attack on the Supreme Court's docket in some sense because they're creating all of this First Amendment law and we're waiting to see what the Supreme Court will take or not take, where it'll nibble. Speaking of cases that they might take, and I think probably will take, and speaking of TikTok is there was oral arguments a couple of weeks ago in the TikTok ban case, and we won't talk about that too much except to say that I was surprised it was a very hostile bench for TikTok and TikTok's users arguing that it was a violation of the First Amendment to ban the platform. And so I think I'm a little pessimistic, and I say pessimistic because I can't quite believe that we sit here in this land of the First Amendment and we're about to potentially uphold the ban of an entire speech platform.

Daphne Keller:                                         Just like Brazil.

Evelyn Douek :                                         Right, exactly. But I don't want to spend too much time on it because I don't think it matters all that much what happens in the DC Circuit because I think this one's probably going up to the Supreme Court, and the Supreme Court will probably take that in the next couple of months. It's on an expedited timeline as well. So that's one to watch.

Alex Stamos:                                           If it goes to Supreme Court, do we have to wait until next year?

Evelyn Douek :                                         As in next calendar year?

Alex Stamos:                                           Yeah.

Evelyn Douek :                                         Yeah, it'll happen next calendar year, I think, yeah. But that's very far. What, you think that's slow? That's very fast in law circle, like law timelines.

Alex Stamos:                                           Right. It's just kind of funny because this whole thing is like... It's supposed to be-

Evelyn Douek :                                         There's a national security threat.

Alex Stamos:                                           Right. It's supposed to be playing out like at national security speed.

Evelyn Douek :                                         We must ban this because otherwise everything is going to... Yeah, exactly it is... That's actually one of the arguments that I think the users are making, or it might be TikTok itself making is saying, "How can you be saying this is an imminent national security threat, there is no other solution to this problem except to ban the platform and you have six months to try and find a buyer and divest?" If the threat was that serious, surely Congress would want to see something happen earlier, so .

Daphne Keller:                                         But there's some chance that these other cases that we've talked about could still get in under the wire and make it into this Supreme Court term. So the California case about transparency possibly could, the California case about the age-appropriate design code possibly could, and then the case about TikTok and ranking liability and Section 230 possibly could. Although right now the parties are petitioning for en banc review by a bigger set of Third Circuit judges. I really, really, really hope none of those things go up this year because I'm so tired and next year would be a good time to think about those things.

Evelyn Douek :                                         Yeah, it is. So like we said, not a lot going on in First Amendment platform regulation land. It's been busy. Thank you for the roundup, Daphne, that was very comprehensive. And thank you everyone for coming and joining us.

Alex Stamos:                                           I just got an instant correction. You're right, it's Within, was the company that Meta was stopped. I'm sorry. You're right.

Evelyn Douek :                                         It's okay. I wasn't going to insist on it front of everyone, but I knew I was right about-

Alex Stamos:                                           Thank you for not embarrassing me front everybody. I admit you were right.

Evelyn Douek :                                         Thank you, Alex, for saying on this podcast that I was right. Could someone clip that? I'm just going to replay it.

Alex Stamos:                                           You are attractive and I am ugly. You are smart and I am stupid.

Evelyn Douek :                                         This is the best outro ever. Thank you very much. This has been your Moderated Content weekly update. This show is available in all the usual places and show notes and transcripts are available at law.stanford.edu/moderatedcontent. It is produced by the wonderful Brian Pelletier, audio engineering this week by Alex Stamos. Special thanks to Lily Chang and Rob Huffman. And special thanks to all of you for coming and listening to us record today.

Alex Stamos:                                           Thanks for coming, everybody.