Moderated Content

MC Weekly Update 3/20: He's baaaaack!

Episode Summary

Alex returns and he and Evelyn discuss: the rapidly escalating TikTok situation, and what the possible endgame is; the return of Donald Trump to Facebook and YouTube, just as he calls for people to protest and take the country back on Truth Social; more of the same over on Twitter and a few small updates in the Legal Corner.

Episode Notes

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Alex Stamos:

I don't see the Chinese regulators allowing ByteDance to sell or to spin out a US subsidiary.

Evelyn Douek:

Right. Do we know anyone that has a loose $50 billion or so and an interest in purchasing social media companies and a desire to please the Chinese government? Is there-

Alex Stamos:

Right. I think the problem for... yeah, there's a guy but I think he's a little-

Evelyn Douek:

The funniest possible outcome. He's busy. It's a shame.

Alex Stamos:

He's a little busy.

Evelyn Douek:

Hello and welcome to Moderated Content's weekly news update from the world of trust and safety with myself, Evelyn Douek. And Alex Stamos is back this week. Welcome back, Alex.

Alex Stamos:

Yeah. Thanks, Evelyn. I think you and Riana did a fantastic job. I think I should probably just retire. I give up the microphone. It's hard to live up to Riana.

Evelyn Douek:

It's true. But we did miss you when we had to say, "The Twitter files."

Alex Stamos:

Oh, right. Yeah. But you did find the one person at Stanford who talks faster than I do. The two of you together is like I had to drag it down. It's the first time I had to go sub-1X on my podcast player.

Evelyn Douek:

We're just training our audiences to have a competitive advantage in this new information age. That is the goal of this product.

Alex Stamos:

Perfect.

Evelyn Douek:

You're welcome, audiences. This is a two for one.

Alex Stamos:

For those who don't know, I teach a class with Riana and that is the feedback we have gotten is like, "Wow, we thought Professor Stamos was a really fast lecturer on Monday, and then Professor Pfefferkorn came on stage." Yeah.

Evelyn Douek:

You can ask me to guest lecture if you want, to look good by comparison. That's fine. Okay, so the big story this week is our TikTok, TikTok. Things have just kept heating up. So in only the last week since we recorded, the Justice Department is investigating the surveillance of American citizens, including journalists who cover the tech industry by ByteDance, that owns TikTok. And then of course the bigger story is that the Biden administration is pushing for a sale of the app, suggesting that it doesn't have faith in Project Texas and the CFIUS process to adequately address the national security concerns.

Alex Stamos:

Yes. Thoughts and prayers for Oracle who are not going to get their $1.5 billion to do something that is effectively impossible.

Evelyn Douek:

To write a content moderation checklist and go, "Doing great, guys. Keep it up."

Alex Stamos:

Yeah.

Evelyn Douek:

Yeah. So we have talked about this a lot. It's kind of wild to feel like we entered a time machine and back to where we were when President Trump was trying to ban TikTok. But it's basically the same policy position that we have ended up with, it seems like. What are your thoughts on this, Alex? What's new here?

Alex Stamos:

Well, it's all coming to a head. Yes. This started with President Trump. You and I have talked about TikTok, it's both being... there are really legitimate concerns here. Of the three, there's the possibility of tweaking the algorithm to manipulate people. Definitely a possibility. There's some evidence that they have highlighted stuff. There's not good evidence of that being done in a lot of political ways, but certainly in some really embarrassing things. Like they downgrade people who they think are overweight. Not cool.

There's direct censorship, of which we have a couple of examples of people who are anti-PRC and such having their content taken down. But for me, by far the most concerning thing here has been the data access issue, of the amount of data that any social media company has and the fact that it was pretty clear that Chinese employees had access to that. And that was proven through this case where they investigated a US journalist and tried to track her to see if she was meeting with TikTok employees, which seems to be the source of the FBI investigation.

So this is an interesting question, is what is the crime that they're possibly investigating? Perhaps this is under a broader... I guess it's not FARA, but some kind of foreign surveillance. Because they have an ECPA SCA responsibility around content, but my understanding is that generally does not cover things like IP addresses and physical location, which I think is a real problem.

Evelyn Douek:

Yeah, that's terrifying.

Alex Stamos:

Right. And I think points to really what... it'll be interesting to see. It should be a crime. It should be a crime for Chinese employees to go investigate US journalists, because in this case there's no evidence that these folks were doing so on behalf of the People's Republic or the Chinese Communist Party, but functionally it'd be exactly the same. It's that the MSS would show up, the Ministry of State Security, and say, "Hey, we want to know about this American." And employees would go, "Yo." People talk about it's being dumped to the Chinese government and such. We actually know this pretty well because it's documented at other companies. You effectively just have MSS representatives who are embedded in the org who can then ask the employees to do queries and such.

And so the fact that we don't have a law that directly speaks to that I think is one of the things that this demonstrates. So it'll be interesting to see if DOJ can invent something. And I know federal prosecutors are famously inventive about using incredibly broad laws and applying them to new things, but certainly I do not know of anything that directly applies here. Do you?

Evelyn Douek:

No. I haven't read what the crime is. No.

Alex Stamos:

Yeah, and I don't think they announced because there's no indictment, there's no affidavit, or complaint.

Evelyn Douek:

Yeah.

Alex Stamos:

And so yeah, I think it's all coming to a head. The Biden administration has given up on Project Texas. TikTok has been on this tour. And they've done a bunch of talks in front of semi-public audiences of speakers and such. So some of our people went to an event they held and were impressed by the quality of the slides. But in our discussion afterwards, nothing changed, in our opinion, on, is this actually a practical way to try to prevent this kind of surveillance?

And so I think what Biden is pushing for is the only reasonable outcome that allows TikTok to exist and for US citizens to use it while also ameliorating that it concerns not just the US but internationally. Because we've got to remember TikTok is the product for everybody except inside the PRC. So anything they do for America they're probably doing for the entire world, unless they do something really complicated. Now, I think this will actually be... this is a situation that could be financially beneficial to ByteDance because there's no reason why you can't allow ByteDance to own 30%, 40%, 49% of the final public company as long as they have limited voting rights.

And you could build that into their shares. So you could have ByteDance economically benefit from this spin-out. And it's quite possible that a US-listed TikTok public American company will have a much greater valuation than what ByteDance has traded in China for a variety of reasons. Yeah, for the difficulties of Chinese stocks and such, and the fact that Chinese currency is not fully convertible, stuff like that. But also because they'll maybe be able to get back into India. And so I think financially this would make a lot of sense for ByteDance, but it makes no sense for the PRC. And I think that's where this is going to hit the wall, is I don't see the Chinese regulators allowing ByteDance to sell or to spin out a US subsidiary.

Evelyn Douek:

Right. Do we know anyone that has a loose $50 billion or so and an interest in purchasing social media companies, and a desire to please the Chinese government? Is there?

Alex Stamos:

Right. I think the problem for... yeah, there's a guy but I think he's a little-

Evelyn Douek:

The funniest possible outcome. He's busy. It's a shame.

Alex Stamos:

He's a little busy. Well, he's already talking about buying Silicon Valley Bank, which is... of all the people to reestablish trust in the banking system, everybody just let that one fall on the floor. But I think TikTok is worth way more than Twitter. TikTok would be, at current size it looks like they have 100... one of the new stories is the CEO of TikTok, Shou Chew, is going to testify in front of House Energy and Commerce on Thursday. And some of the things that's leaked out is they're going to talk about how there's 150 million American users of TikTok.

So if they're looking at 1.5 or so billion MAUs globally and 150 million Americans, and equivalent penetration in Europe, this is a $250 to $300 to $400 billion company. Which is why I think there's a lot of value to ByteDance that could be unlocked here. But what is economically in the benefit of ByteDance is not what's in the benefit of the Chinese Communist Party.

Evelyn Douek:

Yeah. I love that as a comms strategy, by the way. "Oh no. We're going to go to Congress and tell them that we are way more integrated in your society than you even thought when you were worried..." If you were worried before, let me tell you, it's 50% more users than you originally thought.

Alex Stamos:

Yes.

Evelyn Douek:

Yeah. So what's the endgame here? Because I don't understand how this plays out after this. First of all, again, I would lose my job as a First Amendment professor if I didn't say, "First Amendment, First Amendment, First Amendment." I just don't see how a ban here withstands scrutiny in terms of the complete over-breadth and disproportionality of banning an entire communications platform used fruitfully, educationally, and just for fun for apparently 150 million Americans. But even setting that aside, which we can't, this is not going to be the last China-based app. We are in an information world now where this is an integrated information environment. What's the end game here? Are we just going to pick them off one by one?

Alex Stamos:

Yeah. It's a great question because, one, have you looked at the bill that's being proposed to give Biden the ability to ban TikTok?

Evelyn Douek:

Yeah, we talked about the Restrict Act briefly last week.

Alex Stamos:

And as structured, do you think it would have First Amendment problems?

Evelyn Douek:

Yes.

Alex Stamos:

Yeah.

Evelyn Douek:

Yeah.

Alex Stamos:

Okay. Right. So I think one of the challenges, one of the missed opportunities here was, what is the real problem here? And from my perspective, the real problem is the movement of American PII into systems where that data can be accessed by America's adversaries. And so we had an opportunity to perhaps frame this in that direction. Instead of framing it as ban TikTok, framing it as controlling American PII and controlling what countries have access to it, and passing a privacy law that, one, sets federal standards for privacy, which is going to have to happen because every state's passing a privacy law. So we're ending up in this huge, ridiculous mess where we're going to have all these different privacy laws. So we need a federal override privacy law anyway so that we're not living by 50 different standards, which is totally impractical.

But in doing that, we also could say, "Hey, in there are certain countries where American PII cannot be processed." And building up a wall there that makes it that effectively the work TikTok would have to be in compliance with that would have to go beyond even Project Texas or perhaps do a spin-out would be a very reasonable outcome of that. But instead, because we only talk about TikTok, one, it misses all the other companies we've talked about, the WeChats and the Tencents. Alibaba has a cloud product that is used by a bunch of American companies. So I don't know of any laws that restrict what kind of data American companies can move into Alibaba's cloud service, infrastructure as a service provider. So the fact that we're missing out on all these other things is really a missed opportunity. And like you said, let's say they, quote unquote, ban it. Let's say they rewrite the Restrict Act, that is effectively an interstate commerce thing and doesn't talk about speech or anything. TikTok ends up moving all of their operations to Singapore. What happens next?

So you have Americans downloading the app that are talking to web services that exist outside of the United States, and that it's exact same output. Okay, they have to go after the app stores. And that seems like something that the government could never really have a direct power over is, other than just trying to jawbone own Apple and Google into removing TikTok from the app stores, but that would be a hell of a thing for those companies to do.

Evelyn Douek:

Yeah. So as a policy matter, it's an under-inclusive solution that doesn't address the real data problems that we have. As a legal matter, it is flawed and highly constitutionally suspect and infirm. And then there's the political issue of, I do not want to face the wrath of all of the teens of this country losing their access to their favorite pastime. I think the end game is they all just go and get elected to Congress and reverse everything. We will see.

Alex Stamos:

So maybe we found a solution to the problem of young people not voting, right?

Evelyn Douek:

Yes, exactly.

Alex Stamos:

All you have to do. It's unfortunate, this is the situation that gets them to come out.

Evelyn Douek:

This is a pro-democracy measure. You just need to think bigger. It is. Yeah.

Alex Stamos:

So I think something has to happen. I do think TikTok is a real risk. But I just feel like the way this has been approached, which the Biden administration, I think they missed an opportunity because they went down the same path that the Trump administration started, which was only focusing on TikTok and starting down this Project Texas path, which as we discussed before, it's just... the fact that a huge Trump donor happens to get the bid for the contract there was a little bit suspicious from the start. But it was never really going to solve it.

And then we wasted all this time on that when that time could've been spent coming up with a solution that handles all these other companies. Because this is not... if it takes us three years every single time we're dealing with a massive Chinese tech company, that is a losing path. We have to have some kind of actual standard here, because then that will be reflected in Europe. It'll be reflected in Australia, in Japan and New Zealand and all of our allies if we can come up with a good way to handle these adversaries. But just banning TikTok might get followed or not, but it doesn't really create a precedent that I think is super helpful for the western alliance.

Evelyn Douek:

Yeah. And it's not like the technology is moving particularly fast in this area or anything.

Alex Stamos:

Yeah.

Evelyn Douek:

The legal process is really doing very well at keeping up. Okay, speaking of things that are like 2019, 2020 redacts, Donald Trump being returned to both Facebook and YouTube this week. YouTube reinstated Trump's account on Friday, almost exactly two years after then CEO Susan Wojcicki said, "We will lift the channel when we determine that the risk of violence has decreased." Now, it's really evident that they have thought long and hard about this, considered the really difficult, normative theoretical issues that are raised here. The free speech concerns on both sides, because their two Tweet thread explaining this outcome really was a comprehensive explanation.

Here is the entirety of their explanation: "Starting today, the Donald J. Trump channel is no longer restricted and can upload new content. We carefully evaluated the continued risk of real world violence while balancing the chance for voters to hear equally from major political candidates in the run up to an election." And that is all the reassurance we get. I really feel like I understand their complicated decision process. All of the equities that they've thought about. The lessons learned from this difficult and painful process, and what the standards are going to be in the future. Speaking of, they don't have long to work out what those standards are because, meanwhile, over on Truth Social Trump is posting that he's going to be arrested this week and asking people to protest and take our nation back. So it's great timing

Alex Stamos:

Right. Within 24 hours of this decision, Trump announces that he thinks he's going to get arrested and then calls for violence in the streets in case he is. They can't really predict that, but it was clear that at some point this is going to happen. Just the speed at which this happened is amazing to me.

Evelyn Douek:

Yeah.

Alex Stamos:

They had like a day to bask in it and then they're like, "Oh my God."

Evelyn Douek:

It's like the writers of this sitcom needed to fit it all into the one 30 minute episode, so it just became one after the next.

Alex Stamos:

Right. The writers of America have decided that the season finale is not going to be one of those extra double time, is going to fit within the 42 minutes that HBO has allotted them. And we're going to squeeze it all in. Yeah.

Evelyn Douek:

Well, we're busy people. We don't have time to watch a double episode. Yeah. So it's going to be interesting to see how Facebook uses the new guardrails that it said it put up on Trump's account and how YouTube deals with this, which is the most... it wasn't predictable that it would happen within 24 hours, but it was totally predictable that this would happen along some timeframe. And so yes, I have faith. I have so much faith that we have learned so much in this period of reflection and it's not just going to be all the same circus all over again.

Okay. Speaking of circuses, over to our Twitter corner. Excellent. So in our list of non-updates, Musk is still the CEO, there have been no changes to the API, and we still don't have an open source algorithm, although Musk did Tweet this week that they will open source all code by March 31st. And very candidly said that, "The algorithm is overly complex and not fully understood internally. And providing code transparency will be incredibly embarrassing at first," which strikes me as true, Alex. What do you think?

Alex Stamos:

I think those are all true. I think one of the interesting things that's happened in the last couple days is there's a long story about Musk's interference in self-driving cars. Did you read that?

Evelyn Douek:

Yeah.

Alex Stamos:

And I feel like somebody had a quote in there, which is, "People did not believe us about how hard it was to do engineering under Musk until Twitter happened." And that now everybody has this incredible retroactive respect for the engineers at Tesla and SpaceX for getting anything done. And this is exactly the kind of thing. Like, "We're just going to open source our code." One, the Twitter recommendation algorithm is going to make very little sense without access to all the backend data.

It's just going to feed all these conspiracy theories because you're going to have a bunch of variables in there and stuff that can't be defined. Things that are calculated based upon the current data set that you could come up with a theory of, "This is why I'm shadow-banned," or, "This is why my content is pushed down." It's just not going to end well because you have this incredibly complex system that feeds back on itself. And the recommendation algorithm only makes sense in the context of the content that it's looking at. And so yeah, it'll be fascinating to see the code.

I have a feeling... I think it's safe to put $5 down on that we're going to blow past the March 31st deadline. That's going to be real... even if that's all his engineers were doing, which would be shocking because they have 40 other things they've been told to do, that would be not a lot of time to get that stuff abstracted out and ready for sharing.

Evelyn Douek:

$5. I just don't know if I can risk that kind of liquidity in today's today's market.

Alex Stamos:

Well, I'm not making you bet one Bitcoin. I don't know if you saw, but Balaji, one of the venture capitalists bet that Bitcoins are going to be worth... the US financial system is collapsing. And so he bet somebody one Bitcoin, so like $20 grand, it gets to $1 million. That Bitcoin was going to be worth $1 million. So a number of people took his bet. And so that should be... the odds of him paying out I think are not great, which will be interesting, people suing based upon a Twitter promise. A Twitter bet.

Evelyn Douek:

Excellent. More legal cases to look forward to. Another Tweet that caught my eye this week from Musk, Alex, was, "In the months ahead we will use AI to detect and highlight manipulation of public opinion on this platform. Let's see what the PSYOPs cat drags in." Just curious, as someone that has spent a lot of time and work detecting and highlighting manipulation of public opinion on that platform, did you ever consider just using AI?

Alex Stamos:

Yeah. No, but oh my goodness.

Evelyn Douek:

Just go buy some AI, Alex. Come on. What have you been doing with your time?

Alex Stamos:

Right. So it turns out machine learning is really sometimes effective in finding coordinated campaigns, especially when you have access to the data that we don't have on the outside that can be used internally on not looking at the content but looking at, are there indicators here that these accounts are being run by the same person? But that being said, Twitter had that, and they had a research coalition to look at it, and they had a process by which they did this, and they had investigators internally who worked with folks on the outside to try to figure out what's going on. And he got rid of all that.

So he should be worried about outside interference because, from what I have seen, people are running rampant on Twitter now. There's a lot of January, February accounts that are gaining a lot of play that look very, very, very suspicious. And so I think he should be worried about people trying to influence things, but probably should not have fired all the people who do that if that's what he really cared about.

Evelyn Douek:

Right. Turns out at this point you still need people to look at the findings and act upon them, whatever comes up. Okay. And then heading briefly over to our legal corner. Two quick updates: the first is that, as expected, there's the challenge to the New York Hateful Conduct Law that we have been tracking, which requires social media platforms to post how they're going to deal with hateful conduct. And we had a podcast interview with Eugene Volokh, one of the plaintiffs in that case challenging the law.

The New York attorney general is appealing the decision that preliminarily enjoined that law when Eugene won on first instance. So that's not surprising. And so it is headed to the second circuit. And that'll be interesting to watch. It is, in a sense, like a blue state version of the Texas and Florida laws that we've seen, where they're just requiring platforms to post a policy and say what they're doing and offer users an opportunity to appeal.

So it'll be interesting to see how that plays into this net choice milieu as it heads up. And then the second update is a ruling from the ninth circuit in a case called O'Handley, finding that there was no First Amendment violation when Twitter restricted posts based on the California secretary of state and its Office of Elections Cybersecurity, flagging those posts through Twitter's Partner Support portal. So Twitter has its civic election integrity policy that it was enforcing in the run up to the election. And recognizing both that the volume of content on its platform was extremely high and it's going to miss things, and also that local outside actors have knowledge of what's going on.

It granted access to this portal to election officials in at least 38 states, and that included California secretary of state. None of this was secret. It was all quite public in the run up to the election, that this was how Twitter was enforcing its policy. And the court dismissed a jawboning action by O'Handley, who had his Tweet taken down on the basis being flagged by California saying that there was no pressure. This was no illegitimate jawboning. This was not coercion. The office was just flagging it to Twitter and saying, "Hey, could you take a look at this?" And then Twitter could do with it whatever it wanted based on its own internal rules.

Not a surprising decision. I will say the decision also didn't really dig into the legitimately extremely difficult issues here around, what is appropriate jawboning? What is appropriate pressure? This was just not the case to do that perhaps because there was just no signs of the kinds of pressure that we have seen in other cases. But this is, I think, going to be the first of many, many kinds of this case. Jawboning is the big issue du jour I think in the next few years around content moderation, and so we should do a full episode on it at some point.

Alex Stamos:

Yeah, absolutely. I think, like you said, this is going to be a big question of... it'll be interesting to see... let's say governments did it publicly. Let's say Texas, instead of saying, "You have to take down this stuff that we don't like," just made a list of, "This is stuff we don't like on Twitter." And then lots of citizens used that to ask Twitter, "Take this down," to send emails and Tweets and stuff, is how public it is. Is there any threat behind it? And what kind of coercion really varies in different situations. Certainly internationally the amount of coercion is extreme. But like you said, in this case it seems like they had very little power and just had the ability to report. I do think it's going to be a fascinating question of, what's the line that we should draw here? And it's going to have to be drawn across the entire US in one way.

It does really feel like the next SCOTUS thing. But it also has this weird historical parallel, which is this feels like those days in the papers of, famously, JFK and Ben Bradley of the Washington Post being friends. And being able to call up your friend to get stuff done, which I think now we look much more negatively on. So it's like the overall milieu of what is considered journalistic ethics and what kind of interaction can happen between the gatekeepers and the subjects they're covering is much more regulated now. And I think how much we want to apply that to platforms is going to be a super open question. And there's going to be some real difficult trade-offs. People really need to understand that there's a bunch of informal stuff of the government saying like, "Hey, we think these accounts are relayed to Isis."

And then you look and they are relayed to Isis and you take them down. And you look and they're not, and you don't take them down. And if you start to outlaw any kind of cooperation short of a subpoena coming or an indictment of somebody, then there's a huge amount of societally positive things that a lot of people would agree on, especially in the child safety space. There's a lot of like, "Hey, here's a bunch of sketchy accounts. We're going to come get them later, but you might want to take care of it now." A lot of that stuff happens on a regular basis. And if you outlaw any discussion at all between platforms and government, then that kind of stuff's going to go away.

Evelyn Douek:

Right. Of course on the other hand you have Turkish governments saying, "Hey, these are terrorists," and it turns out that it's legitimate journalism. Or it's one thing when it's child safety and it's another when the Indian government's sending police officers to your office.

Alex Stamos:

Right. And clearly whatever we do here is not going to fix Turkey and Israel and India and all these other countries that have internet referral centers and who are like... it's well beyond... jawboning I don't think captures India, where you're like... "We're threatening to arrest your executives," is not jawboning. That's something else. We need another term.

Evelyn Douek:

Yeah. Although there are threatening legal sanctions of certain kinds. That happens in this country too. And so this is a legitimately very difficult issue. And like you said, we're going to have to sort it out. I 100% agree that this is probably going to be heading to SCOTUS at some point. And in some sense it's not a new issue. We just have all of these old cases to do with morality commissions sending letters to bookstores saying, "Hey, we think you've got some indecent stuff. We appreciate your cooperation in this matter." Stuff like that. But the scale, the volume, everything that's happening makes this issue more salient in the platform era, and obviously so political right now.

Alex Stamos:

Right. And now everybody... effectively your experience to, say, these companies... and this has come out in the Twitter files. And I think, as we have discussed, this is a legitimate thing people should pay attention to in the Twitter files, is the experience is like every political actor tells the platforms there's stuff they don't like.

Evelyn Douek:

Right.

Alex Stamos:

Some of them were subtle. Some of them were really obvious and dumb. And so I think the experience of doing it on that side leads me to believe that you're going to have to have a standard, but the standard's going to have to probably be around what kind of coercion and power the person has. And honestly, the people who threaten the worst are members of Congress, I think, with those letters that come out. But unfortunately, anything SCOTUS does probably doesn't apply to them because of speech and debate clause. So they're the people who really need to have a restriction, is folks in Congress, because they just kind of go off the rails in what they're requesting.

Evelyn Douek:

Yeah. No, the First Amendment can definitely constrain. There is a legitimate question, these letters that democratic members of Congress center platform saying, "Here are 12 disinformation, dozen accounts. What are you doing about them?" What level of coercion, what level of threat does that need to be backed with before that's a First Amendment problem? Is something we're going to have to work out. And like you said, is it different when it's public? Is it different when it's Donald Trump's White House sending people, saying, "Hey, take down this Chrissy Teigen Tweet?" There are all different instances of jawboning. They all raise different issues.

Before we leave, I think we just have to acknowledge the latest Stanford scandal that really has me reflecting on what it means to join this institution. It's been hard to process. It's been tough. But the Stanford women's basketball team was knocked out in the first round of March Madness on Sunday. And it's the first top seed in the women's tournament to miss the round of 16 since Duke in 2009. So this is hard. This is a tough moment for this institution. Tough moment for this university. And it's going to take some time to reflect and process this real scandal.

Alex Stamos:

I don't know what to say. I'm just... yeah. Thoughts and prayers.

Evelyn Douek:

Thanks, Alex. Appreciate it.

Alex Stamos:

I do want to give a shout out, though, to the Fairleigh Dickinson Knights, a university I had never heard of, to be honest, defeating number one Purdue in the men's tournament. And to Princeton, who is still in the tournament, who is making this unbelievable run. So at least while we mourn on the women's tournament, we can watch some pretty incredible Cinderella stories in the men's tournament.

Evelyn Douek:

Swings and roundabouts. And with that reprised sports segment for the week, this has been your Moderated Content weekly update. The show is available at all the usual places, including Apple Podcasts and Spotify. And show notes are available at law.stanford.edu/moderatedcontent.

This episode wouldn't be possible without the research and editorial assistance of John Perrino, policy analyst extraordinaire at the Stanford Internet Observatory. And it is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman. See you next week.