Moderated Content

MC LIVE 9/28

Episode Summary

Alex and Evelyn record an episode in front of probably their entire active listener base. They talk about an update on SIO's investigations into child sexual abuse material on platforms; the fight for free speech in India; the poor outlook for election integrity at X in 2024, and what this might mean for other platforms; platform transparency mandates with Daphne Keller; and challenges to age verification laws with Alison Boden, the Executive Director of the Free Speech Coalition.

Episode Transcription

Alex Stamos:

Are you an academic who is tired of interacting with students? Do you want to have the university version of resting and investing? Then try Tenure. Tenure, it means you never have to do real work again. Go to tenure.com/moderatedcontent.

Evelyn Douek:

Sign me up. That sounds good. All right. Hello and welcome to the first ever live recording of Moderated Content. I think our entire listener base is in this room.

Alex Stamos:

The entire... Let's hear it from everybody who's ever downloaded Moderated Content.

Evelyn Douek:

Yeah. All 12 of our listeners are here. It's a very niche audience.

Alex Stamos:

You're clearly not a math professor, which is great.

Evelyn Douek:

Right. Okay. So yes, this is Moderated Content, the weekly, slightly random, and not at all comprehensive news update from the world of trust and safety, with myself, Evelyn Douek and Alex Stamos. All right, we are starting the episode today with an update from Alex's very own Stanford Internet Observatory, which released an update to its June report on the distribution of self-generated child sexual abuse material. And we have David Thiel here to walk us through what the new findings were in the report.

David Thiel:

Sure. So a while back, I think in June, we had published our first report looking at the ecosystem of underage people selling their own explicit content online. Actually, the recheck began kind of in petty fashion, which was that Linda tweeted saying that they had solved most of the problems and so I went and hashtag searched and said, "No you didn't read that."

Alex Stamos:

And then you were done, because David saw that Linda Yaccarino tweeted something, and he said, "That's good enough for me."

Evelyn Douek:

Yeah, that's going to be true.

Alex Stamos:

That's going to be right. We're just going to walk away. Why would we do research in this space? Linda said it's done. So let's just wrap it up everybody, Linda said it's done. The trust and safety research conference is canceled. Please go home.

Evelyn Douek:

The gold standard in platform transparency right there, it's [inaudible 00:02:02] from Linda.

Alex Stamos:

So what did you find, David? Did you find it was over, that Twitter had fixed all their problems?

David Thiel:

The problem had not been solved, and to be clear-

Evelyn Douek:

What?

Alex Stamos:

What? Come on guys. It's, "Oh my God," gasp. They're gasping in the audience. The thousand people in our audience.

Evelyn Douek:

Oh, no someone fainted at the back.

Alex Stamos:

Oh my goodness. Call the medics. Okay, David, what did you actually find?

David Thiel:

And to be clear, as we've talked about in our original report, this is a pretty cross-platform problem. It's not that Twitter is unique in this regard. But I think what we've found since the initial report and doing a retest across Twitter and Instagram is that kind of what we were afraid would happen, which is that people have gone through, kind of found the very obvious signatures, squashed them, put it on a block list and called it a day. Which is not to say there are not ongoing trust and safety efforts within those companies, but what we found was not as encouraging as we had hoped, basically.

Alex Stamos:

And we're talking about Twitter and Instagram as the big ones here?

David Thiel:

Yes, Twitter and Instagram almost entirely for the primary mechanisms. And what we've kind of seen is that there's this funnel mechanism where some platforms are really good at getting people to discover your content, but they have stricter content enforcement mechanisms. Some are really good for keeping a permanent user base but have poor discoverability. So Telegram, no real discoverability, but really great at keeping content there. They don't have any content policy whatsoever, really. And so we see Twitter and Instagram being used to feed that funnel, and to some degree TikTok.

Then kind of the new things that we found in the updates, we found some disappointing things when it came to blocking associated hashtags, where they were doing exact string matching on very explicit hashtags, but if you added another character at the end that it had completely fooled it, which is not going to keep pace with-

Alex Stamos:

Which you and I have had this discussion of is there any real good reason to allow the word pedo into a hashtag? How many people are actually tweeting about their pedometers?

David Thiel:

#pedosaresuperbad is something that you have to leave online apparently.

Alex Stamos:

That should probably be white-listed and then perhaps-

David Thiel:

Correct.

Alex Stamos:

Yes. And so in your report you found just simple emojis, for example, you just add one emoji and the exact same string, which is clearly for selling CCM, was still allowed.

David Thiel:

Yeah. Another interesting thing, and something we didn't go into in the previous report, is we found our first confirmed example of someone pretending to be underage to sell explicit material, but they weren't. Which was something that we suspected might be the case, but in this case we happened to find somebody that linked to a third party platform that did stronger verification of users and had with reasonable confidence found that this person was actually pretending to be 14 while being over 18.

Alex Stamos:

Because they're effectively, they're gaining a 10.99 from this platform.

David Thiel:

Right.

Alex Stamos:

They're actually making a significant amount of money pretending to be young.

David Thiel:

And to be clear, the platform was also not okay with this. But it was interesting because it's actually really, really illegal as well. So saying that you have material of a 14-year-old person, even if it is yourself, even if you don't have it, still violates at least US law to the point where that's like a five-year prison sentence.

Alex Stamos:

That's a good safety tip for this audience. For all of you who are saying you're 16 online. Just like I'm 28 still.

Evelyn Douek:

You can hear the uncomfortable laughter of people like...

Alex Stamos:

I see lots of phones have come out as social media profiles are getting changed real fast.

David Thiel:

Yeah. So I think that was interesting and it's unclear how much that is really occurring.

Alex Stamos:

Yeah. And then also when you talk about the age stuff, the other fascinating thing was that effectively you have probably actually young people, but whoever is selling this stuff, basically building CAPTCHAs to advertise their age, right? In the original report we talked about people saying, "I'm 18 minus four," and such, like really obvious, and now you have effectively tests to try to get around any kind of automated scanning to look for age advertising.

David Thiel:

Yeah. I mean, on the one hand there's just these multiple levels of abstraction that they're applying on top of things, but I mean presumably at a certain level it's just so much work to actually solve puzzles to figure out what content you're looking at.

Alex Stamos:

One of them was like, count how many stars.

David Thiel:

Yeah, it was like my age is in my location, my location says that it's in my pinned post, my pinned post says that you should go and do this thing to my header image. It's like, okay. Eventually, hopefully it wears them down and they get bored and go away. But that was interesting. I also thought it was... We hadn't really noticed in the first report that on Twitter, they actually have a pretty quick rate of actioning some of these accounts. And the reason why we are theorizing is actually because of community self-policing, and that is because Twitter actually allows the sale and advertising of adult content. And so they have people that are in that community.

And when they find anybody who is either underage or selling to underage people, they just start throwing off alarm bells and saying, "Everybody go report this person." And in some cases providing chat screenshots or something where somebody admitted to not being the age they said they were. And so we're kind of guessing that that actually has a positive effect on why they get a lot of it taken down so quickly. And that raises this interesting question where it's not necessarily a good thing that they allow mixed content overall, it makes their job harder in a number of ways.

Alex Stamos:

Which we discussed before, which is if you allowed nudity all of a sudden you've opted yourself into, "I have to decide if this is a 17-year-old or a 19-year-old," which is hard.

David Thiel:

Right. But the fact that they have an above ground community there, unlike on a platform like Instagram, means that there are people that will police that community for you. I don't know what the grand takeaway is there, but I think that's an interesting area for future study is when you are more permissive with your content policy, how many, basically, volunteer enforcers do you gain because they want to preserve their actually legal content community?

Evelyn Douek:

My question is, were you surprised by your findings in the update? Because the first report that you released got quite a lot of press, generated a lot of headlines, and was really bad for the companies, got a lot of attention about it. And then it seems like there's still some pretty low hanging fruit just a little while later.

Alex Stamos:

He was surprised because Linda said it was solved. And you can still see the thousand-yard stare that David has. That he was shaken into the core that Linda's statement was not true.

Evelyn Douek:

How could you ever believe in anything ever again?

Alex Stamos:

Right? It's like he doesn't believe in truth or the existence of a consensual reality because Linda Yaccarino said something that was not true about trust and safety.

Evelyn Douek:

Yeah. Right. And so am I wrong with that takeaway? There seems like there's some low hanging fruit, and why is that so?

David Thiel:

There is low hanging fruit, and it's mostly just because so many organizations have shifted towards reactive postures rather than active investigation. And it is true that there is an unlimited amount of resources that you can spend on trying to proactively investigate harms on your platforms, but there are also things that have a middle ground of being moderately difficult to detect, with harms that are pretty extreme in a number of cases. And so yeah, I wouldn't say it's surprising given the overall trust and safety climate and it's also not surprising that you've got kids that are devising ways to escape the notice of adults trying to enforce their behavior. That's a fairly old concept, I'm led to believe.

Alex Stamos:

As a father of multiple teenagers, I'm shocked that kids are really good at beating their parents at these things. But thank you so much, David. We appreciate your work and appreciate you giving the update, of which there'll be many more unfortunately.

David Thiel:

And this is why I'm here to learn what I'm supposed to be working on.

Alex Stamos:

Right. Does anybody have any tasking for David, because he likes to listen to this podcast to find out what SIO is doing next. Anybody? Okay, great. We're looking forward to your next report, David, which will be available in let's just say 13 days. How about that?

David Thiel:

I'm on it.

Alex Stamos:

Okay, great. Let's hear it for David, everybody.

Evelyn Douek:

All right. So for the next segment, if this podcast had a catchphrase, I think it would probably be something like, the only podcast to comprehensively cover trust and safety news and the NCAA Conference realignment.

Alex Stamos:

Yes.

Evelyn Douek:

But a second, close second would be something along the lines of India is the most important jurisdiction for the future of freedom of expression online. We talk about it a lot on this podcast, and that's because it's really, really important. And it's not just simply because of the sheer number of people in India that are affected by what happens there, but it's also for many platforms, the largest future growth opportunity because they're saturated in Western markets and locked out of China. And it's also looked at by many other countries as a model for how to regulate the internet, as the government takes an increasingly authoritarian turn. And it is geopolitically important to other governments.

And so in the middle of this diplomatic uproar that we're having at the moment around India and Canada's accusation that India was involved in a murder of a Canadian citizen on Canadian soil, the Washington Post had a timely report, I think, or a series of reports actually, really, really good reporting about what's happening on the internet. And in particular, a story about Meta's reticence to take down one of my favorite things, a coordinated inauthentic behavior network because of fears of the Indian government. So I think it's a good story that's a reminder of the geopolitical and business pressures hampering the fight for free expression in the country. But I'm curious, Alex, this is your bread and butter. This is something you worked on a lot at the company. So what struck you about this reporting?

Alex Stamos:

Yeah. So it's not shocking that a year ago, almost exactly a year ago, our team, led by Dr. Shelby Grossman over here. Shelby hates waving when I point at her. Hey, Shelby. Hey, Shelby. Shelby? Hi. Hi, Shelby. Hi, Shelby. Shelby-

Evelyn Douek:

The listeners can hear the embarrassment and fury coming from there.

Alex Stamos:

Yeah. Shelby led a team that looked into a bunch of activity that was pro-Modi, pro-BJP from coordinated in authentic accounts, which you love for us to talk about. And one of the fascinating things is the data we used came from the platforms and they said absolutely nothing about it publicly, right? And so this was a situation in which Twitter and Facebook both took down networks, but unlike almost every other situation, there was just a big one where Facebook screamed from the rooftops that they had taken down Chinese activity. They said absolutely nothing about India. And so, like you said, India is incredibly important for lots of reasons.

But unfortunately, this last month has been on the democracy to authoritarian scale. India has gone from backsliding mostly free democracy to Saudi Arabia with elections, right? If you're in a situation where you have this level of suppression, of disinformation and murdering, you're assassinating your political dissidents in other countries, that's pretty incredible stand. I think it's going to be interesting to see how the companies act from here on out. Because it's clear that one of the pressures here is all the major Western platforms are locked out of the People's Republic of China, they will never be let in, right? There's no situation in which it makes sense for the Chinese Communist Party, no matter what promises they extract, to let an American company into the Chinese market.

So if you're already locked out there, to be locked out of China and India, it means being locked out of a third of the populace of the planet and that is not something that Mark Zuckerberg gets to go say to his shareholders, right? China's position here gives Modi a huge amount of leverage. They have used that leverage against TikTok in the past. And it's pretty clear that that leverage has been used to both bend the decisions that are made around content moderation and then also to bend the kind of public statements that could be possibly made that are basically automatic when you deal with any kind of other estate, but for India become highly problematic.

For our report, I should just point out, yes, we provided attribution to the Indian government based upon stuff we had. It wasn't that hard because you ended up with a member, an officer in the Indian military complaining to the Indian press that their accounts had been taken down by Facebook. And how angry they were that like, "Oh, they took down our fake accounts, we're so upset about it." So it was like it's effectively an open secret in India that there's both the disinformation campaigns happening as well as pressuring the company to take down Indian speech.

Evelyn Douek:

Yeah. And I mean, I don't have a lot of sympathy for the companies in many instances, but this is an area where I do feel like they are being asked to solve geopolitical problems with very little backing from the US government, for example, which has been completely silent on this matter. And I think it is incumbent on democracies to champion free speech values.

Alex Stamos:

What would that look like? I mean, we've talked about this before, that unlike the privacy situation where GDPR or ECPA in the US, if you turn over the data of individuals that might be controlled by Europe or the US to an authoritarian state, there is a conflict of law scenario. How would you create a conflict of law scenario in this case?

Evelyn Douek:

I mean, yeah. So I would start at a much lower bar and just ask for some rhetoric from the government saying when these companies are pushing back against the government, like we saw the old Twitter would take a stand against the Modi government and go to court. I'd like to see more sort of rhetoric from the government or support in saying we support our companies.

Alex Stamos:

Except, I mean, this is the real politic here is-

Evelyn Douek:

Right. Absolutely. Of course.

Alex Stamos:

Is that we're in the middle of the Western alliance, of the five I's trying to recruit India into a coalition. People have even talked about India entering into an official coalition with the Anglophone Pacific countries against China. And so you have a situation where Joe Biden, India assassinates somebody apparently in Canada, our closest most important ally. Haven't really done anything objectionable since the war of 1812 and the fact that they make kind of crappy beer. Oh God, is Jeff Hancock here? I'm going to be in real trouble. They kill somebody on Canadian soil and the US barely has a peep about it. So to think that the United States in a situation where India is the massive prize in trying to counterbalance this new Cold War with China, the government is going to say something about a content moderation decision is just I think completely unrealistic.

Evelyn Douek:

Naive, is the-

Alex Stamos:

No, I'm not... I did not say naive.

Evelyn Douek:

And then when I raise this, the other answer I get is, "You don't know what's going on behind the scenes. You don't know the back channeling that's going on."

Alex Stamos:

I'm sure it's better than what we see.

Evelyn Douek:

Well, it can't be worse.

Alex Stamos:

Behind the scenes. My experience is everything that's behind the scenes is so much better than what's in public.

Evelyn Douek:

Yeah, that's right. It's true, it could be worse. Yeah. Okay. So anyway, it's a depressing segment.

Alex Stamos:

It's a depressing segment, and India is having elections in 2024. It's going to be a huge deal. I mean, 2024 is turning out to be, I think we're going to talk about it this a bit, the Olympics of election disinformation, because there's so many important elections going on. But, man, Modi's crackdown, the fact that there's been very little pushback on either of these stories. If I was the BJP, I'd be feeling pretty empowered right now.

Evelyn Douek:

Right. And the other little tidbit from The Post reporting, was saying Twitter, which has been historically more forceful in pushing back against the government, didn't take any action when this network was-

Alex Stamos:

What?

Evelyn Douek:

Yeah, I know.

Alex Stamos:

What?

Evelyn Douek:

Yeah.

Alex Stamos:

Oh my God, the crowd is shocked that the world's largest... one of the largest possible buyers of electrical vehicles, where there's lots and lots of pictures of Elon Musk shaking the hand of Narendra Modi trying to get Teslas entered into India, that Twitter is not taking a stand here. I am shocked, the crowd is shocked. It is amazing.

Evelyn Douek:

Free speech absolutist Elon Musk. So this is a perfect segue to our Twitter corner.

Alex Stamos:

That was an actual trombone player this time, right? Everybody can confirm that.

Evelyn Douek:

That's why they're laughing.

Alex Stamos:

Thank you to the Leland Stanford Junior University Marching Band for playing that.

Evelyn Douek:

So, speaking of 2024 and depressing stories, so in our Twitter corner this week after introducing the feature in 2022 for users to report misleading political content or election misinformation, X now, has in the past week removed that possibility, that affordance from its platform for every country but the European Union. I'm sure the European Union is doing a nice little victory lap about that. And it has also in the past few days, done deep cuts into its disinformation and election integrity team, despite what "the CEO" Linda Yaccarino said on stage at the Code Conference.

Alex Stamos:

I feel like there's some air quotes and you're saying CEO there, Evelyn?

Evelyn Douek:

Yeah. Actually, I can do them and they actually do something with a live audience. "The CEO," Linda Yaccarino. So yes. So it's not looking good for the platform with these... I mean, they're making it pretty clear. I mean, it's always been the writing has been on the wall, that this is the trend that they're going in. But they're saying it pretty explicitly with both removing the affordance to even report it, not that it was going to do very much anyway, and then removing the people that would've enforced that [inaudible 00:19:26].

Alex Stamos:

I mean, it's kind of a nice level of honesty here, right?

Evelyn Douek:

Right.

Alex Stamos:

That you're like, "We were going to take your reports and immediately send them to Devnull," is what they were doing before. And now they don't even take your reports, right? They show you the respect of basically signaling that if you care about election disinformation, they'll do nothing about it.

Evelyn Douek:

It's such a positive spin. I love it.

Alex Stamos:

Yeah. And this has been interesting-

Evelyn Douek:

It's a platform where you get respect.

Alex Stamos:

Yeah, exactly. It has been an interesting week, because effectively there's an individual who's on the election disinformation team in Ireland who is attacked by a number of people and was being investigated. And it looks super complicated, but Ireland, unlike the United States, actually has labor laws. And so it turns out your CEO is not allowed to just defame you publicly and then fire you for no reason. So this will be interesting to see how this plays out in Ireland, which is, as I'm sure most of the people here know, is the headquarters for most of the tech companies in Europe for a variety of reasons, mostly tax related. Which as a result, I saw this incredible graph on the per capita GDP of Ireland and it's completely ridiculous.

Evelyn Douek:

It's the Guinness.

Alex Stamos:

It's the Guinness, yeah. They're shipping a lot of Guinness to get to whatever, 80,000 euros or something. But yeah, so it'll be interest to see how the Irish courts handle that. But I mean, the big net-net is, it is not shocking anybody, but Twitter has given up on this. Which is unfortunate because Twitter really was... Twitter was never, as we've talked about, I mean, I apologize to any X/Twitter people here.I don't think we have any current X or Twitter employees at the Trust and Safety research conference, but there's a lot of good people who used to work there. Twitter was never the most well-resourced company in this space.

Mark Zuckerberg's statement that Twitter's a clown car that drove into a gold mine was always reasonably appropriate. But they were the leaders in a lot of the thought processes around how are we going to deal with these from a transparency issue, from bird watch and the ability to tag stuff as disinformation, which is still going strong. These ideas came out of Twitter that then were adopted by other folks, because the Twitter folks, while not having all the people and all the researchers and all the data and all the money that maybe a Meta or a YouTube did, were really open about doing that public experimentation. And so to see them just completely openly say that election disinformation is not something that exists is really depressing.

That being said, the only upside here is that Twitter has become much less important in the United States, right? It is no longer the most relevant political platform in the US because of this kind of great dispersion of people across multiple platforms. So while it is unfortunate, I think it is not as important as this decision would've been in 2016 to 2020.

Evelyn Douek:

Right. Okay. So the question then is heading into 2024, is Twitter providing cover for the other platforms and being a leader and creating a trend, or is it going to be an outlier? And so I guess just more broadly. I mean, this is something that my non-law, non-internet policy people are asking me about, "Should I be really panicking about 2024 and how should I be feeling?" And so Alex, how are you feeling about 2024?

Alex Stamos:

About 2024?

Evelyn Douek:

Yeah.

Alex Stamos:

Well, let's see.

Evelyn Douek:

Let's confine it to-

Alex Stamos:

Let's look at Alex's browser history. Let's see. Okay, frozen dried food that will last for five years. How much money does it take to buy a visa in Portugal? Yeah.

Evelyn Douek:

Portugal. Nice.

Alex Stamos:

Yeah. Ways to sneak out of the United States. Yeah, so no, I'm feeling great. I'm super excited about the future for American democracy.

Evelyn Douek:

Okay, great.

Alex Stamos:

I mean, there's a bunch of negative things happening, right? Obviously, like you said, Musk has in all these areas, set the bar so incredibly low that instead of a competition, if you are not personally amplifying white supremacists, then any other CEO-

Evelyn Douek:

You're a responsible CEO.

Alex Stamos:

You're a responsible CEO, right?

Evelyn Douek:

Great.

Alex Stamos:

So if you're Mark Zuckerberg, you're like, "Well, that's not that hard." Right? He's sitting at home of like, "Oh man, I, the descendant of Holocaust survivors, all I have to do is not retweet Nazis and I'm doing okay," is a much lower bar for Mark Zuckerberg than has existed in the past, right? So I do think he has created permission structure for people to take it. I think the political attacks against this have been a huge problem. And I think in the long run is incredibly shortsighted by the people who are mostly on the right, who are attacking any kind of election disinformation stuff.

Because what we're seeing is we're stuck in this discussion where everything's about 2016, right? Where it's about Russia and Trump, but the world has moved on. The largest networks that are being run online for foreign interference are by the People's Republic of China, by the Chinese Communist Party. They are not fans of Republicans. If you look at CCP disinformation, it is a lot of it pushes on the left. It does create fractures in the Democratic Party, but a lot of it intentionally attacks directly Republicans. Yet we're stuck in this 2016, 2017 mindset of thinking that this should be a completely partisan issue, and it's really, really frustrating. So I think that has been become a huge problem.

I think that the disassembly of these teams, that the company's been a problem. I mean, X has completely destroyed it, but there have been decent layoffs at places like Meta. Unfortunately, I have a lot of resumes from people who are looking for other jobs, and just through the layoffs and kind of the de-emphasis of this. And then what are we seeing in the backdrop is we're seeing the politicization of the work. You're seeing the government kind of totally give up on doing any of this. I think a lot of the political fights that are going on, I mean that the kind of work that you've seen out of NSA, Cyber Command, FBI to go after foreign influence campaigns are probably not going to be effective this year. And I am really afraid of LLMs here.

I think in the past there's way too much overstatement, people way too much focus on actual DeepFakes of you're going to create an artifact that is completely artificial and is a lie. Well, that can be debunked. My biggest problem with AI is that it totally changes the economies of running troll farms, in that the efforts to create 80,000 pieces of content in the 2016 election by the Russians required them to staff a building full of 22 to 25 year old Russians who spoke English well enough to pretend to be Americans. That's expensive. That causes lots of OPSEC problems.

Those internet research agency employees kept on giving interviews to the independent press, right? It's really not cheap or easy to do that, but now three or four people with an open source LLM, who just read English well enough to edit it, can do the exact same thing, right? And can generate all of that content themselves. So the lowering of the bar of the creation of lots of what looks like legitimate content is actually really terrifying for me. So yes, I think 2024 is looking to be really messy right now.

Evelyn Douek:

And of course, the other trend that we've been seeing, and it's been a theme of today and we could have talked about it in the first segment about the report on CSAM as well, is the locking down of transparency on all of these platforms. Twitter turning off the API and just making research a lot harder for academics. And so visibility into even what's going on on these platforms, whether it is better or worse, is much worse.

Alex Stamos:

And even if X hadn't done that, had turned off the API and started threatening researchers who bypass it, we would've ended up in this situation because of this massive fracturing of the US social media ecosystem post-January 6th, right? That the moves that Twitter and Facebook made post-January 6th pushed a bunch of people into alternative platforms.

And so now probably the most important platform is very possible in 2024, the most important platform on the right will be Telegram, right? Will be a platform that has been incredibly... it has become one of the most important platforms because of the Ukraine Russia conflict. It is a total incredible hotbed of open source intelligence of both Ukrainians and Russians, documenting in real time the Russian innovation of Ukraine. But it has also become a really core part of right-wing networks in the United States.

Evelyn Douek:

And as David said earlier, has basically no content policy.

Alex Stamos:

Effectively a negative content policy. I mean, one of our findings, original GCSAM report was, "You should not trade CSAM in public channels," wink wink. Right? I have never seen a platform say CSAM should not be in public places, but then intentionally ignore private spaces, right? So yes, their lack of trust and safety work is an advertisement for the platform.

And so Telegram, their encryption sucks. There's no real good protections there other than the fact that that ecosystem is fragmented into many, many, many channels. And so seeing what is popular and what is going on becomes much more technically difficult. So yes, understanding what's even happening is quite possible actions will happen. If something bad has happened in 2020 was driven off of Twitter, somebody will notice. Now we could have a riot or we could have an election worker killed or some horrible thing happen in 2024, and it was planned on Telegram, and we have absolutely no idea what the genesis was. And that's like my real fear here is these kind of stochastic events happening and us having no explainability of what was driving it.

Evelyn Douek:

Right. So talking about transparency, we have someone that thinks a lot about talks, a lot about, writes about and is generally very good at transparency, is we have our own Daphne Keller here who directs the program on platform regulation at Stanford's Cyber Policy Center.

Alex Stamos:

Daphne just appeared magically in here.

Evelyn Douek:

Whoa, where did you come from? All right. So Daphne, big "innovation" in transparency this week, and I saw you Tweeting, posting, X-ing about this, is the DSA transparency database that was launched a couple of days ago. So in theory, this should include every statement of reasons set by a platform explaining its content moderation decisions. And it was launched I think on the 27th or a couple of days ago. And a few hours ago, or if I just refresh it now, how many statements of reasons does it have? It has 12,264,433 statements of reasons in this database. Is this incredibly useful to you, Daphne?

Daphne Keller:

It's really cool.

Evelyn Douek:

It is quite cool.

Daphne Keller:

So I've been a heckler ever since they said they were going to do this because it's really hard to build a database that tracks every single content moderation action of all of these platforms. But they built it and it's cool and I'm learning things from it. So either I need to retract my heckling or my heckling worked and cause them to put sufficient resources into it.

Evelyn Douek:

Yes, [inaudible 00:29:48].

Alex Stamos:

I feel like everything good that European Commission has done was because of you, Daphne.

Daphne Keller:

It's all me. So something that you and I have talked about before, when I first looked at it, which was day one or two, TikTok had put in 1.5 million statements of reasons. So they'd done 1.5 million content moderation actions of some sort, demotion, demonetization, labeling, removal, and reported those to the commission. And X had reported two.

Evelyn Douek:

Yeah. So it's slightly better two days later, although not much. The discrepancy is still there. So a couple of hours ago, TikTok had reported 5,999,264 content moderation decisions, whereas Twitter or X had reported 23 087. Over a couple of days that seems low, I think. So are these files interesting? What do you see when you look up one of these statement of reasons?

Daphne Keller:

Well, they're really interesting if you just want to see what it is that different platforms are telling people about what they did and why. So the commission put together an API for platforms to send their... they're CCing the European Commission basically when they send these statements of reason to people. Except they have to redact all the personally identifying information, so you don't know what was the content, which might be relevant.

Evelyn Douek:

It seems relevant for understanding the reasons.

Alex Stamos:

It was bad, just trust them. Yeah.

Daphne Keller:

Yeah. So it's interesting to see just how, within the fields mandated by this API, like what was the reason you took this down? If it was a contract, which term in the contract? Blah, blah, blah. Different platforms have taken really different approaches. So TikTok is providing pretty good comprehensible rationales, I think. Apple's App Store had, when I checked, submitted, I think it was 464 notices, and they are incomprehensible. They say, "This was removed because it was beyond the scope of service. See our TOS. Also, see our TOS." And that's kind of the end of it. So seeing the diversity is interesting.

I think it's quite evident looking at it that the fields that the platforms have to fill in for the API, they don't all understand what they mean or agree on the meaning. And so it'll be really hard to compare what different platforms are doing. And I don't think that's the platform's fault, actually. I think this just was rolled out very, very fast. And it's something... We have platform researchers here, you guys understand how complicated the data on this stuff can be. Doing it slowly and iteratively to get it right would've been a better approach, but move fast, break things.

Evelyn Douek:

Right.

Alex Stamos:

I feel like that's a poster I've seen in Brussels. They're really into that.

Daphne Keller:

It's everywhere.

Alex Stamos:

It's not shocking in that this is a problem a lot of people run into is that you imagine that if you define an API that is your data schema. But then lacking normalization or a data dictionary of what you actually put in there and normalization across different entities makes it effectively, even if you're all using the same schema, it makes it effectively useless. Would you say it's useful?

Daphne Keller:

It is useful. It is absolutely useful. I mean, you can get a sense of... It has pretty complex advanced search options. So you can search just for things that were demoted or just for things, I'm not using the right terms from their schema, but just for things that were for hate speech or just for things that affected a particular country or were on a particular platform. So you can kind of narrow down the slice that you want to look at. And you don't know if you're getting the real slice because you don't know if the labels were all used the same way, et cetera. But it's something. It's something we didn't have before.

Evelyn Douek:

Right. Yeah, the numbers really do tell a story. We have this wide spectrum of X, which is just, I don't know what it's doing, phoning it in and not really trying. And then we have TikTok at the other end of the spectrum, which is the kid in class has got their hand up being like, "We are complying. We are doing all of our homework."

Alex Stamos:

"Don't ban us. Don't ban us, please. Please."

Evelyn Douek:

Well, they're the good kid, really desperate for that gold star. And then Facebook and YouTube, I looked at those, they are way lower than TikTok. They're sort of 600,000 or 400,000.

Alex Stamos:

They're the rich kids who know that they're going to get legacy admissions somewhere, so they could get-

Evelyn Douek:

Just showed they passed.

Alex Stamos:

A gentleman's bee.

Daphne Keller:

Ouch.

Alex Stamos:

Which has never happened at the Leland Stanford Junior University.

Evelyn Douek:

Of course not. Wow, that was loud laughter.

Alex Stamos:

It was a little awkward.

Evelyn Douek:

So this is what's happening in Europe. This is this huge regulatory scheme that's been spun up in Europe, and it's getting results. There are these statements of reasons coming in, like let's fly back over here or apperate as you can do. And there's sort of all of these legislatures in this country that are trying to maybe get something like this, force companies to give more statements of reasons for what they're doing or more numbers and data about their content moderation activities. And this is going to be, I think, a really big year for transparency and the First Amendment. And these very strong debates about whether that's even something that legislatures can do. So why don't you give us an overview of the landscape in the United States.

Daphne Keller:

Yeah. So there are at least four state transparency laws. Now, I've been up for a lot of hours, I feel like there might be a fifth one I'm forgetting. But the two that are probably most important, because they're probably about to go to the Supreme Court, are the ones in Texas and Florida, which were enacted by those legislatures as part of the so-called must carry laws that require platforms to carry content against their will, including hate speech and disinformation and things that there might be reasons to want them to take down.

And either they were in... The transparency rules, I'm somebody who's asked for transparency for a long time. I think that transparency mandates are appropriate. I think that they can be constitutional. I don't think these ones are constitutional. They were drafted with the same extreme lack of care as the rest of the laws, and they just kind of fail in a lot of the details of how you might want transparency to really operate. And they do so in a way that will be phenomenally burdensome, just like lots and lots of work for platforms, which they can avoid if they just moderate content less, like Texas and Florida wanted them to do in the first place. So there's this kind of burden-related problem why these are badly designed laws.

And then there's also the problem that we already know, like Texas Attorney General Ken Paxton went after Twitter, demanding disclosures of all of their internal communications about content moderation in express retaliation for de-platforming President Trump. There is a clear history of his requiring disclosures as a means of trying to strong arm platforms, Twitter in that case, into adopting the speech policies that he wants them to adopt. There are examples of this on both sides of the aisle, by the way. It is not solely a Republican problem.

And both of those issues, the risk of state abuse and the burden issue, you could redraft these rules to make it a lot better and make it so that depending what First Amendment standard you're applying, maybe they would survive. But that's not what has happened here. These are not good transparency laws.

Alex Stamos:

And you have to take off the T-shirt that says, "I am doing this in retaliation for First Amendment protected decisions."

Daphne Keller:

Yeah. That kind of gives away the game. And then there are also transparency laws in New York and California. The New York one has been struck down on First Amendment grounds. The California one is the subject of a First Amendment challenge brought by X and Elon Musk, and they have this venerated First Amendment lawyer, Floyd Abrams representing them. I suspect the California one will be struck down on the same grounds as the New York one, which is they have a different issue than the one I've described. I mean, they have the issues I've described also. But they specifically say, "Platforms you have to explain your policies on the following kinds of speech that we don't like." So the New York one is you have to explain your hate speech policies and the California one is you have to explain, it's like five listed things. But they're specifically making it so that if you are a platform that hosts a kind of speech that the lawmakers disapprove of, then you take on more burdens. And that's why the New York one got struck down.

Evelyn Douek:

Yeah. So I mean, you and I have talked about transparency a lot over the years, and it's been fascinating to watch, I think. We were talking about this on the podcast last week or whenever it was, about roll back the clock five years ago and I think transparency was the buzzword. It was what everyone wanted. It was the easy thing to call for because we had so little of it and it would advance knowledge and the ability to hold these companies to account. And then you have all of these legislatures, mostly on the right, but not exclusively, like we have the New York example as well, weaponizing transparency laws and showing how they can be tools of government abuse as well.

And so when you're thinking about transparency laws, I sort of think of it as the Ken Paxton problem. How can you draft this law such that if the enforcer was Ken Paxton, it's still going to be a useful or a safe government tool? I mean, that's a really, really tricky problem. The Paxton problem is a tricky problem. But I guess my fear is that we're going to be so scared off by the Paxton problem that we're going to forsake the idea of transparency laws altogether. And we're going to get bad constitutional ruling law from the Supreme Court, whether it's in these net choice cases out of Texas and Florida, or if we've said there's this California X v. Bonta that, like you said, probably may well be struck down and then go up to the Supreme Court as well.

And I guess I really worry that we're going to create constitutional law that says the First Amendment prohibits transparency mandates in all cases. Which is going to end up with a very deregulatory First Amendment and not allow, I think, important tools to hold these companies to account. But am I just being too sort of pessimistic or what do you think?

Daphne Keller:

Oh, no. Oh, no. Your pessimism is always warranted. Yeah. There's a funny thing in the Texas and Florida cases is that those state AGs who are usually aligned with business interests are advocating a standard that businesses generally hate. So if you're ExxonMobil, you don't want the state to be able to compel you to turn over a whole bunch of information about your operations. But the arguments that Texas and Florida are using are exactly the kinds of argument... they're using it to make platforms do this, but they're exactly the arguments that you would use to make ExxonMobil do it.

Alex Stamos:

Thank you so much, Daphne.

Daphne Keller:

Thanks for having me.

Evelyn Douek:

Thank you so much. Okay. Wow. It's amazing how she just disappeared like that.

Alex Stamos:

Where'd she go?

Evelyn Douek:

Yeah. Okay. And so next we have the wonderful Alison Boden joining us, who is the executive director of the Free Speech Coalition. Thank you very much, Alison, for coming on.

Alison Boden :

Thanks for having me. Big fan.

Alex Stamos:

Long time listener, first time caller.

Alison Boden :

Well, all 12 of us.

Evelyn Douek:

Yeah, that's right.

Alex Stamos:

Wow. Wow. Okay. And that's all the time we have for Alison.

Evelyn Douek:

That's right. Thank you. Your input has been wonderful. All right, so let's start with the basics. What is the Free Speech Coalition?

Alison Boden :

So the Free Speech Coalition is the trade association for the adult industry. So we cover not only content creation, but also and toys and other adult material.

Evelyn Douek:

Great. And we've asked you on because you are playing a very important role in a lot of these... There's been a wave of legislation coming out from various states around age verification laws. So I wonder if you could just sort of talk us through the landscape that you're seeing. What are these laws and where are they coming from?

Alison Boden :

Sure. So in 2021, Louisiana passed a law requiring any website that has more than 33 and one third percent material harmful to minors, they needed to do age verification. And they specified a bunch of ways that you can do that, including their own LA Wallet app. And that went to effect January 1, 2023. In the wake of that 33 other laws have been proposed in 26 states as of today, because last week it was a few fewer, and seven of them have passed those laws, soon to be joined by North Carolina as soon as the governor either doesn't veto or does sign that law.

And so what they are mandating is that... And it's very much directed at adult sites, right? There are maybe some sites out there that have more than 33 and a third percent harmful to minors materials that aren't really being targeted. But the lawmakers themselves have come out and said, "Well, this is about porn. This is the Pornhub law." And very much they're being targeted. Our adult companies are the ones in the crosshairs. So we're kind of fighting back. We are suing Louisiana, Utah, and Texas on the grounds that these are unconstitutional laws because these laws have been found unconstitutional in the past.

Evelyn Douek:

Yeah. Yeah, we'll come back to that. I'm curious for your thoughts, that's an extraordinary wave of legislation. That is a lot of states in a very short period of time. And just what is causing that wave of legislation right now? What's causing that panic?

Alison Boden :

It's a good question. I think it's not something that's happening in isolation. Right now we're seeing these laws and we're also seeing a lot of laws targeting trans folks. We're seeing a lot of book bans. We're seeing even LGBT laws are being either considered impassed. And having lost the right to illegal abortion, I think it's not a coincidence. So I think that it's part of an overall kind of cultural attack essentially. And as a wise person put it, "Porn is the tip of the spear," so to speak.

Alex Stamos:

I want to note that I did not laugh at that. It's true. Not awkward. Thank you. So Evelyn, I'm interested in... First, Alison, if you talk about a little bit functionally, what do you have to do to comply with this law? And Evelyn, where is the constitutional bar here of how far you can push? We have all kinds of laws that have some kind of burden, but what level of burden do you think hits an unconstitutional level? And is this something that's not actually well-defined enough for lower course of making decisions right now?

Alison Boden :

So depending on the state, we can use Louisiana as the example. They dictate that you have to verify someone's age using their government ID or essentially data broker information. So I think it has something to do with financial information, including things like mortgages or loans.

Alex Stamos:

So I mean, they're talking about, for people here who have gotten a credit card or something recently, you end up having, which of these people is your... These really creepy questions that you had asked. When you apply, you put in your social security number and they come back with, "Which of these people is your sibling," or, "Which of these houses have you lived in," or, "Which of these were your employers?" So that would be good enough if you didn't have a government ID.

Alison Boden :

Exactly. Which I mean, I'm sure we know that kids don't know anything about their parents or where they live or any of that.

Alex Stamos:

Yeah. No, I'm pretty sure my kids could figure out 80% of those questions pretty quickly.

Alison Boden :

Well, I mean the data brokers have it, so I imagine it's out there someplace. So checking government ID isn't that easy to do on the internet. So in Louisiana, they are, I think the only state so far who has passed one of these laws that actually has a mobile driver's license, that has an API that you can hit to check it online. Otherwise, it's really just useful in a convenience store. So that app, in order to verify your age, they say it takes just 30 seconds.

Alex Stamos:

And what I'm sure you've found in these states is that nobody looks at porn anymore, right? Is that the outcome here? Is that...

Alison Boden :

Porn's gone? No. No, of course not. But what they're doing is folks may be understandably are going to websites and going, "Whoa, you want me to show my ID to look at porn? I would prefer not do that." And then where are they going? Places that aren't complying with laws.

Evelyn Douek:

Right. So I mean, that's a good segue, because I guess I'm going to phrase my answer as a question to you, Alison. For the people who say, "What's wrong with having age verification on adult websites? I don't want my kid looking at porn." Why is that a First Amendment issue? What's the problem? What's your answer to that?

Alison Boden :

Well, first I would say I don't want your kid looking at porn either, for a variety of reasons. But the problem is that it's not technologically a very feasible thing to do right now, to protect someone's privacy completely and also verify exactly how old they are. And not only that, part of our argument is that these laws are unconstitutional and represent a prior restraint, because that 30 seconds it takes you to figure out the app and put it into PornHub and have it come back, that's not permissible when we're talking about accessing protected speech. Evelyn, I don't know if you'd agree with that as a First Amendment lawyer.

Evelyn Douek:

Yeah, no, absolutely.

Alex Stamos:

Gosh, if only we had a First Amendment professor here.

Evelyn Douek:

That's right. Don't I get a theme song or a sound effect or anything?

Alex Stamos:

When you get Tenure you get it.

Evelyn Douek:

All right. It better be a really good sound effect though. I'm getting excited. Now I want Tenure. Yeah, I mean, so Alison mentioned this and you asked, "Is the standard well-defined?" And the standard is really well-defined. This is something that was settled a long time ago by the Supreme Court. And it's really relevant because Alison, the Free Speech Coalition, I don't need to inform him, informing the listeners, one, an injunction of the Texas law in this case. And the district court order in that case was basically like, "The Supreme Court solved this. We talked about this in Reno in 1997 and Ashcroft v. ACLU in 2002. And we, as a district court cannot just go and disregard this binding Supreme Court precedent that says that..."

The problem is not that you are burdening children's access to porn. The problem is that you are burdening adults access to porn, and that's constitutionally protected speech. Unprotected obscenity is an extremely narrow category. And so that's something that the First Amendment doesn't allow. And so it is amazing to watch these laws, which are basically like the district court said in this case, these laws from 20 years ago redux, they're not raising new issues. And I think that could be another reason why we're seeing away with these laws now is that there are legislatures and lawmakers who think they might have a better shot of creating some new law and overturning some precedent [inaudible 00:49:19].

Alex Stamos:

The privacy issues, are those important in that analysis or is it just the existence of the burden?

Evelyn Douek:

Well, the privacy issue is part of the burden and the chilling effect that that creates.

Alex Stamos:

So let's say, I mean, this doesn't exist, but there's all these companies out there that are basically advertising, "We can do verification without violating your privacy," which is bs. But let's imagine they actually could come up with something that you could get somebody who actually understood computers, cryptography, math or humans to say is actually privacy protecting. Would that change, do you think, the analysis here?

Evelyn Douek:

Yeah, I mean, I think it would, because I think that's exactly the line of attack for people who say, "Well, we can change law because Reno, the case, the leading precedent in this area talked about the state of technology and talked about the fact that it wasn't possible to do this in the current state of technology." And so the argument being made is, "Oh, but now we have better age verification tools." And so I guess throwing the question back to you, and you answered it in your question, but lots of people are saying this, people are saying the world has changed, the technology has changed and this is much easier and safer and less friction these days. Is that true?

Alex Stamos:

First, for the Stanford undergraduates in the room, Janet Reno was the Attorney General of the United States back before you were born. No. I hate saying this because we're going to get email or I'll get a student come up to me with a startup idea, but somebody's going to say, "I've got a blockchain solution to this." But no, there's no actually really good reason... a way to verify somebody's identity any kind of reasonable way, while being privacy preserving identifying. Just based upon any of the technologies we have, our lack of any kind of cryptographic component in the identification cards we are given. And just the reality of having physically access to either the information about somebody, a special code that's tied to them or even the physical ID, doesn't mean that that's actually the right person.

So short of where the People's Republic of China comes out in this, which is effectively you have to show government ID in person to buy a sim card that gets permanently tied into a database, and all kinds of other incredibly intrusive authoritarian steps. I don't think you can get there at any kind of reasonably technically supportable way.

Evelyn Douek:

Right. But these laws are still being passed and they are still on the books and going to go into effect unless something happens. And so that's something I guess is you Alison. But it must feel like a DDoS attack to have all of these laws coming at you and then you have to go around filing First Amendment lawsuits and say, "Hold on, let's hold up some old precedents." Is that how it is? And how do you sort of navigate... We talked about you filed the challenge in Texas, but where else are you litigating at the moment? And how do you navigate which cases to bring?

Alison Boden :

And like I mentioned, there's seven of these laws that have already been passed and we're only litigating in three states. That's Utah, where we are appealing a dismissal by the district court. Louisiana, where we have a preliminary injunction hearing next Wednesday on 10/4. And in Texas, where we won the preliminary injunction to keep the law from going into effect. And then the Fifth Circuit stayed that, so the law did go into effect, but we have another hearing that's coming up actually very quickly for the Fifth Circuit, also next Wednesday, 10/4. So we are a very tiny organization. There are more of these laws than there are people who work for the Free Speech Coalition, freespeechcoalition.com/donate. And we have to be pretty strategic. Right? We have to sort of choose our battles. There's no sense in going after two states in the same circuit, that sort of thing.

But ultimately, we are the ones fighting these laws. We are the only organization that really focuses on the rights of adults to adult expression. And so we're out there, we do have allies. Got some really, really good amicus support the other day. But yeah, it's really a question of, "Okay, do we have enough folks on our tiny legal team?" We can max out at three lawsuits right now. We're considering a couple more. Thinking about the states that have passed it. We've got Louisiana, Utah, Virginia, Mississippi, Arkansas, and Montana. So if you have that circuit map in your head, you can think about what might be the most advantageous place to go. But it's definitely difficult and we're really buoyed by the fact that this is settled law. We're on the side of right here. And we hope that, despite any changes to the courts in recent years, it's very clear that precedent means something.

Alex Stamos:

Right. Which is really the lesson of the last several years is that well-defined civil liberties.

Evelyn Douek:

I walk into the classroom and say, "Precedent." And everyone's like, "Oh, got it."

Alex Stamos:

"Oh, good." [inaudible 00:54:13]

Alison Boden :

[inaudible 00:54:12]. It's not for suckers. No, no. And so of course we're worried. And you got to get out there and fight the good fight.

Evelyn Douek:

Well, Godspeed in bringing those cases. And you will be creating law when I teach. I teach Ashcroft v Free Speech Coalition from 2002, which was a challenge that the Free Speech Coalition brought against ban of virtual child pornography. And so this is a kind of a who knows in 20 years the Free Speech Coalition cases, the caption that we will be teaching in our First Amendment classes as a result of all this law that you may well be making in the next few years.

Alison Boden :

Looking forward to it. Thank you.

Alex Stamos:

Thanks, Alison.

Evelyn Douek:

All right, Alex. The part of the podcast that everyone really came for.

Alex Stamos:

Oh, yes. Now it's time, ladies and gentlemen, for the Moderated Content sports update.

Evelyn Douek:

Yes.

Alex Stamos:

Well, so we have some more legal sports news.

Evelyn Douek:

Oh, really?

Alex Stamos:

Which is the battle to control the Pac-12 continue. So in realignment land, as everybody who listens to the podcast knows, the Pacific 12 Conference, which used to be Pacific-10 and the Pacific-8 before that, which grew and grew, is now dying. And 10 of the 12 members have pledged to move to different conferences next year. What does that mean? The two schools that are left, the Washington State University and Oregon State University have sued in federal court, saying that under the bylaws of the Pac-12, everybody who's announced that they're leaving no longer can sit on the board. And they are so far winning in court. There's another hearing on November 14th. But it looks like the conference at all of these teams play in is actually going to be taken over by the two incredibly pissed off state schools who have been abandoned by, first UCLA and USC, but now all of their compatriots in the Pac-12. Which means they can probably loot and pull the wiring out of the walls to try to make up for the fact that they've gotten really screwed here. So it's kind of amazing.

And people are talking about, it's a really big day for Twitter sports lawyers, because people are looking into all these crazy clauses that the Pac-12 effectively is guaranteed a playoff birth. And they're supposed to have six or eight teams, but they have three years to fix that. So for several years, one of the playoff births in the college football playoff might go to either Oregon State or Washington State every single year, whoever wins between them. But they would apparently have to have the conference, they would have to play six games against each other, back and forth, back and forth. The whole thing started out to be incredibly cuckoo and amazing. So there's that. And then what does everybody here know is the most important sporting news from this weekend? Who went to a football game this weekend?

Audience:

Taylor Swift.

Alex Stamos:

Taylor Swift.

Evelyn Douek:

It's a piece of sporting news that even I knew.

Alex Stamos:

Oh, amazing. I want to hear, what do you know about this?

Evelyn Douek:

Oh, no, not the follow-up questions. This is not fair. That's basically the extent of my knowledge.

Alex Stamos:

I'm going Socratic. So Taylor Swift-

Evelyn Douek:

So Taylor Swift is now probably possibly rumored, it seems to be dating some sports dude.

Alex Stamos:

Sports guy. Travis Kelce is his name.

Evelyn Douek:

Plays with the ball.

Alex Stamos:

Plays with a ball.

Evelyn Douek:

Yeah. And he's good at it. I think he's pretty good at it.

Alex Stamos:

He's pretty good. Do you know what he does with that ball?

Evelyn Douek:

Does he throw it and sometimes kick it?

Alex Stamos:

Yes. Good job, Evelyn.

Evelyn Douek:

All right. Good.

Alex Stamos:

Despite it being called football, he uses his hand.

Evelyn Douek:

Yeah. Yes, yes.

Alex Stamos:

Yes.

Evelyn Douek:

That's it. I'm tapped out.

Alex Stamos:

And she went to a game and people lost their minds. And so in the internet, if you look at the search results, both Taylor Swift and Travis Kelce, the Google search results for them have gone through the roof, the search trends. Which is, it's the most incredible crossover in history, of Taylor Swift fans getting them interested in football and vice versa.

Evelyn Douek:

I did see that the NFL changed its handle to @NFL and then (Taylor's version, which is excellent.

Alex Stamos:

Yeah. So anyway, yes. I'm so glad that that happened because now we'll have something that we can talk about for every sports update coming up is, "Did Taylor go to the football game this weekend," will become the big thing. It's actually, it's interesting because there's a bunch of people who are really angry about the fact that a huge amount of the coverage was, like in the middle of important plays in this football game, they're cutting to, "What's Taylor doing right now?" It's like, "Taylor is eating a fully loaded potato wing." Yeah.

Evelyn Douek:

Yeah. And also, apparently she's emasculating him somehow or he's becoming woke and it's terrible. That is also the stupidest possible timeline that this is the level of the debate.

Alex Stamos:

I mean, this is how life works. Every possible thing that happens in our universe has to boil down to a bipolar culture war, right? Yes.

Evelyn Douek:

Exactly. Nothing is safe.

Alex Stamos:

Nothing is safe. All right. How are the Matildas doing? They're off.

Evelyn Douek:

Oh, that was not... Oh, come on.

Alex Stamos:

Oh, I'm sorry. We can edit that out.

Evelyn Douek:

All right, well, with that, this has been...

Alex Stamos:

Oh, I'm sorry. I'm sorry.

Evelyn Douek:

Moderated Content weekly update live show. It's available in all the usual places, including Apple Podcasts and Spotify. And show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't be possible without the research and editorial assistance of John Tereto over there.

Alex Stamos:

Woo, John.

Evelyn Douek:

Always super helpful.

Alex Stamos:

He's real. It's not just a name we make up.

Evelyn Douek:

Yay. Really appreciate his analysis and briefing every week. And it is produced by the wonderful Brian Eletier. A special thanks also this week to Ben Rosenthal.

Alex Stamos:

Hey, Ben.

Evelyn Douek:

For helping set all of this up. And to Justin Fu and Rob Huffman always for helping get it in people's ears. And thank you everyone for coming. This has been super fun. And we will talk to you next week.

Alex Stamos:

And I think you pointed out there's more people in this room who have ever rated the podcast.

Evelyn Douek:

Oh, that's true.

Alex Stamos:

So let's get some coordinated and authentic behavior going, people.

Evelyn Douek:

Lock the doors. Lock the doors.

Alex Stamos:

Lock the doors. I want to say, to get out, you have to show me a five star rating on your Apple Podcast app. Lock it.

Evelyn Douek:

All right. See you have everyone.