Alex and Evelyn discuss the TikTok ban bill, the EU's investigation into Meta for violations of the DSA, the passage of the REPORT Act, and the Supreme Court's denial of a stay of Texas' age verification law.
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:
TikTok Tick-Tock
Netzwerkdurchsetzungsgesetz (EU Policy Corner)
Legal Corner
Sports Corner
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Alex Stamos: I feel like we got to do the legal corner sound. At the top here, it's like this whole thing is legal corner. There's nothing substantive here.
Evelyn Douek: Whoa!
Alex Stamos: It's just people arguing about what other people do.
Evelyn Douek: What? Not substantive? Did you just imply that my entire life's work is, quote unquote, not substantive?
Alex Stamos: I don't think I implied. I think I just said it.
Evelyn Douek: This really hurts, man. It's rough. This is the most substantive podcast we're going to have done in a while, in my opinion.
Alex Stamos: Somebody's got to argue about other people doing real work. So it's important, it's an important job. And then somebody's got to teach the people who argue about other people doing stuff.
Evelyn Douek: So you're saying I'm like four things removed from substantive work is where I'm at. Just sitting out here doing my-
Alex Stamos: Oh, no, no, no. You teach people to argue about substantive work.
Evelyn Douek: It's only three, three degrees of separation between me and substantive work. Got you.
Alex Stamos: Actually, you do a podcast about people teaching about argue about substantive work. So you're like three and a half.
Evelyn Douek: Three and a half. Yeah. Well, ouch.
Alex Stamos: But good luck on tenure.
Evelyn Douek: Good thing, lawyers are all about procedure anyways. Welcome to Moderated Content, stochastically released, slightly random and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. Before we get started, we run a pretty loose ship over here, but our professional producer, Brian, makes the excellent point that I should remind you to please follow this feed so you never miss an episode. Our irregular scheduling means that if you subscribe at any moment, you could be pleasantly surprised to see an episode drop. So please go ahead and do that. That would actually make a big difference. And rate us while you're there. Why not? All right. So we have to start with, Alex, you have a correction for us to get started with this week.
Alex Stamos: Yes. A correction corner because here, we are very serious journalists here at Moderated Content. So I think we have to do our NPR voices.
Evelyn Douek: Right. The highest standards.
Alex Stamos: The highest standards. Here on Moderated Content today, we have a correction and that correction is in several episodes ago, I mentioned that the lawsuit, X Corporation versus the Center for Countering Digital Hate, the nonprofit that had criticized Mr. Musk and therefore was sued by him for breach of contract violation of the Computer Fraud Abuse Act and a variety of other things that that lawsuit was being fought by White & Case LLP, the large law firm. And we had a very nice person from White & Case write in, first to ask for a correction because the partner who filed that, Mr. Jonathan Hawk, has since left White & Case and then asked for us actually to remove the audio from the podcast. And so while we're not going to do that, I am going to say it's very important for us to be correct. So it is absolutely correct.
Evelyn Douek: Absolutely, highest standards. Highest standards on this podcast.
Alex Stamos: The person who lost this case when it was dismissed was not White & Case. However, White & Case was the law firm that filed a lawsuit against the Center for Countering Digital Hate to try to oppress their First Amendment protected speech, utilizing the Computer Fraud and Abuse Act. So they filed it, but they did not lose it. So there is your correction for the day, and I'm sure that this has brought no more attention to this issue than if you did not ask us to edit our podcast in the past. That is your correction corner. And if you have any other corrections, please make sure to email Evelyn, not me, but then I will be the one who passively aggressively reads them out in the next episode. Thank you very much. That's the end of the correction corner.
Evelyn Douek: Absolutely no comment on that one. Not touching it. Moving right along over here. Let's go to our-
Alex Stamos: There goes my expert witness work from at least one.
Evelyn Douek: Right. Let's go to our TikTok TikTok for the week. Obviously, I think that's where we have to start because what else? It is hitting the headlines as listeners are no doubt aware. Last week, Congress passed somehow and the president signed into law the Protecting Americans from Foreign Adversary Controlled Applications Act or PAFACA. As literally no one likes to call it, but as I was calling it with my students last week, PAFACA.
Alex Stamos: What are congressional staffers doing if they're not coming up with 20 different acronyms, right?
Evelyn Douek: They're really phoning it in this week. It's terrible.
Alex Stamos: Okay. Yeah. I'm just giving a thumbs down, no tip of my hat. This is a wag of my finger at the congressional staff who were not able to come up with some kind of incredibly patriotic acronym for this. I mean, that's just ridiculous. They knew this act was probably going to pass. If you're going to phone it in on something that has no chance, sure. But something that is actually going to pass, you really need to work on that. So let's run a little competition. If anybody else has a better acronym, perhaps we can propose it for any future legislation. So go ahead and write in again, Evelyn, not myself.
Evelyn Douek: Thank you. Yeah, it's amazing that they're going to lose the PR war with the youth over this act with catchy phrases and names like PAFACA. Another name for it, of course, is the TikTok Ban bill. Now, you'll hear some people say that it's not a ban because it's really just a forced divestiture because, of course, we've talked about this on the podcast before. The bill gives, and in the new version of the bill slightly longer, it gives TikTok 270 days to find a new and specifically non-Chinese owner before the provisions, penalizing app stores for hosting it, kick in. Now to my mind, I think it's really cute to argue that it's not a ban because all they're doing is just trying to force the app to sell because it's still pretty obviously a ban.
The way that I like to think about it is if you said, for example, the New York Times has to shut down unless it finds new owners in nine months. And you say that after months and months of criticizing, the New York Times editorial positions on things, I think that no one would say that this doesn't implicate the First Amendment or isn't trying to shut down New York Times' speech. So this is, to my mind, just basically a prettier version of the Montana bill. That was the stupid version, which basically went and outright bans the bill, and in the preamble included a bunch of stuff that said we're banning this up because of the content including protected speech like milk-crate challenges and Night-Quail chicken. So Montana did the stupid version and a court in November held that it was likely unconstitutional.
Now the question is, does prettying it up in this way mean that it's more likely to survive? But to my mind, the fundamental issues are still the same. The constitutional issues that the bill raises are really still essentially the same as the Montana bill, and the answer should still be no. But before we jump into all of that, Alex, I'm curious to get your high level thoughts on this. Also, it's wild to me that this is how you pass laws in this country. There was this whole debate around the bill and then it gets shoved into this must-pass foreign aid package so that it doesn't even really get proper debate. I mean, Biden, when he signed the bill, didn't even mention the TikTok parts of the law. Is this democracy working? Just not a loaded question or anything, but is this how it's supposed to work?
Alex Stamos: This is exactly... Have you not read? It's in Federalist 17. They talk about how the way you should do legislation is have three bills, the omnibus bill, the NDAA, social security reauthorization, debt ceiling, four or five bills, and then you just attach everything to those. It is exactly what the founders wanted. I actually think there's an entire song in Hamilton about omnibus bills. They got cut for time, but Miranda wrote it.
Evelyn Douek: Yeah, not because it's not catchy or anything. Because omnibus bills just rolls off the tongue.
Alex Stamos: Real easy to rhyme with the word, omnibus. Yeah, it is interesting. There's actually a great, she needs our help, so we should give her some podcast listenership, but this little podcaster named Kara Swisher actually had a fascinating interview with Maria Cantwell, a Democrat of Microsoft, I mean Democrat from Washington, and it was a fascinating discussion about her privacy bill, but then also the work that the Senate tried to do to fix up the TikTok bill on constitutionality issues. And effectively, she admitted that they tried a bunch of things that got rejected and so they're like, "Okay. The house passed this, so let's just see what happens." So I think there were some really serious people who were trying to look at the things we were talking about of, could you make this a privacy bill instead of about speech? Could you make it not just about TikTok and make it much broader?
I still much would've preferred a privacy bill that had good controls around PRC access to data beyond just this company or beyond just social media companies with X million users. That's not what we got. And so the Senate had this process that they're trying to do better. That being said, there is also a privacy bill. So I think an interesting way for the Senate to act now would be to push forward Cantwell's privacy bill to go through the process here and then maybe to strengthen the China provisions so that if this initial bill doesn't pass constitutional muster, then you've got a backup of effectively a much more defensible bill that isn't explicitly about speech platforms. That being said, the problem with a privacy bill for Congress is it affects every company. I think this is what people miss is that you hear all the time, we got to regulate tech companies so we need a privacy bill.
The truth is, a American GDPR would have minimal impact on the big tech companies because they have to comply with GDPR. I was there and Facebook spent hundreds of millions of dollars to comply with GDPR, of all the work that had to happen. The problem there is the rest of the Chamber of Commerce because an American privacy bill affects all of the companies that only have American customers. So every domestic telephone company, all of a sudden, their costs go through the roof. Every domestic insurance company, car company, everybody. And so the pushback against privacy bills is not from the US tech industry. They're fine with it because it's like they already have to live by those rules globally. It's by all of these big industries that have 99% to 100% of their customers inside of North America where all of a sudden, they have to do everything tech did 10 years ago about GDPR compliance.
Evelyn Douek: Yeah. So I actually want to really amplify that part because I think it's really, really important this part about the companies that it applies to because I see it getting lost in a lot of the conversation. I haven't seen a lot of people talking about this, and I think it's really, really important. So the issue here is, the argument is we need to do this because we're concerned about data privacy and about Chinese government getting access to users' data. And so this is the only way to address that, and the effect on speech is incidental to that aim. So this isn't about the speech, this is about the data privacy. And if that's true, if that's what the bill is doing, then it's a lower level of scrutiny. It'll get subjected to intermediate scrutiny, what's called intermediate scrutiny rather than strict scrutiny, and so it's more likely to survive.
The problem with that argument is the sections of the bill that are about not TikTok, but say, look, we're going to create this general process for companies that the president can designate as covered by this, a covered company, and therefore subject to the same restrictions as TikTok. It only applies, the definition, only applies to applications that permit users to create an account or profile to generate share and view text, images, videos or real-time communications or similar content. And so this bill will only ever apply to social media companies or companies that are about content creation, that are about speech. And so all these other Chinese companies that have the same potentially data privacy concerns like the shopping companies like Shein and Temu and things like that, they're not going to be caught by this bill because they don't meet that definition.
Alex Stamos: I mean, they're definitely going to get rid of user reviews. I think it'll be fascinating to see all of the companies, all of the Chinese companies that are not explicitly social media companies, they're just going to get rid of anything that could possibly make them related here including if you're a game company, you're going to get rid of as much interactivity as possible as we've talked about. This clearly catches Fortnite, it catches anybody who has voice chat or the equivalent. And so you're just going to see them pivot of like, "Oh, our core mission is not user generated content." So yes, some of our product managers are going to complain, but because of this bill, we could completely survive it if we just cut out the chat box, which is ridiculous because that does nothing for the actual privacy issues involved with that product. It just means that their product design is now all about getting around American law. Oh, well.
Evelyn Douek: So it gives light to the idea that this bill is about privacy because it's clearly targeting speech platforms. And then even if the court accepts, okay, it's about privacy, this sort of under-inclusiveness is going to be a problem at the tailoring stage. The court is going to say, but we're suspicious about this because it doesn't actually get to the goal that you're trying to get. If your concern is data privacy, why are you tailoring it in this way? And so that then leads us to the other argument because I think that means that the data privacy argument isn't so convincing for this bill. That leads to the other argument, which is much more directly, we are concerned about foreign propaganda. We are concerned about the Chinese government surreptitiously shaping political debate, public debate in this conversation, influencing people's views. And then this bill is extremely narrowly tailored to that goal.
If you are concerned about the Chinese government interfering with a speech platform and shaping people's views, this is the only way you can really meet that aim I guess unless we could talk about maybe transparency provisions. Things like that might be another way of actually meeting that aim, which actually is often what the court will say. Those are the kinds of things. They can shape views, but they need to label it. They need to be much more transparent. The problem with that is that there is a constitutional right to receive foreign propaganda, and this is pretty well established in the law. There's a 1965 case called Lamont v. Postmaster General, where it was literally Chinese propaganda. This was the Peking review, and there was a scheme where people had to fill out a little form. It didn't even ban the Peking review actually. All it did was say, if you want to receive the Peking review, you just have to fill out a little card and let the post office know, yes, I want to receive the Peking review, which obviously-
Alex Stamos: I'm a communist.
Evelyn Douek: Right, exactly. Please watch me.
Alex Stamos: The card just happened to be read because that's the paper that we had.
Evelyn Douek: Right. Yeah. Nothing chilling or scary about that at all.
Alex Stamos: In the 60s, that would've been super neutral. Yeah.
Evelyn Douek: 100%. So in the 60s, the court said.
Alex Stamos: So that's the reason why The Epoch Times just delivers a newspaper that I've never ordered, and they're allowed to just drop it off. You're like this is totally a legitimate newspaper with a legitimate business structure if they can afford to print it on paper and have somebody bring it to my house every day. Yeah.
Evelyn Douek: Way more persuasive I'm sure. You definitely read every page for the pictures. So the court said no, even that chilling effect of having to fill out the form and give it to the government, that's too much of a chilling effect on First Amendment freedoms. People have a right to receive this if they want to. And then there's this lovely statement by Justice Brennan in the concurrence of that case that says, I'm going to paraphrase here, but essentially the fact that other countries, the countries that originate this propaganda don't have the same press freedoms is no reason for us to abandon that freedom. It makes us special and we should hold onto that.
Alex Stamos: Right. Look at the shirt you're wearing. It says, we're the good guys. And so if you're the good guys, it's easy to say you're the good guys. It's hard to be the good guys. I feel like that's what the justices are trying to say. It's not just enough to say we're for freedom. Freedom is hard when it's other people's speech.
Evelyn Douek: Exactly. And if it wasn't hard, we wouldn't need rights. If it was easy...
Alex Stamos: It would be special.
Evelyn Douek: You don't need the first amendment. Everyone just does it because it's easy and good and the right thing to do. The point is that rights are there when it is difficult, and even when the government comes in and waves national security around. And I just really want to underline this. National security concerns are legitimate, and obviously that is an area where courts might be more deferential or they have said that national security concerns are a compelling interest that the government can pursue, but also, the government really has to demonstrate, it has to show its cards, has to demonstrate the harm, and it can't be merely hypothetical.
And of course, that makes sense because otherwise, what is the first amendment? What is the right that's secured? If anytime the government wants to, it can come in and wave a piece of paper around that says we have national security concerns. We're not going to tell you what they are, but they're really very serious. And then they get to ban a whole bunch of speech or a speech platform. So that'll be interesting to see whether the federal government can, in this litigation, show more evidence, more evidence of the national security concerns that actually aren't just hypothetical in some moment in the future, but now and happening then Montana could in its litigation. But that's one of the things that we'll have to wait and see.
Alex Stamos: So you say the court, but that requires it to be a legal fight. And so far, nobody has filed anything, which is actually, I mean, just getting ready. Obviously, this is going to be a humongous, incredibly expensive lawsuit. So I'm guessing, I'm going to guess that there's been some billable hours in the last two weeks. What do you think?
Evelyn Douek: Yeah, 100% right. This is a full employment program for lawyers and for academics that teach about lawyers who make these challenges and for podcasters.
Alex Stamos: Which is a totally legitimate job. Those are all legitimate jobs.
Evelyn Douek: Yeah. I mean, it's definitely substantive for sure. It's not substanceless to podcast about being an academic that studies these lawyers.
Alex Stamos: No. So here's a substantive question. Where do they file this? Obviously, this affects every single district in the country. What happens there?
Evelyn Douek: Yeah, great. So actually it's one of the unusual provisions of this act. It says that any disputes arising under it have to be filed in the DC Circuit, and that's going to have exclusive jurisdictions. So my guess is that in the next, any minute now, essentially maybe by the time you're listening to this podcast, there will be a lawsuit or maybe multiple lawsuits challenging this bill. In Montana, of course, we had TikTok that sued on its own behalf, and then it also had this bunch of plaintiffs of users who used TikTok, and it was this lovely case where they went and found the perfect plaintiffs like a woman who was a small business owner who sold moccasins, who's only income was from TikTok sales, and a vet that used TikTok to reach out to other vets struggling with mental health issues.
I can't remember the specifics, but it was like you pick your plaintiffs really well to have the most sympathetic case of this is why this is a really important speech platform that isn't all about Chinese propaganda or Naikul chicken, but actually the vast majority of it is pretty wholesome content and clearly protected speech. So that's the thing I think to be watching. But honestly, that first legal challenge, I mean, it'll be nice, but it's kind of I think just a sort of necessary evil because I think this is on a fast track to the Supreme Court. I think it's going to be heading up there pretty quickly.
Alex Stamos: So the constitution doesn't say that they have primary jurisdiction of lawsuits with Chinese. That was not in Federal 17 or in the constitution? The Supreme Court gets the case initially?
Evelyn Douek: No. Yeah, unfortunately, we're going to have to wait for the DC Circuit to work this one through. I mean, I guess there are many benefits to that. But yes, I mean, it's going to get real when we get to the Supreme Court because I guess this is raising all of these new issues and these fundamental problems that aren't just really honestly, and I think that, again, we shouldn't forget this. This isn't just about TikTok. This is about how do we think about these issues in a globalized world with a globalized internet where we're going to have foreign platforms. This is not going to be the last Chinese successful platform. And so are we just going to allow it?
Alex Stamos: It's the only one now as we keep on talking about.
Evelyn Douek: Right. So are we just going to allow these platforms to be banned anytime the president wants to or are we going to have some other kind of system?
Alex Stamos: Yeah. I mean, I am a little afraid there's a bad defendants make bad law situation here too of, so we're looking at what at least two years before, two terms from now, three terms from now before it's written in front of the Supreme Court because I mean, there'll be a stay-
Evelyn Douek: It could be quicker than that. It could be quicker than that. Yeah, because also I guess not a particularly factually complicated litigation. It's not like there's going to need to be massive discovery or lots of evidence to sort through. And if we look at the jaw-burning case, for example, that was much more complicated and that still got through pretty quickly. So I imagine this could be pretty fast.
Alex Stamos: Oh, no. That's a good point. It's like, what are you going to ask a witness? Is ByteDance a Chinese company? Yes. Okay, great. Do you have this many users? Yes. I mean, they're not going to dispute. I think in their lawsuit, they will probably. I mean, we'll see what happens. It'll be interesting to see if TikTok tries to fight like we're Chinese or not fight the ownership provisions based upon the ownership, a variety of different parties, and there came structure and such. But yeah, it's interesting. I mean, I'm just a little afraid because who knows where we'll be vis-a-vis China, but it's going to be very hard. The constitutional issue seemed pretty clear from how you explained it, but I also think it's going to be very hard to sit in DC in the middle of really heightening tensions against the PRC and come down. This is going to be fascinating.
Evelyn Douek: And that's exactly why when I talk to reporters about this, I say something along the lines of, I'm 150% sure of what the precedents say here and what the outcome should be, but I'm nowhere near 100% sure about what the outcome will be. And that's because especially in this national security area, judges get worried. They get scared and they can be deferential to the government coming in and making these claims even when-
Alex Stamos: You could have another Korematsu, right?
Evelyn Douek: Right.
Alex Stamos: Is there a First Amendment equivalent of Korematsu? Is there a First Amendment case that was during World War II that, I mean, that would be the last equivalent because obviously the Vietnam cases all go the other way. They're all very pro speed.
Evelyn Douek: So the equivalent here is an infamous case called Holder v. Humanitarian Law Project, which was about foreign, it's actually a quite recent case, I think 2010, that there was about designated foreign terrorist organizations and Americans wanting to communicate with those terrorist organizations, including giving them advice about how to make international law claims so that they could pursue their objectives peacefully rather than using violence. And so that's the kind of core political speech that ordinarily should be protected.
But it was, I think it's a pretty hard decision to reconcile with the rest of the First Amendment structure because the court said, "Well, no. This could help them, even if not pursue their own law claims by making them more legitimate and letting them free up their resources in other ways." So we are going to allow the government to criminalize this speech, and it's to my mind, a pretty shocking decision. But it is an instance where you can quite clearly see the judges saying, "This is national security. This is foreign relations. We're going to defer to the government when it comes in and makes these claims even on pretty, I think, flimsy evidence." And so that's a really recent example of what the kind of thing that we might see here too.
Alex Stamos: There you go. None of you have to go take Evelyn's class or get a degree from Stanford Law because you just got it all on a podcast.
Evelyn Douek: Right, there you go. Yeah, there's nothing else to it. It's just those two cases. Okay. So speaking of governments wanting to constrain foreign platforms, let's go over to Europe now. Okay, so this week the European Commission announced that it has opened formal proceedings into whether Meta, through Facebook and Instagram, has breached the Digital Services Act, which is this sweeping new legislative package of content moderation laws that came into force for these platforms last year. They've announced a bunch of investigations actually recently, but this is the first into Facebook and Instagram and it's going to be watched closely. There's four areas of concern that the commission has raised about suspicions that Meta is not complying with the Digital Services Act. So the first is around deceptive advertising and disinformation campaigns. So the idea here is that maybe Meta is not doing enough to clean up its platforms in terms of especially Russian disinformation and especially in the lead up to the European elections in June.
And related to that, there's also a complaint about the mechanism to flag illegal content. The commission is worried that the notice in action mechanism isn't compliant with the DSA and that it's not easy enough to access or not user-friendly. So there's an investigation around that. So that's two. The third, our listeners will know a lot about this is about the shuttering of CrowdTangle. For people who haven't heard this episode a couple of weeks ago, we spoke to former CEO of CrowdTangle, Brandon Silverman, about Meta closing down CrowdTangle and it's new content library that it's opening up and why Brandon thinks that's nowhere near sufficient or substitutable at this stage, that it might be in the future, but at this stage, there's going to be this pretty big gap for researchers in terms of following what's happening on the platform in the meantime. And so the commission obviously listened to our podcast, heard this, and heard these concerns and thought we need to investigate this. And so again, this is especially in the lead up to the elections in June, something that they're watching.
Alex Stamos: So I mean, a couple interesting things here. The Crow Tangle stuff, you and I totally agree, CrowdTangle is one of the best transparency tools. I think it is great that the Europeans are going to push for transparency. I think there's two things here for which Meta has legitimate complaints about. One on the transparency side, even if they get rid of CrowdTangle, they're way better than almost anybody else. So their inferior APIs are still better than you get from TikTok or YouTube or certainly Truth Social or Cab or Parlor or now X who is filing lawsuits filed by White & Case, but not lost by White & Case, filing lawsuits to try to suppress NGO research into their work. So if you're Meta, you're like, "You're starting with us?" It's a little bit about legitimate complaints.
Evelyn Douek: Yeah, they're not starting. They do have other investigations. They have also opened an investigation into X, I think.
Alex Stamos: They do have a number open, so you'll be interested to see which then makes it to court first effectively, which it is court. I mean, what is the enforcement? Or they would have to fine them under the DSA and then there would be an appeal. So it doesn't make it into court for the initial fine.
Evelyn Douek: So I think that's right. I mean, I'm not an expert on the procedures here, but I'm pretty sure that's right, that the commission has the power to fine these platforms. Quite significant fines, there's some pretty heavy penalties if it finds that there is a breach of the DSA and then presumably a platform would challenge that.
Alex Stamos: Yeah. Okay, so it'll be interesting to see. Maybe they try to line them all up so it doesn't look like they're just picking on one company that European regulators don't like because it would be completely unfair to punish Meta, but then nobody else because honestly, again, I think CrowdTangle should stay up. I think CrowdTangle should be available to European and researchers around the world. This is public data. There is no real privacy issues involved with CrowdTangle. It's just cost. And I think it does, Meta does not like that transparency hurts them, especially when their competitors are not punished for not being transparent. And so it would be, I think the wrong signal to punish them for being the best and then going from being not quite as best. That's the wrong signal to send because you're really telling companies is never try in the first place because there's going to punish you if you ever change your APIs.
Evelyn Douek: So the fourth complaint that the commission has or is investigating against Meta is a perfect example of this point, I think. So the fourth complaint is about the visibility of political content on Meta. And so the commission suspects that, this is from the announcement, Meta's policy linked to the political content approach that demotes political content in the recommended systems of Instagram and Facebook is not compliant with the DSA obligations and that they're concerned that this is not compatible with transparency and user redress obligations as well as the requirements to assess and mitigate risks to civic discourse and electoral processes.
And I just think this is incredible. This is one of those situations where I'm no fan of this policy. I think this is a bad policy. I don't like that Meta is downranking political content. I think that it's harmful to public discourse. There's some great reporting in the past few weeks about some of the unintended consequences of this. We'll link to a article in the Washington Post in the show notes that goes through some of these. Where it's hurting not just politicians and not just political ads, but people like influencers that produce political content or produce like LGBTQ+ influencers whose content is judged to be political or people talking about human rights issues that is judged to be, quote unquote, political. All of that is having their down ranked as a result of this policy, and there's concerns about the vagueness and the breadth and the lack of transparency.
So I think this is a bad policy. At the same time, Meta was really transparent about doing this. It announced this decision. There's an opt-out process on its platforms for parts of it that users, if they want to see political content, can do so. And in a sense, again, this is a good example of a platform could do this without announcing it. It could just start downranking political content and we might never know. And would the commission open an investigation as a result of that? I mean, probably not. But as a result of Meta announcing the decision, the commission, it gets an alert off and is starting to look into this.
Alex Stamos: Right. This is the European block equivalent of I've been shadowbanned. It's just one, it's what they asked for. So if you're working at Meta, if you're Nick Clegg and you're talking to these people in Brussels, you're like I had to listen for years about Europeans complaining that Facebook is too influential in European politics. So Facebook decides to be less influential in European politics, and then you get sued for that. This is what they asked for and this is what the media asked for. I keep on having this argument with folks of roll back to 2016-2017, of what were you saying in 2017, it's that you did not want Facebook to have political ads. You do not want Facebook to have political content. You do not want people talking about politics on Facebook. And so they're doing all that and then all of those have negative consequences.
Again, I don't think it's a good policy. I don't like the negative consequences, but they were completely foreseeable and they're exactly what civil society in general was asking for, and now they're turning around and complaining. So it is I think between the CrowdTangle and this, Meta has a good argument of this is specifically targeted at a company that is not well-liked in Brussels on topics for which you're not punishing anybody else. Are they punishing Pinterest for not having political content? If somebody is doing a bunch of comparison of bridal dresses and they've got their pin board of all the wedding dresses they love, does Pinterest have to shoot in there of like here's an update on neo-Nazis in France and now that relates to the European election? No. And so like you said, it's, wait, what is the signal they're sending here?
The signal they're sending is never carry political content in the first place. If you make a change to your ranking algorithms, never be transparent. And never provide APIs because if you ever change your APIs, you can have the Europeans investigate you. It's pretty ridiculous. And the only complaint in this whole thing that's valid possibly is a bunch of disinformation that is going on. And what we're seeing, our colleagues at SIO have written a number of things about this, is there is a serious AI spam problem on Facebook now. Generative AI has become a humongous problem, and it's obvious when it's the economically motivated spam. It's a lot more subtle when it's politically motivated. And so the possibility that the European elections are being really messed with from a AI-generated political spam perspective is actually quite high. But they really weaken their argument of that's a really legitimate topic when they include this really stupid stuff.
Evelyn Douek: Yeah. And so it'll be interesting to see what happens. The process from here is a little unclear. The commission is going to investigate. Unclear on timeline, they'll make demands for information. It could be that they come back and they say none of this violates the DSA and you're good to go. Thanks very much. But we'll see. I think that in launching these investigations so soon after the DSA has come into force and obviously in the lead up to the elections, I think that's sending a pretty clear signal that they are intending to or will pursue enforcement actions and trying to get platforms to shape up. So it'll be very interesting to see how this all plays out. And if the commission is transparent about it, if the commission is open about what information it gets, how it sets its standards, what metrics it's using, all of that or whether it's going to make these announcements, get these headlines, put informal pressure on the platforms behind the scenes, and none of us really get a lot of insight into what exactly is going on.
Alex Stamos: Yeah, I would like an API to see all of their interaction with the platforms. That would be fascinating.
Evelyn Douek: Yeah, absolutely. Okay, so in another completely unsubstantive update that we have this week, Congress did something completely unsubstantive and passed the revising existing procedures on reporting via Technology Act or the Report Act. So Alex, I think I assume that you think this is a big nothing burger, right?
Alex Stamos: So first, kids, that's how you do a background right there. That's an excellent one right there. Yes, the Report Act has been languishing. It had passed out of the Senate unanimously. It had been languishing in the house for quite a long time. At listeners' podcast, Shelby Grossman visited us and we talked about our big SIO report in which we talked about Congress really needs to fix several things and several of the things we talked about were in the Report Act. It is necessary, but it's not sufficient. So there's a bunch more of recommendations we had in our report for Congress, but this is a good start. It changes some of the technical stuff around holding deadlines. A key thing it does is it creates liability protections for cloud providers so that they are allowed to hold CSAM if they are a vendor of NCMEC.
So it creates a model by which NCMEC can use AWS, they can use Google Cloud, they can use Microsoft Azure, which they never could before. That's one of the things we talked about a lot in our report, is the fact that NCMEC has to buy their servers and rack and stack them in a data center. It's not how anybody does anything except the hyperscalers himself. And so this could possibly lead to modernization. Now, there's a couple of real challenges there. NCMEC now needs the people to be able to design a cloud architecture. They need the money to pay a cloud provider or they need a cloud provider to donate all of that. They need to be able to hire a bunch of engineers to do it. They need the funding to pay those people competitive salaries. So this is not a magic solution. Like I said, it's a good first step.
I hope that Congress stays involved here with this problem because as we talked about in that last podcast, the AI wave of CSAM is coming and it's going to take a huge amount of work by NCMEC over the next year to get up and ready for it, and we don't have time to lose. So I hope Congress follows up that now NCMEC can start the process of building cloud infrastructure. We need DOJ. This is where it gets a little technical, but effectively, DOJ has to approve what NCMEC does here. And so DOJ is going to have to be flexible and thoughtful about their application of rules into the cloud, and we're going to need Congress probably to follow up a significant amount of funding so that NCMEC can actually hire people and get this done. But anyway, great step. Glad to see it passed.
Evelyn Douek: Right. Yeah, so it's like a tale of two bills. You've got the first bill is a blunderbuss approach, probably unconstitutional, doesn't even really fix the problems that it's purporting to fix and has no acronym. And then on the other hand, you've got this bill with this beautiful acronym, is a measured step that's taking a targeted, sensible approach to fixing actual problems. So on the one hand and on the other hand.
Alex Stamos: When you put it like that, it's clearly all in the name.
Evelyn Douek: That's right. It's the measure of quality. It turns out you can judge a bill by its cover. That's the only way to judge a bill. That's what you learn at law school. Yeah.
Alex Stamos: It's great. I don't have to go anymore. Perfect. Thank you.
Evelyn Douek: Exactly. This is very cheap way to get a legal education here. But speaking of that, let's head over to the legal corner for the week. Okay, great. So Georgia became the latest state in the union to pass an age verification law as Governor Brian Kemp signed a bill into law this week. These are so common now that there's basically no mainstream news coverage of these bills. It's another bill that requires parental consent for minors who are people under 16 to get a social media account. There was, like I said, no mainstream new coverage. When I was Googling this, the top Google hits came from sites like the Christian Voice and the catholicvote.com. So that's the extent of coverage and interest in this bill. I would've said a while ago that these are almost certainly unconstitutional laws despite the fact that there are many, many of them around the country.
But as we've talked about on this podcast as well before, one of these laws, a similar law in Texas that was an age verification law for adult sites was upheld by the 5th Circuit Court of Appeals. And just this week, the Supreme Court denied an application for a stay of the law pending the CERT application, the CERT petition for the appeal from this 5th Circuit decision. Now, this is a fuzzy signal. It's not really clear exactly what it means that the Supreme Court didn't grant a stay of the law, so it's going to allow it to come into force even while the appeal is ongoing. There were no reasons given. And so it doesn't necessarily indicate anything about how the court views the merits of the appeal or the fact that it may or may not grant a petition. On the other hand, I was surprised that it didn't grant a stay, given that the 5th Circuit decision is explicitly at odds with Supreme Court precedent.
And when I say explicitly, I don't mean like, oh, I know the Supreme Court precedent and I read the 5th Circuit decision and I can see that they're at odds. I mean, the 5th Circuit literally said, yes, we can see this decision, Ashcroft v. ACLU on this point. That should mean that this law is unconstitutional, but we think the Supreme Court didn't really mean it because reasons and those reasons are pretty uncompelling. So the 5th Circuit explicitly disregarded Supreme Court precedent and First Amendment rights are at stake here. If the law goes into effect, that would be normally a situation where you would see the Supreme Court stay the law, at least pending its adjudication of the CERT petition and then ultimately the merits. And so it's not a great sign that it didn't grant the stay, and it's just sort of a real sign of the times about how you have this 5th Circuit. Again, just sending up all of these cases to the Supreme Court and really testing the limits of First Amendment and constitutional law in this area.
Alex Stamos: It's interesting. It's going to be interesting question is if this is going to an effect, you're going to end up, this is going to be by far now the largest state that has been affected by one of these bans. If all the major porn providers end up cutting off all of Texas, which is the likely next step, it'll be fascinating to see what the impact of that is. Texas, I mean, they've got all their don't mess with Texas, land of the free, kind of stuff. I'm not sure every single adult in Texas going to a porn site and getting a banner saying, you're not allowed to see this because people in Texas, your legislators are idiots, is going to play well there. So I actually am kind of fascinated of, I wonder if that will have effect on the eventual either the law being changed before there's a final decision here, or possibly the political following, the personal of either widespread VPN download and just bypass, or if you end up having political pushback because, again, it's not a state that's famous for people doing what they're told.
Evelyn Douek: Well, so Pornhub did pull out of Texas as a result of this law in March, and of course, the classic Google trends chart that shows a big spike in Googling for VPNs exactly at the same time. And it is interesting that I don't know that I've seen massive outcry or political pressure. I wonder if porn consumption has, in any way, gone down as a result of either this law or Pornhub pulling out of Texas as a result of the easy access of VPNs or the many other sites that aren't within the reach of Texas' jurisdiction that people can access this content on. So it could just be a sign of the futility.
Alex Stamos: Yeah. I haven't seen anything good, but I'd like to see some real empirical data here. I think this is an opportunity for somebody to get their PhD on this. I can't imagine that actual porn consumption has dropped any measurable amount. I think, like you said, this is a massively unregulated business of which there are 10,000 small competitors to the handful of big companies that have geo-blocked. But yeah, it'll be interesting to see whether this becomes. In the UK, their equivalent law was basically killed by an upswelling from the grassroots. And so it'll be interesting to see if, at some point, that actually happened somewhere in the US. It's surprising to me that it hasn't yet.
Evelyn Douek: Yeah. And in the meantime, the 5th Circuit will just keep on teeing up these cases for the Supreme Court. They're blatantly defying precedent, and I guess their assumption is something along the lines of the court can't grant CERT on them all, so maybe one is going to slip through and this law will survive. I don't know.
Alex Stamos: Right. It's the SCOTUS DDoS attack that we talked about.
Evelyn Douek: Exactly. But yeah, and it's great. I mean, it's again, very helpful for people that podcast teach about unsubstantive stuff like this because it's keeping my syllabus nice and interesting with all of these Supreme Court cases. So next term, we could have both this free speech coalition and the TikTok case, and so it continues to be interesting times in content moderation and Moderated Content land. So any sports updates for the week? Alex, what's the news?
Alex Stamos: We're in the middle of the NBA finals, but I don't care because my beautiful Sacramento Kings were eliminated in the play-in round unfortunately, and people are enjoying the Knicks. The Knicks have been bad for a long time. So it is fun to see people enjoy basketball mass in Square Garden. The flip side is you have to root for New York. New York sports fans are the absolute worst. They're total bandwagoners. And then when the Yankees are good, you never freaking hear the end of it. And because the entire sports media is headquartered in New York, they get massively overplayed. So I'm happy for the Knicks players themselves and for the real fans. For most New Yorkers, I don't really care so much. So anyway, yeah, the NBA finals are happening.
Evelyn Douek: Feeling a little bitter.
Alex Stamos: Feeling a little bitter. It is possible that the Sacramento is a little sad about this year. Next year, next year is the year. We keep on saying every year.
Evelyn Douek: That's right. I seem to recall you saying that last year, but I'm sure next year really is the year.
Alex Stamos: It is. Next year is the year. Tough question.
Evelyn Douek: Excellent.
Alex Stamos: Yeah, and things are getting interesting in college sports. There's a bunch of crazy legal stuff going on. Florida State has sued the ACC, their conference that they're part of. So they're suing the organization that they're a part of. They basically want out, but this lawsuit is about transparency. So if you want, we could actually make a crossover episode here if we want to talk about it. But the ACC is where Cal and Stanford went for safety after the Pac-12 fell apart, and so now the ACC may have fell apart. So it's like a great time for Bay Area sports because we're going to end up playing in the Premier League or something. They go, how did this happen? Okay, great. As long as it's stable and it's not falling apart, we'll just fly to Manchester.
Evelyn Douek: This is incredible. I don't follow this at all except for you giving me updates and it always sounds like some sort of sitcom or like a Parks and Rec kind of situation of the way that this is all playing out. It seems so ridiculous.
Alex Stamos: And there actually is a Supreme Court connection here around interpretation of the Supreme Court precedent of whether or not effectively, we're moving towards a world where college athletes are getting paid like professionals and that's creating all these problems. It's great for the students. I understand why it's totally appropriate to do that, but it is also blowing up college sports. And at the elite universities, it might really blow it up because you're going to end up a situation where having semi-pro athletes pretend to be students is not going to be acceptable to universities like Stanford and Cal and the Ivies and Chicago and schools like that.
And so it is fascinating that we might end up in a situation where a bunch of schools effectively opt out of the highest end of college sports and go back to real student athletes, is what people have talked about. The problem is they all have these humongous budgets, and so that is based upon the idea that they are competitive at the highest levels. And so downsizing your athletic programs is going to be incredibly hard for them. Anyway, it is a fascinating challenge for American academia of how do you handle these effectively semi-pro sports teams that happen to share a campus with your chemistry department?
Evelyn Douek: Yeah. Wow, that's a surprisingly optimistic take I think about the academic integrity or aspirations of these institutions. So it'll be interesting to see.
Alex Stamos: It's been discussed, and it's been discussed for schools like Stanford, at least to get out of football because football is what's driving all the weird stuff. But then a number of the schools, like both Cal and Stanford are two of the top three, I think Cal, Stanford, USC, and UCLA are by far the biggest contributors of Olympians to the US Olympic team for all the different, the swimming and water polo and all these different Olympic sports. And how much do you want to punish these poor Olympic sports kids because of football and the deal with Fox Sports? It is already ridiculous, the idea that our volleyball team is going to have to fly to the East Coast on Tuesdays and still go to class because ESPN and Fox did not want to give a deal to the Pac-12. I mean, that's just a ridiculous statement, but that's how things are happening.
Evelyn Douek: Well, I'm glad we got some substantive coverage in this sports corner right at the end.
Alex Stamos: At least we're talking about something real like sports.
Evelyn Douek: Exactly. With real substance and heft, and so glad we could slip that in right at the end. And with that, this has been your Moderated Content weekly update. This show is available in all the usual places and show notes and transcripts are available at law.stanford.edu/moderatedcontent. This episode wouldn't be possible without the research and editorial assistance of John Perino, policy analyst extraordinaire at the Stanford Internet Observatory. It is produced by the wonderful Brian Pelletier. And special thanks also to Justin Fu and Rob Huffman. Talk to you next week.