Alex and Evelyn are joined by Moderated Content's Supreme Court correspondent Daphne Keller to talk about the oral argument in the NetChoice cases this week and what the Supreme Court justices seem to be thinking about whether and how states can regulate internet platforms.
Daphne Keller:
Alex, have you been to the Supreme Court before?
Alex Stamos:
Not inside, no.
Daphne Keller:
So, one of the things they do is they get a whole line of people who are on their way in to stand in this marble stairwell that's beautiful and has amazing acoustics. And this guard with a very deep resonant voice tells you that if you do any free speech activity, you'll spend the night in jail. There's this beautifully delivered lecture about how you better not even wear a pin that says anything on it. We are not messing around. You will go to jail tonight if you speak out in the courtroom.
Alex Stamos:
Wow. So, the one place, the First Amendment does not apply in the United States is within the Supreme Court.
Evelyn Douek:
Yeah. Time, place, and manner regulations will blow your mind, Alex. Turns out the government can do this.
Welcome to Moderated Content's stochastically released, slightly random, and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek and Alex Stamos. Today we have a special episode where we're going to talk about what else but the oral arguments in the NetChoice cases at the Supreme Court on Monday. And to do so, we are joined by the wonderful Daphne Keller. Daphne is well-known to our listeners of this feed, the director of the program on platform regulation at Stanford Cyber Policy Center, formerly Associate General Counsel for Google, and most importantly, Moderated Content's Supreme Court correspondent. Thank you very much for joining us again, Daphne.
Daphne Keller:
Happy to be here.
Evelyn Douek:
And for your commitment in being live on the scene for this podcast in order to get the firsthand account of what it was like in the room.
Alex Stamos:
What it was like. What did the Supreme Court justices smell like, Daphne? That's what people are looking forward to.
Daphne Keller:
Dignity.
Alex Stamos:
They smelled like dignity, perfect, and freedom.
Evelyn Douek:
See, that would've been a great B-roll ad read. Dignity, the sense of the Supreme Court justices. All right, so I assume most of our listeners are going to know, but we should do some background and some primer and some setup for the case. So Daphne, walk us through, in a nutshell, what do the laws do and what was the argument essentially about on Monday?
Daphne Keller:
Yeah, so maybe I'll start this story one step before the laws get passed, which is in the, I think, probably early spring of 2021, Clarence Thomas wrote an opinion basically saying, "Maybe it would be okay to force platforms to carry content they don't want to. Wouldn't it be interesting if we had a case about that?" And spelling out what he thought the legal theories might be for why it would be okay to force platforms to carry content against their will. Basically because they're so big, and they're essential to communication, and that line of reasoning.
Not too long thereafter, both Texas and Florida took him up on that or perhaps they were not inspired by him, but from where we sit, this seems like a chronological ordered event with causation. So they both passed laws that in slightly different ways take away platform's discretion over what content they're going to carry.
It's not that the platforms have to carry everything, but there are some new rules, including so-called viewpoint neutrality rules in Texas, imposed on them. So that's one piece of the laws is that "must carry rules" as lawyers have started calling them. And then the other piece is both of the laws have some transparency mandates. Those are different in Texas and Florida, but the part of that that wound up before the Court was a requirement to notify any user who's affected by content moderation, and in the case of Texas to allow them to appeal.
So what went to the Supreme Court was the must carry rules and the notice and appeal rules and the legal challenge to them it was looking at was the platforms saying this violates our First Amendment rights to set editorial policies.
Evelyn Douek:
Great. Actually, can we take those in the reverse order? Because we can dispense with the transparency provisions pretty quickly because they were barely talked about on Monday. I thought it was fascinating. I mean, I think that these provisions are really important. You've written about them at length about how important it is to know the First Amendment standards that govern when governments can mandate these private companies, these enormously powerful companies to disclose what they're doing on their platforms. Hugely important issue, and barely got talked about during the argument on Monday, which I mean we always knew it wouldn't be the focus, but I was surprised it basically got zero air time. Did you have the same reaction Daphne?
Daphne Keller:
Absolutely. It was amazing. There were maybe three references to that part of the case and they passed by in a sentence or so. And what that says to me, or what I would guess based on that, is that the Court is willing to buy into the framing that the Justice Department gave it and to some extent that the platforms gave, which is the transparency rules, or at least the notice and appeal rules, should rise and fall with the must carry rules. Whatever the answer is about must carry, maybe it's the same answer about notice and appeal. And the way to sort of munge them together is if they were the same issue, which they're not actually, there are a whole bunch of really important differences.
But the way to munge them together is to say platforms have an editorial right to set and enforce their own policies, and if they have to bear this cost of notice and appeal every single time they enforce their policies, that's a burden that deters them from exercising their constitutionally protected editorial rights.
Evelyn Douek:
Right. Yeah, I mean it seems likely that they're not going to do anything particularly innovative or striking in this direction given that they didn't talk about it or didn't ventilate any issues. And these issues haven't really been ventilated in the lower courts either. It's not like the lower courts sort of dealt with these at length.
Daphne Keller:
Yeah. Not at all. I mean, my favorite stat on this is in the 11th Circuit, the platform's brief was 67 pages and they spent one page on the transparency issues. That's how big a priority it is.
Evelyn Douek:
Right. So there's at least one bucket of issues that I think is going to be back before the Supreme Court in not too long in another form because this case surely isn't going to decide the law governing transparency.
Daphne Keller:
Yeah. And maybe listeners to this podcast might be interested in knowing that there are cases coming up out of California and New York, or rather in response to transparency mandates that were enacted in California and in New York, and both of those have First Amendment challenges. Those laws are kind of different, like the exact questions they would bring to the Court are a little different, but that would be my guess about what would bring this back to the Court relatively quickly.
Evelyn Douek:
Yeah, completely agree. And it seems like NetChoice is not going to be a case that says much in either direction on that. So then we go back to the must-carry issues and the ways in which these laws regulate how platforms do content moderation. And the setup here is basically that we have the parties coming in with two very extreme positions and that these two very extreme positions have been reflected in the lower court judgments, as well, on this issue.
So to briefly summarize, it's something like the platforms say they can never be regulated in any way, these laws are unconstitutional in all of their forms, and the 11th Circuit agreed with that. And the states come in and say, "No, no, no, there's no speech or First Amendment relevant activity here at all. This is all conduct. You can be regulated any other business," and the Fifth Circuit basically agreed with that.
And I think the most striking thing for me in listening to the arguments on Monday was that there were a significant number of justices, I think, that were trying to find a middle path somewhere between these two extreme positions where they didn't want to necessarily be talking about this as either constitutional in all its applications or unconstitutional in all its applications. But I'm curious what you made of that Daphne, if that was your reaction too.
Daphne Keller:
That's interesting. I think you might be right. I had not been looking at it that way. So I mean, just to kind of flesh this out, the way that these two extreme positions play out in a lot of the arguments is by way of analogy where the platforms are saying we're basically newspapers and the states are saying, "You're basically telephone companies." And both of those things are absurd on their face. Platforms are very different from newspapers, and they're very different from telephone companies. And we have a lot of areas of law, including liability law under things like the DMCA or the CDA 230, that recognize that platforms are this in-between thing that gets distinct rules.
So in the oral argument, I think that the thing you're pointing to where the justices wanted to do something more complicated and nuanced, what I heard was a number of them, probably five of them, which is the magic number, saying, "Yeah, we think Facebook and YouTube and X and these sort of most prominent social media platforms, we think they are engaging in editorial activity when they set and enforce their content moderation rules. But we also think that these laws, and especially Florida's law, might also apply to things like direct messaging, email, marketplaces, whether Uber can stop somebody from being a rider," and they were worried about issuing a ruling that would preclude regulation for those kinds of services, not because I think they had a super strong theory about what the right rule should be, but just because nobody had briefed that, like nothing in the case had been about those other applications.
So I'm not sure that that they were trying to do some nuanced thing about Facebook. It's more that they were recognizing the universe is bigger than Facebook.
Evelyn Douek:
Okay, so it's interesting. Because I mean, I totally agree with that. I think that there was this concern in Florida's law about the fact that it might apply to the Ubers of the world or the Amazon Web Services or the Dropboxes and no one had talked about this and the question was, "Well, if it's constitutional as applied to them, can we really strike it down facially in all its forms?" Which just sidebar on this, it's so bizarre to be talking about these laws in this way when Florida was not subtle about the reason why it passed this law in saying it wanted to regulate Facebook and the Facebook newsfeed and it was upset that Donald Trump had been kicked off Facebook. And it wanted to regulate Facebook, but they did it kind of incompetently with a quickly drafted law. They're like the dog that caught the car, they have to defend it, and then suddenly it might be saved because their drafting means it accidentally also applies to Uber. It's a weird situation.
Daphne Keller:
Right. By drafting really sloppily, they made it harder to strike the line.
Evelyn Douek:
Which should not be how the First Amendment works.
Alex Stamos:
We talk about this all the time, about these state laws while they're like we're passing this what's supposed to be a neutral law, they're wearing a T-shirt that says, "This is because of First Amendment protected speech." Right? Like, "I am angry."
Evelyn Douek:
That's right.
Daphne Keller:
Yeah. Saying that in interviews, the governor's signing statement is explicit about it. It's not subtle in any way, as Evelyn said.
Evelyn Douek:
Like they're saying, "We can regulate them because the platforms aren't engaged in expressive activity while also saying we're really angry at the platforms for expressing these views about this content by moderating it in this way."
Alex Stamos:
Right. "Their views are bad, and we think their views are bad because I disagree with them, therefore I want to control their speech." My T-shirt saying all those things is raising many questions about my T-shirt saying I want to control the speech of others.
Evelyn Douek:
Yes. So everyone ignored the T-shirts in the room on the day because this is the Supreme Court.
Daphne Keller:
Well, but also in the cert petitions, there was a question about this that the Court expressly rejected and said it wasn't going to rule on, which is whether the motivations of the legislators rendered the laws unconstitutional. I mean, inevitably, no matter how you frame the question, the motivations of the legislators seem kind of relevant, but they refuse to just look squarely at that question.
Evelyn Douek:
Yeah. And this is one of my, I'm teaching this to my students, where the Court just doesn't look at these statements that the politicians make in public debate to interpret the laws. They just look at the face of the law and try and work it out from there. So it's not unusual for the Court not to be looking at that, but it is somewhat bizarre. It's this cognitive dissonance.
But I want to go back to the question of whether the justices were looking for a nuanced path through, because I thought, especially with respect to Texas's law, I heard a number of justices... And I completely agree, I maybe got Kagan, Barrett, Jackson, Sotomayor, at least, were all saying something along the lines of, "Let's assume we agree that this is unconstitutional as applied to the Facebook newsfeed or the YouTube recommendation algorithm or the paradigmatic social media platforms." There was enough people saying, "Let's assume we agree that that's unconstitutional," which I'm sure made the platforms breathe a big sigh of relief.
But I heard a number of questions about like, "Well, what about other parts of the platform, though, where they're not necessarily engaged in the same kind of curation function?" So DMs came up a lot as someone mentioned like the Facebook marketplace, areas of platforms or email or WhatsApp messages, areas of platforms that don't look the same as the newsfeed where they're not engaged in that curatorial function. And they're saying, "Well, maybe we actually need to," and Justice Jackson was talking about this quite a lot, "evaluate this function by function rather than just platform by platform," which I think actually was really, really welcome. Because it makes total sense to me that you can't just lump a whole bunch of activities under the same... like as long as you have the same logo at the top of the page and you shove everything into an everything app that suddenly you can hide all of it behind the First Amendment and say you can't regulate any of it. I think we do need to get to a place where you engage in that function by function analysis. And I was really pleasantly surprised.
Daphne Keller:
I agree with that. Yeah, no, I didn't mean to imply that anything under the Facebook branding was all okay. I meant the newsfeed, and DMs were definitely an example they raised repeatedly as something that might get different analysis.
Evelyn Douek:
Yeah. So I just thought that was super interesting and a real pleasant surprise given it's not so long ago where Justice Kagan was making the quip that, "We're not the nine greatest experts on the internet." There seemed to be quite a lot of nuanced conversation that was going on about understanding that different parts of apps are different and that it's not just that this might also accidentally apply to Uber, but that we also might want to think about how it applies within Facebook to various parts of it and that they don't want to say anything in this case that prevents platforms from potentially applying non-discrimination norms to parts of platforms that are engaged in this much more common carrier-like activity.
I mean, Paul Clement for the platforms obviously pushed it to the nth degree and said they can viewpoint discriminate at any part of this, but I wasn't necessarily hearing the justices buying that.
Daphne Keller:
Yeah. So I share your generally positive response to the policy instinct behind their questions that they wanted to slow down and say, "Hey, different features might get different treatment, and maybe there are some things where a common carrier-like treatment would make sense," but I overlay that with some pessimism about what that means procedurally, because what we heard Justice Gorsuch, for example, suggesting was, "Well, maybe this goes back to the fact-finding first instance court," and they look at every single thing any possible platform could possibly ever do and every possible thing this law could ever mean, and once they've done that a thousand years of discovery and analysis, then we can answer the question, but in the meantime, maybe the law can go into effect. Which would be a wild, wild outcome. I mean, he literally asked at one point, Alex, I think you'll find this funny, "Where in the record can I look for a list of every feature of every covered platform?"
Alex Stamos:
Oh yeah, no problem.
Daphne Keller:
Yeah, no, yeah. We'll get right on that, Your Honor.
Alex Stamos:
And it seems like this just comes back to the argument we always have, which is what is a platform? I mean, we've just picked this term platform for you have these Delaware corporations that own a bunch of products that use computers to do stuff, and often there are two human beings at either side of a bunch of computers in the middle, both the computers in the hand of the people and then computers up in the cloud. And the incredibly wide range of things that you can do when you describe that, we just capture it all in platform. And it just seems like, I think, even our field, if you'd say academics who study this, we have not been careful enough because we have not created the record of what do you mean by you use the term... Evelyn and I use the term platform every single week, and I don't even know what it means. It just makes me sound smart.
So just like for most of the things I say, I just kind throw it out there because I feel like I'll get some kind of social benefit from it. But is Facebook a platform? Is Meta a platform? It's a really hard, interesting question. What's that mean?
Daphne Keller:
It creates this artificial ability to communicate with people. We'll all agree to this placeholder, and we all kind of know we don't agree on what it means or even have our own internal definitions, but then we can have a conversation that would otherwise be impossible.
Alex Stamos:
Right, which guess it's just a real problem because, as you guys keep on pointing out, the Supreme Court does everything by metaphor, right? Like everything, the entire thing is, "What 18th century invention can we map this thing against?" Right, it's like...
Evelyn Douek:
Well, okay, so here is the one that Solicitor General Prelogar offered in this vein, which just sidebar again, she's amazing and did an amazing job, and wow, when I grow up, I want to be just like her. She did an incredible job, I thought. But the analogy that she offered here, which I thought was actually quite good, but I'm interested in if there's pushback. She gave the analogy of Amtrak, which is a common carrier, and you can impose all these non-discrimination norms on it in terms of who it sells train tickets to, but if Amtrak produced a newsletter or a magazine, they would have First Amendment rights in how they produce a magazine. But similarly, just because Amtrak might produce a magazine and it has First Amendment rights when it produces a magazine, doesn't necessarily mean it has First Amendment rights to disallow people from catching the train.
And so this idea of disaggregating, just because an entity is a common carrier in certain functions doesn't necessarily mean that it's going to be a common carrier or be able to be regulated based on its conduct in all of the functions that it carries out. And I thought that that was useful.
Daphne Keller:
Although I think that the Solicitor General's position is that, following this metaphor, the platforms can keep certain people from catching the train. They can exclude from the newsfeed or from YouTube recommendations or from-
Evelyn Douek:
Oh, I think the newsfeed is the magazine. I think the newsfeed is the magazine in this analogy and that maybe DMs are catching the train, potentially, is how I understood it. But maybe the 18th century analogy breaks down as Alex was suggesting.
Daphne Keller:
Well, I mean I'm pushing back mostly because I think Texas and Florida's position is, "Sure they can say whatever they want in their own pages that they write or in their own posts," and that feels more like the magazine to me.
Evelyn Douek:
Yeah, no. So I understood necessarily be saying, "Look, if they've got expressive product in certain areas that you can't regulate, like the magazine or the newsfeed, that's fine." But the biggest way in which she departed from Paul Clement's argument, I thought, in terms of the positions they staked out, was whether you can viewpoint discriminate in the provisions of direct messages or emails or those kinds of much more common carrier kind of neutral conduit services.
Paul Clement said, "Yes, we can do whatever we like with respect to any of it." And she wanted to differ from him there and suggest that there might be possibility to pass laws that could constrain platforms in that area.
Daphne Keller:
Yeah, I think she was as surprised as anyone else that the Court spent so much time on that. I think she wound up saying, "Look, we don't have a dog in this fight. We're here to talk to you about this other thing."
Evelyn Douek:
Totally. I think everyone was confused, surprised, and was having to think on their feet and react on their feet as to, "What to do if..." Because the parties came in with these two really absolute positions, and the way it had been litigated below hadn't been to talk about any of these possible nuances, and then suddenly the Justices come in and are like, "What about Uber and Amazon Web Services and DMs?" And they're like, "Oh, I don't even know if that issue is probably before you." And then there was lots of scrambling to work out procedurally what does that mean. And Paul Clement's talking about the possible long trial that's going to take place and all of these issues be ventilated, which has just not been how this has been litigated below.
That's one reality that was carrying out in the court on Monday. But as with Congressional hearings about these matters, there's this split-screen reality that's going on, where you have a group of people in one reality having one conversation and then you have a split-screen reality of another group of people in another reality having a completely different conversation. And so we should talk about this briefly.
Daphne Keller:
Oh, Do you mean Justices Kavanaugh and Roberts?
Evelyn Douek:
Well, yeah. I mean they were sort of separate. They were a little bit removed from the other reality. But no, I mean, I am talking about the people in Topsy Turvy land who were throwing out these Section 230 arguments and asking if content moderation is just a euphemism for censorship and if what the US government did during World War I by locking up dissidents was just content moderation, to which I'm sure that that is a great analogy. Yes, Eugene Debs was content moderated into jail for his resistance to World War I.
Anyway, it was, I mean, not at all unexpected because as you said, Daphne, Justice Thomas had been the one to float this theory in favor of the states early on. He's really been playing the long game here. He also planted a number of seeds in his decision in Taamneh last term, which we can talk about in a second. And then here we have them making these arguments on Monday.
I guess the place to start with this is about Section 230 because a lot of these arguments centered around Section 230, and the Justices' assertion to the advocates that, "Aren't you being totally inconsistent platforms by disclaiming all responsibility for the content on your sites when you want Section 230 protection and now coming in here today and saying no, no, no, our curation, everything on our sites is our speech, and this is editorial discretion."
So Daphne, can you give us, I mean Paul Clement and General Prelogar deserve a medal for their patience in how they responded to these many, many questions. Why aren't these positions totally inconsistent?
Daphne Keller:
So maybe I'll just start as background that this case was litigated. There were three challenges that the platforms brought. Number one, the First Amendment precludes these laws. Number two, Section 230 precludes these laws. And number three, the Dormant Commerce Clause bars these laws. And only one of those questions got accepted for review at the Supreme Court, and that was the question about the First Amendment. So you might think this would be a day in court when Section 230 does not get mentioned, but if you thought that of course you would be wrong.
And it came up for two very different reasons. One is more big picture, conceptual, and the other is more statutory and in the weeds. The big picture conceptual one is kind of more interesting and is what you were getting to. So let's start there, but I'd love to talk about both.
Evelyn Douek:
Yeah, please.
Daphne Keller:
So the claim, which I think is appealing to a lot of people, is how can it be that you are immunized for your content moderation choices under 230 and those exact same choices are your own protected speech, not your user's speech, but your own protected speech when it comes time to litigate these Texas and Florida cases? Like isn't that inconsistent?" And the answer is no. Those things are definitely not inconsistent, because as you guys know, the purpose of Section 230 was to encourage platforms to exercise editorial discretion. The goal was to have them simultaneously make editorial choices and also be immunized.
So it's not inconsistent, and I think the assumption that it is inconsistent often comes from people whose sort of view of communications in the media was formed pre-internet in this kind of circa-1980 world, where if you're a regular person, you can only ever talk on a common carrier like a telephone, and you can only talk to the people who already know you, et cetera, and if you want to broadcast to the whole world, the only way to do that is if you're a very, very privileged speaker, you're Tom Brokaw or something, and you have a network behind you and their lawyers are going to vet you, and there is no in-between. In the world where every carrier of speech has to either be a passive conduit on the one hand or be a responsible illegally liable editor on the other, the only two kinds of media that make sense are phone companies on the one hand and broadcasters on the other.
And that's what the internet got us away from and opened up the possibility for regular people to be able to speak with much wider reach and ability to talk to the rest of the world, but the idea that a number of the Justices were voicing, that it would be inconsistent to immunize carriers and also allow them to assert the First Amendment, I think, is rooted in that vision of how media is supposed to work.
The other thing that's kind of bonkers about that framing is if you have a First Amendment right, Congress can't take it away from you by giving you an immunity. That would be an amazing tool for them if it were indeed a tool at their disposal.
Evelyn Douek:
One neat trick.
Daphne Keller:
Yeah. So that was problematic. And then can I shift to the doctrinal thing?
Evelyn Douek:
Please, yeah.
Daphne Keller:
So Florida's law says this law only applies to the extent consistent with CDA 230, and Texas's law has something a little vaguer, but that probably means the same thing. And so Florida said to the Court, "Look, you cannot interpret what our law means without first interpreting Section 230. You can't even get to the stage of doing First Amendment analysis without this multi-step process where you have to interpret 230 to get there."
And the part of 230 they want to have interpreted is the part that has been a bone of contention between the kind of MAGA Right, for the most part, on the one hand and a lot of other 230 wonks on the other for a number of years, and that's the idea advanced by the Trump administration and advanced in this case by Florida and Texas that platforms are only immunized from must carry claims like the ones created by these laws if they moderated content for an enumerated list of reasons using some but not all of the words that are in section 230 C2, and those are words like lewd, lascivious, excessively violent, harassing.
What that boils down to, if the court were to buy into Florida's interpretation of the statute, is platforms are immunized when they take down sexual content, violent content, or harassing content, but they're not immunized if they're taking down non-harassing hate speech, if they're taking down disinformation, if they're taking down pro-suicide material. There's just this gamut of really awful stuff that all the big platforms moderate and that most users want them to moderate, but that wouldn't be part of their immunized right to moderate under Section 230 if the Court were to adopt this in the weeds doctrinal, statutory interpretation that Florida's advancing.
Evelyn Douek:
Yeah, I mean this is a very minor point, but it was of immense frustration to me, and I assume you, Daphne, too, when in the discussion of terrorism content, like the way that that tried to get brushed away. I think it was Texas at this point was sort of suggesting that, "Oh no, no, no, this won't make platforms carry terrorist content because all terrorist content, we all know, is illegal content," and so they can take it down because it's illegal. And I think it was Justice Kavanaugh that said, "No, no, there is a whole world of terrorist content that is nowhere near going to meet Brandenburg Incitement Standard or be unprotected speech. You can't get out of it that easily."
Daphne Keller:
And then eventually he got the Texas advocate to say, "Well, they can take down pro-Al-Qaeda content as long as they just take down everything anyone says about Al-Qaeda." And Justice Kavanaugh at that point was just like, "All right, you've dug your own grave, dude." Like he didn't even follow up to that one because the problem with it is so obvious.
Alex Stamos:
Is Kavanaugh the hero here as the Justice who seems to actually understand that there's a ton of First Amendment-protected speech that none of them want to see online? It just seems to me listening to the argument and then listening to two of you recapping that from the perspective of those who are on the operational, technical side, there is still a humongous disconnect from the actual operational reality that did not get really plumbed here.
Daphne Keller:
So I think he is a hero for a certain position. It's not exactly the pro-content moderation, let's protect users position. It's more the, "These are private companies and they have speech rights. They can do what they want with their private property," like that kind of old school property-driven conservative perspective. And that's a position that he has held for a long time and that he was outspoken about on the DC circuit. It wasn't a big surprise. Indeed, it's a position Clarence Thomas seems to have 20 years ago and has drifted away from.
So there's actually, there's an amazing piece by Corbin Barthold in, I can't remember which publication, describing Justice Kavanaugh metaphorically as shotgunning a beer, crushing the can against his head, ripping off his robe, and shouting an epithet about the First Amendment. So that very much frames him as the hero in that mold, and in that mold he is indeed the hero.
Evelyn Douek:
Yeah, I think it was in the Daily Beast, and it's certainly extremely vividly written and a great read. I think I wanted to resist that framing though, of Kavanaugh being the hero, at least in my story. Like Kavanaugh's job on Monday, it seemed to be every so often chiming in every half an hour or so to be like, "But the First Amendment's about the government, right?" He's just like, "Oh, censorship is by the government, right?" And it's like, "Oh, we're just talking about if it's not the government, it's not a problem."
And I want to resist that framing. I think he's in this camp where it is all very easy. Like as long as the regulator is a private company, there's no real threat to free expression and we can move on. Credit to him at least, as you say, Daphne, for being very consistent. This is a position that he has held for a long time and does not appear to be budging from it in the least in the face of the culture wars and the politics around this.
But Justice Kavanaugh's idea of an ideal speech environment is not my idea of an ideal speech environment where it's just like, "Government bad, corporation fine." And that's why if he was far out on the one side sort of thinking, "What are we even doing here? I've read the First Amendment. It says, Congress shall make no law. Like why are we here?" I definitely am much more in this camp of the people in the middle going, "Okay, well that's certainly true with respect to some expressive activity, but it can't be, let's try and find..." Being nuanced about this and not just be so formalistic that to say as long as it's a private actor acting, there's no threat to free expression or threat to people's speech. And I think that if we're going to end up in a world where we have a more healthy online speech environment, that's something that I'm much more interested in, myself.
Daphne Keller:
Yeah, I mean I do think Kavanaugh is capable of a little more nuance on this, but the States were being so absurd on the other side that there was sort of never a reason to get into the nuance. But as you know, in his dissent in the USTA net neutrality case in the DC circuit, he had this very weird gloss on the Turner case, which is an old case about making cable companies carry local broadcast TV, where he said in that case, the cable companies, the analogs of the platforms today, the cable companies had First Amendment rights, but the government had a good enough reason to override it and force them to carry stuff they didn't want to carry. And that reason was competition and market failure.
So he has this model where if you have enough of a competition model, you can go ahead and compel platforms to carry things they don't want to, or at least some things if you have a good enough state purpose and it's sufficiently tailored, etc. But what's weird about that, of course, is if you look at Turner, there wasn't just one government reason to override the First Amendment rights of the carriers. There were three, and one of them was diversity of voices, what might be called media pluralism in some situations, and that's sort of absent from Kavanaugh's version of things, at least in that case.
And the diversity of voices media pluralism goal is the goal that Texas and Florida were asserting some of the time. And of course, that's not what their laws achieved, right? Like we shouldn't pretend that these laws actually carried out that goal, but an interesting thread in the case and different Justices' position and advocates' positions on it is, "Who is and isn't on board with the idea that the state might have an interest in ensuring that a diversity of voices get an airing in the public?"
Evelyn Douek:
Yeah. One of the other parts of the argument that I found really interesting was like, "Well, what would the laws achieve?" There was some discussion about what would platforms actually do if these laws went into practice? And we already talked about the idea that they might just remove whole subject areas of content. So it's this bizarre situation where the states have passed these laws extensively to further free speech and to make sure that platforms are much more speech-protective, and then they're getting up in oral argument and saying that it's not going to allow terrorist content because what the platforms can do is just take down anything that ever mentions terrorism, whether positively or negatively.
And then Ned Choice's lawyer, Paul Clement, was also at one point saying what this might incentivize platforms to do is say, "Let's do only puppy dogs, at least in Florida, until we can get this straightened out because we're getting hammered by people saying we're not doing enough to keep material that's harmful to children off our sites. And so in order to consistently, and not in any viewpoint discriminatorily, make sure that we're getting this harmful material off our sites, we'll just nuke entire categories of content and only have puppy dogs." Which, honestly, not a terrible world, but definitely not a pro-free speech world and not one that's great for democratic discourse, as excellent as puppies are. And so yeah, I thought that was really interesting to hear the discussion of how platforms would possibly comply if these laws were to go into effect.
Daphne Keller:
And it was interesting. I agree. And it was interesting also that so much of the discussion was focused on what would it mean to be viewpoint neutral or what would it mean to be consistent, which is where this idea comes in of like, well just take down everything in a certain category, take down all the non-puppy category content.
Evelyn Douek:
Right.
Daphne Keller:
But both of these laws actually have some content-based rules dictated by the government about what platforms have to carry or what platforms are free to take down, regardless of its viewpoint, because the state disapproves of it. And it was interesting to me that along with the transparency rules, the content-based parts of these laws got almost no discussion in the case. I think because that part's so obviously problematic that it wouldn't have been that interesting to talk about.
Evelyn Douek:
Yeah, I guess that's right, too. Yeah, they didn't talk about you can't take political candidates off in Florida or the exceptions to the rules in Texas, which of course had been why the platforms had come in and said this whole thing... Like this is why they'd run a facial challenge and said that these laws are unconstitutional in all of their applications because they are content-based discriminatory laws, and that makes them unconstitutional and a violation of the First Amendment. But yes, the Justices weren't so interested in talking about that and were trying to work out, "Well, how actually would it apply to other things?" in interesting ways.
Daphne Keller:
The other thing that the platforms have really leaned on the whole time and that Paul Clement leaned on in oral arguments is the idea that the laws are facially unconstitutional and you can just strike them down with no further discussion because they targeted platforms based on their size. And this is a part of the argument that, frankly, I just haven't paid that much attention to because I'm not invested in it. I kind of like the idea that we might have different rules for the biggest platforms, and certainly we have different rules for bigger economic players in other non-speech areas, but I think the platforms really have loved to have gotten a sort of clean sweep ruling where targeting big platforms at all automatically made this a First Amendment violation, and it didn't seem like they were getting a ton of traction with that with the justices, at least at some point, Paul Clement started focusing on other arguments.
Evelyn Douek:
Yeah. I mean, Justice Kagan was asking about it at one point, and her voice was dripping with incredulity when she was asking about this. I don't know, I assume her face was also baffled, but I could hear her bafflement coming through the internet where she was like, "It seems an impossible argument for you to make that this is unconstitutional just because it regulates really big platforms. It's totally reasonable for a state to conclude that it should focus on the big platforms." And Paul Clement made the argument, "Well, but it's focusing on these because it doesn't like these ones in particular." Which, if true, that would be a problem, but she says, "But let's take that out of it. Let's say that the state is focusing on these platforms, not because it thinks they're the liberal platforms, but because it thinks they're the big platforms and the government are entitled to regulate the biggest players," and she just clearly thought that was totally fine.
Daphne Keller:
She was making the face that... It wasn't bafflement.
Evelyn Douek:
[inaudible 00:39:21].
Daphne Keller:
It was something a lot less kind than bafflement.
Alex Stamos:
Is this an example? I saw a number of people pointing out that Clement, being a famously good Supreme Court advocate, was still hampered by the fact that he was representing a coalition, that this is NetChoice, it's not Meta and it's not Google. Did you get any feeling for that, Daphne, that if Clement was representing a specific company, where he could tie it to that specific circumstance, that he'd be in better shape?
Daphne Keller:
I am positive that his hands were tied by that. I'm trying to think if there were specific moments, and Evelyn jump in if you have any in mind, where it seemed like he could have given a crisper answer but for that. I mean, I imagine he just couldn't commit on anything that hadn't already been put to a very large committee.
Alex Stamos:
Right. I mean, getting the general counsels of those companies, all of whom sue each other all the time. I mean, that's one of the amazing things about NetChoice, is these people hate each other, but they're all bound in this one goal. Like when you talk about the "what is a platform" argument, it seems it would be a lot better for him if he represented, say, just Meta and he said, "Okay, well, which of our platforms is not a speech platform?" Which gets really hard when you have a bunch of folks for whom there's things that are direct editorial being written by the employees as well as these search and other things that are not actual social media products.
Daphne Keller:
Right. I mean, I guess one thing he could have done, if he didn't have a coalition, is throw some other companies under the bus and say, "Oh, of course ride-sharing. Ride-sharing's not in scope or..." Yeah.
Alex Stamos:
Yeah, no, that's actually a great example of people who why they're in NetChoice is kind of bizarre other than they pay their dues.
Evelyn Douek:
Well, and the other thing, I mean, I don't know how much this was influenced by the fact that it was a coalition so much as... I think the main thing that had him scrambling and sort of flat-footed on the day was the litigation choices made below to only bring facial challenges, to only argue this is unconstitutional root and branch in every application. And then the Justices weren't having that, but the problem was NetChoice hadn't brought an as-applied claim below that could get them out to just say, "Well, okay, it doesn't matter if it applies to other companies, it's unconstitutional as applied to us," but they just hadn't made that choice below because they wanted to go big or go home, I guess.
Daphne Keller:
They did argue that the laws were unconstitutionally vague, though. And it seems like all of these questions about like, "Well, couldn't these laws mean a thousand different things?" That's what happens when you're unconstitutionally vague. Then nobody has fair notice of where the laws actually apply, and that's a ground for striking it down on that basis.
But when they argued that Florida opposed by saying, "Oh, there's no speech issue here at all. The content moderation is not speech. End of analysis. You don't need to answer any more First Amendment questions after that," and the litigation from there on kind of proceeded with a focus on that question and didn't get back to, among other things, that the vagueness issue.
And so one of the things that the Court was going back and forth on is like, "Well, whose fault is that? Platforms or Florida? Who has the burden?" It is supposed to be the platforms that have the burden, but as Justice Sotomayor said just a few minutes in, at some point if the State drafts a broad enough, unclear enough law, shouldn't the burden shift to the State to justify why it did it that way? So there was a lot of legal procedural wonkery around that, but I think it's a pretty deep and important question. It goes back to what we were saying before, can you draft a sloppy law and prevail for that reason?
Alex Stamos:
It seems like this raises another Meta issue that I read about, which is the idea that there's just too much here, that despite the fact there's hours and hours of argument, that there's so many interesting, crazy questions, that there's no possibility the Supreme Court... Now, obviously, now they're going to sit for months and have their clerks read every single thing that's possibly related. So it's not like they can't go in parallel, but that is a funny disconnect, it seems to me, between the oral argument where you're time limited and the fact that now there's going to be thousands of hours of research work that has to go into solving the fact that there's 12 different legal questions here.
What do you think? I mean, we're getting to the end. We should probably talk about predictions, but do you think that they will make a decision based upon just a couple of narrow... Or is this going to be a complete rewrite of First Amendment jurisprudence for the internet? Are they going to tackle all 12 relevant issues or the whole thing going to disappear on one? What do you think is going to happen here, Daphne?
Daphne Keller:
I don't think they can tackle all the relevant issues because they're just like fractally expansive. There's no way they could even list them all, much less actually analyze them all, in part because the laws have so many ambiguous provisions, and in part because there are so many different product features to multiply that by to get the number of questions to answer.
Alex Stamos:
The list of legal questions is as long as the list of product features that they're looking for.
Daphne Keller:
Yeah.
Alex Stamos:
Okay, great.
Daphne Keller:
Yeah, and in part it's because there are some genuinely novel, interesting questions that haven't been litigated before, and in part it's because the laws, especially Florida's, are so badly drafted, so they could mean so many different things. And because the states insist that every single provision in the laws is severable, meaning that the Court could strike down sentences 0 through 999, leave standing sentence 1000, and then strike down the next thousand sentences, which, again, fractally expands the number of questions to answer.
So I don't think they have the option of issuing an as applied ruling in response to a facial challenge. And the Justices were literally asking the advocates, "What are our procedural options? How can we write an opinion here?" And so one scenario I can imagine is that formally the states win because formally the facial challenge gets rejected, but the Court rejects the facial challenge with analysis that is clearly telling the lower courts to immediately enjoin enforcement as to the Facebook newsfeed. That sort of tells us what the answer is without being the formal moment when it gets resolved.
Evelyn Douek:
I think there is no way that they are going to try and resolve all of these issues, and I think that's fantastic. I think we are in for a long road ahead. It's a full employment program for academics, lawyers, and podcast hosts who are interested in content moderation issues. Like I heard a number of justices just being really nervous of deciding issues that hadn't been properly argued, that they didn't have all the facts for, that they hadn't really thought about. Justice Barrett at one point was talking about landmines, like she gets nervous when you are deciding one issue that you're accidentally planting a landmine for another issue later on. And Justice Sotomayor was saying, "I get really anxious with these really broad laws, how to think about applying them." I heard this in many, many of the Justices' mouths, Justice Jackson also.
I think this is fantastic that there is real clarity on the need to be narrow here and not decide things that they hadn't thought about properly and at length. So that's very good because it means that there won't be too many votes to enshrine something way too rigid or way too broad that hampers the development of the doctrine in the coming years. But it does mean that we're going to be going back down, ventilating the issues in these particular cases. We're going to be seeing lots of other issues bubbling up in the other cases.
Daphne Keller:
And under these terrible laws.
Evelyn Douek:
Right. Yes. I mean, it is that these are terrible laws, and that was one of the things that made me nervous about these cases though, is because they are so terrible and because they were passed with such obvious animus and bad motives that made them so anathema to the First Amendment, that there would be some really broad ruling because it looks like an easy case, and the fact that it looks like an easy case might lead the Justices to be complacent about all of the really tricky issues that are lurking just below the surface once you poke on them a little bit.
So yes, it'll be with these terrible laws, but also with the many other laws that we're seeing around the country, and I think there were a number of times where the Justices suggested that these very cases will be back up before the Supreme Court in not too long. So yeah.
Daphne Keller:
Although, I mean, if things do play out that way and the First Amendment issues remain unresolved and there's more stuff going on on remand, that provides a big reason for either NetChoice or some other regulated platform acting independently to say, "All right, let's litigate the 230 question a little faster now, or let's litigate the Dormant Commerce Clause question a little faster now." And particularly that latter one, I would love to see more judicial attention to it sooner. Because that's the question of whether we can have a 50-state patchwork of different speech rules for platforms and expect them to somehow implement that and expect their users to live in a world where they never know if their speech is going to make it across state lines to their friends or to their customers because of these differences in state rules.
So to me, it would be useful to move a little faster to get some attention to that question so that we don't all waste our time arguing about laws that maybe states can't actually pass in the first place.
Evelyn Douek:
Right. We all thought it was, we all thought it was the First Amendment that was going to prevent states from passing these laws in any way, and then it turns out coming in from the other side, it's the Dormant Commerce Clause, and the states can't pass them for that reason instead. And you're right, this has not been an issue that has been well-ventilated, and it's extremely important as we have basically every state legislature in the country passing at least one if not many more bills in the coming months.
Alex Stamos:
Is there some way you can also pull in Chevron? Is there a way that they can rule that both takes away speech powers of platforms but also means that you can't regulate fracking? Is that...
Daphne Keller:
Oh, yes. I think we dodged that. But the transparency and notice and action fight is over a case called Zauderer, that if the anti-administrative state Justices wanted to rule on Zauderer in a way that further dismantled the administrative state, this case gives them ample opportunity to do so. And so the fact that we didn't hear that discussed in the case in the oral arguments actually was a big relief to me.
There's this analogous fight going on right now, sorry to digress at the last minute but I actually think this is really important, about whether the Securities and Exchange Commission can require companies to disclose climate change related risks. And the reason they would have the authority compel that disclosure is the same thing that's being thought about in NetChoice about Texas and Florida having the authority to compel disclosure about content moderation.
And so over in the SEC case about climate disclosures, you got things like the Attorney General of West Virginia weighing in and saying, "This is compelled speech on a specific topic, and it should get strict scrutiny, and definitely the SEC should never be able to compel this kind of disclosure." And meanwhile, his usually allied Attorneys General in Texas and Florida are going to the Supreme Court and saying, "Oh, regulators can make companies disclose anything. No problem." So there's a deep administrative state question that hopefully we're just going to dodge.
Alex Stamos:
Okay, well, let's not start down that road because I actually have a lot of bones to pick with the SEC and how they're handling cyber stuff right now, but I don't think... Unless you guys want to go into hour two now? Is that what we should do?
Evelyn Douek:
Yeah, Alex's thoughts on the administrative state. Let's go.
Alex Stamos:
We'll make this Daphne part one. Daphne part two can come out later and be even better. It'll be the Empire Strikes back. I'm referencing, I saw Dune 2 last night, and I'm still shaken to my core. It was incredible.
Evelyn Douek:
But not as incredible as this podcast, Daphne. You are...
Alex Stamos:
No. Daphne part two will blow... We just need Hans Zimmer to score Daphne part two. That's all we need is an electronic trombone right now.
Evelyn Douek:
I was just, the whole time I was thinking that's what's been missing, the electronic trombone. This is really interesting what you're saying, but where's the electronic trombone?
No, this has been great, and I think we're going to have part two, part three, part four. As long as you are willing to come back, there is going to be so much going on, and I think, like I said, full employment program, but it's good there was some dashes of optimism throughout this conversation, I think-
Daphne Keller:
There's several. Yeah.
Evelyn Douek:
Which is welcome. Yeah.
Alex Stamos:
We don't do that a lot on this podcast. It's weird.
Evelyn Douek:
Yeah, we're not really sure what to do with that. I feel uncomfortable. I'm going to have to go stand outside in the rain for a while now.
Daphne Keller:
Yes, it's not on brand for me either. Maybe it's negatives canceling each other out.
Evelyn Douek:
Well, let's not ruin it. Let's jump off before anyone says anything too sad. Thank you so much, Daphne, for joining us, and we'll talk to you again soon.
Daphne Keller:
Talk to you later.
Evelyn Douek:
This has been your Moderated Content weekly update. This show is available in all the usual places, including Apple Podcasts and Spotify. Show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't be possible without the research and editorial assistance of John Perino, policy analyst extraordinaire at the Stanford Internet Observatory, and it is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman. See you all shortly.