Moderated Content

MC 1/19: Casey Newton On His Holiday Reading List

Episode Summary

Alex and Evelyn are joined by Platformer's Casey Newton to talk about his decision to move his newsletter off Substack, and how to think about difficult content moderation decisions at different levels of the internet stack.

Episode Notes

Stanford’s Evelyn Douek and Alex Stamos are joined by Casey Newton of Platformer and Hard Fork to talk about his decision to move his newsletter off of Substack. 

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Evelyn Douek: All of us use platforms every day that have terrible content on them, absolutely. I don't see it, but I'm sure that it's there. I'm sure that every single one of our listeners uses platforms every day to probably listen to this podcast that has terrible content on it. Nazis use email, I use email. This podcast recording platform, I'm sure, is also used for terrible content as well. You can't have a purity test. There's no moral choices under capitalism.

Alex Stamos: There's stuff I hate on law.stanford.edu.

Evelyn Douek: Yeah, that's right. I'm sorry about that. I know you disagree, Alex, but come on.

Casey Newton: The reviews that are written for Evelyn's latest paper.

Evelyn Douek: Exactly. My stunning review of my work.

Hello and welcome to Moderated Contents' sarcastically released, slightly random, and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. We've got something a little different and special for you today to kick off the new year. Today we're joined by Casey Newton, founder and author of a newsletter about social networks called, Platformer, and co-host of a minor competitor podcast named Hard Fork that comes out through a small outlet you might have heard of called The New York Times.

Alex Stamos: Never heard of it. Is that like the New York Post? Do they have great headlines like the New York Post?

Casey Newton: Yeah, it's very similar. It's very similar.

Alex Stamos: Okay.

Evelyn Douek: Need to work on their SEO. I'm sure you can find it though, if you're very good at googling.

Anyway, Casey is basically the best in the biz on this beat and always a must-read for me. And while Casey is someone I always follow for his reporting on platforms, we've got him on today to talk about a story that he has not just been reporting out, but has been at the center of, the Substack content moderation, or lack thereof, backlash. So last week, as many of our listeners no doubt will know, Casey made the decision to move his newsletter off the platform after not being satisfied with answers he got from the team there about how they're going to handle content moderation going forward.

Thanks for joining us very much, Casey, to talk about this. Was it as fun as it looked being at the center of this cluster beep that made everyone angry on all sides?

Alex Stamos: Yeah. Tell me, Casey, what is it like to have a platform that's targeted from all sides? I have no idea what that feels like and I have no empathy for somebody like that.

Casey Newton: First of all, hello and thank you for having me here. I am not lying when I say this is my favorite tech podcast. I truly look forward to every episode. When I got the message asking me to come on, it was a dream come true. And so I am thrilled to be here in the pleasance, presence, and in the pleasance, of people who I consider to be luminaries of the content-

Alex Stamos: Aww.

Casey Newton: ... moderation universe. So thanks for having me.

Alex Stamos: You mean the center of the censorship-industrial complex?

Casey Newton: That's right.

Alex Stamos: [inaudible 00:02:52].

Casey Newton: The enemies of progress. Yeah, it's nice to be here among the enemies of progress.

Alex Stamos: Casey, it's the enemies of progress.

Casey Newton: The enemies of progress. That's right. Well, thanks for having me.

And no, it hasn't been fun. It has been painful for reasons I'll talk about, but it has also been a very clarifying time. And I think when you are in journalism, it is good every once in a while to just remind yourself, what are your values, and then to try to live up to them. So it's been one of those kind of moments for me.

Alex Stamos: That's great, man.

So we're super happy to have you here as both a commentator and as the person at the center of this controversy which we're going to talk about it. We're going to do a little recap of exactly what happened. But most of today, I think we want to talk about what this means for content moderation going forward and some of the interesting questions it raises.

Before we get there, so first I'm just going to recommend everybody listen to the Hard Fork episode with Casey and Kevin Roose, an employee of this small outlet called The New York Times, as Evelyn talked about. You guys do a very thorough TikTok of everything that's happened. So I think we can assume, if you're listening to this podcast, you know a little bit. But why don't you give just a five-minute recap-

Casey Newton: Sure.

Alex Stamos: ... of what's happening, Casey, so we're all on the same factual basis?

Casey Newton: Sure.

Alex Stamos: And then we'll dive into what it means.

Casey Newton: Sure. You know what? I don't want to brag, but I think I can do it even faster than that.

Alex Stamos: Okay, great.

Casey Newton: So it all starts in November when a college classmate of mine, named Jonathan M. Katz, writes an article in The Atlantic saying Substack has a Nazi problem. And then he goes on to identify a lot of far-right extremists who are using the platform, some literal Nazis, others who are just writing about the great replacement theory and other violent ideologies. And I write a lot about content moderation. I wasn't particularly surprised to see Jonathan's article, but I did wonder what Substack would say about it. They said very little.

Then a few weeks later, a group of about 250 Substackers wrote a letter to the management, essentially saying, "Hey, what are you going to do about all these Nazis on the platform?" And so I waited for a response. And then it came on December 21st, when Substack co-founder, Hamish McKenzie, put out a blog post saying in effect, "We don't like Nazis around here, but we're not going to cover up the fact that they exist and we're not going to demonetize them. And that is our policy and Merry Christmas." And that was the point where I thought, well-

Alex Stamos: It's like the Christmas card I always want to get, is the, "We will continue to revenue split with Nazis. Have a Merry Christmas and Happy Hanukkah."

Casey Newton: Yes, it was a, we wish you a Nazi Christmas from the Substack Company. And so that was when I thought, well, I may actually have a problem here. And of course, at that point I did start to get more messages from readers. People were calling me out on social media. I started talking with my business partner, my managing editor, Zoe Schiffer. And we started putting our heads together and thought, what do we do about this?

Evelyn Douek: I'm actually curious, why now? What made this cycle different? We've been talking about Substack and its approach to content moderation for ages. I was just looking, in January 2021, my colleague at Lawfare, Jacob Schulz, wrote a post about Substack's curious views on content moderation. That was when they first came out with their big blog post about how they believe in the marketplace of ideas. And they'd been reading Justice Holmes and in the best tradition of free speech.

And then around April last year, we talked about this on the podcast as well, CEO, Chris Best, had poor answers to Nilay Patel's questions on his podcast, Decoder, when he was asking specific questions about what they would and would not remove. And he'd basically said, "I'm not going to play content moderation, got you?" And that was like, why won't you even just tell us what your rules are? And so what made it different for you this time when it was in the news again?

Casey Newton: I think there were two things. One was, I was and I remain unaware of any major U.S. internet platform that does not have a policy of removing actual 1930s Nazis, or praise for those Nazis, support for those Nazis. To me, that was just a baseline. Look, there's a lot of other really objectionable stuff. There are some anti-vaxxers on Substack making a lot of money and I have never liked it. But I just held my nose because I take it as the price of doing business on the internet, is that you're occasionally going to be surrounded by people that you not only don't agree with, but actively dislike.

The second thing though, in addition to that, was that Substack had really evolved over the past year. Last April, they had unveiled this thing called, Notes, which is just basically a Twitter clone. They had also increased the number of surfaces where they were recommending content. And so all of a sudden the idea that you might see a literal 1930s Nazi blog post next to Platformer in some sort of feed had gone from impossible to potentially quite likely. And so for those reasons I thought, it feels different now than it did before.

Alex Stamos: I want to take a pause here to make sure we're all on the same definition of Nazi. It is one of those terms that has, anybody who's part of internet culture knows about Godwin's law, that whenever you bring up the term, "You're a Nazi," or you compare somebody to Nazis, that a conversation is over, that an argument has been lost. But when we talk about Nazis here, we're talking about the way our grandparents used the term Nazi. We're not talking about the Gen Z Nazis. Probably all three of us, to certain members, to certain 17-year-olds, are all Nazis. We all have political positions that are somewhere to the right of Bernie Sanders on one way or another and therefore we're all Nazis. But you're not talking about Gen Z Nazis. You're talking about greatest generation Nazis here.

Casey Newton: That's right. But to pick the story back up, on December 21st when Hamish published that blog post, I didn't quite know what kind of Nazis we were talking about. I had read Katz's story and it appeared that he was talking about some literal Nazis, but he was also talking about Richard Spencer, who is a neo-Nazi. Whose Substack, by the way, does not have pro-Nazi content on it. I wouldn't call it innocuous, but if you didn't know who he was, it doesn't immediately scan as a content moderation emergency that he's on the platform.

That's when Zoe and I got together and we said, we need to figure out what we're dealing with. What kind of Nazis are on this platform? And so we worked with some journalists and some extremism researchers and just basically said, "Hey, show us the worst of the worst." And they give us 40 sites or so, and we spent the next few days over our winter break, over our precious winter break, reading literally the worst stuff on Substack, and tried to categorize it.

Alex Stamos: Well, the weather outside is frightful.

Casey Newton: For real, for real.

Alex Stamos: But the Nazi content's delightful.

Casey Newton: So yes, as the stockings were hung by the chimney with care, we're sorting through these blogs and categorizing them. It's like, okay, this one is a great replacement blog. This one is a Western chauvinism blog. And at the end of it, we have seven things that we think are just 1930s Nazi. One memorable one had a love letter to Himmler, the Nazi general.

Alex Stamos: Oh, wow.

Casey Newton: Another one, this was the only one that Substack actually removed before we sent it to them, but there was one that said basically why the last genocide was good and why we should do another one. And then there was just a lot of praise for Hitler, republishing parts of Mein Kampf, explaining why Hitler was actually a good Christian and lashing out at people who said that Hitler was not a good Christian. It was this kind of stuff.

Alex Stamos: In the Twitter discussion of this, the assumption is, is that when you say Nazi, you don't mean it. It's one of those hilarious back and forths where you're like, I see Nazi content. And then you have all these people jump in, they're like, "Oh, those aren't Nazis." And then the people you're talking about are saying, "No, I am a supporter of the National Socialist German Workers' Party."

Casey Newton: Right, yeah. Exactly. And why does this matter to me? I think it's important to say that I was using this as a proxy for, how does this company moderate content? Because if it would not remove the literal 1930s Nazi content, I had no reason to expect that they would remove any other kind of hateful or violent ideology. And when you think about how many times over the past few years we have seen right-wing extremists use social platforms to post their manifestos, to raise funds to build their movements, that just started to scare me a lot.

Anyway, that's why I was interested in the 1930s Nazis. We took our list of six, we send it in to Substack a few days later. They came back, they said, "We think that five of the six actually violate our policy." But then, instead of saying affirmatively, we will remove content going forward that expresses support for literal 1930s Nazis, they said, "Essentially, we will continue to review things on a case-by-case basis. We hear your concerns." And so I very much felt like Substack was saying, thank you for your volunteer content moderation. If you would like to continue to see Nazis removed from Substack, flag them to us as they come into your awareness. And I just thought, doesn't work for me.

Alex Stamos: You're trying to run a business here on a platform. Your goal is not to then also, they should outsource their content moderation to you that you should be responsible for. Because also, the issue here is that just like with any other company that has recommendation algorithms, you don't see where your content is showing up next to other recommendations. You have no idea until somebody sends you a screenshot or they put a screenshot up on Twitter saying, "Oh, look. Casey's right next to the Nazis are good Substack."

Casey Newton: Yes.

Now, it is also true that we found six blogs. That's not a lot of blogs. There probably are not a lot of consumer internet platforms that have fewer than six really bad things on them. And a lot of the criticism that we would get in the wake of this decision was essentially that we were panicking over nothing, making a mountain out of a molehill.

And I just want to acknowledge that this was an actual tension in my decision-making process. And I really like to get your two thoughts specifically on this because these were the options as I saw them. One option is we go, we make a clean break. We go to a platform that has what is in our view a more robust policy against putting us next to horrible things, that has way fewer social features that we feel like are going to get us into trouble. Or I say, "It's only six blogs. Everybody relax. We will keep an eye on this." And then when it gets to some threshold, then potentially we will make another decision. Which in some ways would have been an easier decision because it would have required me to do nothing. And in some ways would've been a much harder decision because I write about content moderation so much.

I think Platformer probably has the savviest audience about content moderation of any audience of any publication in the world, except for maybe the Stanford trusted safety review that comes out quarterly. But except for that one academic journal, I'd put the Platformer audience up there with anybody, right?

Alex Stamos: Absolutely.

Casey Newton: And so they're emailing me, they're calling me on social media. When they hear some tech bro say, "Free speech is good, let's fight it out in the marketplace of ideas," they are not fooled. They had that argument before when they were freshmen in Evelyn's law school class, right?

Alex Stamos: Right. Or when they're in a conference room at YouTube, right?

Casey Newton: Yes.

Alex Stamos: Arguing it out with engineer bros.

Casey Newton: Completely. They've had these fights. They are not swayed by any of that. But again, but I also have readers that are just generally interested in tech and don't understand why I am making such a big deal out of six blogs. So this is where I really do want to turn the microphones to you and say, you're in my position. You've just completed this review. Substack has just told you this thing. What do you guys do in that situation?

Evelyn Douek: Well, actually, if I can just answer your question with another question. I think knowing what the alternatives are is pretty important to this question as well. Clearly, the alternative is not, you just take your blog off the internet, you're done, because that would be a huge loss to ... the social cost of that would be ... outweigh the problem here. But you move to Ghost and I have looked at Ghost's content moderation guidelines and they're unacceptable use policies, and they're pretty sparse too. They're not extensive, they don't talk a lot about what they're doing. I can't see a lot of transparency around this. They definitely do prohibit calls to violence. But on the other hand, so does Substack. If you go and read its acceptable use guidelines, they also have an incitement policy. And so you suggested that this is going to be a more robust ecosystem for you and I'm just curious why you think that.

Casey Newton: Great question. First thing was just basically, I get on the Zoom with the CEO of Ghost.

Well, actually, let me take a step back. There are two flavors of Ghost. Ghost is operated by a nonprofit. It is open source software. You can use it without permission to build things on the internet and Ghost will not make any money off of that. That is just a flavor of Ghost. It's not dissimilar from WordPress. And then WordPress, they have a hosted service called Ghost Pro where they do a bunch of stuff on your behalf, including sending out a lot of emails if you're something like Platformer. And then there is a slightly elevated set of policies that they're going to apply.

So when I get on the Zoom with the CEO of Ghost, whose name is John O'Nolan, I say, "Hey, what is your policy about Nazis on Ghost Pro?" He says, "Nazis are not allowed. If we see Nazis, we remove them." I just want to say, that is more than I could get out of Substack. So it was, if nothing else, Ghost was going to be a better intermediate home for Platformer than some of the alternatives that we looked at.

The second thing, to go back to this, where are you in the stack conversation, is that Ghost doesn't have a social network. They do have a recommendation feature, so you can recommend any blog. But at least for the moment, Ghost doesn't really have a big network in the way that Substack does. Maybe they will eventually. Maybe this does become an issue eventually, but I just feel like at the moment, nothing I write on Ghost is going to show up next to a Nazi blog. And also, the Nazi blogs aren't allowed. That was enough for me to say, this will be the next right home. There are options that would be even further into self-hosting, but I felt comfortable stopping one step short.

Alex Stamos: I'm looking at the Ghost content policy right now. Like Evelyn said, it's not incredibly specific. I made this comment, actually, about Substack in the early days, is that Substack had a reasonable set of, these are ... You always see startups with content moderation policies have these really broad, be good to each other. In Ghost's case it's, "Don't be a dick. You shall be judged to have been a dick when a group of your peers have deemed that you were being a dick. If after being warned you continue to be a dick, you will be banned." And they do have these examples including sexist, heterosexist, racist, otherwise hateful remarks, which is clearly where the Nazi stuff falls under the hateful speech.

This is not that far from what Substack's original was. At the time, I said, this is the kind of cute thing you get to do when you're a startup. And then you look at Twitter or Facebook's policies and you can see the thousand yard stare of the content moderation people who had to write it, of, they have been through hell. And if you look through Twitter's policies, it used to be, at least before Musk, specific things like, you shall not sell human body parts via advertisements, you know? And that's a real content moderation policy for people who have done it for a decade is like, wow, you're sitting around a conference room table and you're all drinking and you're like, "Hmm, didn't think we were going to have to write the, don't sell human body parts rule, but I guess we're going to have to," right?

Casey Newton: Totally.

Alex Stamos: And Ghost isn't quite there yet. But at least you're saying that from your conversations with them and their actions so far, that at least directionally they're towards the direction of, they will eventually have the, don't sell human body parts rule. Not that they're going to say, "Oh, we're a free speech platform. We're also a free body part platform," right?

Casey Newton: Yeah.

Let me make this conversation a little bit more difficult. Let's say that Substack, Elon hadn't taken over Twitter, Substack didn't need to see the need to build a Twitter clone and so it hadn't built the social features that it has over the past year or so. Let's say it really was just email infrastructure and there was no chance that any platform or post shows up to any Nazi posts anywhere. And they have the same policy and people create pro-Nazi blogs and they mostly don't remove them, and my, or your customers, if you want to put yourself in my shoes, start emailing you and calling you out on social media saying, "Why are you on this Nazi platform?" All of a sudden, I can't fall back to the argument of, I could potentially end up next to Nazi content on the social network. It's just kind of like there are distasteful things that are using the same infrastructure that I am and you want me to leave and you're unsubscribing because you want me to leave, but I don't feel like I have as principled a reason.

So in a way, Substack made this much easier for me for building all of this other stuff. And in the likely event that Nazi blogs show up on Ghost, if for no other reason than to troll me and make my life miserable, this is the argument that I plan to fall back on is, this is actually infrastructure. And in these kinds of cases, I'm going to be less likely to move. But I do wonder if you have a better argument that I can fall back on when that happens.

Alex Stamos: So this brings up, I think, the core of what I wanted to discuss today which is, is Substack infrastructure? And what is infrastructure? To be frank, I've talked about this for years. I've talked about this in 2018 YouTube videos, that the idea of common carrier status/net neutrality was going to run into pressure on content moderation. That somewhere those two things meet and where is that?

Now, the things to me that makes Substack, that lean it towards a platform that has responsibilities, are a couple. One, they do a revenue share. So to me, this is something that I think has been under-discussed in this whole thing, is that Substack takes money and they just say ... What percentage do they take, Casey? Can you remind us?

10%.

Casey Newton: 10%.

Alex Stamos: So they take 10%. They let you have 90%, but they take 10%. So they are directly motivated, unlike some other infrastructure providers. So you talk about AWS, AWS is not rev share. You pay them per hour for what you use, or you pay them for byte moved. The business model is up for you. But for Substack, they are highly motivated to get people to subscribe and to give credit card numbers because they're keeping 10% of the money.

So first off, to me, that is a humongous one. And Evelyn and I have talked about this. For YouTube, for any situation in which you're cutting a check to a content creator, that, to me, puts you in the same place as The New York Times and some other ... Maybe not quite at that level, but you're much closer to The New York Times than you are to Amazon Web Services to me, right?

Evelyn Douek: Yeah. Well, and the other thing that it raises, Alex, I think that's really important, is that booting these people off the platform was not the only option that Substack had. You can talk about free speech and the importance of, we don't burn books here and if people want to have Mein Kampf and read it or whatever, we believe in these free speech principles, but this is not a binary decision. There's a whole bunch of tools that platforms can use. We talk about this all the time as well. And preventing them accessing revenue sharing is one option that they could have had removing them from recommendations.

This is the lifecycle we've seen all of our platforms go through over the last half decade where we're getting more nuanced conversations about this. And we're realizing that this is not just a take down, leave up conversation. And it's at the lack of maturity of even thinking about those other options and pretending that it's all a free speech question, I think, that was really galling to me.

Casey Newton: I really appreciate that. If I could just say too, while I do think that the Substack founders acted from principles, and maybe I'm projecting because this is more vibes-based, but I really do just get the sense that everything that you just described, Evelyn, which I would have loved to have seen them do and which might have created a world where we could have stayed on Substack, I just think that that just seemed exhausting to them, you know? And you just see this, they're at the core of so much resistance to content moderation, is people just understanding it will be tedious and unsatisfying and so they choose not to engage in it at all until they absolutely have to. But as Substack finds out, that comes back to bite you.

Evelyn Douek: They're not wrong. It is exhausting. It is exhausting-

Casey Newton: Exactly, it is. It is.

Evelyn Douek: ... and tedious. But the whole vibe, as you say, was, they just wanted this to be simple. And I teach First Amendments, I study free speech. There is no simple rule. There is no simple answer here. We argue about this for the rest of time.

Casey Newton: Totally.

Evelyn Douek: We've argued about it for centuries. We're going to keep arguing about it. You don't just come up with the perfect rule and be like, "I'm going to go home now. I've solved this problem." There is no simple answer. And the fact that here in the year of our Lord 2024, we still have platforms wanting this to be a simple question is crazy to me because every platform thinks it could be a simple question, or we're just going to be the ones that don't get our hands dirty on this and don't really think about it very hard. And every platform learns, oh, I'm sorry, it turns out free speech is hard.

Alex Stamos: But I think, not to psychoanalyze these guys too much, but it seems to me from my perspective, a lot of their feeling that they are different comes back to the founding of the company and their basic theory that advertising is the core of all evil. That this has been the thing that Substack has talked about over and over again. And this is just a total bullshit theory, I'm just going to say. It has been proven wrong and wrong again. Even if there's a bestselling book from one of Evelyn's old colleagues at Harvard that turns out to have a technical incorrect statement every third page ... I got to be careful. I've got enough trouble as it is already without picking too many fights. But people at Harvard hate me, so it's totally fine.

But anyway, this theory that was, honestly, came out of the left-leaning media, that came out of the post-2016 New York Times led tech lash of, advertising is the root of all evil. And if we didn't have advertising online, that everything would be better, was just never true. There was never any empirical evidence for it, nor did it ever make sense because we have had Nazis for way longer than we've had online advertising. You always had crazy people doing stuff. And this idea that it's like, algorithms plus advertising, if you mix it all together, that's the only reason that there's bad people, which is never going to be true.

But Substack bought into that and they bought into this theory. And I think it's really hard for them personally to step off of this circa 2017, 2018 theory that empowered their platform because it really means questioning some of the fundamental assumptions they had when creating this product.

Casey Newton: Yeah, it's true. And they did share with me in our conversations, they really feel like one reason why the right wing extremism problem, just whenever it got that bad on Substack, is because Substack is different because it doesn't have advertising. That is an article of faith for them that I do not understand at all.

Alex Stamos: Which goes also back to those original decisions from Substack.

Another reason I feel like they have a much more responsibility than Amazon Web Services is when they started Substack, they explicitly picked out people that they wanted to attract to Substack and they gave them money. That is The New York Times. When you find somebody and you email them and you say, "I'm going to pay you several hundred thousand dollars a year to write a column," that is called a newspaper, whether or not you actually have a paper copy. And that set the tone.

The people they chose, including a independent journalist who has personally said things about me that aren't true, that some people would claim are defamatory, they chose those people and pay them a bunch of money. And that set the standard of, who do you want to attract to this platform? And they're still living, years later, even though they have not done that for years, they're living with the consequences of the kinds of people they picked because these are conspiratorial-minded voices. These are people who believe that any kind of content moderation is censorship. And if you attract those as the core of your audience, then that will be the kinds of people that are attracted on an ongoing basis.

Casey Newton: Let me say one thing about that, which is that I do believe that on some, and I don't know if it's more ... So Chris Best is the CEO and Hamish is his co-founder, who I believe Hamish's title is now chief writing officer. These were the two that I was interacting with during this whole thing. And I do believe, because it is observable from everything that you've just said, Alex, it's observable from the way that they handled this Nazi situation, I must believe that on some level they are thrilled to be fighting a culture war. I do believe that because if you didn't want to fight the culture war, you could deescalate this in so many ways.

And one of the reasons I felt like I had to leave was that I felt like I was being drafted into a culture war where I was forced to take Chris and Hamish's position about content moderation, which is a position I do not share. But if I did not share it, it did not make any sense for me to be on the platform. So that just really confused me because it is a recipe for the smallest version of Substack's business. One where anybody who is even a little bit left of center I think is now just going to be asking themselves, what am I doing here?

Alex Stamos: It does feel consistent with the overall, the creation of a reactionary MAGA-ish, right wing venture capital/Silicon Valley, that these guys are at least adjacent to that world.

Again, I don't want to ... you're talking about a small number of really big voices, so we don't want to ascribe too much. But it used to be just Peter Thiel, and now you've got venture capitalists from very well-known VC firms talking about, maybe elections were stolen and that mRNA vaccines rewrite your DNA. Stuff that would make me, if I was the person managing funds for the public employee retirement fund of Utah, I would probably not be giving these guys a billion dollars as an LP if they believe in these kinds of things. But that's not my job.

So the people I want to contrast this to, and I think the other problem that Substack has here, but maybe like you said, they're excited about it, is that when you take a stand like this, you attract those people to your platform. And this is a problem that Cloudflare had to deal with. That Cloudflare made a principled decision that we're not going to take down The Daily Stormer. But because they publicly said, "We're not taking down, we are allowing this Nazi newspaper to exist," then all the Nazi newspapers flooded to them. And this is the common Nazi bar problem, that if you don't take a stand on some of these groups, they will flood your platform. And so Cloudflare ended up actually reversing their decision because they did not want to be in this culture war in a way that Substack seems to be excited about the fact that they're going to be attracting the fringes to their platform.

Casey Newton: I'm glad you brought it up because again, this also went in the calculus to leave, which is now that they have said that there are at least some cases where we're going to let pro-Nazi content stand on the platform, what are the odds that every Nazi who wants to write a blog isn't going to at least show up and try? And for the next infinity journalist, we're always looking for an easy story. It's like, oh, why don't I just use the Substack search bar and see how many Nazis I can find today? That just becomes a beat for anybody who wants to take it. And then every single time it happens, it's a fresh cycle and they're asking all of the non-Nazis on the platform, what are you doing there? So it's like, I absolutely was going to opt out of those new cycles.

Alex Stamos: So you heard it here first. If you want to be the next Casey Newton, if you're a 22-year-old, you've come out of journalism school, you could start the todayinnazis.substack.com. You could talk about Substack Nazis on Substack because Substack will, by definition, will not take your blog down for having Nazi content.

Casey Newton: If there's one thing you two know well, it's that there's always going to be jobs for volunteer content moderators on the internet.

Alex Stamos: Totally.

Evelyn Douek: I'm glad you brought up the example, Alex, of Cloudflare, though, because I think it's important. And I don't want to be at risk of oversimplifying this conversation as Casey was urging us before, to make this more complicated and go back to this question of infrastructure and the difference between different platforms. I do genuinely think I'm not in the world, and I don't want to be on the record or being misunderstood to be in a world where I say, every platform in every part of the stack should be removing Nazi content, should be removing Mein Kempf. I think Mein Kempf has important historical value. I don't want all copies of it burnt or removed from libraries. And so I have to think about what I mean for that on the internet as well, and what's the equivalent of that on the internet.

But Cloudflare's dealing with this has been pretty different in many ways as well, to the way that Substack dealt with it. And I think one of the things that's coming through here is there's been a breach of trust, I think, from the users. And a lot of the thinking that I do around content moderation is this relationship of trust between users and platforms and the idea that there should be transparency and due process and things like that. And there was just this feeling with the Substack guys that I just don't know what they're doing. I don't know what their rules mean. Suddenly this content was outside their rules, but now it's inside their rules. They're not changing the rules, but the content's gone. There's no transparency report. I don't really understand what they're going to do in the future. Cloudflare, by contrast, does release a transparency report. I don't know how often it is, but it releases transparency reports. It tells us how much content it's removing, how it's responding to government requests, that kind of thing.

And so I think also, there's this question of, regardless of what you think of the underlying substantive question, the idea that there's just a complete lack of transparency. And the fact that the Cloudflare, sorry, the Substack guys wouldn't own their decisions. This is what came up with the Nilay interview as well, was they just wouldn't tell us what their rules were. They just wouldn't tell us, would they take this kind of racist content down. And I think that that lack of trust between the platform and its users is pretty damaging. So I can totally understand from that perspective, Casey, why that would be a problem for you.

Casey Newton: Totally.

So for what it's worth, I actually, I gave them the advice to do a transparency report. And they were more open to it than I suspected because they actually think it would help them with the intellectual dark web too. Because everyone loves transparency, whether you're a Republican or a Democrat.

Alex Stamos: It's interesting, they are a relatively small company. I think the other thing we have to consider is, I think from a cultural conversation perspective, they are way more important than their revenue size. And so that is the other thing that, to give them a little bit of credit here, is that they only have a hundred-something employees I think, and are probably losing money hand over fist. Which is not a good ethical reason, but helps you understand from a business decision perspective why they do not want to be in a position of creating a content moderation council of doing all the work that Facebook had to do when Facebook had hundreds of millions of users and was on its way to being one of the world's largest companies.

Casey Newton: Totally. And I just want to say again, my expectations of them were so low relative to my expectations for Facebook. It was like, I was trying to find the literal floor of their content moderation guidelines.

To bring up an example, I think Richard Spencer on Substack is an interesting example because the reason that you would ban him from Substack is essentially for off-platform behavior. Substack, being young and small, does not have an off-platform behavior policy. Most companies don't. They're becoming more popular now. It's a rich subject to explore for any PhDs that are listening to this podcast, I think, are these policies. It would not have felt very principled to me to leave Substack because Richard Spencer had one, even though I think there's probably a really good reason to remove him. But it's like, well, at this stage of their lifecycle, most of these platforms just aren't taking into account off-platform behavior. Now, the flip side of that argument is, it turns out everyone has an off-platform behavior policy. You go up and shoot up a grocery store, Facebook is going to take down your account.

So anyway, I don't know.

Evelyn Douek: Can we talk about Stripe? I think this is an interesting-

Casey Newton: Yeah.

Alex Stamos: Yeah.

Evelyn Douek: ... part of this problem too. So how does Stripe come into this story, Casey, and tell us your thoughts about it.

Casey Newton: I acknowledged this on Hard Fork. This was where I was a little bit edgy because for the most part, I know that it's ... Well, it's not considered good etiquette to demand that the payment processor intervene. But I did have this legitimate journalistic question which was, so Stripe, you have policy against supporting violent movements. One of your customers has just said that they are not going to remove literal Nazis from their platform and Substack is now going to be collecting 10% of this Nazi revenue. Does that square with your policies, Stripe?

I think that was a valid question. I didn't write the post that was, shame on Stripe, or whatever, but I thought it was worth asking the question. So that's why I dragged Stripe into it just because, again, Substack's policy was so far beyond the pale outside the mainstream that I was like, can you even do that? Their most important relationship outside their relationship with their writers is with their payment processor. And as far as I could tell, they were basically giving their payment processor the middle finger.

Alex Stamos: This is an interesting question. I think from my perspective, Stripe is much closer to infrastructure, right?

Casey Newton: Absolutely.

Alex Stamos: I don't think there's a hard boundary here. It's infrastructure or platform. It is a continuum. As of 2024, all these things are a continuum. But to me, absolute infrastructure, the credit card network, those are basically highly regulated oligopolies. Visa, MasterCard, American Express, you can't get around those folks. They have special exemptions for the PCI DSS Council for their security standards and such from monopoly rules. And the flip side is that is, in something that's an oligopoly like that, I think has to be common carrier to a massive extent. And the way people have used Visa and such to go after porn companies I'm actually against because I think that that's like, you're striking at a situation which you don't have any alternatives.

Stripe is right there in the middle and there's a pretty decent number of payment processors. The difference is, is one, from their perspective, the content is not in band. So everything for them is off, everything other than direct fraud and money laundering is off platform for them. Now, that doesn't mean they should have zero off-platform rules. I don't think you should be able to buy human beings. I don't think you should be able to buy body parts. I don't think you should be able to buy child sexual abuse material using Stripe. But for this level of content moderation, I think it would be reasonable for Stripe to say, this is so far out of band of what we do. You are asking us to opine on something that in this case you are bringing us the data, but it is normally impossible for us to know what's going on. That our focus is going to be on these things that are either in band or off-platform things that are so incredibly heinous that everybody agrees that they are. And generally, that means that they're illegal.

Casey Newton: That makes sense to me. In the event that ... Well, I I guess in the event that we wound up in which was Substack removing some, but not saying affirmatively that it was going to ban all Nazi material from now on, I think a good outcome for me would have been someone at Stripe reaching out to someone at Substack and just having a conversation. Just saying, "Just as a reminder, here's what our policies are. And as you use your viral social network machinery that you have built to build monetization infrastructure for literal Nazis, when we start reading ... if there is a Nazi who gains a hundred thousand readers and starts making half a million dollars a year using Stripe, we are going to have a conversation about that. That actually is illegal." That's what I wanted to happen and what I think still could and should happen if things play out as I fear they might here.

Alex Stamos: I think though, back to Evelyn's point, the appropriate move here would have been Substack saying ... I think the appropriate middle ground, if I was there, my recommendation to them would be to have a demonetization policy. Would be to take Stripe out of the question. Because you're just going to say, "Okay, great, we believe that these people should be allowed speech, but we're not going to recommend you and we're not going to allow you to be monetized because we disagree with the speech. And that in us taking a cut, if that doesn't say that we're endorsing the speech, it means that we at least think that speech is within the Overton window if we're going to get 10% of the revenue that's generated." Which would have taken Stripe out of it.

This is also where I think it's different than an AWS, in that one of the questions you have to have here on the platform infrastructure dimension is, how much knowledge they have about what's going on just from a totally practical perspective. And for AWS, if it's just, you've got 10 instances, they theoretically can look in and figure out. But it's not like the normal business. Whereas Substack is taking the content and mailing it to people. They're taking the content, they're looking at it in algorithms and they're using it for the recommendations. And that's where I see at AWS in the Stripe, and to a certain extent, Cloudflare. That's where it puts Cloudflare in a weird place here just based upon where they sit as being an intermediary sharing the content. But I would put those three things as quite different from Substack. Looking at the content and using what the content is for a monetization strategy is not how those infrastructure providers work.

Casey Newton: That's a good point.

Alex Stamos: I think you made the right decision based upon the feedback you got here. And then also just directionally, as somebody who knows what's going to happen is, looking at their decisions, it's going to ... this is not going to be sustainable.

Now, the question is whether Substack is sustainable or not as an ongoing business, is an open question beyond of what's going on with you. But if they exist five years from now, I can't imagine that this is what their platform policy looks like because they're about to live through the Nazi bar problem of attracting the worst of the worst and that becoming a significant percentage of their revenue.

Casey Newton: I find this so validating to hear because I do feel like in leaving I had to make a bet, essentially. And the bet was, I think this is going to get significantly worse. But again, the problem is small enough now that it feels like, in some ways, to be a long odds bet. But what you just said, Alex, I do agree with. Which again is, well, if you're a Nazi and you want a blog, why wouldn't you try Substack first?

Alex Stamos: Absolutely.

Well, and now you're reading this stuff because, this is the weird Heisenberg uncertainty problem here, is that now that we're paying attention, it changes the situation. That now that you're reading about it, you've got people who are Richard Spencer fans who are like, "Oh, I didn't know I could go subscribe to his Substack." Like, "Oh, now I know where I can go for this kind of content."

And maybe they will continue to kick off the 1930s Nazis. But the entire ecosystem of people who are adjacent to that, clearly they're not going to take any stand on. And if you want to subscribe to that kind of content or you want to monetize that kind of content, this is going to be your platform.

Casey Newton: All right. So when Nazis build blogs on Ghost and people start calling me out on social media saying, "Oh, well, well, well. If it isn't Mr. High Horse back on another Nazi platform, are you going to leave that one too?" How do I approach that question-

Alex Stamos: So you're talking about Ghost Pro, right?

Casey Newton: ... mentors?

Yeah.

Alex Stamos: Because I think-

Casey Newton: [inaudible 00:41:27].

Alex Stamos: ... already you see people calling you out on Twitter and a certain Substack publication. So one of the things you didn't talk about that you talked about in Hard Fork is the other thing the Substack guys did, was they tried to front-run you by basically giving a bunch of information to a Substack. Let's not name it. I don't like naming it because they are obsessed with a colleague of ours to the point of where they've run thousands of words about her and done hours of YouTube videos about our colleague. And so I don't want to promote people who are trying to drive harassment and death threats to our colleague. But the same outlet that does that were also the people they turned to, to try to back you up by giving them information to undermine you before they even spoke to you. Is that correct?

Casey Newton: Yeah, that's right. It was a super shady thing to do. For whatever reason, I haven't even gotten that mad about ... maybe it was just because by the time that happened, I was already one foot out the door. But yeah, that is another aspect of the story.

Alex Stamos: But that's just from a customer relation perspective, that's a totally reasonable reason for you to leave of, oh, you're trying to get people to attack me by using ... that you were following the confidentiality rules. You were very careful on everything written. I don't see you directly quoting them. Obviously you had off the record or on background discussions with them and that you were following the rules and that they were breaking those rules. From a customer service perspective, why the hell would you share 10% of your revenue with people who did that to you?

Casey Newton: It didn't feel good. When I started 10% of my revenue, it was not very much. But it's now at a point where we can absolutely just hire a junior staffer with the money we are going to save not being on Substack.

Alex Stamos: That's awesome.

So you're asking about Ghost. I just want to point out, that outlet was then very disingenuously attacking you because Ghost is an open source that could be used for anything. So again, I use the Linux kernel, and I also know Nazis use the Linux kernel, and that's just the life of open source and that's fine. But for Ghost Pro, if they make the same decisions, I think the interesting question here is, one, do they make recommendations? Do they go and have an editorial voice by choosing who is on their platform? So if Ghost goes out and decides, we're going to get these really edgy writers and we're going to give them hundreds of thousands of dollars to bootstrap our platform, and then you disagree with them, then that seems totally reasonable to me. Just as it would be reasonable to leave The Atlantic or The New Yorker or The New York Times for the same reason.

Casey Newton: I don't think that they're going to do that. They don't really recommend. They do have a homepage that has a variety of publications on it. And some I'm politically aligned with, some, I'm not. But right now it doesn't feel like the thing that was really worrying me, which was, I guess, everything we've been discussing today.

Evelyn Douek: Obviously I'm the lawyer on the podcast, I have to give the lawyers answer of-

Casey Newton: Please.

Evelyn Douek: ... well, it depends on the situation and the content and everything else. But just starting from a baseline of, you can't have a purity test. Every platform, as we were saying before, has bad content on it. All of us use platforms every day that have terrible content on them, absolutely. I don't see it, but I'm sure that it's there. I'm sure that every single one of our listeners uses platforms every day, probably listen to this podcast that has terrible content on it. Nazis use email, I use email. This podcast recording platform, I'm sure, is also used for terrible content as well. So you can't have a purity test. There's no moral choices under capitalism.

Alex Stamos: There's stuff I hate on law.stanford.edu.

Evelyn Douek: That's right.

I'm sorry about that. I know you disagree, Alex, but come on.

Casey Newton: The reviews are written for Evelyn's latest paper.

Evelyn Douek: Exactly. My stunning review of my work.

So you can't have this purity test. That's not possible. And I do think that trying to be more nuanced about this, trying to have these actual conversations about, what are the features and also, what's Ghost's engagement with it. One of the things that was really tough about this with Substack is they have these rules, but I don't know what they mean and is it clear from Ghost what it's doing? Is it actually applying these rules, those kinds of things? That's what I would want to know when making those decisions. But I don't think that we can live in a world where we find ... we all just keep moving from platform to platform until we find some utopia where we all just exist in this lovely little bubble where there is no bad content.

Casey Newton: No, I appreciate that. That makes sense to me. I will say, I also stopped posting on X for a quite similar reason. The antisemitism got to a point where I was like, it feels awful to be here. And so I have abandoned two platforms in the past year. And I hate doing it for exactly the reasons that you raise, Evelyn. These purity tests, they can get out of control. But one of the things I've just been trying to figure out over the past year is, what are the red lines? And for, really, the first time in my life as an internet user, there have now been two platforms where I was like, I actually just have to get out of here. So it's been a really unusual period.

Alex Stamos: Well, for me leaving, I left X. And I hate doing it because I hate seeding the field to the bad guys because it does feel like letting them win. But I felt I had to do it because the most milk toast cybersecurity post I would make on X would have 50 blue check mark fake accounts attacking me because they have gotten rid of any kinds of protections against organized, influential operations. I have spent the last seven years shutting down Russian, Chinese, Iranian, and other troll farms, and those guys are going wild. And certainly I, and some other of my colleagues at Stanford are on a list, that no matter what we say, we're going to get death threats, we're going to get all this stuff. So it became unusable to me. I think it's different than Substack. Substack was usable for you. It's not like you were getting a bunch of death threats via Substack, right?

Casey Newton: Yeah, absolutely. It's just, as you know, it feels like with every passing year, just the idea of content moderation grows more and more controversial. And we're now starting to see some platforms lean into the idea of, "Well, what if we just didn't moderate at all? Let's see. Let's see what happens. Let's run the experiment and, whatever, free country. You can do it." But it's also leading, I think, thoughtful people who care about governance and civic life to make other choices.

Alex Stamos: And I should point out that because this podcast will get mischaracterized probably by that Substack that we're not mentioning, not a single person on the Substack has said that there should be any legal repercussions for Substack. We all believe that Substack has a First Amendment right to decide. If they decided to host Richard Spencer and to revenue share, they should do it. In my personal position is, Stripe should let them do that too. I don't think Stripe should do content moderation at that level, but this is your First Amendment decision of what platform you're going to publish on and who is going to benefit economically from your output. And that is very American for you to have the freedom to choose that you're going to publish in the Baltimore Sun versus The New York Times versus the Los Angeles Times, that this is the modern equivalent of that.

Casey Newton: It's interesting, I haven't read all of the criticism of me. One of the things that you learn about these people in particular is that nobody writes more, just, words by volume than they do.

Alex Stamos: It's amazing.

Casey Newton: It's just amazing what they can put out. So it's like, I'm not going to put myself through that.

Alex Stamos: Well, and as a number of people have pointed out, there's a number of Substackers that would have benefited from having editors. Some people who used to have editors, who used to be part of larger publications, that you're now, oh my God, I feel incredibly bad for their editors because this must have been an incredibly terrible job to cut down their 8,000 insane words.

The other person we have to give a shout-out here is Bill Ackman, who will not stop writing 5,000 word tweets about how his wife is not a public figure and that her ties to Jeffrey Epstein and her potential plagiarism are things that people should not be discussing. Which he follows up with another 5,000 words about, people should not be talking about my wife's ties to Jeffrey Epstein or having this discussion. It's the same kind of group of people where somehow they think they're getting paid per word or the amount of verbiage makes them look better. I don't understand what the whole theory is.

Casey Newton: It is a huge amount of word. The thing that I just want to say about what I've been interested in, is that very few people are actually arguing on the question of, what should Substack's policies be with literal 1930s Nazis? And I think that's because maybe on some level they find it indefensible, or at least they don't want to try to make the argument. So instead, they just try to ridicule me as the weakling who ran away from six Nazi blogs. And I guess it actually heartens me that that is the best argument available to them because I think over time it will not actually be just six Nazi blogs. And if my bet is right and a year from now we're having a conversation about 600 Nazi blogs, I will have felt very good about leaving when I did.

Alex Stamos: I think one of the issues here is just on a timing perspective from a cultural war perspective, is your controversies happened at the exact same time that the exact same people are saying that the presidents of three universities should be fired for having nuanced positions on what is antisemitism. So it is very hard to say that Claudine Gay saying that it depends based upon the kind of culture genocide, that that's compatible with the same kind of position around Substack. And so they can't defend at the same ... Even these people who have very little shame, it is hard for them to make both those arguments at the same time. So it's better to tease you about it when you're making a completely reasonable, logical choice.

And I think, honestly, it's going to be ... I'd love to hear back from you in six months, either on Hard Fork or here. I'd love for you, or do a publication, on what it meant economically for you to make the swap. Just from a business perspective, it'll be interesting to see the numbers there.

Casey Newton: Totally. Every September I do an anniversary post. For the first three years it's like, what I learned in year one on Substack. Now I guess it'll just be, what I learned in year four of independence. But I do think we'll go into the economics. I think the immediate economics are great, we're saving 10%. The question is just, what does this mean for our long-term growth prospects? And we don't know. But look, there's a reason why I'm going on any podcast that'll have me.

Evelyn Douek: That's right.

Alex Stamos: But you're still paying. Are you doing Stripe directly? How do you handle-

Casey Newton: Yeah. So-

Alex Stamos: Can you talk a little bit just about what you did so that people understand what the transition looks like?

Casey Newton: Yeah, totally. And so we had it so easy. Ghost has a concierge team. The website makes it seem like it's free to everyone. So if you want to try this, give it a shot. But I think it's just ghost.org/concierge. We got in touch with them. We had a call with them, told them what we wanted to do. They walked us through it. There were a few things, we had to disconnect Stripe from Substack. We had to reconnect Stripe to Ghost. We had to export all of our emails, our free subs, our paid subs. They did a little bit of magic on their end. They have a template where Platformer's using a very basic template on their website right now. We're hoping we can spiff that up sometime in the next couple months. But it was very easy. It mostly just felt like we snapped our fingers and it was done, and now I just type in a different box and that's it.

Alex Stamos: And who's handling payments then for you?

Casey Newton: Stripe. Stripe. Stripe is still handling our payments.

Alex Stamos: And so I'm a subscriber. Nothing happened. It's just that Stripe no longer sends 10% to Substack. So somehow you register that you're no longer on Substack and your Stripe account converts to something that's completely independent, is that the idea?

Casey Newton: That's right. I've heard two versions of this. One is that you can email Substack and they will disconnect it on their end. Ghost also shared with us that after they import the customer relationships from Stripe, they essentially recreate that relationship in a way that severs the tie to Substack. So we did it that way. But I had talked to people, and this was years ago, I'm sure it's very much different now, but I talked to somebody who moved off of Substack, but Substack was still taking a cut. He had to hire an engineer to figure this out for them. Compared to other platforms, very easy to leave Substack, but there are some steps you have to take.

Alex Stamos: Interesting.

As a cyber guy, describing, "Oh, you just email Stripe and where money goes changes," is terrifying me. But I'll leave that for another podcast for somebody to red team that.

Evelyn Douek: Well, Alex is a cyber guy, I'm the law professor. And I'll say, it warms my heart to hear Alex describe your move as your First Amendment right. Because, of course, that's absolutely correct. And I think a theme of this conversation has been, these decisions are really hard and this is not an end state that you arrive at. It's a conversation that you keep having. Free speech and a healthy ecosystem is something that we all do every day through our continued actions and our continued conversations. So I look forward to continuing to have those conversations, whether it's in six months or a year's time when you have to make ... exercise your First Amendment rights to either stay or go or do whatever as this continues to unfold. It's been a real pleasure, Casey. Thanks so much for coming to talk to us.

Casey Newton: It has been a thrill to talk with you guys. I truly love this podcast so much, so thanks for having me on. And call anytime.

Alex Stamos: Subscribe to Platformer, folks. Listen to Hard Fork.

Let's give Casey a little help here, Evelyn, with our-

Evelyn Douek: It's rough out there for these little startups.

Alex Stamos: Our podcast listened to by 17 law professors and content moderation experts at Facebook. Let's give Casey a boost.

Evelyn Douek: That's it.

And with that, this has been your Moderated Content episode for the week. The show is available in all the usual places, including Apple Podcasts and Spotify. Show notes and transcripts are available at law.stanford.edu/moderatedcontent.