Moderated Content

Meta Reinstates Trump's Accounts

Episode Summary

Evelyn sits down with Nate Persily, Professor at Stanford Law School, and Alex Stamos, director of the Stanford Internet Observatory, to discuss Meta's decision that it is reinstating former President Trump's accounts. Nate is pragmatic, Alex is cynical, and Evelyn is a naive little formalist about it all. Here's their quick takes.

Episode Transcription

Evelyn:

Hello and welcome to Moderated Content, Emergency Edition. Alex, do you have your soundboard there? Can you play any dramatic sounds for our podcast? Okay.

Alex Stamos:

Is that dramatic enough?

Evelyn:

Yeah. Do you have anything else? No. All right, that'll do.

Hello and welcome to Moderated Content, Emergency Edition. We're recording live as it happens on January 25th at around 2:30 PM Pacific Time. I have Professor Nate Persily my colleague at the Stanford Law School here, and Alex Stamos from Stanford Internet Observatory, and we are recording in the immediate wake of Meta's decision about what to do with former President Trump's account.

Now this decision was long awaited. Two years ago in response to the oversight board's decision, Meta placed the question on ice for two years and the self-imposed deadline for a new decision was January 7th. So if anything, this is a little bit late. And they released a blog post this afternoon saying that, "After the two year period has expired, our determination is that the risk has sufficiently receded and that we should therefore adhere to the two year timeline we set out, and Trump will be getting his Facebook and Instagram accounts back in the coming weeks."

They have said though, that they are doing this with increased guardrails and that in light of his prior violations, he now faces heightened penalties for repeat offenses. These are penalties that will apply to other public figures too, and that if Mr. Trump posts further violating content, it will be removed and he'll be suspended for between one month and two years depending on the severity of the violation.

So we're going to dig into more sort of specifics here as we go on, but I mean, just quick high level take, Nate, does any of this surprise you?

Nate Persily:

Well, I don't think the final result surprises me. I think most of us thought that he would be reinstated. The question is what are these conditions and if they're real conditions and how are they going to act on any new content that he posts? For me, I mean, there are a few little significant nuggets if you're trying to do the kind of criminology of Facebook here as to what's going on. The explicit mention of advertising in here is actually I think a bit of a shot across the bow, that that might be one of the punishments that they see, because frankly, as much as we think of Facebook in the same breath as Twitter, for Donald Trump they're very different platforms, and the advertising opportunities on Facebook are in the short term, the most important things, sort of affordance of that platform for a political campaign. And I would think as much as he's actually hasn't posted, for instance, on Twitter yet, that he could end up selling hats and NFTs or trading cards, whatever, as well as other campaign related stuff through the advertising.

So I got very specific very quickly because that's something that jumped out at me. I think there are the larger issues here about whether this is a kind of ticket for this day and train only, or whether this is a larger policy to deal with world leaders, to deal with incitement, to deal with suspended accounts. I think that it's not, at least not yet. And so the most significant thing is that they're letting him back on and they suggest that there are additional guardrails that they're putting in there and the devil will be in the details.

One thing I would note, and sorry to monopolize at the beginning, is that they talked about him de-legitimizing future elections, not past elections, so that the guardrail that they put in place has to do with what he might say about 2024, not what he might say about 2020.

Evelyn:

So let me just read the language that you're talking about there because I think it's actually really interesting. You're saying they say that the updated protocol that they're making this decision under, so it's a crisis protocol, also addresses content that "does not violate our community standards, but that contributes to the sort of risk that materialized on January 6th, such as content that de-legitimizes an upcoming election or is related to QAnon. We may limit the distribution of such posts and for repeated instances may temporarily restrict access to our advertising tools."

So this is really interesting language because it's saying, "This is not in violation of our rules. We're not even really going to specify exactly what this corpus of material is. It's this idea of borderline content, but we're not going to specify exactly what it is. And then we may do some other sort of action with respect to this." And as you say, Nate, the reference to advertising tools is extremely interesting there. But of course the converse is that Trump does have unrestricted access to the advertising tools in the absence of these restrictions, which is a huge, huge thing for him and the main reason why he would be interested in getting his Facebook accounts back. So Alex, what's your top line reaction to this?

Alex Stamos:

I totally agree with everything Nate said. I think the advertising mention is a huge shot across the bow because Trump was a prolific advertiser on Facebook to create small dollar donations, and it's not really the output of his immediate id. That was always Twitter, although we'll see now that everything's kind of reset, where is his primary platform versus his secondary platform? What places does he post himself, versus his professional staff doing it? I totally see a world in which he's taking Twitter into the bathroom again while people try to nail down the door and he says crazy stuff, but then his Facebook account is controlled by the campaign and mostly used for advertising, pushing videos and stuff like that.

I have kind of a really cynical view of why I think Nate's right on the criminology. You can't just say Meta is doing something or Facebook's doing something. It's like a government. You have all these different groups internally that have very different equities and rise and fall of different powers. And something I've said for years is one of the real concerning things about structurally about first Facebook and now Meta, is the policy, the content policy, the product policy, and the government affairs people report up to the same person. That happens to be Nick Clegg, the guy who made the announcement here. And as a result, you end up with people like Clegg and some of the people underneath him having a dual role of both deciding what is the "right thing" for us to do for the policy of what can happen on our platform. And then also to try to keep governments from regulating them.

And from a completely cynical place, this is how I think Facebook is looking at the US, which is post 2016, in 2017, and I was part of this, there's a big investigation of Russia. What happened? A big pivot towards integrity, a pivot towards, "we are not going to be shy and having our own editorial policy as to content that we think is harmful." And there was a huge kind of monotonic growth from 2017 on until about 2021, 2022, of the size of the integrity teams, the aggressiveness of policy. They added policy, added policy, added policy. They almost never took anything away.

What's happened since 2022, in 2020, is in the 2020 election, Facebook did more for election integrity than any company has ever done at all in this space. Was really aggressive about policies. They were not perfect. But in the end there was still a January 6th insurrection driven by political actors who were able to use a variety of platforms, including the mass media, including Fox News, to drive people to invade the Capitol. Who gets blamed? It's tech companies again.

So if you're them, you're feeling from 2016 to 2020, "Democrats hated us. They said that we got Trump elected. We've responded super aggressively." They responded very aggressively, I think overly aggressively, mostly on Covid. We could have a whole discussion of that. But they went very far on making decisions about what is and is not allowed during Covid. There was missed pivots on the Hunter Biden laptop in the run up to 2020, really aggressive stuff, and it bought them nothing from the Democratic side. It bought them nothing. Joe Biden as President, you still have Democratic members of Congress blaming tech companies for everything. You have the FTC shutting down ridiculous little sales of stupid little VR apps. Like DOJ is going after both Facebook and Google for different things. So you still have a massive regulatory push from the Democrats, but on top of it, because of the actions they took around Covid in 2020, you're also hated by the Republicans.

 And I think just from a completely cynical perspective, they're pivoting back to neutrality because they're like, "If the Democrats are going to hate us no matter what, if we're going to get anti-business stuff from the Democratic side no matter what, then why are we going to do the things they want around doing the "right thing"? And we are going to pivot back to a neutral place where at least you don't have both major political parties in the United States trying to destroy the entire industry." Which is effectively where they've ended up now.

And so the fact that this is happening as the House of Representatives change hands, there's going to be a big committee that's looking into deep state groomers and tech. You're going to have effectively a committee in which you have some QAnon believing House of Representative members on. Then what they're trying to do is I think shore up their right side a little bit and then just throw up their hands and say, "We enforce these rules neutrally." And the Hunter Biden laptop thing lasted for 48 hours and we're still talking about it. There's nothing more they can do there. The big signaling thing they can do is let Trump back on. So I think this has very little to do with what they think is wrong or right. I think it's completely about domestic politics in the US and a different faction inside of Facebook winning the war for deciding where should they position himself in the American politics.

Nate Persily:

I disagree with Alex a little bit on this. I think that there was a battle inside the company and that it's no coincidence that Nick Clegg is the one who is announcing this, as a former elected politician. And I think that any of us who've dealt with them in this period over the last few years knows that they are very uncomfortable with the decision to take down Trump, but they're probably largely pleased with the results because we've reached a new kind of equilibrium. And I think that they really do mean it when they think of these elected leaders, that for Facebook to allow one party to have a place on the platform and not another? I think that now that Trump is actually an announced candidate, I think they were genuinely worried about what that would say. Now, maybe that's not completely inconsistent with what you're saying, Alex, because it's there's a real politics story to that, but I think there's a principle there that they're worried about taking down these political candidates.

Speaker 4:

From my perspective, I was there when the Trump transition happened. And what happens inside the government affairs offices of all the tech, all major companies really, not just tech companies, all major companies, it's just like the rest of K Street where they fire all the outgoing people or they demote them and they promote other ones. So I saw during that Trump period, if you were a Republican, after Trump took power all of a sudden you were on the rise inside of Facebook public policy. And I think that is what's happening here is now that we have divided government, that most of the news over the next year is going to be made by the House GPO. Good news, bad news, they are controlling the narrative and they're going to control the conventional wisdom.

I expect the people who have been fighting this entire time, "We got to let Trump back on," have been newly empowered by the fact that there's also probably a huge team at Facebook getting ready for hearing. And that this is one of the cleanest ways for them to say, I think from their perspective too, there's two things that reduce the risk. One, Elon Musk gives great cover for Facebook to do everything. Short of taking content moderation cues from catturd2, Mark Zuckerberg cannot look worse than Elon Musk.

So Musk was just like, the phrase that I think was the smartest here was, "letting the inmates out of Arkham Asylum," that was by Casey Newton. Musk is like, "Hey, everybody's coming back." Nick Fuentes came back for 24 hours. Horrible white supremacist. And so the fact that Clegg comes out with this very complicated nuanced statement, you're going to look like much more mature. And second, just because Facebook was never Trump's personal platform.

Nate Persily:

Right, right.

Speaker 4:

And so if you're letting his people, and it'd be interesting to see if there's any back channeling here that's effectively like, "Look, if your people want to run ads that are like generic GOP ads that maybe some people on this call disagree with but are just normal political ads, that's fine. Do not let Trump just have Facebook on his personal phone." And that might be, if not the deal, at least the assumption between Truth Social and all the other platforms and Twitter, that most of the stuff accrues to Twitter where Musk is not going to handle it well, I think we can predict.

Nate Persily:

I actually think Facebook has placed themselves in a box here because the worst of all worlds is if they take him down within a month or two. And so now that they've announced this, I mean, I think that they've sort of upset the equilibrium and it's possible they're going to be accused of even more bias because assuming Trump actually uses the platform, which is not a given, then they're going to have to decide how much does he need to de-legitimize the electoral process before they kick him off? How close does he have to nuzzle up to what he did on January 6th before they take action? And how severe will the actions be? And so I just think that they've just invited, they've thrown themselves into the hornets nest here a little bit.

Evelyn:

My first reaction on reading this, Nate, is I think Meta's made, this is a terrible mistake in the way that they've done this, the way that they've written this blog post. I'm a naive little formalist, so unlike both of you, I wasn't engaging in criminology. I was sort of reading the blog post and trying to understand the rationale that they were putting forward for the decision and how they were thinking about this like the good little Australian lawyer that I am. And I can't see the rationale for this decision really anywhere on the face of it. You, Nate, we're talking earlier about is this a ticket that's good for all seasons or is it just for today? They're trying to point to these general principles, this new crisis policy protocol that applies to all other political leaders as well.

They're trying to make it out like, "This is a general decision. We have these new blog posts on how we're going to treat political leaders." But I think you're totally right. None of this in any way establishes general principles. So this general statement, "the risk has sufficiently receded." It doesn't describe what that is. It doesn't describe what factors they are looking at when they describe the risk. Is there a risk in Brazil right now?

Nate Persily:

Right.

Evelyn:

Like how risky do they think India is today? None of these things are clear from the face of the decision, and so they haven't bound their hands or given themselves a clear decision pathway in any way for all of these future decisions that are inevitably going to come up when Trump starts advertising, starts posting. And so every time he posts, we're going to be having the same old arguments that we had two years ago saying, "Is this one crosses the line? Has this been de-amplified? Should his advertising be restricted because of this?" And it's just going to be interminable. I mean, full employment program for us and expect lots of podcasts, but otherwise relatively boring.

Alex Stamos:

Get the sound board loaded up. Get ready.

Nate Persily:

Yeah. But here is the way the decision appears to have been made, which is that they did consult with outside groups, let's say dealing with political violence. They took note that the fact that the 2022 election was basically a peaceful affair suggests that we're not in the same world that we were in 2020. None of these things are going to be like, "Oh, well we think there's a 28% chance of violence as opposed to a 45% chance of violence, so therefore we're letting him on." It's always going to be some kind of expert judgment on this. I actually think, and one thing I've sort of wondered is suppose they made this decision after the assault on Paul Pelosi? Would it have gone a completely different direction as opposed to, let alone if the 2022 election had some violence? And it just sort of depends where we are in time and what events are going to happen that might convince them one way or the other.

Alex Stamos:

Well, you pointed out something important, Nate, which was they talked about new guardrails. And something that wasn't in the original announcement, but is in the Axios interview afterwards, is Clegg said that, "The reinstatement is not immediate because Facebook's engineers have to build new guardrails."

Nate Persily:

Yeah, yeah.

Alex Stamos:

So they're building new reach reduction mechanisms that we've never heard of. I think the important thing here, they still have an opportunity to provide some predictability. I mean that's what has been missing from this entire process is they've never said, "If X happens, we will do Y." And that means Trump and his team can't really predict what's going to happen, but it means the rest of us and every other world leader have no visibility into what the actual line they're drawing. And you have to be careful how deep those lines are. But I think you could say, "If you de-legitimize election without calling for violence, we will reduce in this way. If you de-legitimize and call for violence, we'll take your content down for the first strike and with a second or third strike, your account will be suspended for X days." And that that's something that we should be seeing right now.

Nate Persily:

One thing on the strikes point is that he does not have legacy strikes as a result of the, he's actually being wiped clean with this new policy, so that while he is being subject to greater supervision and there's these special rules, these special guardrails, that he is, at least his status in terms of strikes is the same as you and me, assuming you haven't had any strikes lately. But then the other thing that-

Alex Stamos:

I have special strikes.

Nate Persily:

Yeah, right.

Alex Stamos:

Double super special secret strikes, Nate.

Nate Persily:

Or lucky strike as the case may be. But on these guardrails, I do think it will be interesting to see how they, as we know he will push up against the line, and the question is will they take action? What I think they need to build, by the way, is this capacity to remove the re-share button and some of those other things. That's not a function that I think they currently have. I don't think this will take more than a month, but so we should expect them technically to be able to-

Alex Stamos:

Usually not. Those are the kinds of estimates. If you're an engineering manager inside of Facebook and you're like, "I can whip this out in two weeks." And then you start to dig into it and you're like, "Oh, this is functionality some Stanford, or I'm sorry, some Harvard junior wrote, that now runs on a million computers concurrently." Actually it's more for a computer science class, but the doing in-memory data structures across a million machines in real time across the world turns out to be complicated.

But yes, I totally agree. The re-share one is a great one. Facebook has never really had the level of isolation that Twitter would apply in some of these cases, where it's something will show up on somebody's page if you intentionally go to it, but will never show up in your feed unless you are intentionally looking at it. And I do think that's appropriate here, that if he says something that is not an immediate call to violence that you say, "Okay, well for historical record, it is now on this page, the official Donald Trump page, but we are not going to push it out to people's newsfeeds unless they go ask for it specifically."

Evelyn:

Yeah, I mean the fact that they're just building these tools now though does of course beg the question, why now? This has been two years in the making. If these are the kinds of tools that they think should generally be available to deal with world leaders and aren't just responsive to just Trump, then maybe they could have started a while ago. I think that the answer is some version of, yes, they were waiting to see how everything panned out, they were waiting to see what their decision would be in this particular case, and then they're going to make tools that deal with this particular case, and if they happen to need to apply them or can use them in other cases, then alls the better, I think.

You mentioned Paul Pelosi and the elections, Nate, I mean, I also wonder, in a different world, is Twitter the blue canary in the coal mine here? If Elon hadn't purchased Twitter and reinstated Trump, would we have a different decision today? Was it that Meta can draft in Elon's wake in particular on this, both making it less controversial, but also seeing that so far, at least, Trump hasn't yet incited a riot yet again on Twitter. And so maybe that they feel safer to make this decision. I don't know.

Nate Persily:

I'll say one last thing here, which is that YouTube punitively had the same criteria, which is that they were going to reevaluate every week as to whether there was a likelihood of violence. So now the ball is, as always, in their court. This is a good controlled experiment as to whether you think anything that's happening inside Facebook, the politics of this, bleeds over into these other platforms. It's also one of the misfortunes of not having greater detail as to who they talk to and how they assess the risk, because that's not something then another platform can piggyback onto, but presumably they're all doing that at the same time.

Alex Stamos:

Well, that's an interesting question is how do you assess risk? Is it only on platform or if he says something crazy on Fox News or on Newsmax, does that count?

Nate Persily:

If you notice, they mentioned QAnon in Clegg's post, but that's because he has cited QAnon people on his Truth Social account, right?

Alex Stamos:

Right.

Nate Persily:

And so they are looking at least the stuff that didn't happen on Facebook that could implicate this decision.

Evelyn:

I mean, these were the questions we were asking two years ago saying, "How much will you take into account off platform behavior? How much will you take into account what other platforms are doing?" And here we are, two years later, we don't have any answers, and this blog post doesn't answer them. But we are seeing a kind of reverse domino effect where Twitter banned him, Facebook banned him, YouTube eventually banned him. Maybe we'll see Twitter's putting him back, Facebook's putting him back, and maybe we'll see YouTube putting him back as well.

Couple of things to note. Nate and myself and Alex, we were all talking about wanting more transparency around the tools and the increased violations. The oversight board has weighed in today and responded to Meta's decision as well. And that's basically exactly what they've said. They welcome the decision, but they want Meta to provide additional details to the assessments so that the board can review the implementation. I think one thing that's really interesting here is that Meta has not specified any role for the oversight board in going forward with respect to Trump's account, which again, I think means we're going to be engaged in this back and forth as things escalate or go forward of saying, "Should they be sending this to the oversight board? Should they not be sending this to the oversight board?" It would've been, I think, useful for them to tie their hands with that respect as well and say, "In this case, we will seek particular guidance from the oversight board." But it seems-

Alex Stamos:

And what are they for, if not for, whether you de-platform a world leader?

Evelyn:

Yeah, I mean exactly.

Alex Stamos:

Yeah, and for deciding things like if the rule is because he's a candidate, but at this point, anybody can say they're a candidate for President for a major party because we've had no primaries, we haven't even had people qualify for primaries yet. So none of the official mechanisms of state party, you could have a standard of have to end up in this caucus or whatever, or New Hampshire, to be officially it. None of that would work right now. So does this work? Can anybody who's been platformed say that they're running for President and all of a sudden get back on? Those are kinds of details that people are talking about inside of Facebook. That is a discussion I guarantee that's happening. We're just not seeing it. And they have not taken this opportunity to kind of do better on transparency in this area., unfortunately.

Evelyn:

No. They've released this reasonably long blog post that is a lot of words to say basically nothing. It is entirely unclear how they're thinking about this going forward and have left themselves as much wiggle room as possible, which is interesting because it also means that they're going to be called on to do things and leave themselves open to criticism again and again as we go forward.

I completely agree about the oversight board. I wrote about this when Trump was banned two years ago. I wrote a post saying, "Meta needs to refer this decision to the oversight board because currently that is the only mechanism for an account suspension to get to the oversight board. So the oversight board can't just pick up the case and decide it itself." Which shows a real limit on their authority here as well. So anyway, I guess we're going to be having this conversation a lot more. We talked just a couple of days ago, Alex, about how it seems like Trump is going to stay off these platforms until June, that that's what some of the reporting is suggesting, when his exclusivity agreement with Truth Social expires.

Alex Stamos:

Yeah.

Evelyn:

But we'll see if that remains the case.

Alex Stamos:

So Trump has made now an official response on Truth Social?

Evelyn:

Live, it's happening live.

Alex Stamos:

Right. I will not be doing a Trump impression. There's a reason why I do not work on Saturday Night Live. "Facebook," all caps, "which has lost billions of dollars in value since de-platforming your favorite President, me, has just announced that they're reinstating my account. Such a thing should never again happen to a sitting President or anybody else who is not deserving of retribution." Everything past this point isn't all caps. "Thank you to Truth Social for doing such an incredible job. Your growth is outstanding and future unlimited!!!"

So he is endorsing Truth and saying that, effectively, I mean, he's not saying, "I'm not coming back to Facebook," but he is saying that Truth is like his primary platform. Again, it'll be interesting to see if he interprets, I wonder exactly what that contract says, if he stays on Truth, but the campaign to reelect or whatever is back on Facebook as long as he is not personally posting, which would get him all the value that he really wants out of Facebook without technically violating his contract. So it would be interesting. I don't think we've seen what that contract looks like with Truth, have we?

Evelyn:

No. So whether, I mean this all, just as we come back to where we started, this all could be about the advertising. Trump could just sit on Truth Social and post his, that was actually a very good reading, thank you, Alex, a dramatic reading, and send out his campaign ads. Anyway, I guess we know what we'll be talking about for the next few years. Are there any other big takeaways or comments you want to make before we wrap this up?

Speaker 4:

It's a double emergency podcast, but people, I think I'm going to give ourselves credit. We talked about Modi being able to censor Twitter and YouTube in India, and now it is. i=It finally made it as a big story. So I'm going to say that we are the ones that pulled that from obscurity and turned it into a big deal. But yes, I'm glad that that discussion is happening. It is impossible to have, like Nate was saying, it's impossible to have the Trump discussion without having Modi in the back of your head.

Evelyn:

Absolutely. Do you want to hear it first? Listen in to Moderated Content, weekly updates with Alex and Evelyn, to know what's coming in the week ahead.

And that is all for this episode. The show's available in all the usual places and show notes and transcripts are available at law.stanford.edu/moderatedcontent. Really grateful to Brian Pelletier, our producer who's going to turn this around quickly for us today. And also to Alyssa Ashdown. Justin Fu and Rob Huffman for doing double time on this. Thanks very much. Bye.