Moderated Content

MC Weekly Update 11/14: Elections and Elon, again

Episode Summary

Evelyn and Alex talk about what the Election Integrity Partnership saw online in terms of mis- and dis-information around the midterms, and what the results might mean for tech policy. And... Elon. Sigh. What a week. Twitter's security team resigned -- what does this mean for compliance with an FTC consent order and... what does it mean for Twitter's security? Elon says they're turning off "microservices." That can't be good. What does it mean? And other exciting developments in the Musk/Twitter debacle.

Episode Notes

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to 

subscribe and share

 the podcast with friends!

Episode Transcription

Alex Stamos:

The odds of Twitter both being running and owned by Elon Musk, by the time NetChoice comes to SCOTUS is-

Evelyn Douek:

Yeah. Well we should buy a lettuce basically at this point to see whether Twitter still exists or Elon still owns it and while the lettuce survives.

Welcome to the weekly news and episode of Moderated Content with myself, Evelyn Dak and Alex Stamos. I would just like to note that we are recording at 10:43 AM Pacific time on Monday, November 14, and as at the time of recording, Twitter.com is still up and running, but we make no assurances that will still be the case by the time you're listening to this.

Alex Stamos:

Right. Up to the moment, we'll keep on updating you through the podcast as to what's happening.

Evelyn Douek:

Exactly. Which lawmaker will Elon offend in the next 30 minutes? Okay. I think we have to start with the midterms, because I do think that that's a better more consequential place to start. So Alex, you ran this war room during the midterms with the Election Integrity Partnership following what you saw online. So tell us the general play.

Alex Stamos:

Yeah, just as within 2020, our team at the Stanford Observatory worked with the University of Washington Center of Informed Public and we ran the Election Integrity Partnership. We had a war room in the basement of a building here at Stanford with students eating a huge amount of simple carbs. So one of the things I learned was I do not have the pancreas of a 21 year old. It turns out I felt horrible the next day without a drop of alcohol, just eating like an undergrad for a day. So it was a busy day. The amount of content we saw that you could call some kind of election disinformation, I think was actually larger than what we saw in 2020.

But overall the effects were much more muted. A couple of things we saw, one of our thesis coming into 2022 was that it would be a much more spread out environment. Post January 6th, there's been kind of a fracturing especially on the right of the platforms that people are hanging out on. And that turned out to be true, that there were things that people were saying that were not true on Twitter and Facebook. But for the most part on the big platforms, you didn't see things like calls to violence. You didn't see people threatening individuals or doxing individuals. We saw a lot of that on the alternative sites. So the [inaudible 00:02:13] Parlor, True Social, and especially Telegram and for me the big platform of the night was Telegram.

There are hundreds of channels that become these very effective echo chambers where everybody is kind of riling the other folks up at which you saw individual poll workers being doxed. You saw calls for protest including bring your guns, Defend your First and Second Amendment rights and kind of coded calls for potential violence. And while those things ended up not happening, that is definitely where you have to be paying attention now if you want to see what's going to go on is going to be on Telegram.

Evelyn Douek:

Great. Okay, so a bunch of follow ups there. The first thing is you said you didn't see a lot of this stuff on the major platforms, and I'm wondering if you have a thesis as to why that is. Was it because of better content moderation? Is it because people were deplatformed the last time around? Or is it because the audience isn't there? The audience is in these other platforms now?

Alex Stamos:

Yeah, I would say it's those first two things leading to the third, right? In that the aggressive rules that were created in 2020 and the aggressive moderation, especially after January 6th, up until election day 2020, there was definitely more enforcement of rules against straight up lying about elections and certainly threats against individuals. That really softened between election day and January 6th, and then the companies really aggressively went after folks who were spreading the big lie after January 6th, 2021. That impact, especially the de platforming of Trump, which drove a lot of people off of Twitter and Facebook, then led to that third thing you're talking about that the audiences have changed. So I think yes, there is more enforcement than there was pre 2020, but also people just aren't trying.

The other big platform is Rumble. So we actually saw a lot less on YouTube because all of the really big live shows, the people who are live streaming about how it's been stolen continuously, who just honestly lie their butts off about every little piece of evidence as part of a grand conspiracy against their side. Those people are in Rumble now. They've moved off of YouTube. Yeah, I think part of it is just the audiences have changed those platforms where there's effectively no possibility of them being moderated in any way. Because you have to remember for a lot of these people, it's a grift. They're in the business is making money, and so being on platforms where they know that they are able to still sell and where their economic lifeline is not going to be cut off is really important for them.

And Rumble is directly paying some of these folks. We don't know what those deals look like on the back end. There's much less visibility to Rumble economics than say YouTube, where there's lots of people that share their YouTube earning numbers. But those numbers might be significant. So you might have the Charlie Kirks and such making a decent amount of money for taking their shows to Rumble.

Evelyn Douek:

There's been this big debate over the last few years about the effects of de platforming because on the one hand, if you kick someone off one of the major platforms, you really reduce their possibility of meeting a mainstream audience. But on the other hand, if you kick them into these echo chambers as a possibility of them really radicalizing and not having the countervailing counter speech that might help at least stop the radicalization of some people. We saw a lot less of this stuff sticking. You said there was an increase in volume, but maybe less of it sticking. Does this suggest one way or the other an answer to that debate about deplatforming?

Alex Stamos:

I don't think we have a general thesis on deplatforming that comes out of this because there's so many things happening. One of the differences here too is you don't really have any candidates who are really effectively pushing a stop the steal, it was stolen from me line. The only person left is Kari Lake who is running for governor of Arizona and since election day has been neck and neck with her opponent, Katie Hobbs, the Democrat. Because it is very close, you end up with the situation where as the vote totals go up and down, it looks like Lake's going to win or not, the dial of how much election disinformation is turned up and down. So if Lake is losing, it's being stolen. If Lake looks like she might win, then it's totally fine. So I think it's very hard when that is the reality you're dealing with.

The other difference this year is that Fox News has on election day, in election night and the day after was generally pretty responsible in the way they were not in 2020. We saw that Fox News was responsible on election day in 2020. They're the first ones to call Arizona for Biden. They called Biden the president elect the next morning. And then when the primetime host came on, they just went nuts. And there's absolutely no standards for Tucker and Sean Hannity and such. You always have to remember that a lot of this stuff that you see on social media is downstream of people kind of taking their cues from people like Tucker and Sean Hannity who are setting the stage and then all of the local AM talk show hosts. That has been much more mixed this year of people blaming. There's still some stop to steal.

Tucker's doing a lot of just asking questions. He said a bunch of things that are not factually true on election night about who built the machines in Maricopa County, what the root cause of the issues are in Maricopa County, whether or not votes are being counted or not. The truth is those votes that aren't scanning in the precinct based scanners get scanned later in centralized scanners. So he's saying things that aren't true, but you also see the whole Rupert Murdoch empire turning against Trump. And so you don't have kind of the unified Fox News, OANN, Newsmax kind of trifecta pushing a stop the steal narrative. And so it's really hard to figure out what is going on here because it's just such a complicated ecosystem of major media figures as well as people on social media.

Evelyn Douek:

So bottom line, something along the lines of maybe companies have made progress, they have better rules, they're maybe doing better enforcement, but it's hard to tell the causative impact of that given that all of these other factors are so important.

Alex Stamos:

That's right. And for me, I mean the thing that I really always want from the companies is for them to stop threats of violence against individuals. If people are going to say this election is not being run in the right way, to me that is a total reasonable free speech issue. There are lots of issues around election administration we talk about. Where I find it less compelling to see on the big platforms is one when people are just straight up taking facts and then lying about this is a conspiracy driven by George Soros money or Mark Zuckerberg money or whatever. And then especially the direct calls for violence and the doxing of individuals. That's something that the rules have been effective. The amount of who's this individual person let's dox them on Twitter or Facebook has been extremely small. Whereas that is rampant on Telegram. Partially because enforcement, partially because people are just move those platforms. I think they think it's much more secret on Telegram that they're not going to get caught. I don't think there's a lot of good evidence for that, but that is their thesis.

So the things that are pushing the bounds of legality around what could be actually seen as potential legitimate threats against individuals, that's the kind of stuff that has moved off the major platforms for sure. To that extent, I wouldn't call it de platforming, but the aggressive enforcement on doxing and threats to individuals definitely has driven stuff off of the big platforms.

Evelyn Douek:

And I guess that suggests something interesting about this move towards decentralization that we're maybe seeing if we're seeing maybe a big move towards Mastodon or those kinds of models for moderation. There is some benefit in having centralized gatekeepers in certain circumstances.

Alex Stamos:

Certainly I've been running this Mastodon instance cybervillains.com as specific as a test instance, and Mastodon has nowhere near the tools necessary, even if you wanted to be an aggressive moderator of a specific Mastodon instance because that's what you use to attract people to it. The tooling is not great at all. It's quite poor. So there's a lot of work for the Fed averse if people want to be able to do this, especially with kind of part-time moderators and without anybody making money. There's a lot of work that's going to have to get done on just the technical side.

Evelyn Douek:

Well, that'll be something to watch then, depending on what happens in the next two years. I believe Twitter is still up and running six minutes into this podcast.

Alex Stamos:

Right. Twitter is still up and running.

Evelyn Douek:

All right, excellent. I guess the other thing to of talk about in the takeaway from the midterms is what this might mean for tech policy. The big thing is that there's not as much change as we expected. If the red wave had materialized, we might have seen a lot more platform bashing, a lot more concerns about conservative censorship on social media. As it is, we're roughly in the same situation where it is really hard to find bipartisan consensus to get a lot of things through. So there might not be a lot of change.

Alex Stamos:

I mean, there is a bipartisan consensus still that tech companies are bad.

Evelyn Douek:

Right, exactly. We could expect many more hearings bashing platform executives around the head. Further evidence in support of that is Musk getting into a Twitter war with Senator Ed Markey over the last few days where Markey tweeted something about a Washington Post story to do with verification of a spoof account in his name. Musk tweeted some very mature critiques back that, "Why does your PPE have a mask?" So that is the quality of public discourse and policy conversations at the moment. And then Markey said, "You're spending your time picking fights online, fix your companies or Congress will." Which suggests this big public rhetoric going to continue around platform bashing.

Alex Stamos:

I personally have not liked it when elected politicians in the United States have threatened regulation to get really specific content decisions. I really don't like that. As a First Amendment scholar, what is your immediate opinion to how Markey is acting here? Does it make you at all uncomfortable to have these politicians trying to get very individual things done? You're not talking about I'm going to pass an American DSA, you're saying I want this specific account taken down, or I want this specific account left up. How do you see that interacting with overall First Amendment?

Evelyn Douek:

I think that people don't show enough concern about what we call jawboning by American lawmakers of platform executives. You see Democratic senators sending Facebook a letter saying, "Here are the 12 disinformation dozen. What are you doing about these people?" I do think that's really problematic. That's trying to do an end run around the First Amendment because the lawmakers can't censor it directly, so they're trying to get the platforms to do it by threatening them with legislation. So there are legitimate First Amendment issues here, but I think Markey is making some general, "Get your house in order." That concerns me a lot less. There is a situation where it is appropriate for Congress to be saying to platform executives, "You're not running this critically important platform in a responsible way." So that concerns me far, far less.

Alex Stamos:

Right. Yeah, I really disagreed with that letter with the top partially because as people who do research, we'll have our list of here are the top spreaders of a certain kind of disinformation. We are not building that then. We're doing that so people can understand what's going on. We don't make that list so that a politician can turn around and cite that letter and then try to suppress because what do you do about that is actually, I think a complicated... Just taking down 20 accounts is probably not the right response if you're dealing with something like vaccine disinformation. So Musk's response here. Yes. One of his responses, Markey was complaining that there's impersonators who are able now to get blue check marks. The rollout, as everybody has predicted, including hundreds of people inside of Twitter I'm sure, allowing people to pay for a check mark has turned out to be a complete disaster for Twitter on many levels.

Evelyn Douek:

I mean hilarious though.

Alex Stamos:

It's hilarious. Yeah. We'll talk about some of the specific economic disasters. Markey specifically talking about how people are able to create checkmarked Ed Markey accounts and that he doesn't like that. Musk's response, "Perhaps it is because your real account sounds like a parody." So I know that Musk doesn't have comms people or media or PR people, but I guarantee that Tesla and SpaceX have government relations people. Tesla is a company that is partially built upon American tax law. Tax credits for both cars and solar panels are a big deal. SpaceX, something like 90 some percent of their revenue comes from the US government. They are a massive government contractor. I'm guessing Gwynne Shotwell, the person who really runs SpaceX every day, is losing her mind because you've got Musk turning himself into a political figure and insulting the people who have to actively vote for a budget where there is a line item that directly hundreds of millions, maybe eventually billions of dollars is going to go to SpaceX.

Evelyn Douek:

I mean, I guess it's the flip side of the thing we talked about right at the very beginning when Musk acquired Twitter about his other investments provide leverage for governments. We're not just talking about the Chinese government, the Indian market, the Brazilian government. Hey, it turns out the US government also has that leverage, which again, we should have real questions about them using that because of the dissatisfaction with the way a platform's being run. But it exists.

Alex Stamos:

Right. I mean, a lot of people say the US government has too much influence over Mark Zuckerberg. Imagine if Mark Zuckerberg, all of his money was in Lockheed stock. Now they got a lot more leverage.

Evelyn Douek:

I will say though, just going back to the effects of the midterms, to quickly close that out before we move onto the Musk saga, our weekly segment on Elon Musk. Yeah, exactly. Cue theme music. Markey's tweet was I think largely performative in many ways. Get your house in order. The idea that Congress is actually going to pass some meaningful tech legislation is not... I would not put the probability very high, but that does not mean that Musk shouldn't worry about the law. There is a lot of law that he should be very worried about. We saw the FTC perking up its ears, which we'll talk about in just a second. But we also have, and I think we've talked about this before, the Supreme Court going weigh in on a bunch of content moderation cases and other legal levers that he really should be showing much more concern about than he is.

I just don't think that legislation from Congress... And also we should talk about the states just really quickly to say the states are going to be really interested in this area, particularly because Congress is vacating the field and showing so little action.

Alex Stamos:

As a Supreme Court watcher, if you are one of the people who believes that Clarence Thomas and Sam Alito and some other folks are more politically driven than legally driven, the fact that there is now, if not a conservative, a edge lord who owns Twitter, do you think that actually changes how SCOTUS rules?

Evelyn Douek:

It's such an interesting question because you're right, we seem to have been seeing a conservative carve out for social media platforms from their general position that corporations should have free speech rights, protect the money, this will all sort it itself out in the marketplace. We're seeing them sort of show a break from this decades long position of conservative lawyers and justices because of concerns about social media platforms. Do I think that this one platform showing more of a free speech ethos will change that? I'm not convinced. I do think that there's just generally going to be this deep suspicions of the libs in Silicon Valley, but I don't know. I really don't have a strong prediction of what's going to happen in these cases. The politics around the First Amendment are really weird right now, and so I'm not going to put my money down either way.

Alex Stamos:

And there are also kind of weird cases that don't go direct to... Like the Texas issues and such that haven't gone to that level yet, the net choice case.

Evelyn Douek:

Yeah. So they will come up next year sometime. Early next year, probably. Again, we're speculating, but it seems like the court has been very, very hungry for those. So what happens with Twitter between now and then, who knows? So we'll see.

Alex Stamos:

The odds of Twitter both being running and owned by Elon Musk by the time net choice comes to SCOTUS is, yeah-

Evelyn Douek:

Yeah. Well, we should buy a lettuce basically at this point to see whether Twitter still exists or Elon still owns it while the lettuce survives. Okay, so let's turn then probably to Musk. Tell us about the security developments this week.

Alex Stamos:

Okay. Twitter made no changes to their rules or enforcement before the midterms. That was clearly something a number of people internally were fighting for. Yoel Roth, I think I'm going to give the most credit for. I've said to you, I think I'm going to nominate Yoel for a Nobel Peace Prize because for at least a short period of time, he convinced Elon Musk that there was a reason why platforms like Twitter have to have some kind of content policy that they can't just be free speech free for alls because they become completely unusable and the kind of thing places that people don't want to hang out at. So for a short period of time, you all was able to hold on. But then after the midterms, we saw first the synchronized resignation of the chief information security officer, the chief privacy officer, and the chief compliance officer of the company, all the exact same time. Apparently, coincidentally at the same time that they were supposed to respond to the FTC and give the FTC an update on their progress under the FTC consent decree.

Evelyn Douek:

What a coincidence.

Alex Stamos:

Yes. Almost as if those are people who... We haven't talked a lot about the Joe Sullivan case, but my predecessor at Facebook, Joe Sullivan, went on to [inaudible 00:18:40] Uber and was just convicted of hiding evidence of a breach from the FTC. The facts here are really complicated. We could probably have a very long discussion, and I have some disagreements with the US attorney's office in this, but whatever you believe about that, clearly the Department of Justice has thrown an elbow saying, "We will arrest and hold liable lower level executives for decisions that were made by the CEO." Like decisions that were made by Travis at Uber are now being criminally attached to Joe. And if you're looking at that and you're looking at what Musk is probably asking you to do, being the chief security officer at Twitter sounds like a horrible job. In fact, I said on Twitter, "You'd have to be insane to take that job right now."

I understand why Lea stepped up into that position after being asked to do so, but to be able to take that job right now, you'd have to convince yourself that you're not going to be asked to do something that would put you in the same kind of personal criminal level of responsibility. I think that that would be a hard thing to make yourself confident of with the way that Musk is acting.

Evelyn Douek:

Right. There was reporting this week that Musk's lawyer, Alex Sparrow, said, "Elon puts rockets into space. He's not afraid of the FTC." Well, that's all good and well for him, but maybe the rest of the people at the company are not so bullish about just evading.

Alex Stamos:

Right. I mean, I tell me this is correct, that if Twitter falls out of compliance, they're not going to just start arresting random people. I think the problem that I would have is that generally, as somebody who had to sign these letters, as a chief security officer, you are personally putting your signature on a number of letters that you were filing under penalty of perjury. On that letter, it says, "To my best belief and knowledge that things in this letter are true." I had to do that with the New York Department of Financial Services, had to do it with the credit card industry, the PCI DSS, had to do it with the FTC. You sit there and you get some indigestion because no matter how hard you worked and how much you trust your team, one human being has no ability to understand what is happening at a humongous company of thousands of people and hundreds of thousands or millions of computers. So you're just hoping that everybody's doing their job well and that there's not a force that's pushing them to lie.

But in a situation like what Musk is doing inside of Twitter, one half of those people are gone. Half the people who used to, and in some cases it sounds like more than half the people who actually implemented things that were required by the FTC are gone. So saying, "To the best of my knowledge were compliant with this," becomes impossible. Because if you have built, for example, an internal security team whose job it is to look at an internal data access and all those people are fired, there's absolutely no way you can send that letter anymore.

Evelyn Douek:

Right. So we're going to have a longer podcast dropping into this feed later in the week with Riana Pfefferkorn of Stanford's own Stanford Internet Observatory, and Whitney Merrill, a longtime privacy lawyer and including FTC attorney. So we're going to dig into the law later in the week on that. But Alex, substantially, should we be concerned that all of these privacy people and security people have left Twitter? What's going to happen?

Alex Stamos:

Yeah, I'm very much looking forward to your podcast. Whitney and Riana are fantastic, and I can't wait for the three of you to talk about the legal side. From the substantive side, I am terrified about data at Twitter right now. Twitter has a long history of data breaches due to internal data access controls. I know for a fact that's something that Lea and team really focused on over this last year and has made significant progress and a lot of those people have been fired or they've quit. So now without a security team watching, this is not something where you just build it and walk away. Security is something you have to do every... It's adversarial. You have to do it every single day. It's like playing chess. You can't write a book on chess and then say, "I'm the world's grace chess master." You have to actually play the games. And there's nobody playing the game anymore at Twitter. There's these high level, but my signal in WhatsApp is full of resumes of people who are leaving. They're not doing so high profile. Yeah, I'm really worried about it.

Evelyn Douek:

Then on Twitter, DMing it to you.

Alex Stamos:

Right. I'm really worried about that. I'm worried about Twitter going down because Musk is making some kind of plainly incorrect statements about how Twitter's backend services work that is clearly wrong to anybody who's actually worked on this stuff. He considers himself a genius. He could be a genius in every single kind of engineering. But turns out that building at scale, web scale backend services is a difficult, interesting job for which people get paid a lot of money because there's not a lot of people who do it well. Those people are correcting him, and he seems to be firing people who are saying that he's wrong.

Evelyn Douek:

He tweeted recently he's going to be turning off a bunch of microservices. What does that mean?

Alex Stamos:

Yeah. That would be a bad idea. You know that part of Ghostbusters where the EPA guy shuts down the containment grid and puts all the ghosts out? That is how I'm envisioning. Somebody needs to, who has better Photoshop skills, put Elon Musk's face over the EPA guy in Ghostbusters, because that is what's going to happen. So microservices, this is just a backend architecture of how do you build a system like Twitter. What is Twitter in the end? It is a system by which you could put data into it, a tweet or an image, and then that data has to be replicated at scale for millions or hundreds of millions of people to see. This requires multiple, multiple layers of services where when you are on your phone, you are talking to a web service that is sitting in a data center. But behind it, after you tell the web service, "Here's my tweet," it has to spread that out across tens of thousands of computers on the back end so then it can be properly replicated and show up.

So if you're Elon Musk and you have a hundred million people, the job of taking one tweet from Elon Musk and copying a hundred million times turns out not to be trivial. One of the complicated things for Twitter, and this is true for Facebook as well, is that everybody has a different view. So on a normal webpage, you can build more monolithic systems because the view... If you go to io.stanford.edu right now, and you want to look at all of our blog posts, every single person who logs in gets the same blog post. So if we had a billion people who wanted to read a document, we could just copy that into a content distribution network around the world. And it's the same copy.

The challenge for Twitter is they can't just do that because every single time somebody reloads their feed, every single Twitter feed on the planet is different. A hundred million people are getting totally customized feeds. So building a service that does that, one of the ways you can do is with microservices, where you're talking to a frontend machine. That frontend machine is talking to dozens and dozens of services behind it and each of those services talk to a dozen services behind.

So one thing he was talking about that he didn't really understand is that if you reload a Twitter timeline, it quite possibly touches a thousand different systems inside of Twitter. That is by design. That is faster than trying to build some kind of big monolithic system. That is the exact same thing that happens at Facebook. Just putting somebody's Facebook newsfeed together is an incredibly complex piece of computation that requires all of these different machines to be running perfectly. That's the only way you can do that kind of stuff at scale.

So just going and turning off microservices, it's a statement that doesn't make any sense because if you just go shut off these services, Twitter's just going to stop working, which is effectively almost what they've been doing because they keep on firing people. What I was hearing was every big company has an on-call schedule. Well, you'll have a list of, for each backend service, here is the engineer who's on call. That when they started laying people off, they were laying people off and locking their accounts who are currently on call for backend services so that there's no way to actually get to the person whose job it was to fix something if it broke. So if they keep on acting in this way where they just keep on randomly firing people on the backend and randomly turning off services, Twitter's going to stop working at some point. It's just a totally crazy way to take over this incredibly complex system that's been built over years and just assume that he knows how to rearchitect it overnight when he has no experience at all with any kind of web scale services.

Evelyn Douek:

Right. And if that's on the back end, what you see on the front end, Casey Newton was reporting this week that part of those layoffs have been thousands of the contractors that Twitter hires to do the content moderation. Even if Musk hasn't shown great propensity for actually changing the rules like he said he was going to, then the rules are pretty much all still in place. Firing the people that actually employ those rules is as good as getting rid of them in many ways.

Alex Stamos:

And anecdotally, I have seen an uptick in spam and such. The other thing we've seen seen is all of the impersonation, right? That the thesis of people will not spend $8 to either make a joke or to manipulate the stock price of a publicly traded company. That turns out to be an incorrect thesis. We're seeing that over and over again where people are totally willing to pay the eight bucks, even if they're only up for a couple hours. The best example of this was Eli Lilly, the pharmaceutical company lost billions of dollars in market cap thanks to an impersonation on Twitter, which is amazing.

Evelyn Douek:

I've seen some tweets or analysis of this that says that was generally in line with a stock market drop across the industry at that point.

Alex Stamos:

I've heard that and I've also seen that a bunch of other stocks have recovered and they did not. So yes, it is very hard to always predict what's going on, but the coincidence is definitely up there. And a bunch of companies now after that have decided to stop advertising on Twitter because it's not a safe place. You've got PepsiCo saying Coke is better. You've got Coke saying if we get enough retweets, we're putting cocaine back in into our Coke bottles. So these are things that there's no way for people to tell whether they're legitimate accounts or not. It has become this crazy free for all and it's funny to watch, but if you consider that up until this moment, Twitter made 5 billion a year from advertising, that's got to be half maybe a quarter of what it used to be.

I think we're going to see a cash crunch because my understanding is there's less than a billion dollars on hand at the time that the handover happened. Twitter spends about 5 billion a year. One of the reasons he's cutting all these people and these contractors is he just doesn't have the cash to pay them anymore if their cash flow goes to effectively zero.

Evelyn Douek:

So we have the lettuce, we have Twitter's technical systems, and we have Twitter's commercial viability. Which one's going to go down first, I guess is the question. There's also all these interesting issues. Very briefly with the verification badge and all of these impersonations, once Twitter is adding a verification badge to something that quite likely becomes its own speech rather than the speech of the user in question. So section 230 only provides platforms immunity from what users say on their services. If Twitter goes out and defames a bunch of people, it can't plead Section 230 immunity. So there's all these really interesting legal issues about could these people sue Twitter for verifying and helping fraud, defamation all these other claims as a result of them independently making people think this account is real now. It's unclear at this stage what a blue check actually means on Twitter, whether people actually think that it means this is a real account at this point, but would I expect to see... Again, full employment program for lawyers. This is the section 230 subject unit in the Elon Musk JD program that we're standing up.

Alex Stamos:

Well, and I think there is an interesting question here. Generally these section 230 test cases aren't class section security cases. So now you have this huge bar. There's a huge number of lawyers who just do, every time a stock drops, end up suing any everybody involved. So the fact that those people are now getting involved in section 230 issues, I think is going to be fascinating.

Evelyn Douek:

Okay. But Elon doesn't have time to worry about this because he's acquainting himself with Brazilian politics in some tweets you pointed out to me this morning. What's going on?

Alex Stamos:

So there was an election in Brazil. Before the election, the Brazilian Supreme Court kind of granted themselves the power to censor social media in Brazil. And my understanding is for the most part, American companies are fulfilling those requests. They're at least geoblocking in Brazil, even if they're not taking content completely down. Lula won the socialist candidate. Bolsonaro lost. From my perspective, I'm not super happy. If I had to a fascist and a socialist to vote for, that kind of sucks from a choice, but that's how it seems to have worked. And a bunch of right wing supporters of Bolsonaro are saying that the election was rigged partially because the Brazilian Supreme Court was able to take down this content.

So today you have a prominent conservative journalist tweeting at Elon that he should be looking at censorship and election rigging in Brazil and Elon taking it seriously and having interactions with this Bolsonaro supporter. Now, from my perspective, what to do about the Brazilian Supreme Court ordering you to take tweets down is not an easy issue. But if you're going to have a discussion at Twitter about how do we handle this complicated issue and how do we handle the Brazilian election, it's probably not because your CEO is chilling with a super right wing journalist on Twitter and doing it live via replies.

Evelyn Douek:

Right. "It's on my list to review," he says. How urgent is this? I've got to review the microservices first.

Alex Stamos:

Right. First, he has to turn off all the backend services and see what happens. We'll just throw this big switch.

Evelyn Douek:

Before I weigh into these really complicated normative issues around international human rights law and respecting local law and which body we should defer to in the judgment of whether speech is illegal or not.

Alex Stamos:

Well, at least you can go talk to his head of... Oh no.

Evelyn Douek:

Yeah, that's right.

Alex Stamos:

At least you can talk to his general counsel. Oh. Or compliance officer. No.

Evelyn Douek:

His advisory board, his trust and safety advisory board.

Alex Stamos:

Oh no. Yeah.

Evelyn Douek:

Yeah, his ethics. Oh man.

Alex Stamos:

Chief Privacy Officer. Oh, right.

Evelyn Douek:

Okay. But at least he has 115 million Twitter followers that he can bounce ideas off.

Alex Stamos:

Right. He can call his little kitchen cabinet of venture capitalists who have never had a real job, who have spent their entire lives writing spreadsheets and go ask them, "What should I do in Brazil?" Elon, if you want some revenue, I will pay for Twitter blue if you allow me to silently listen into the phone calls you have with Mike Solana and such asking them what you should do in Brazil. That would be awesome.

Evelyn Douek:

He may be that desperate by the time we record next week. I have to say I was enjoying the impersonation tweets though. It felt like what Twitter does best. So I'm glad that Twitter is going down doing what it loved. My personal favorite was a Roblox impersonation account that tweeted, "We're adding sex to the game now." Which is fantastic. There's always a content moderation angle to these things. You can't hold hands on Roblox, but you can get it on. Yeah. That's great. Anything else to add before we sign out?

Alex Stamos:

Right now, Twitter is still up. And hopefully it'll be up on Saturday. Do you know what is Saturday? You now work for a school that actually cares a little bit about football?

Evelyn Douek:

Oh yeah.

Alex Stamos:

Do you know what is coming up on this Saturday?

Evelyn Douek:

I do not know.

Alex Stamos:

It is the big game.

Evelyn Douek:

Oh yeah, the big game. Right.

Alex Stamos:

The University of California versus Leland Stanford Junior University at football. Two teams that are horrible at the actual playing of football, which is always the best big game when both teams are really bad because it becomes a really hard fought, fun battle.

Evelyn Douek:

Well, my emotional investment in football may be like the right wing media ecosystem with Kari Lake's vote count where when we're up, I'm like, "Yes, I love football. I'm getting into this." When we're down, it's like, "Never cared. Don't even know what these games are."

Alex Stamos:

Did you ever go to a Harvard Yale game?

Evelyn Douek:

I did not. Was I missing some spectacular skilled football?

Alex Stamos:

No, you did not. But I do recommend at the University of California Memorial Stadium, it'll be a beautiful day to go watch Cal hopefully beat Stanford and to retain the Stanford Axe, which is the traditional prize of that victory.

Evelyn Douek:

Well, we will have to have an important update on this critical matter next Monday. That has been your Moderated Content news roundup for this week. The show is available in all the usual places, including Apple Podcasts and Spotify. Show notes are available at law.stanford.edu/moderated content. This episode wouldn't have been possible without the research and editorial assistance of John Perrino, Policy Analyst at the Stanford Internet Observatory. It is produced by Brian [inaudible 00:33:48]. Special thanks to Alyssa Ashdown, Justin Fu and Rob Hoffman. See you next week for more Moderated Content.

Alex Stamos:

Go Bears.