Moderated Content

MC Weekly Update 12/19: Twitter's Thursday Night Massacre

Episode Summary

Evelyn and Alex talk about news happening in other corners of the trust and safety and platform regulation world: bills are introduced to ban TikTok; Meta released its annual adversarial threats report; a tech industry trade body filed a legal challenge to the California Age-Appropriate Design Code Act and asked SCOTUS to review the 5th Circuit ruling upholding Texas' social media law; Trump had thoughts about defunding Stanford. And then, yes, they discuss the ongoing Twitter death spiral.

Episode Notes

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

A bill that would ban TikTok in the U.S. and could be extended to other social media companies with ties to “foreign adversaries” was introduced in the House and Senate, but lacks Democratic co-sponsors in the upper chamber. - Lauren Feiner/ CNBC, Rebecca Shabad/ NBC News

Meta released its annual report on “Coordinated Inauthentic Behavior Enforcements,” noting the milestone of 200 takedowns. - Ben Nimmo, David Agranovich/ Meta, Alexander Martin/ The Record by Recorded Future, @DavidAgranovich, @benimmo

Tech trade association NetChoice sued the state of California in an attempt to block the California Age-Appropriate Design Code Act over First Amendment protections for content moderation. The law would go into effect next year with broad online privacy and safety components for children. - Natasha Singer/ The New York Times, Cat Zakrzewski/ The Washington Post, Rebecca Klar/ The Hill, Lauren Feiner/ CNBC, Rebecca Kern/ Politico Pro

The Supreme Court schedule is set for hearings on Gonzalez v. Google and Twitter v. Taamneh on February 21 and February 22. The cases are focused on content moderation and recommendation algorithms. - Adi Robertson/ The Verge, @GregStohr

"Former President Trump said Thursday that he’d ban the U.S. government from labeling any domestic speech as ‘misinformation’ or ‘disinformation’ if he returns to the White House.” - Julia Mueller/ The Hill

Matt Taibbi named the Election Integrity Partnership in a Friday afternoon version of the Twitter Files. - @mtaibbi

Twitter suspended over 25 accounts that track private planes and nine journalists — including CNN’s Donie O’Sullivan, Ryan Mac of the New York Times, and Drew Harwell of The Washington Post — who shared links about the @elonjet account which posts public information about the location of Musk’s private jet. Most reporter accounts have since been reinstated after Musk conducted a Twitter poll on whether to enforce his new policy against sharing flight trackers and similar information. - Jason Abbruzzese, Kevin Collier, Phil Helsel/ NBC News, Ashley Capoot/ CNBC, Ryan Mac/ The New York Times, Paul Farhi/ The Washington Post, Jordan Pearson/ Vice

Musk banned linking out to other platforms… and then conducted a Twitter poll, subsequently reversing the decision, with 87% of voters opposed, and taking down the tweet announcement and blog page on the policy. Some users are still unable to post links to Mastodon and other social media sites in tweets. - Mack DeGeurin/ Gizmodo, @JuddLegum

Musk conducted a scientific Twitter poll asking if he should step down as CEO. Nearly 58% of the more than 17 million respondents voted for him to step down. - Alexa Corse/ The Wall Street Journal

It was coincidentally just after he was at the World Cup with Jared Kushner and... a bunch of Emiratis. Eurasia Group President Ian Bremmer quipped that twitter’s content moderation panel looks different these days. - @ianbremmer

Sports balls were kicked and a team scored more points than the other team after time was added, and then stopped, and then added, and then people lined up to kick more balls into the net than the other team. Congratulations to Argentina! - Ben Church/ CNN

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Evelyn Douek: One of our friends told us this week that they're not listening to this because I speak too fast and personally.

Alex Stamos: Oh wow.

Evelyn Douek: Yeah. I'm offended because I thought we were doing a service by pre putting on the 1.5 setting on this podcast and helping people out. But there you go. I'm going to try and work on that.

Alex Stamos: I will too. Certainly I am as well.

Evelyn Douek: We should have hand signals for slow down. Welcome to Moderated Content's weekly news update from the world of trust and safety with myself, Evelyn Douek and Alex Stamos. Alex, I think you joked this week that the only way to keep this podcast current was to do a 24 7 livestream. I don't even know. This is probably dated already, so only 30 seconds in.

Alex Stamos: Right. So we're moving to launching an AM radio network across the country.

Evelyn Douek: Dreamed of this since I was a little girl finally coming real. Okay. There are other things happening in the world of social media and trust and safety this week, which I think we'll leave the exciting stuff to the end to keep our audience. But let's start further afield. TikTok, there was a bill introduced this week to ban it around the country. I love dash detest the US practice of coming up with fantastic or terrible acronyms for bills. So this one is The Averting the National Threat of Internet Surveillance, Oppressive Censorship and Influence, and Algorithmic Learning by the Chinese Communist Party Act or the ANTI-SOCIAL CCP Act. 'Cause if there's one problem with the CCP, it's that it's antisocial.

Alex Stamos: Right. Yeah, exactly. Some congressional intern definitely earned their 12 bucks an hour for that one. 

Evelyn Douek: Yeah. They were probably up late eating pizza, and then at some point three AM like, "Got it." Okay so this was introduced by Rubio in the senate and Mike Gallagher and Krishna [inaudible 00:01:58] representative Krishna Murphy in the House. And it would ban all transactions with certain foreign-owned companies, social media companies within the US, and it specifically designates TikTok as and [inaudible 00:02:13] as one of those companies. Although interestingly, it doesn't define social media company, but it does leave it broad enough that by my reading this could also encompass WeChat, which potentially more of a concern. I don't know if you want to talk about that a little bit.

Alex Stamos: Yeah, I've always thought that there are legitimate concerns around TikTok, but it's not the top of my list when I think of Chinese companies. First, TikTok is kind of won fair and square. TikTok just out implemented Snap and Instagram. They built a product that young people like to use. They created a community of creators that people like to tune into. And they did so fairly without any overt help from the Chinese government. When you look at WeChat, WeChat's popularity around the world is driven by the great firewall because it is effectively you are part of the Chinese diaspora. If you have any family back in The People's Republic, the only way you can communicate with that family is via WeChat. And that is something that's driven by the Chinese Communist Party. WeChat is not end-to-end encrypted. I expect effectively everything on it is archived somewhere in Beijing, that WeChat is probably one of the most effective open source intelligence gathering mechanisms for the Chinese Communist Party in the world.

And I know for a fact it is used to keep an eye on people who have immigrated from China to be students or employees in the US. It is how if you are a student at a US university, and you're at some kind of event where you say something that is anti CCP, you could get a message on WeChat from somebody. So that's a much bigger concern for me. And TikTok is just popular, and it's trendy, and people want to talk about it, but very little important stuff happens on TikTok. It's still stupid dances. And one of the reasons I think people enjoy it is during the pandemic it was just very super lightweight. There's not discussion of vaccines for the most part. There's some, but not, it's not dominated. There's not a lot of talk about politics. And the amount of information they have is important.

But compared to a bunch of other products that I've had like insecure ad libraries and stuff, I don't see it as a huge information gathering. So I think we have to pay attention to TikTok. But what I'd prefer is for these laws to be aimed at certain kinds of PII not being able to be accessed by people who are physically in the PRC or PRC organizations. And they do a poor job of defining that. They just basically say these companies are completely banned. Whereas if they looked at the data, and they basically built a privacy law that explicitly looks at US adversaries, I think you'd catch up both a broader set of companies, and you'd also disincentivize the kind of joint ventures and such that have been a real problem for intellectual property to be drained out of the US.

Evelyn Douek: And that kind of targeted legislation is much more likely to be successful and constitutional. If you cast your mind back six million years ago in social media time to September 2020, now you might remember Trump also tried to ban TikTok and WeChat, and there was actually a case about this in a district court in California where the court found that banning an entire platform is not actually proportionate and is a violation of the First Amendment because you are banning so much more speech than is necessary. And the court talks about in that judgment how this is a really important platform for many people, many Chinese people in the United States that talk to their family. So there's a lot of stuff going on that platform, and some of it's really important. And the First Amendment requires laws to be narrowly tailored to the purpose. 

And the government sort of just waltzed in said, waived it hands, "National security threat," banned the entire thing, and that didn't hold muster. So that case was eventually mooted, and didn't work its way out through the courts, but you would absolutely see that kind of litigation and that kind of argument being run again.

Alex Stamos: That's a good point because there is a bit of do we beat the Chinese by copying him here? The United States does not have a great firewall. We do not prevent our citizens from voluntarily using products created in other countries. But I would like to see, I mean I'd like us just to have a federal privacy law. And if we had a federal privacy law, what we could do is fill in a gap that GDPR had, which GDPR does not foresee there are countries in which you don't want to process data. I think one of the things we could do is we could explicitly like this Bill does say, "These are countries that are adversaries of the United States. There's nothing that you can do to make it legal to transfer certain types of information to those countries."

Evelyn Douek: So something to watch there. In other social media news this week, so Meta released its big annual roundup of the influence operations that it's been taking down over the course of the year, and adversarial operations, and highlighting trends that it's seen. What were your big takeaways from that report this year?

Alex Stamos: This is a very interesting report because the focus here, it was not on kind of traditional influence campaigns, or even government hacking, but what's called the surveillance for hire industry. So while I was at Facebook, we kind of started a lot of this work focused on NSO Group, which is a private company in Israel that sells effectively cyber weapons as you may call it, but services and tools to lots of folks including authoritarian governments, including apparently Saudi Arabia, the UAE, folks like that. NSO Group has been targeted for sanctions in a variety of legal ways. And this report expanded on that of talking about a bunch of other entities that are selling weapons to countries that can't build it themselves. Now interestingly enough, usually kind of my assumption here is that these tools are used by countries that don't have the in-house capability, but even then, it looks like China and some other big countries are also using it for their less important stuff, right?

In the US, we have thousands and thousands of employees who work for cyber command NSA, they can go build these tool chains, go do this kind of surveillance work themselves, whereas if you're say Mexico or the UAE, doing so inside of your government is prohibitively expensive. And so buying the capabilities from these companies is really useful. And then turns out these larger countries that have the capabilities will use it kind of on the lower end. So you might reserve the best ministry of state security hacking for your biggest, hardest targets, and then buy this kind of stuff for normal journalists. So Meta kind of revealed a bunch of information. They released a huge list of IOCs. There's Indicators of Compromise. So they released the domains, the command and control domains that are used. So that is a big deal. That is a big F U to these guys because they are effectively burning down the infrastructure that these companies have spent years creating of fake domains that you can use to either spearfish people or they can use for command and control that will perhaps get past people's filters.

And so right now, the entire information security industry is taking that list of IOCs looking in the past. And so I expect because of this action, we'll see more and more information as people pull the strings on all of those campaigns. And they did cease and desist letters. So they basically are trying to ban these platforms. I think you can talk more about the legal issues involved from [inaudible 00:08:56] and LinkedIn and the [inaudible 00:08:57] and such, but they're clearly trying to lay down a fact pattern under CFAA that allows them to try to prosecute these people if they come back and use any Facebook products. So yeah, really cool report. Obviously a lot of work from this team, and they should be very proud of it because this is a huge issue that we should not take for granted is we talk about the high-end stuff of Russia and China and the US, but every country in the world wants to spy on certain people.

And you might consider some of that legitimate. For me, when they're spying on their own citizens, or activists, or journalists, and that should be stopped. And one of the best ways to stop that would be going after the private companies that sell capability.

Evelyn Douek: Right. Yeah, I mean it is an impressive report. One of the things that's striking is the diversity. Like you said, we talk about stereotypes here in terms of adversarial threats, but if you look at even the influence operations, they highlight that yes, Russia had 34 networks that they took down, but Iran had 29, and Mexico makes an appearance with 13. So this is not just "The Russians are coming" story. I can't help but tell my favorite coordinated inauthentic behavior anecdote that you told me once, which is that you were sitting around in 2016, 2017 trying to come up with a name for this thing that you'd found on your platform when you were working at Facebook and saying, "What do we call it?" And coordinated inauthentic activity was something that you were throwing around really happy with this, but decided that the acronym CIA was possibly not the best thing to go with. So a lot of progress has been made since those early days, I guess.

Alex Stamos: Yeah, true story.

Evelyn Douek: All right, so a couple of updates from the legal world. I like to joke that we will make an Elon Musk JD degree, which I'm not sure how much of a joke that is at this point, but I also think that we could do a NetChoice and platform's First Amendment. NetChoice is the industry trade body that represents most of the platforms that you hear see in the news. And it has filed a couple of important legal documents this week. So it challenged in California, the California Age-Appropriate Design Code Act saying it violated the first amendment. This is a law that would've gone into effect next year or may go into effect next year with supposedly purportedly an online privacy and safety bill for children. It's extremely, extremely broad. 

A lot of the terms aren't defined, but children for example, are defined as people under 18, which is a lot of 17-year-olds would be struck by that definition of children. And so that's basically the argument that NetChoice is making in that case there, that this is substantially over broad and would require them to err on the side of caution and basically offer very vanilla services for fear of breaching that act.

Alex Stamos: Yeah, and we're going to get involved in this a little bit at SIO. This law is plainly unconstitutional. It is based upon a UK, is a UK baroness pushed it based upon a UK law. I'm not a history major, but I remember us fighting some wars that we don't have to follow UK law anymore, that we have this thing called the United States the Bill of Rights that the UK does not. So it's clearly not going to survive judicial scrutiny, but I do think this creates an opportunity to try to replace it with a law that actually makes things better and that is constitutional in California. And so our group at SIO is going to be hosting a series of meetings next year to try to get people, child safety experts, civil libertarians, companies, civil society academics, to try to get something passed in California to replace it before the January 1st, 2024.

Evelyn Douek: Right. Yeah. Americans will never miss an opportunity to throw shade at the UK if they can. Chris Marchese NetChoice's council, I saw him quoted in I think it was the Times this week saying, "Although the UK has a similar law on the books, it has neither a First Amendment nor a long tradition of protecting online speech," which is just take that backhand. 

Alex Stamos: Yeah.

Evelyn Douek: Fair enough. So that's something I'll be watching for sure. It feeds into this general milieu of NetChoice cases around the country. And NetChoice this week formally filed its cert petition at the Supreme Court challenging or asking the court to review the Fifth Circuit decision upholding Texas' social media law known as HB 20, which was upheld a couple of months ago that would require social media platforms that technically the wording is to not take down content based on viewpoint, and that would disable quite a significant amount of content moderation, which is based on viewpoint.

So that has not a surprise at all that that's been formally filed. First Amendment experts are broadly expecting that the court is going to take this up along with the cases coming out of Florida and the Florida social media bills. They raise similar issues, and they have similar arguments as being made in California. These are blockbuster cases that cannot emphasize how dramatic this could be in transforming the First Amendment landscape. So this is my NetChoice seminar syllabus is writing itself at this point. The Supreme Court also announced that it's going to hear Gonzalez v. Google and Twitter v [inaudible 00:13:47] the two Section 230 cases on February 21 and 22. So for all of you out there that are waiting with bated breath, mark your diaries. We can have a watching or listening party. And this is all again, hugely important cases that could transform Section 230 and online liability for social media companies.

This is really great for me. I am teaching a seminar on the first amendment and platform regulation in April, and this is justifying my procrastination. I cannot write that seminar with all of this activity happening in the next few months. So stay tuned.

Alex Stamos: Awesome.

Evelyn Douek: All right. The former guy, former President Trump was laying down some truths this week. He truthed that he would ban the US government if reelected from labeling any domestic speech as misinformation or disinformation if he returns to the White House. And in my favorite truth, if any US University just hypothetically ...

Alex Stamos: Hypothetically.

Evelyn Douek: ... is discovered to engage in censorship activities or election interferences in the past, such as flagging social media content for removing of blacklisting, those universities should lose federal research dollars and federal student loan support for a period of five years and maybe more. Alex, are you concerned that you're endangering all of Stanford's federal research dollars and federal students' loan support for the next five years? Throwing shade at you here.

Alex Stamos: Yeah, so clearly he's referencing us. Unfortunately, our work at Stanford Internet Observatory has been tied into a bunch of conspiracy theories that we're helping the government censor. Now, of course the government, the people who were in the government during the 2020 election was Donald Trump himself. And the work we did around election security was approved by lawyers working for the Trump administration. But as is normal, he kind of forgets who was actually in charge during these things, and has spun a bunch of lies following the law in some really bad journalism and some amplification from Fox News and the like of things that are totally untrue, and is now threatening the funding for Stanford based upon our First Amendment protected academic research. So one, universities generally don't have the ability to censor speech on big platforms. 

We have the ability to publish stuff of what we think is going on and should happen, but with that, my postdocs don't have any power to actually change things like content moderation rules. And so anyway, I'm glad that Donald Trump read our report, which you can read at eipartnership.net. You can actually order a paper copy. Maybe I'll send him a paper copy to Mar-a-Lago, and he can go put it in one of the boxes next to all of his TS SCI classified documents.

Evelyn Douek: It's a great plan. I mean if you're looking for Christmas gifts, here you go. All right, well thank you SIO for providing a nice segue to our weekly Twitter and Musk segment. So yeah, SIO and the Election Integrity Partnership were mentioned, featured in Matt Taibbi's latest installment of the Twitter Files.

Alex Stamos: The Twitter files.

Evelyn Douek: How you feeling, Alex? You're a bit of a celebrity. I think you were on Fox News last night.

Alex Stamos: I was on Fox News last night. Yes. So it is, it's kind of funny to do this work where we are doing daily live streams. We have a YouTube channel full of our videos of discussing all the work we're doing. We're doing blog posts and updates and wrote a 270-page report, and then people publish pieces of a report that we've been flogging for years now and tried to get anybody to read as some kind of secret document. So yes, Taibbi found some emails inside of Twitter where they were discussing a report that some issues that we passed through from state and local election officials. So the way this worked is that state and local election officials could send election disinformation to the election integrity ISAC, which we did not run. That was run by a nonprofit called CIS, and then the Election Integrity ISAC could to the platforms.

But we also got access to it because one of the things we do is go look of "Is this thing kind of going wide under their platforms? Is it being spread by different places?" And we could provide that then to the platforms themselves. And the examples in this case were straight up lies about both how elections work. And then in one case somebody lied about the day you can vote, which is actually illegal, right? It is illegal to tell people you can vote on Wednesday when it's Tuesday. People have gone to jail for that. And so I actually feel pretty good about Taibbi. If this is the worst thing is that he found that we looked into claims that were sent to us from secretaries of state, then that's correct. And we did get a bunch of those, and they replied to both Democrats and Republicans. 

In fact one, well I'm going to save the best story for that for the inevitable house hearing, but there are a bunch of situations in which locals and secretaries of state reported to us stuff from Democrats that then we report to the platforms of Democrats saying things that were wrong or misrepresenting processes and such. And in the end, we have the ability to research this stuff and say, "Hey, we think this is true or not. We think this is violating or not." But in the end that these platforms have the First Amendment protected by the First Amendment and as well as Section 230 to make moderation decisions on their own. We have no power to enforce that. So, yeah so we showed up in the Twitter Files. Everybody wants to flog. There's some kind of conspiracy here. 

But if an academic institution using private funding wants to go do academic research into disinformation online, I think that's an appropriate thing and something that is compatible with the best of the First Amendment. We have our opinion on this speech, and we will continue to publish it. And we will not be intimidated into not doing so in 2024 and beyond.

Evelyn Douek: Great. Yeah, you're doing something right when you don't want to give spoilers for your future congressional testimony.

Alex Stamos: Yeah, no, it's definitely coming, and that's going to be fine. I mean we volunteered effectively to testify on this stuff before, and people weren't interested. So now if Jim Jordan is the guy who wants to give us the ability to talk to the Congress about what we found and the number of people in the United States who are I see very slowly poisoning democracy by telling their ideological supporters that voting doesn't matter, I see that as a poisonous thing. And we will go talk about that, and we will have as is our right, lots of empirical evidence to back us up, which I think will be a lot of fun. Great opportunity to check a lot of stuff into the congressional record.

Evelyn Douek: Excellent. Somehow the Twitter Files-

Alex Stamos: The Twitter Files.

Evelyn Douek: ... got buried this week. They were nowhere near the top Twitter story. Musk was treading on his own big releases. The big story I think from Twitter this week was the banning and subsequent unbanning. But the Thursday night massacre, as I saw you saw say call it at some point of a whole bunch of journalists that had been covering Elon Musk and his companies, it was purportedly on the basis of them doxxing him by covering stories about the @ElonJet account, which we sort of mentioned jokingly last episode about how hilarious is this. Musk is taking down the @ElonJet account, which provides updates on the location of Musk's private plane, which by the way is based on public information. And this actually turned out this joke that I found kind of slightly amusing turned out to I think be a real turning point for in the Twitter saga.

And I think you autocrats first come for the free press. I think when all of these people were adulations about the First Amendment, they must not have read the free press clause, which is also in there. So I think this was really significant and a significant step back from all of his free speech rhetoric.

Alex Stamos: Yeah, I mean think this is definitely the emperor has no clothes moment for Musk in that he isn't actually a free speech advocate. He wants Twitter to make decisions that directly corresponds to what speech he wants to see and does not want to see and specifically for him, and his friends, and people of his economic strata. So first off, the tracking of the private jets isn't just about rich people who fly in private jets. You have to be so rich for this to work that you own your own jet, right? So somebody who kind of occasionally rents a NetJet or something, this kind of tracking doesn't work because you end up with different tail numbers. This is only for people who own their own jets. In this case, these are jets that are owned by SpaceX, which he is a majority owner of. Okay, so a lot to unpack here.

One again, a lot of these issues, he has a good point which is how do you deal with doxxing on these platforms of providing the information on people's location, where they live, how to contact them, how do you deal with that and reduce risk while also allowing criticism and political speech? It is a huge problem for the companies trying to figure that out. And I believe that Twitter has not done enough around doxxing. This is, as somebody who I've been doxxed, my kids, I've been sent pictures of my kids and got some threats from ISIS and all kinds of fun stuff in my past. And I would like to see these platforms have more aggressive doxxing policies. And you could see if you thoughtfully went through doxxing policies, that one things you would catch would be searching kinds of real type tracking of people's location.

That being said, when you're tracking planes, and it's with public information that's being broadcast all over the country, then that's the kind of thing that you might not. But I think there, there's a legitimate argument to do that. If you really cared about that, what you could do is kick off a process by which you come up with a new policy, you come up with the interpretations of the policy, you announce it, and then you enforce it going forward. The way you don't do that is all of a sudden you ban a bunch of people at night because you're pissed at them, come up with a story that turns out to be not true about this being related to a risk to his son. Apparently something scary happened to his son. It was mile dozens of miles away from the closest airport in LA. It had to do with probably somebody knowing where his mom lives, the musician Grimes, which is scary and about doxxing, and I can understand him being scared, but it was just obviously a post-talk rationalization here.

And so he is clearly just kind of out of control on this that he says, "This is something I want gone," and then he asked the Twitter Trust and Safety Team to come up with a post-talk rationalization, which there are some people who have legitimate LinkedIn histories who work at Twitter Trust and Safety, and they are burning down their reputations now. I feel actually really bad for those people because the folks who are now implementing these decisions because he's also, they have to go out there and defend it, defend it, defend it, and then he completely reverses himself. It's a lot ... There's a very trumpish kind of cycle here that people are not going to be able to save the reputations from doing this kind of stuff on Twitter. But yeah, he's kind of out of control, and doesn't know what he's doing, and is clearly not having a good time.

Evelyn Douek: He is [inaudible 00:23:58]. I completely agree that there is a temptation to completely overreact to all of Musk's points and dismiss them out of hand because it's Elon Musk, and he [inaudible 00:24:07], and is making no sense, and he is not making these choices for rational reasons. But doxxing is a legitimate issue. And you were commenting that there are certain laws against election misinformation when it's really blatant about voting times and things like that. Turns out there are certain kinds of speech that are illegal under the first Amendment. That's one of them. That's a kind of false speech that by the way, can be banned consistent with the First Amendment, and doxxing is another one. There are certain laws against doxxing, and there are legitimate debates about the broadness, definitions, things like that. David Sacks pointed to the California version this week. 

It's really clear that the reporters reporting on @ElonJet are nowhere near this definition of doxxing. But as you say, that doesn't mean that there isn't an underlying issue that needs to be discussed. And this policy that they released was ridiculous. It was clearly not thought through. It was so broad that it would catch me saying, "Hey, I'm at Stanford today talking about my paper, blah, blah, blah, come down and check it out," which would be really sad for me because those tweets normally get the hoards storming the building to come see my talks.

Alex Stamos: Right. During this time, Musk was posting about how he was at the World Cup. He was posting pictures from which you could figure out exactly where he was sitting, which it turns out with Jared Kushner and a bunch of perhaps Emirates. So he had good seats, as you might expect at the World Cup final. These policies are quite hard, but if you are going to strengthen the policies, one of the accounts that would really get caught up in this is Libs of TikTok, which drives a huge amount of real world abuse at people they disagree with. And clearly any expanded doxxing policy would catch that account. Instead, he is positively interacting with that account and talking about it, and has removed whatever small limits Twitter placed in the past. So yeah, it's clearly there's no consistency here. There's just his fan boys who are out there trying to come up with these rationalizations.

Evelyn Douek: And we've talked before about other legal issues, and one of the jurisdictions that might reign him in here is Europe. And it turns out that a lot of this behavior, this completely ridiculous post hoc rationalization once the Digital Services Act comes into force very soon would be unacceptable. And Twitter could be subject to huge fines. Article 15 of the DSA requires a statement of reasons providing a clear and specific statement of the reasons for that decision, which basically no one was provided this week as this whack-a-mole was going on. So that's something to watch. The other potential policy that could have got him in trouble with the EU, we had an EU policy expert, competition policy expert lined up to comment on this, but then we took them off the hook because this policy was reversed. 

But briefly, Musk was banning links to all other, or a bunch, bunch of other social media platforms, interestingly, not all other social media platforms. It was a weird selection of social media platforms, which I'm curious for your thoughts. Notably the TikTok that we were talking about when we started the show was not on it. Facebook, Instagram, Mastodon, [inaudible 00:26:56] so and just random handful there. Yeah. What are your thoughts on this, Alex?

Alex Stamos: So I mean the TikTok is clearly, Twitter is downstream with TikTok from a youth culture perspective. And so there are lots of tweets on that get a lot of play on Twitter that are really just reposted TikTok videos. And so that they didn't want to take that down because they get a lot of engagement from exposing TikTok to a broader audience of middle-aged normies who otherwise wouldn't find it on TikTok. But overall, yeah, I mean this is, as I tweeted an incredible demonstration of weakness, right? Musk himself had a tweet kind of referencing the Berlin Wall a while ago of economic systems that keep people in that's not a demonstration of strength. And I agree right? This is effectively a Berlin Wall for Twitter to keep people from leaving. And it's not a demonstration of strength. It's a demonstration of weakness.

Evelyn Douek: Speaking of the Berlin Wall points to Mastodon whose official account tweeted, "As a company from Eastern Germany, we know that building a wall to try and keep people from leaving isn't a good idea." So just some social media manager, there's-

Alex Stamos: Moral authority behind that one. Yeah.

Evelyn Douek: Exactly. Yeah and so this ended up being extremely controversial, and had quite a large fallout obviously as you would expect in the Valley. Do you want to talk about that?

Alex Stamos: Yeah, so what happened is then they just started randomly shutting down people who were linking to different Mastodon instances. And one of the people who shut down was Paul Graham. He's the founder of Y Combinator, a very well-respected venture capitalist here in Silicon Valley, and somebody who has supported Musk. He has a bunch of tweets kind of saying, "Musk has built cars and gone to space. How can you question him, all these losers questioning him." And then started turning a little bit against Musk and then said, "Well this is the final straw. I'll be at Mastodon," when he started banning journalists and announced this policy about linking out, and then his account was taken down, which created a-

Evelyn Douek: Can you explain for people outside of this world why that's significant, or how significant that is? Because I saw a lot of tech people being like, "Whoa, Paul Graham." Why is it such a big deal?

Alex Stamos: Yeah, it's a big deal because it has been trendy since Musk's take-over to be on Team Musk, right? There's a whole kind of culture of John Galt builders. "We go it alone. You are a loser unless you've built something." And I think that attitude, there is some accuracy to that. I certainly have felt in the past when I read something in the New York Times that criticizes tech companies on both sides that it's easy to be a critic. It's harder to actually do things because you have to make real decisions and be responsible for real decisions. But no matter what, a critic can always find something to criticize. So I understand that impulse, but that impulse has driven them to backing Musk, right? They've totally pivoted from "Journalists are sometimes unfair about tech," to, "Musk is our new God king, and is setting a new stage for Silicon Valley," and it's become very trendy to back him.

And Graham was one of the most prominent people doing that, a thoughtful guy who's done lots of investment, who's not an edge lord, who is not a white nationalist, but was supporting Musk. And the fact that he was banned and has, he still hasn't tweeted even though his account's been turned back on. And he's like, it looks like he's permanently moved to Mastodon is a big deal. It was a big signaling effect of something I've been talking about for a while now is I think the bubble has popped on Musk. And now we're going to have a deflation of people are going to pretend that they never supported him so aggressively. They're going to kind of whitewash their history and just like Sequoia has tried to do with FTX and such.

Evelyn Douek: Okay, so in keeping with that, we are recording at 11:00 AM Pacific Time. As far as we know at the moment, Musk is still CEO of Twitter. I don't know if that is not, it's because he hasn't yet woken up and seen the results of the poll that he put up last night asking Twitter uses to vote, should he step down as CEO of Twitter, and that he promised to abide by the results of that poll. Now as with all of these polls, I expect it to be a very legitimate democratic process that he is taking the results very seriously [inaudible 00:30:56]. What prompted this sudden turnaround, Alex? Who could think?

Alex Stamos: Well, I mean it's pretty clear that his weakness in this whole thing has been capitalism, has been that going and buying Twitter and selling a bunch of stock to do so Tesla stock, and then running it as an own personal fiefdom, and destroying his personal brand, and therefore destroying the brand of Tesla was going to be his weakness. A number of us have been saying this the entire time, and I think that finally came to roost. Over year to date Tesla has lost over 60% of its stock value, and in the last week, that drumbeat became very loud. The third largest individual owner of Tesla stock, so the biggest owners are Vanguard and Fidelity and such, but the third largest individual owner of Tesla stock personally said that the board needs to do something and possibly fire Musk. There was somebody who put in to run for the board, another investor.

So we were starting to see the investment community turn against him and say that he needs to focus on Tesla and stop destroying the brand of Tesla. And it is interesting. He flew to Qatar. He met with Jared Kushner and then possibly people who were, some of his investors in Tesla and elsewhere who are probably not so super happy with their sovereign wealth funds having all their money burned in this little experiment, and right afterwards cruelly decides that he's not having fun doing this anymore. So I do think it is the threat of his CEO position at Tesla that caused this. And after he announced this, actually Tesla stocks were up in pre-trading. Now they're about even. It has stopped. Just him having the poll has stopped a downward slide of Tesla. And so we'll see what happens whether he follows it or not. Clearly finding a CEO is going to be incredibly hard.

Who would possibly want ... He's still going to own Tesla. Whoever the CEO is, they're going to get random texts in the middle of the night that Musk wants certain content taken down or put back up. And he has destroyed the economics of Twitter, right? Twitter used to make five billion a year. They're probably down to one, one and a half in revenue. In an alternate universe, you could have had Twitter do 15, 20, 30% staff cuts thoughtfully based upon performance ratings and become profitable, survived the macro conditions, and still been an operating concern. But he has cut 70 to 80% of the staff in a way that drove away all the best engineers and product managers and such. And so it is a nightmare of a company for somebody to take over now. So I don't know if he's going to be able to get out 'cause I'm not sure anybody qualified will take the CEO role.

Evelyn Douek: Yeah, it's definitely a poison chalice. I think there's a big question of whether a new CEO could turn it around at this point. I think this week was a real turning point. I think a lot of people just quit. The Thursday night massacre saw a massive exodus which may have prompted the link ban policy. I think both you and I decided we were pretty much vacating our Twitter handles at this point and moving to Mastodon on which I'm doing reluctantly. So I don't know. I don't know if it's quick enough whether a new CEO could turn it around. There are still a lot of people that haven't jumped ship because the friction and the startup costs for new social media platform are just too much, not that they're so invested in Twitter. So it's possible. Would you be optimistic? I mean there's the economic side, but there's also the user experience side, and Twitter is becoming a bit of a wasteland of content now.

Alex Stamos: It is. So it's becoming a wasteland, and the spam and abuse is becoming a huge problem as you and I both predicted. If you get rid of most of your engineers, you're not going to be able to fight those things. In fact, today our team at Stanford Internet Observatory led by David Thiel did our analysis of Chinese spam. So there was these huge amounts of spam during the protests in Chinese cities. And it looks like this is less about a coordinated effort by the Chinese Communist Party and more that just anti-spam capabilities at Twitter, perhaps, especially not in English, have collapsed that basically Twitter is becoming useless for certain hashtags because if a hashtag is at all popular, the spammers can run rampant for weeks and weeks and weeks. So if you go to [inaudible 00:34:53] you can read that report with some of the numbers behind it.

But yeah, Twitter is becoming a bit of wasteland. If anything I post is now I get some death threats, and I get people saying all kinds of nasty stuff, and then if I put something on Mastodon, I get intelligent conversation, right? And there's some abuse on Mastodon. I think there will be much more as it grows. But the curve is pretty amazing. There's been 300,000 Macedon accounts created in the last week, just 76,000 created yesterday with all this stuff going on. So certainly there's an exodus, and it is not just random people. It is your elite tweeters, the journalists, the academics, the politicians, the people who made Twitter interesting was that you can have kind of normal folks rubbing elbows intellectually with people like that, the folks who run the world, and you know have really high profile sports figures, and activists, and people who are normally not super political like actors and stuff make a jump.

George Takei is now the biggest, the most followed Mastodon on account. So that's kind of a nature is healing thing. So yeah, this might be an irreversible decline for Twitter. I think the longer Musk owns it, the worst it's going to get for Twitter. Probably the best thing he could do would be allow the banks to take it over. Twitter is not going to be able to make its payments without Musk subsidizing it. And so one option would be some kind of controlled bankruptcy where the banks take it over, he steps aside. And then you're going to see maybe a private equity firm buy it for two, $3 billion, $4 billion and then try to rescue it, which will be an incredibly difficult, but it'll be easier to rescue it with Musk completely out of the picture than in a situation where he is continuing while he's not running a day-to-day, he's continuing to meddle all the time.

Evelyn Douek: So Mastodon, come join us, and then I look forward to the future episodes of the show where instead of having a Twitter musk segment, we'll need a sound effect for our Mastodon content moderation dumpster fire segment.

Alex Stamos: It needs to be like the trombone, but sound like an elephant, right. So we'll have to go find that.

Evelyn Douek: We'll do some research. If you have such a sound just sitting in your back pocket, please send it in. Sports segment Alex, it's a real shame that there were no major and exciting sport events for our sports segment to cover. I just can't think what it might be. 

Alex Stamos: Well yeah, so it does tie into one of the people watching, Elon Musk was at the final, perhaps the most exciting and thrilling World Cup final in a very, very long time if ever. Argentina defeated France in penalty kicks. I hate penalty kicks. It is the worst way to end any kind of, you know what? My dad actually had a good idea here. My dad was a soccer ref for a long time, and reffed at very elite levels that every period you do 10 minute extra periods, and every 10 minutes you remove a player, right until eventually you've just got one offensive guy and one goalie on each side. 

Evelyn Douek: Messi against Mbappe-

Alex Stamos: Yeah.

Evelyn Douek: ... just running up and down the field-

Alex Stamos: Oh my God. Wouldn't you love that?

Evelyn Douek: They just collapsed in exhaustion. What a great game. 

Alex Stamos: Yeah it would. Actually, I mean, it'd be way better than penalty kicks, right? But yes.

Evelyn Douek: It's true. 

Alex Stamos: A totally thrilling-

Evelyn Douek: Hunger Games World Cup final. I love it.

Alex Stamos: Argentina was leading two nil until the last 15 minutes, and then two scores from France out of nowhere. I liked watching Emmanuel Macron was there, and you could see as it got crazier, crazier, he was just like clapping in his suit, and then his jacket was off, his tie was off. It was like he was losing control. I expected him to cut to him, and he was going to have the blue and red face paint. Anyway, it was a great game, incredible. And we're going to have to find another sporting event for us to cover.

Evelyn Douek: Yeah for anyone that is newly thinking about taking up soccer fandom, because it looks like a great game, it's always like that. It is always that exciting. 

Alex Stamos: Oh yeah, sure.

Evelyn Douek: Definitely it is. It's a great sport. Join us. And so with that, this has been your Moderated Content weekly update. This show is available in all the usual places, including Apple Podcasts and Spotify. Show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't have been possible without the research and editorial assistance of John Perino policy analyst at the Stanford Internet Observatory, is produced by Brian Pelletier. Special thanks also to Alyssa Ashdown, Justin Fu and Rob Huffman. And if you happen to celebrate a holiday with a gift giving tradition, consider giving the gift of an excellent podcast recommendation to your loved ones or a podcast rating and review to your favorite podcast host whoever they may be. Happy holidays. See you next week. And we will see if Twitter still exists or has a CEO.