Alex and Evelyn discuss what was a ridiculous week for Twitter, even by its new standards: Germany is threatening to fine it over poor content moderation of illegal content; it made a hash of state-affiliated media labelling; and got into a fight with Substack. Meanwhile, India continues its forward march in cracking down on the internet; everything is a content moderation story, including the recently leaked US intelligence documents; and Arkansas is the latest state to join the "won't you think of the children" bandwagon.
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Alex Stamos:
So how was your first week teaching, Evelyn, as a Stanford professor, as a real professor?
Evelyn Douek:
Well, delighted to report that one week later and no one has dropped the course, so I think that that's a good sign.
Welcome to Moderated Content's weekly news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. Alex, this week was a ridiculous week for Twitter, even by Twitter standards, so I'd say we should double sad trombone. But I worry that as this gets stupider and stupider, it's going to lead us down a slippery slope, where in future episodes we're just going to have 20 minutes of sad tromboning before we can even start. But I think we should really enjoy this sad trombone because it is well deserved this week.
Alex Stamos:
It is very sad this week, the trombone.
Evelyn Douek:
Right. That trombone needs to go to therapy. So in the Twitter corner, Musk is still CEO doing really important work, like last night painting the W on the Twitter sign outside San Francisco headquarters in a style of comedic genius that I haven't seen since my co-ed primary school days back in Randwick Public School.
Alex Stamos:
I am literally facepalming.
Evelyn Douek:
He is. Listener, I can tell you it was repeated facepalming.
Alex Stamos:
I'm just, I am slapping myself that he has renamed the company. I'm going to say it. He got rid of the W just for a joke.
Evelyn Douek:
It's amazing. It can always get stupider, listener, it can always get stupider. Meanwhile, actual stuff is happening at the company, it turns out. So basically, everyone that is involved in making sure Twitter isn't in the FTC's crosshairs resigns sooner or later. And this week, a senior lawyer on the team involved in Twitter's negotiations with the Federal Trade Commission resigned. And it's unclear if anyone at Twitter is picking up the phone from other regulators at all.
So in sort of relatively big news, Germany's Federal Office of Justice has launched proceedings against Twitter claiming that the company has adequately failed to deal with illegal content on its service. This is an enforcement action under the country's Network Enforcement Act or NetzDG, or Alex, can you say the German version of this at all?
Alex Stamos:
[German 00:02:11].
Evelyn Douek:
That's right.
Alex Stamos:
[German 00:02:14] There you go. That's how I pronounce it.
Evelyn Douek:
So it's a wonderful German word, but it would sound like I needed a [German 00:02:21] afterwards, so I'm not going to even try. But this is-
Alex Stamos:
Amazing. It's an amazing word. I've heard Germans say it and you're like, "Wow."
Evelyn Douek:
Yeah, that sounds scary.
Alex Stamos:
Yeah.
Evelyn Douek:
So the NetzDG, as I say it, so this is a law that was passed in 2017 and it requires companies to take down manifestly illegal content within 24 hours. And obviously, what's illegal in Germany is broader than what's illegal in the United States and includes things like hate speech, abuse and threats, and Holocaust denial. And so the BFJ, as the justice office is known, has said that there's significant indications that Twitter is no longer dealing with complaints about illegal content on its service adequately and that there's over a four-month period in relation to one individual. Similar unjustified, defamatory statements of opinion have been made against this person and they've been flagged, and the company isn't taking them down. And someone posted about this on Twitter and Musk replied, "First I've heard about this. Did they identify what content that they're concerned about?"
I think this press release might just be Germany trying to get Musk to answer the phone, I think, threatening these are fines up to 50 million euros. They're not about individual pieces of content. They're supposed to be about systemic failures to do content moderation in a meaningful way. I think this is really interesting actually. These are the first fines for content moderation failures under this act, even though it's been around for over five years now. There have sort of been a couple of fines about transparency reporting and things like that. So I'm curious to see what actually happens here, whether there actually is evidence that Twitter is worse than other platforms or has been getting worse in Germany. I don't know.
I mean, sure, there's sort of some reason to think that it might be, but it's also fair to say that there are many platforms where potentially defamatory statements might have been posted against a single user over four months, and not all of them were taken down. So we need to see what happens here.
Alex Stamos:
I mean, I think this does demonstrate why we need good quantitative measurement, as has been talked about with the DSA, which exists in some situations. Specifically in Europe, there's a group that does measurement of pro-terrorist content where they actually do create accounts. They post kind of standardized content and they see what the reaction time looks like for the companies. But this is just so broad, it's very hard to make a judgment. Although just based upon the chaos at Twitter, it is not surprising that perhaps operationally, their ability to take lawful requests from the German government have gone down.
Evelyn Douek:
Yeah. I mean, one of the things that I'm sad about the NetzDG in general is that there isn't a lot of quantitative analysis of the effects of it. This law was passed, it was going to break the internet. Clearly, the internet hasn't broken, but we also don't really understand its effects. We do know that a lot of companies hired a lot more German content moderators as a result of the law, so it probably had some effect, but it would be good to see some more detailed actual analysis, although it's obviously a very difficult question. So yeah, curious to see what happens with this.
Alex Stamos:
Yeah. There are a lot of anecdotes of comedians having their comedy taken down and stuff. So there are negative anecdotes, but there is no consistent data being gathered here, for sure.
Evelyn Douek:
Right. Musk probably didn't know about the German enforcement action because it's been a busy week for him over here making and reversing policies. So first, he chucked a tantrum about NPR, so it gave the public broadcaster a state-affiliated media label. And this is amazing because NPR was literally previously an example of what would not be state-affiliated media under Twitter's state-affiliated media policy.
Alex Stamos:
And what is still their policy, right?
Evelyn Douek:
Right.
Alex Stamos:
The policy itself didn't change. He just decided apparently, there was an NPR story he didn't like. And so now they're state-affiliated media.
Evelyn Douek:
Right. They did get rid of the wording that said, "And NPR would be an example of a thing not an example of this policy," but the policy's still the same. And there was reporting this week about how Musk didn't really seem to understand the difference between public media and state-controlled media. And after being informed that NPR only gets about 1% of its funding from the government, Musk said something like, "Oh, it sounds like this label might not be accurate here," and then flipped around. And now NPR has a label saying that it is government-funded media. Any other companies that might be government funded that might deserve this label, Alex?
Alex Stamos:
Yeah. So a number of people have pointed out that Musk's companies, in particular, Tesla and SpaceX, absolutely only exist because of US government funding, Tesla with somewhat indirect subsidies, pretty direct, I guess, of tax credits for people who buy cars and for solar panels and such. I think those tax credits are great. If we want to move the country to a post-carbon economy, we're going to need tax credits to motivate it. But the American industrial policy has been very good for Tesla. And then SpaceX is straight up, most of their money is just a check from the US government for services provided to NASA, as well as things like they have almost a billion dollars in rural broadband money to subsidize the creation of Starlink.
The idea that, oh, the 1% of NPR's funding that comes directly from the federal government makes it state sponsored when Musk's, a huge amount of his fortune really only exists because the US government has given him funding is amazing. And this also goes back, people need to remember the reason these state media labels were created is somewhat around Russia today and such. But the truth is, most people, well, I guess not Matt Gaetz, doesn't know that the Global Times is the official outlet of the Chinese Communist Party, but a lot of people understand that Globaltimes.cn or Russia Today are state media.
What happened in the 2017, 2018 timeframe when there's a crackdown by the companies on covert propaganda is you had a move towards these kind of gray propaganda where you had the creation of these subsidiaries of subsidiaries, of subsidiaries, where they might pay somebody who is an American face in America, a Spanish-speaking face in a Spanish-speaking country, an African face in Sub-Saharan Africa. And those people would be working through five or six different corporate shell companies back for Russia Today, or CGTN or some other state medium. And so the creation of this label is less about letting people know that these official outlets are official outlets and more about labeling the things for which it was intentionally obfuscated, which everybody knows who NPR is. It is clear that they have editorial independence, just like the BBC, in that both NPR and BBC crap on the American administration all the time in a way you will not see from Russia Today and Putin or the Global Times or CGTN against Xi.
And so the companies had to come up with these somewhat nuanced policies because it is clear that the BBC, the ABC in Australia, all of these public broadcasters get a small amount of money, perhaps in some cases it's larger, but still have editorial independence. And to differentiate that from Maffick Media, is one of the subsidiaries, The Red Zone being a show that nobody knew it was actually being paid for by the Russian government. So yes, he clearly does not understand the history here and why these labels exist, because he's also taken some actions around Russia and China that demonstrate that he doesn't really care about their propaganda output.
Evelyn Douek:
So this is reporting from Semafor that picked up that. Twitter is still labeling RT and Global Times as state-affiliated media, but one of the things that it used to do was if a user tweeted out a link to stories from these outlets, there would be a little thing that said, "Stay informed, know where the source of this information." And in tests conducted by Semafor, they found that Twitter was no longer doing this. And these are measures that are in the best tradition of free speech.
The idea is we're not going to censor this content. People have a right to see what these outlets are saying and what they're reporting, but let's have more transparency around the source of this content so people can make an informed, understanding choice of how to intake this information. And it's replicated at the federal level with the Foreign Agents Registration Act, FARA, that the idea is we don't restrict this stuff, but we do make sure that people understand where it's coming from.
Alex Stamos:
Well, okay. So Twitter and Facebook, they did something that some people would say unfair, that they made decisions around Chinese and Russian state media that they perhaps didn't make elsewhere. They did so on very good purposes. Russian state media in the invasion of Ukraine has been massively... has been calling for effectively, war crimes and covering up for war crimes. Russian State media talks about all of the poor orphans that need to be adopted by Russia and how it's a good thing that these children are being stolen from Ukraine and then moved into Russia and then trained to be Russian as part of what has been indicted as a war crime against Putin as well as his minister who oversees this.
Twitter does not... They used to say, "Okay, well, we're going to let this stuff exist out of the kind of free expression beliefs of Americans, but we are not going to let people just amplify it openly. And we're going to tag new things because you are supporting genocide. You're supporting genocide in Ukraine, you're supporting genocide in Xinjiang when you talk about the Chinese outlets." That was a pretty reasonable position to have. And Musk's little temper tantrum here is kind of shocking. I mean, is a little bit shocking. Just considering his personal history and the fact that he comes from South Africa, you'd probably want to be cleaner than clean with where a lot of his money comes of things like slave labor in Western China and the kidnapping of children and the fact that he's getting billions of dollars a year from the US government.
I mean, just the fact that he's effectively picking the Chinese and Russian side, here he is taking the side, is kind of amazing for a man who has made his entire fortune in the United States, to whom he owes the United States a huge amount. If I was at SpaceX right now, I'd be really worried. I'd be really worried about having him affiliated with a company that is a massive US government contractor because he is slowly making himself radioactive as going from one of the most celebrated American entrepreneurs to being seen as somebody who intentionally decides to give aid and comfort to not just America's adversaries, but media outlets that are helping support warlords.
Evelyn Douek:
It's also this question of these policies and these labels came from years of debate of wrestling around how to best adopt these policies and what nuance and where exactly to draw the line in this gray area, how much editorial independence do you need to show, and what evidence do you need to have? And there was all of these controversies around it, but they were sort of... And they're hard calls, but they were well-thought-out. These companies were wrestling with them, and there was this kind of explanation, whereas Musk is just tweeting through it and making up his mind on the go and then reversing it based on whoever tweeted him the loudest and has the most followers. And so whatever you think of the substantive decisions, this process of just total whiplash and not even knowing what's going to happen tomorrow, I think is, yeah, it's very problematic.
Alex Stamos:
Well, and even his MAGA supporters have seen this because of this next story that we're about to talk about. They're starting to see people turn on the idea of, oh, maybe having a council of people that you ask for their opinions and they feel that they're able to give your opinions freely without being fired is a smart thing.
Evelyn Douek:
Yes. This is the, I never thought the leopards would eat my face portion of the Twitter corner this week. Yeah, exactly.
Alex Stamos:
Oh, my face.
Evelyn Douek:
Do you have a leopards eating faces sound effect, Alex?
Alex Stamos:
We're going to have to work on that.
Evelyn Douek:
Yeah, we'll probably need it again, so we'll dig one up.
Alex Stamos:
Okay. We need the sex panther monologue from Anchorman, I think is what we'll use for that.
Evelyn Douek:
Perfect. Okay. So Musk got into a fight with Substack this week. Substack announced a new product, the Substack Notes feature, and the screenshots that it showed of this new product look a lot like Twitter. It has a lot of Twitter-like elements. It is clearly intended in some way to mimic Twitter. So this set off a cascading series of events. So first, Twitter started blocking Substack authors from embedding tweets into their stories. Then it started to block engagement on tweets containing links to Substack on Twitter. So users weren't able to like or retweet them, but you could quote them. And then it started marking links to Substack as unsafe. So if you clicked on a Substack link on Twitter, this big screen popped up saying, "The link you're trying to access has been identified as potentially spammy or unsafe."
And if you search for Substack in the search bar, you got generic results for the word newsletter. And so Musk, I mean this is clearly, this is not the first time that this has happened. We remember we talked about this in December when Musk cracked down on links to Instagram, Mastodon and Facebook around people fleeing the site to those platforms. This has subsequently been reversed. Musk claimed that the links were never fully blocked, which is true, but they were blocked in a way that sort of made engagement and spreading of them entirely impractical. And he alleged that the reason why this was happening was because Substack was trying to download a massive portion of the Twitter database to bootstrap their Twitter clone to. So the IP is obviously untrusted. Thoughts on this one?
Alex Stamos:
Okay, lots of thoughts. So some of the context here for me personally is that there have been a handful of very grifty, dishonest Substackers who have been attacking our group at Stanford and the work we've done about both elections and vaccines. I've tried not to talk about it in the podcast because I've decided that I'm not going to let grifter Substackers control my life. But in this case, there is an intersection here for a real issue. And so one of those grifter Substackers is Matt Taibbi, who I'm not a big fan of, especially since he said things under oath that are not true about a project that I ran in Congress and has written all kinds of stuff about us that is not actually his ideas. He's got a guy who he's trying to amplify who I'm not even going to mention who is being paid by a Super PAC. I think there'll be more reporting coming out about that in the future.
But anyway, Taibbi's amplified lies about us, and this week had a very aggressive questioning by Mehdi Hasan on MSNBC. And then there's some really good stories written after that in both Techdirt and emptywheel.net. We can link to both of those, but a lot of good pushback on the factual errors that Taibbi's made in his reporting. And then the next day, this whole Substack thing happens, and there's possibility that there's some link here that effectively, the fact that Musk, the Twitter files that he has supported, that he has pushed for has been a big deal. And Taibbi on MSNBC had the chance to... The whole MSNBC interview happened because Mehdi Hasan criticized Taibbi for not caring about censorship in India. And Taibbi basically invited himself onto Hasan's show to talk about it. And then when confronted with it, had no opinion on Modi and India censorship, and then explicitly refused to criticize Elon Musk, said, "I don't feel like doing it."
Then the next day, Musk starts doing this crazy stuff. He starts with Substack, but then it personally becomes about Taibbi because Taibbi starts writing about Twitter and Substack of like, "I can't believe this is happening." He wasn't even that aggressive about Musk. And Musk immediately does this response saying, "Taibbi is a Substack employee," which is not true. He talks about the scraping, which Substack does not have only one IP address. I think he's completely misinformed on Substack "scraping Twitter." I don't think there's any good evidence of that. And certainly, nobody else has demonstrated any evidence of that. And then specifically targets Taibbi for what you can only call a shadow-ban.
The great irony here, just so everybody understands, is a lot of the Twitter files is complaining about the decisions that Twitter made, that Twitter wanted to carry controversial speech or speech that Twitter did not want to endorse and amplify, that they had come up with ways to reduce the spread of those things or to add contextual labels. All of those different things that they invented have been called shadow-bans and they've been called censorship. That if Twitter did any of these things to reduce the spread of ideas that they themselves had a First Amendment right to disagree with, that that was censorship and perhaps state-sponsored censorship based upon a number of misunderstandings/lies about the relationship between the US government and Twitter.
Musk used all of those tools against Taibbi to the point of where if you search for Taibbi's name, none of his tweets came up. Only other people mentioning him, but none of his actual tweets. That's one of the most aggressive ways that you can isolate an account. And so yes, it's just an amazing leopards eating face in that this group of Substackers who make all of their money on Substack and who use Twitter, one, as a place where they get access to these documents that have given them hundreds of thousands of new followers and tens of thousands of Substack subscribers. These Substackers, some of them are making six figures a month. They're probably making seven figures a year, millions of dollars a year on their Substack subscriptions, and who finish all of their articles with, please join, please like and subscribe so that we can do this work.
They make all that money on Substack, but their relationship with Musk was critical because that was effectively a subsidy for Musk, of Musk giving them a selection of documents to support these articles. That relationship has very much broken down both in Taibbi and then Bari Weiss, who very gently criticized Musk for something and then immediately was totally cut off. So there's only really one person, and he's the worst of this group, who still has access and who has decided, even though he is a Substacker, to not criticize Musk. So clearly this is not so much about free speech, but more about can you find captured Substackers who will support Musk's political point of view, who will support him in every way, who will never, ever criticize him? If so, then you can have access to a certain subset. And I would not call that journalism.
So Taibbi is still operating on Substack, it seems, and he has not said anything on Twitter. And so it'll be interesting. I would love to see if somebody wanted to leak the emails or the text messages going on the backend here because a bunch of the MAGA, conservative, neo-reactionary folks who supported Musk are starting to criticize him a little bit because he is cutting off these people and he is creating this problem.
The other interesting thing here is that Musk, Twitter had a product to try to compete against Substack, and Musk canceled it and then fired the team. Yes, it is bad for him that notes is being created, but Substack Notes, 10 people saw their announcement that this thing was going to exist. And it is now front page news that Substack's going to compete against Twitter. So one, he has-
Evelyn Douek:
Joined the resistance. Yeah, exactly.
Alex Stamos:
Right. He's created this huge Streisand effect where he's massively advertised for Substack. But the other thing he has done was he has demonstrated that you cannot build an economic model on Twitter anymore because if your entire life will fall apart if Musk personally dislikes you. So just a completely nutty week for the future of Twitter financially. I think again, we keep on talking about the beginning of the end, but we're definitely at the end of the beginning of the beginning of the end here because I think this is going to turn out to be a huge-
Evelyn Douek:
This time, we mean it. This time for sure, it's really, really the end of the beginning of the end of the beginning. The shadow-ban was delicious. I mean, the idea that you searched Taibbi on Twitter and can't find it, I mean, it's just the most poetic ending to that particular story. It reminded me being disappeared, like Trotsky being erased from the photos. It's just the free speech people, everybody. And in a nice segue, so one of the things that Mehdi Hasan was criticizing Taibbi about, as you mentioned, was Musk's position on India and the increasingly bowing to the Indian government's demands for censorship. And Taibbi refused to comment. So in an increasing trend going over to India now so that we keep tabs on this story, because like I often say, like we always say, this is one of the most important stories for the future of free expression online.
So, this week the Indian government added to its arsenal for cracking down on online speech. The Ministry of Electronics and IT made amendments to the information technology rules, which created a new procedure where any government-related content that is fact checked as false or misleading by the government's own fact check unit needs to be taken down by intermediaries, which includes platforms and ISPs, or they lose their safe harbor immunity. So I mean, sadly, this isn't that shocking. This is just sort of increasing the powers that it already had. They already had powers to order take-downs, but this is sort of a shortcut around some of the procedural requirements that they had to comply with before.
It's also true that intermediaries did have a prior obligation not to carry misleading information or misinformation before. But the thing about this amendment is that it means that the government can directly say what it classes as false and misleading, which not so great, not so great. So another sad day in India.
Alex Stamos:
Yeah, big deal. And it does demonstrate when you have the parliamentary system and you control it and you don't have a written constitution with lots of protections for freedom of speech, that other countries can move so much more quickly than the US or the huge, momentous EU bureaucracy. And making India, again, just based upon their size, but also the speed at which they have the ability to pass laws and to change their interpretations makes them incredibly powerful.
Evelyn Douek:
Yeah. And I mean, it's not just India. We've seen that in Australia as well, where they've passed tech legislation overnight or in a couple of days, and it's never good. It's never a great way of making policy, even if it is good to be able to make policy. So somewhere in the middle, somewhere in the middle would be nice.
Alex Stamos:
Right. Somewhere between Congress, like us, possibly the entire economic system dropping down because we can't just reauthorize issuing Treasury bills and censoring the entire internet in a couple of days. Maybe there's a middle ground here of what a functional democracy looks like.
Evelyn Douek:
Surely not. Okay, so one of the big stories this week was the leak of intelligence documents from the US intelligence community. And because everything is a content moderation problem, this obviously has a platform link and tie in. This is a pretty crazy story. So Alex, do you want to walk us through it?
Alex Stamos:
This is a crazy story, so we can link to it. There's a very good explainer on bellingcat.com where Bellingcat tries to do some of the history here of where do these documents come from. So if you look, the documents are photos taken of paper documents such as maps of the frontline, intelligence reports from inside Russia, battle damage reports, some very sensitive stuff marked TSSEI and a bunch of special compartmentalized information of that really demonstrates that the United States is both very much involved in the Ukraine-Russia conflict, which is not shocking, but has also very deeply penetrated Russian intelligence. So the release of these documents is a pretty big deal. It also demonstrated the US spying on allies, which is not a huge surprise. We've known that for quite a while and those allies often spy back. But whenever that comes out, it's not a great thing for the intelligence community.
Bellingcat tries to figure out, where do these come from? Because there's been a bunch, there's been some posted to 4Chan. A lot of them are on Telegram. Telegram is the most important platform for the Russia-Ukrainian conflict because Ukrainians and Russians are both using it all the time to talk amongst themselves as well as to slam each other. Eventually, the book should be written on Telegram during this war because it is kind of an amazing its own battlefield. But they trace it back to a number of Discord servers, including it might be one called Thug Shaker Central, being a Discord server. Now, when you say a Discord server, it's not really... That's kind of the logically what these look like is you can create a Discord community, but all of these servers exist in Discord's cloud. So this is not a situation in which you're actually running software separately.
Discord has all the logs of this. They should have the history, and it looks like that some of the first... This relatively small 20 to 30 person Discord server that was focused on Orthodox Christianity, gaming, YouTube and some other things might be the first place these documents are posted. So it looks like this is not some kind of big intelligence project. This is not a Snowden-like leaker who has some kind of big moral crusade. This might have been somebody who was just trying to prove themself of being right and nerd sniping of kind of showing off, I have access to these documents, and then trying to convince other people that he is right and they're wrong. Effectively, somebody else is wrong on the internet seems to have been the big motivator here. Just kind of amazing and a real reflection of what these intelligence problems look like in the 21st century.
Evelyn Douek:
Yeah. It's always sillier than you think is basically the upshot of this one.
Alex Stamos:
Yeah. Well, and there is some precedent for this in that there has been a number of cases of people leaking classified information, not just from the United States, but at least in one case from the Chinese military, of people who play video games, realistic military simulations where they're saying, "Oh, this tank needs to have better armor." And the developers are like, "What do you mean? We're trying to do the best we can versus this Wikipedia." And they're like, "Well, here's the classified document saying how well the armor works," or in one case, the performance of certain artillery weapons in the Chinese military that you have members of the People's Liberation Army apparently who love this game so much that they're willing to leak classified documents.
It's a very interesting issue in that we've moved on from the Snowden, from the Cold War era of people being paid off by the Russians, of the Hansens and such to a world where people just because they want invisible internet points on Discord, that they're willing to commit pretty serious felonies. And one of the things they said was, "Thoughts and prayers to the Discord law enforcement team," because this is probably their first real run in with the national security establishment, that this is going to be the number one counterintelligence investigation at the FBI. And the fact that a huge amount of the evidence is on Discord means that they are getting a lot of FAA 702 requests right now and are going to get kind of picked apart pretty aggressively.
So I mean, Discord did nothing wrong here, but it is interesting that just because they have so many young people and those young people apparently have clearances, either work for the IC or work for the military, is going to make Discord a real interesting target for this kind of stuff.
Evelyn Douek:
Yeah, it's going to be a great movie one day, exactly how this happened. And in keeping tabs on our, won't someone think of the children watch in this country, heading over to Arkansas. So Arkansas passed the Social Media Safety Act this week, which will require platforms to verify the age of people during signup and restrict minors from creating profiles without parental consent. Again, this is people under 18. It hasn't been signed into law, but the governor, Sarah Huckabee Sanders, has previously expressed support for this bill. So it's most likely going to be signed into law very soon and that companies will face fines for up to $2,500 for each violation.
The bill sponsor, Senator Dees, assures us, "This is not a social media ban, this is not a First Amendment issue. We are not censoring any sort of free speech at all." So I like the chances of that argument when this bill is inevitably challenged in court and just-
Alex Stamos:
Can you just say, "This isn't a speech issue," and just wave your hands and all of a sudden when you pass a law massively controlling this speech, it doesn't count?
Evelyn Douek:
I mean, you didn't used to be able to, but let's see what happens. And for anyone keeping tabs, you can get a driver's license at 16 in Arkansas, but you're going to have to wait until 18 to get a social media account if this bill becomes law. And that is what I had for the week. Looking forward to another, it's finally sunny here in the Bay Area.
Alex Stamos:
Yes. I know, we kind of lied to you when we recruited you to move here from Boston about what the weather would be like.
Evelyn Douek:
"Come to the Bay Area," they said, "it's lovely here." No, I've never heard of an atmospheric river. What are you talking about?
Alex Stamos:
And then you're like, "Why is there a boat full of giraffes floating down the street from me?"
Evelyn Douek:
Law school lost power the other day because of the wind and half of the people were in the building because they'd lost power at home. It is apocalyptic out here, but it's sunny today, so that's nice.
Alex Stamos:
Yes, and we're starting for the good period, so now you're going to be happy and it's not going to be as nasty as it is in Boston in the summer. It's going to be great and enjoy your teaching. I'm glad that you're settled in. Nobody's dropped. Hopefully, that'll continue.
Evelyn Douek:
Yeah. That's famous last words. Evelyn's going to start the next episode with, "I have no students left." Anyway, we will find out. But for this week, that has been your Moderated Content Weekly Update. This show is available in all the usual places, including Apple Podcast and Spotify, and show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't be possible without the research and editorial assistance of John Perrino, and it is produced by the wonderful Brian Pelitier. Special thanks also to Justin Fu and Rob Huffman.