Moderated Content

MC Weekly News Roundup 11/7: The Elon Musk JD Program

Episode Summary

Evelyn and Alex discuss Musk's "just tweet through it" approach to Twitter ownership; the pros and cons of the great Mastodon Migration; Rumble pulling out of France over demands it block RT; the Intercept's (poor) reporting on DHS-Platform collaboration,; what to expect with the Midterms (GO VOTE!); and check in on legislative developments in India and the UK.

Episode Notes

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Alex Stamos:

One of the reasons I'm tired is it's really a lot of work to coordinate between the globalists, the Illuminati, the Deep State. George Soros is blowing up my phone every day on, "Alex, Why aren't you censoring social media better?"

Evelyn Douek:

Welcome to the Weekly News Hit episode of Moderated Content with myself, Evelyn Douek and Alex Stamos. We are now 11 days into the magnificent reign of supreme leader, Elon Musk, who yesterday announced that he would permanently ban any Twitter handles engaging in impersonation without clearly specifying that it was parody after Twitter as a community had been having a lot of fun changing handles to Musk's name. So, I am really enjoying this new free speech platform. Alex, blink three times if you have jokes that you want to make about Musk but are too scared to make them lest you lose your Twitter account.

Alex Stamos:

Yes, his reign of terror has begun. It's amazing, I honestly did not think it would be this bad. He is the world's richest man because he has actually built and operated companies. I am actually a Musk fan. So, I bought a Tesla Model S in its first couple of months. It was available because I got to do some security testing for Tesla as a consultant and was so impressed by their work. I have a Tesla roof and I have Powerwalls sitting behind me that's kept us ... Here in California, they like to flicker the power all summer as forest fires rage around us. Not at our house thanks to Tesla.

So, I think he has done incredible work. I am shocked, I am honestly shocked at how incredibly bad he's been at this. Because my expectation was he was preening for the crowd online, but once he took over, once he went in, once he had $44 billion at stake, that he would talk to his professionals and he would mature a little bit.And we saw that for like three hours in the beginning of him saying, "We're doing a content moderation council." He was retweeting Yowell, who's a very serious person who thinks very deeply about these issues and he is now in a spiral. In fact, it's that the point where you're almost at a Ye level where you're like, "Is he having a mental health issue?" If I was a family member, I would be calling him because just today in the last couple of hours, he is spiraling around the drain.

Evelyn Douek:

And he is adopting the just tweet through it approach, which is ...

Alex Stamos:

Which has never solved anything for anybody ever, let's just say.

Evelyn Douek:

But this could be the one time. All right, so I'm totally with you. I was in the camp of, "We're all saying the sky is falling, but it's just being a little hysterical, let's wait and see. Probably nothing will materially change because you couldn't run a viable business if you set too much on fire."

And now I'm, in the last week, which has felt like an eternity, definitely in your camp as well. Surprised at what's going on and not to mention devastated at the human cost of this. The employee layoffs are terrible.

Alex Stamos:

Right.

Evelyn Douek:

Okay, so what has most surprised you?

Alex Stamos:

Yeah, let's talk about it. So you're right, he did the layoffs. The layoffs, I had heard that these were coming, a number of my friends in there had been forced to create lists of people that they could fire, but tried very aggressively to protect the people who were ... And apparently those lists were not totally listened to. We know that for a fact because Twitter turned around and then tried to rehire people. I don't know if any of them took it. But they got rid of critical folks.

I heard madness around things like on call schedules and such. So, one of the things at a big company like that, you'll have dozens and dozens of people on call. You'll have different DevOps, engineers for each of your internal microservices and that's the kind of thing that wasn't even taken into account. So, people's accounts are being turned off while they were on call for critical services. It's amazing that it stayed up, honestly.

A bunch of things got weird. AOC was complaining that her mentions weren't working and stuff, which is not a conspiracy, that just means some internal service died. Some [inaudible 00:03:52] or something internally in front of the mentions died. And it might have been because the person who's supposed to keep it up had their accounts turned off. So, one, that's how people found out. There's no communication. He had done no all hands communication. Companies have to lay people off. Just part of capitalism, you have these cycles and we are due for this cycle in Silicon Valley. It has been only good times since the 2008 crisis and a lot of these companies have gone really big. It is very hard for tech companies to fire people during normal times and so this always happens.

But if you're a successful company that has cash, if you're not going out of business, and Twitter's not ... They weren't going out of business before Elon bought them. He was the one that stuck them with a billion dollars a year in interest, then you treat people humanely. And a better gamble of this is at the same time Twitter was doing this, Stripe was doing a layoff. What did Stripe do? Patrick Collison spoke to the entire team. He talked about how sorry he was that this happened. They provided 14 weeks of severance to people. They provided healthcare, they provided accelerated vesting of their RSUs. So, they took care of people financially and they took care of them emotionally, which is really important in these situations.

In this case, people's emails are just getting shut off. And so you had Twitter employees live tweeting, "Oh, my emails shut off. I guess I'm laid off because nobody's told me yet." That's just a crazy inhumane way to do it. And it's really stupid for a man who now owns three companies that he has to hire top end engineers because, yes, we're going into a downturn and it will not be hard for them to hire for the next couple years and then things are going to get good again and people are going to remember that Elon Musk treats his employees like crap.

That's not a stink you could wash off yourself. And I think that's really going to hurt. I tweeted this, it was a bad week for all those employees. It was also a bad week if you're a recruiter for Tesla or SpaceX because it's getting really hard to sell people, the kind of people who you need to be functional.

And so the way he handled that was so bad, incredibly bad. Incredibly bad for a man who's a CEO of several companies. It's just HR 101 that he totally ignored and it just got worse from there. At least layoffs are something that companies do and he did it poorly. Later in the week, he's threatening his advertisers. So, like you and I predicted, a bunch of advertisers have paused spending because of his bizarre, erratic behavior and he tweets that he's going to name and shame advertisers that aren't spending money with Twitter.

That is ... "If you spend money with me and then stop, I will send my hoard after you." Is the worst sales pitch I have ever heard from an advertising platform in my life. That is unbelievably bad. How does this man have so much money if he is this bad at business? He has counter parties at Tesla and SpaceX. Does he go around threatening these people if he doesn't get what he wants? I don't understand. It's just mind blowing to me.

Evelyn Douek:

But he's got hurt feelings, Alex, and it's the way to maturely deal with this. The relationship with advertisers is truly battling. That's what we thought would be the constraining factor on how much damage he would do to the platform and the constraining factor on the content moderation changes that he would make. And it looks like that's not necessarily going to operate in any way. This morning he was tweeting that he thinks he has tortious interference claims against so-called activists pressuring advertisers.

Alex Stamos:

So, you're a law professor. If you had this hypo in class, what would you tell? If the hypo was, "Somebody complained about my service and they said that people should not spend money with me. Do I have a tortious interference claim?" If one of your students said yes, how would you grade them, Professor Douek?

Evelyn Douek:

Basically the answer to this question is "Beep no." There is no way that this stands up in court. The courts are doing crazy things, but not that crazy. This is fundamentally first amendment protected speech. The idea that you are criticizing a product, you are criticizing a business, is just the way that speech works. And so I'm really enjoying this new free speech version of Twitter, which is both parody and consumer boycotts are dead. I mean, it's hilarious. At this stage, you could probably teach a full JDs worth of courses by working through all of the stages in the Elon Musk saga.

Alex Stamos:

You should offer that class

Evelyn Douek:

Yeah, exactly.

Alex Stamos:

You should, right?

Evelyn Douek:

Elon Musk Law 101. You start with corporations, you move through employment law, you've got your first amendment in there, we've got national security and [inaudible 00:08:17] issues. It would be a lot of fun. And who knows what's coming up next.

Alex Stamos:

Well, it's kind of shocking too because one of the guys in his little kitchen cabinet that he's really close to is Alex Spiro, who seems to be a real lawyer. You're not talking about a release the Kraken lawyer, but an actual litigator. Does he just not listen to his lawyers? Maybe he likes paying them $1,500 bucks an hour for every time he screws up and breaks the law or comes up with one of these crazy theories? I don't know.

Evelyn Douek:

Yeah, I don't know either. But full employment program for lawyers even as not such a great employment program for Twitter employees obviously. So, in the midst of all of this, there's the, "Go to Mastodon" movement. I have not yet jumped on that bandwagon. I'm a late adopter while I let other people work out all the kinks. Where are you, Alex?

Alex Stamos:

Well, so this is interesting. A lot of people who are in my circles have created Mastodon accounts. They don't seem super active. It's people reserving their spot in the name space and then doing a couple of little things. What I launched today, and I'll put it out on Twitter if people want to follow, is last night I was up till 2:00 AM because I'm a dork and I set up a Mastodon instance, a little bit over-engineered in Google Cloud using a bunch of enterprise Google Cloud stuff, cybervillains.com, because I think there are some really interesting privacy, security, safety issues with the whole federated idea and specifically with Mastodon. And the only way you can understand that is if you have an instance specifically to explore it. So, if people want to be on a instance that is intentional for folks hacking each other and testing out the trust and safety tools, then cybervillains.com is probably your choice. And-

Evelyn Douek:

What a sales pitch, Alex. Sign me up.

Alex Stamos:

Right. Well, you probably tell me how I'm violating GDPR, but the privacy policy says you have no expectation of privacy because it's expected that somebody's going to dump the database every once in a while. But unfortunately that turns out to be true for all Mastodon. So, the whole architecture of Mastodon is like, "Huh, this Cambridge Analytical scandal, I would love to live that as a default part of my system architecture over and over and over again." The Mastodon instances have a huge amount of trust with one another. Anybody can set up an instance. I just set it up last night, all I need was a domain, a couple of cloud machines, and I was magically joined to the rest of the Fedaverse. There was no authorization steps, I didn't have to fill out a form, I just joined up. And that's really powerful and that's why there are thousands of Mastodon servers and now it looks like millions of people collectively on those Mastodon servers. It is growing very quickly.

But the flip side is those servers can see everything that you communicate back and forth. So, on stuff that is 100% public, that might not be a huge, huge deal. Although a lot of the stuff that people complained about with Cambridge Analytica was stuff that was not actually private. But for private information, like DMs and such, it's a real security disaster. And so I'm really interested in that. I'm also interested in, from what I've seen so far, the moderation tools are extremely simplistic.

And so if you're running something with 500 people who you know all of them, then they're probably fine. If you are running a million person or two million person Mastodon server, and I think Mastodon.social was approaching about a million users before they shut down, they have so many incoming, they shut down new signups. Which is just going to make people want to sign up, right? Then if you're trying to moderate a million people, the tools are nowhere near there yet. So, anyway, we'll be running the server and I'm sure we'll have a chance to talk about it. I'll be writing up some of my thoughts based upon being the owner of a social network now.

Evelyn Douek:

I love it, Alex. All of this is going on and your first thought is, "How can I do an experiment in this and get a blog post out of it?" It's great. So, I mean-

Alex Stamos:

So, Google Cloud is $400 in free credits if you sign up. So, it's like, "Hey, I may as well burn through those." I'm down three bucks. I got $397 left in this experiment.

Evelyn Douek:

It is funny because we go through these cycles with centralization and decentralization and decentralization has been all the rage the past few years as a call to deal with these gatekeepers and our tech overlords. But decentralization does solve some problems, including the too much centralization of power. But one of the reasons why we centralized power was because there were calls for more gate keeping. And so while this decentralization solves some content moderation problems, it will create a whole bunch of new ones.

And part of the problem with content moderation is that users are creating bad content that needs gate keeping. And so putting the power back in the hands of the users isn't necessarily going to solve all of those problems. So yes, the people leaving Twitter because of its new content moderation policies are in for a surprise, I think, on Mastodon.

Alex Stamos:

Right, and I think you brought up a good point, which is there's a lot of ... The kind of people we hang out with, with EFF hoodies and the Corey, Dr. Rose and such, people who I like and who I respect who are big on decentralization and talk all about ... Will give people the ability to moderate themselves. And what they're always thinking about something like porn or spam that, "Hey, if you've got legal porn on a site and you don't want to see it, then if I give you the tools that you don't see it, then it doesn't hurt you."

But that is a really shallow view of what trusted safety is because, yes, there are trusted safety issues that accrue or the damage accrues to the person who was viewing the content. But a lot of trusted safety issues is damage that accrues external to the entire platform. So, if a group of 10 people organize a violent attack on your platform, blocking does nothing.

Evelyn Douek:

They're not going to be reporting each other.

Alex Stamos:

They're not going to be reporting each other. Right, right, exactly. If people are trading [inaudible 00:13:44] then blocking tools don't help you. And so yes, I think there's a lot of advocates for this kind of decentralization who are running into a real ... There's just always assumption that social media would be better if good people work there and I think of myself as a good person so I can do a better job. It's like, okay, well good luck, bro. Have fun.

Evelyn Douek:

Okay, so an undercover story I think in the free speech platforms world is Rumble, which is I think the most successful alt tech platform. It's the alt tech version of YouTube. It's marketing itself as the free speech alternative and it's big. It sees 10 times the traffic of Truth Social, 100 times the traffic of Parlor.

This week it announced that it was moving out of France because of demands by the French government to block RT and other Russian news sources. It candidly admitted that France is not a huge market for it. So, it's unclear whether this is a real commitment to principle or just performative. But it is interesting, I think it says something interesting about we might see as different governments impose different laws, how different platforms might deal with that. This idea of you can't nail jello to the wall so governments won't be able to regulate the internet. Well, then we found governments could regulate the internet. It turns out platforms can respond to that as well and not necessarily just keep operating in a country. So, I thought that was an interesting thing that I hadn't seen much coverage of.

Alex Stamos:

Yeah, you're right. Rumble is a totally under discussed platform and I think it's going to be very influential. One, because it is video. That makes it's just a much more effective ... The kids aren't on Twitter, they're not on Facebook, they're on YouTube and they're on TikTok. So, I think especially among young people, Rumble is much more interesting than a Gab or a Parlor. It also has some really big backers and it's got some really big names like the Glen Greenwalds and such that they have signed up people from the horseshoe theory ... I want to call alt-right. Greenwald is just Greenwald. He went so far left that he wrapped around. And so it's got a bunch of those people with big audiences who are pulling folks in a way where Truth only really has Trump. And so I do think Rumble's going to be a big deal and they are because they have real money in a way that a Kiwi Farms or such doesn't.

They actually do care about regulation because they have stuff to lose. They do want to eventually have a business plan. They don't seem to be making money now, they're just in the growth phase. So yeah, I do think it's interesting and I'm shocked it took this long. It does feel like with the DSA that you're probably going to end up in a situation where a bunch of the alt platforms block all of Europe and then if you want to access them, you'll be using the VPN.

Which then raises a question again for the law professor. If the Europeans are trying to apply their laws to a company that has no European foothold at all, does not make money in Europe, and actually blocks IPs, but then they end up with hundreds of thousands or millions of their citizens using VPNs just like the Chinese do to get around Chinese blocks, are they still going to try to apply their laws extraterritoriality?

Evelyn Douek:

Yeah, and the answer to that is no one knows. The law in this area is really vague and obviously cases that are from a couple of years ago don't really apply as the technology keeps changing and the level of leakage and how VPNs works keeps changing as well. And this could be an issue in the United States as well where different states are passing different laws and people are getting really excited about the capacity for states to regulate the internet. We may see either companies cutting off Texas, for example, or we may find that if that's not possible and Texas laws have too much of an extraterritorial effect, their laws could be declared unconstitutional due to this fun doctrine, the Dormant Comments Clause. So, all of this is fun to watch.

Alex Stamos:

Lots of stuff for you to teach. So, that's good.

Evelyn Douek:

Yeah, exactly. Again, this is in year three of the Elon Musk JD program. One other note, I don't know your thoughts on this, Alex, about Rumble and one of the reasons why we should be paying more attention to it is it has branched out into cloud services as well. So, it's trying to build robustness through the stack so that all of the issues that Parlor ran into with AWS cutting off services in the light of January 6th, it could have a much more robust solution and infrastructure for things like that. So, yeah, I think it's an important platform.

Alex Stamos:

Yeah, absolutely.

Evelyn Douek:

Okay, let's move to the midterms or loosely starting with this intercept story about the DHS this week. So, there was this big story that got people a flutter on Twitter about the level of collaboration or cooperation or conspiracy between DHS and the social media platforms.

I'm the number one audience for this. I wrote a paper about content cartels, about the level of opaque collaboration between governments and platforms and the reasons why we should be worried about the free speech impacts and the threats of that. This story had nothing. I was getting out my popcorn and rubbing my hands, getting ready to get into it. It had nothing for me. None of this was secret. The companies were literally tweeting this stuff out with statements that they were having these meetings in the lead up to the election as a example of how responsible they were being.

And there is absolutely nothing in the story that indicates that the government had power over these platforms or was being coercive in any way that even those of us that are really sensitive this would think was particularly problematic. Alex, curious for your reactions to this story that touches close to home.

Alex Stamos:

Yeah. So, as you alluded to, I am not neutral on this because I am a key part of this crazy conspiracy that ... One of the reasons I'm tired is it's really a lot of work to coordinate between the globalists, the Illuminati, the deep state. George Soros is blowing up my phone every day on, "Alex, Why aren't you censoring social media better?"

Yeah, the Intercept is following a reporting from a kind of an ultra right, honestly garbage, outlet called Just the News that tried to turn the election integrity partnership that I help run into a scandal for DHS. DHS put together this disinformation board that blew up spectacularly because it was poorly defined. It was poorly rolled out.

Evelyn Douek:

Poorly named, just to be fair.

Alex Stamos:

Yeah. The people picked for it were very colorful, you could say. DHS made a mistake here, it had actual no impact on anybody and it became like a little ... And they're trying to find their next scandal like that. And so the election integrity partnership, which is so secret that we launched in the summer of 2020 with eipartnership.net, with a Twitter account, with weekly blog posts, and then daily YouTube, public YouTube web streams of what we were finding, and then we published a 300 page report and now I think six or seven peer review journal articles. That secret conspiracy, which I guess you could call secret because nobody reads journal articles.

Evelyn Douek:

Right. 300 page reports and academic journal articles. The well known way to get your name out there, Alex.

Alex Stamos:

Right. And so we ran this thing that was supported by private philanthropists that represents the First Amendment protected academic speech of researchers working for the Stanford and the University of Washington and our opinion of what we thought was going on. And in doing that, we would send referrals, we would tell the platforms, "We think this falls under your disinformation policy."

Now our focus was, it continues only to be, on election disinformation. So, these crazy things they try to pull us into the Hunter Biden laptop story was explicitly out of scope for us. We had nothing to do with, "This person is bad or this is what's going on here or that." It was only about, "Don't vote because they're checking warrants." Or, "There's a bomb threat." Or people lying about back doors in Dominion Systems. That is what we did and we are continuing to do. But they're trying to turn it, first just the news and now the Intercept, is trying to create this vast conspiracy where the government is using us to censor things and that's just not true.

We did get referrals from local and state governments. A bunch of them went different ways. One of the referrals we got was actually a Instagram post by Hillary Clinton that was sent to us by multiple Secretary of State's offices because she reshared on Instagram an image that was created by the Pod Save the World or Pod Save America people that just had wrong dates on when early voting dates were and such. It was just wrong. And so a couple of states said, "Hey, this is wrong." We referred it to Instagram, we also sent it to the DNC and the DNC, I think, tagged her and they changed the image. That's the kind of stuff we did. It was not some kind of grand conspiracy.

So anyway, The Intercept article was BS. They took stuff that was mostly public and then they wrote it up and it just demonstrates that journalists can make anything sound like a scandal if they're smart enough.

And I think part of the goal here is to make it so that ... I don't know what The Intercept's goal is. I understand that people on the far right who are the purveyors generally of a lot of election disinformation want to make it so that people feel under siege. And that is somewhat working. Kate Starbird, our colleague, got lots of threats, got doxxed, got lots of horrible things going on. But we're not stopping. We're doing this tomorrow. I will be working with the rest of our team and we'll be doing real time disinformation monitoring. We'll continue to be doing it and publishing publicly what we find and using that as our basis of research of what's going on from a disinformation perspective around elections because I do think it's critical and we're not going to be intimidated from not doing this kind of work by The Intercept or anybody else.

Evelyn Douek:

It is just classic chilling effects of threatening people to try and stop them doing the work and raising the cost of doing this work.

Alex Stamos:

Right. If The Intercept wants to write journalistic stuff and they're wrong, that's one thing. But they need to also understand that what they are doing feeds into an ecosystem of tens of thousands of people who are willing to use threats of violence and then perhaps even still casting violence against individuals for their political goals. And so The Intercept really needs to think about, if they're going to consider themselves legitimate journalists, of what is the context in which they're making really outlandish claims that are not supported by the evidence.

Evelyn Douek:

Yeha, and I think one of the most disappointing things here is it totally undermined what are legitimate concerns in this area about government collaboration in general, not in this particular story. But this is something that we need to look at very carefully. There are real problems around these platform's relationships with governments. It was what sparked the original The Wire story, was about perhaps a Facebook improper relationship with the Indian government. There's all sorts of questions to be raised around Facebook's relationship, for example, with the Israeli government. There's an Israeli constitutional court case about that, about special reporting channels for the Israeli government and how responsive platforms are including in the UK where there's these internet referral units where they have special reporting channels and platforms very often for self-interested reasons might just go, "Well, this is not worth the hassle of disputing and so we're going to take things down."

But that doesn't mean that no information flow between the government and platforms can be beneficial. And a lot of this story was about enabling the government to do counter speech, which is classic First Amendment values. Here we have this story, this is disinformation, we need you to put out a statement or you might consider putting out a statement correcting this information, government official, because you're the one on the ground that has the proper information.

Alex Stamos:

Which multiple times we have and we've documented a bunch of these. We had a bunch of situations in which we enabled governments to say, "Hey, this is just not true." Right? Because they can't see everything that's going on, they can't see what's trending. And part of our goal was to make it so that the government doesn't have to monitor social media.

I agree. I think it would be very creepy to have a group in DHS that is sitting there just watching everything Americans say online. There's a natural possibility of that being abused. And so part of our goal was, well, if we have a bunch of powerless undergraduates doing it and saying this is something that's trending, then that can be sent to the Pennsylvania Secretary of State, can be sent to the Maracopa County registrar, it can be sent to SISA and governments can do counter messaging.

And that is completely different than "censorship." Anyway, it's frustrating. But you judge a man by the enemies he has. We can find, I think, a more modern version of that for 2022. Obviously it's probably better to be hated if you're going to work to defend democracy than to be ignored and we're not being ignored anymore, that's for sure.

Anyway, if The Intercept wants to send them paper copies, we do have a bunch of paper copies of our report. Happy to send those over.

Evelyn Douek:

It's currently the only book on my office bookshelves because I've been too slow moving things in. But you kindly gave me a copy. So, thank you, Alex.

Alex Stamos:

Who would need everything else? It's got drama, it's got romance.

Evelyn Douek:

It's everything you could ever want to know. Speaking of your amazing expertise in this area, midterms or tomorrow, do you want to just say a little bit about what you're seeing, what to expect and what you'll be doing?

Alex Stamos:

So, the Election Integrity partnership will be going live all day. You can watch on our blog at eapartnership.net or our Twitter account, @ei_partnership, to see what we're doing. We'll have real time monitoring starting, I believe, at 7:00 AM Pacific Time all the way to late and then we will continue after election day of seeing the kinds of things people are saying.

As I expected, things are much more chaotic this year. There's not a unifying message about just Trump in the presidential. The fact that we don't have a presidential election means that there's 435 house races, 32 or 33 Senate races, a bunch of gubernatorial races. And so it's a much more dispersed situation.

But it is pretty clear that the Pandora's box is opened. That lots and lots of people, and not just Republicans this year, are of making unsupportable claims about the election being unfair to them.

And just a couple hours ago, our colleague Kate was tweeting about one of the things we've been on, which is that there was rumors of a bomb threat to keep people from voting, early voting in New York. And it turned out not to be true and it's being pushed by a bunch of people on the left.

So, the thing that really worries me is that it looks like, just based upon the polls and stuff, that it will be a good night for Republicans, that because the seal has been broken on nobody being punished for having crazy thoughts about, "They didn't really lose." That we will see equally fervent election disinformation on the Democratic side this year too.

And it's bad enough to have one major party consumed by the idea that they can't actually lose elections. If you end up with everybody believing it, I think that that is the start to the end of American democracy. We just cannot have a system in which nobody believes that they legitimately lost an election. And I do think we're going to be on the down slope of that. Stacey Abrams has already been saying a bunch of stuff that I don't think is supportable by the evidence and if she continues down that it's really going to open the door for all the local house races, local state assembly and state senate races, for people to say things were rigged against them.

Evelyn Douek:

Well, the best thing that you can do on that cheery note is get out and vote. I can't. And so it would be great if all of our listeners could on behalf of those that have no say.

Alex Stamos:

I will illegally vote on behalf of you.

Evelyn Douek:

Thank you, Alex.

Alex Stamos:

I'll be filling out a ballot. And as I mule thousands of ballots to the local ballot box, thankfully protected by Patriots with their M-16s, I will make sure that your vote is counted [inaudible 00:29:06]

Evelyn Douek:

Fantastic. Exactly what I'm wanting to hear. We're coming up against time. So, a couple of quick dispatchers from overseas. India has, as was expected, amended its IT law to create these government appointed members, three members on something called a grievance appellate committee, which will sit above social media companies and allow users to appeal to those committees and have their decision overturned.

Obviously, it's like a government run version of the Facebook oversight board. Lots of open questions about how they could possibly hear all of the appeals, how they'll pick their cases, but it'll be something to watch in what we often say is the most important jurisdiction for freedom of speech online.

Alex Stamos:

Yep. Big one. Yep. More India.

Evelyn Douek:

Yeah, more India. Dispatch from the UK is that the online safety bill, which has been this huge legislative package that's been years in the making but was shelved with the other things going on in the UK recently, is going to be back. It's back on the agenda. But it has had its controversial legal but harmful provisions taken out of it.

So, it had all these provisions about requiring companies to take certain steps with respect to legal but harmful content. That's obviously very concerning because the government shouldn't really be legislating about legal but harmful content because it is by definition legal and it shouldn't be using pressure on companies to get around and basically get legal content taken down. If companies want to take it down voluntarily, that's fine, but the government shouldn't really have a role in that.

There's still a number of concerns about this bill, that there are threats to free speech including how it defines illegal content and things around requiring automated removals in certain cases. But that is at least progress. And as with always in legislative packages, we have to wait and see what actually comes out in the wash. But it's another big one to watch.

Alex Stamos:

Yeah. It is a big one. The interesting thing is I've seen that there's polls saying that a bunch of people in the UK want to rejoin the EU. I don't know if that's at all possible, but it would be fascinating to watch whether or not if in two, three years that they rejoin the EU and then all of these laws that were passed need to be now fixed and harmonized and they have to do the whole process over again.

Evelyn Douek:

Yeah, well as the holder of a British passport that became substantially less useful recently, that would be very exciting for me because you get free admission to European museums and things like that. So, I'm in favor. I can vote in that one. So, that'd be great. Anything else you want to cover before we sign out, Alex?

Alex Stamos:

No. Like you said, I hope everybody goes out and votes and then if things don't go the way they like, that they accept that that's how democracy works. That sometimes you win, sometimes you lose, and that you don't go out there and amplify disinformation about how the election process ... Hopefully this is the peak of this and I'm not hopeful. I don't think that's likely, but let's be hopeful that tomorrow is going to be kind of crazy and that that's the worst it gets.

Evelyn Douek:

As Gandhi said, "Be the Twitter that you want to see in the world and don't amplify misinformation."

That has been your episode of Moderated Content for the week. This show is available in all the usual places, including Apple Podcasts and Spotify. Show notes are available at law.stanford.edu/moderatedcontent. Alex, can you believe that no one at Apple has contacted me to notify me of a suspiciously large amount of positive reviews on this podcast? It's like our listeners aren't even trying. So, right after you've posted your ballot in the most important election tomorrow, please get your voice heard on the second most important election tomorrow and give us a rating on Apple.

This episode of Moderated Content wouldn't be possible without the research and editorial assistance of John Farino, a policy analyst at the Stanford Internet Observatory. It is produced by the brilliant Brian Pelater. Thanks also to Alyssa Ashdown, Justin Fu and Rob Huffman. See you next week for more Moderated Content.