Play the sad trombone 5 times for this week's Twitter Corner: Musk censors political content at the behest of the Turkish Government in the final days of a close and historically important election; Linda Yaccarino is announced as the new CEO; Tucker Carlson announces he's going to stream his new show to Twitter; the platform announces not-so-encrypted messaging; and continues its ad hominem content moderation practices. Also: Singapore, Pakistan, Russia all crack down on internet freedom, and the European Court of Human Rights releases a wild ruling holding politicians responsible for third-party comments on their Facebook pages.
Evelyn Douek:
I've worked out why podcast hosts ask for reviews, because it turns out if you ask for them, people give them to you, and it's a really nice feeling.
Alex Stamos:
Oh, I haven't checked. How are we doing?
Evelyn Douek:
Yeah, we had a nice little influx this week of people-
Alex Stamos:
Helping dilute our two-star review?
Evelyn Douek:
Exactly. You know what? You can still have a real impact. We don't have so many reviews that you can make a difference.
Welcome to Moderated Content's weekly, slightly random, and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. Alex, I think we're just going to have to play the sad trombone five times this morning before we head to our Twitter corner.
Alex Stamos:
Yeah, just imagine that you've heard this five times. We need an entire sad orchestra actually, Evelyn, or a sad Canadian brass.
Evelyn Douek:
Sad tiny violins. Sad double basses. Yes, we need it all.
Alex Stamos:
Because there's so much sadness to go around this time.
Evelyn Douek:
Yeah, so I mean, obviously, the big story leading the Twitter news this week, we thought it would be the new CEO announcement, but we're going to have to hold that for a second and head to Turkey, because that is the big story in Twitter news this week. So obviously as our listeners probably know, Turkey went to one of the most important elections in its history yesterday. In an announcement posted on Friday evening at around 6:00 AM in Turkish time, Twitter's official Global Government Affairs account announced that the platform had started taking steps to restrict access to some content in Turkey in response to legal requests in the run-up to this extremely important election.
Some stories have come out. They didn't announce what the accounts were that they had restricted in subsequent tweets. Musk promised that he would, but we're still waiting on that at the time that we go to air on Monday morning, but some reporting's being done around it and unsurprisingly, it's political content that's critical of the government, and investigative journalists and businesspeople who have been critical of the president previously. When journalist Matthew Yglesias criticized this, Musk replied with the great zinger, "Did your brain fall out of your head, Yglesias?"
Alex Stamos:
Right. What is this? Some kind of 50s standup comedy? "Did your brain fall out of your head, sir?"
Evelyn Douek:
Exactly. With all the seriousness that this issue demands, Musk weighed in saying, "The choice is to have Twitter throttled in its entirety or limit access to some tweets. Which one do you want?" Which I mean, I think pretty clearly explains what his thinking is around this. It seems like a pretty easy choice for him. "Oh, we can't lose access to the country. We're going to comply with these legal demands." Now, Alex, I just wish someone had thought that this issue might arise when thinking about how to run a global platform. I really couldn't have seen this coming.
Alex Stamos:
If only anybody wrote or podcasted about or gave TV interviews or told every member of their family or had random people in the street ask them about what's going to happen with Twitter and said that the biggest risk was from foreign states asking for censorship because this is a constant problem. It's not a problem that Musk invented, but what Musk invented is that all of his net worth comes from companies where he sells physical stuff, and he sells stuff in Turkey.
Also, SpaceX is a vendor to the Turkish government. There are great photos of him shaking hands with Erdogan. That's the other thing you get if you're an industrialist who actually sells stuff, is every autocrat in the world has a picture of you shaking his hand at some economic summit where you're asking for raw minerals or you're trying to open up the space to sell your cars. Yes.
Evelyn Douek:
Yeah. One of the great things about having a podcast it turns out, I didn't know this, is that you can say, "We told you so," a lot. So I went and looked up our podcast from October right after Musk acquired the platform. Here's you saying when he talks about we will follow the law and the crux is that you want to follow the law, but you also want to protect human rights and you want to protect free expression, there's no way that everything he said about free speech and everything he said about following the law, those things are not compatible.
Alex Stamos:
When he talks about we will follow the law, that is the crux, is that you want to follow the law, but you also want to protect human rights and you want to protect free expression. So, there is no way that everything he has said about free speech and everything he has said about following the law, those things are not compatible. It's just not compatible around the world. So, he is going to be in for a really rough ride as he learns the challenges.
Evelyn Douek:
Here is Musk's learning curve, where Turkey is the playground or his classroom, unfortunately. So, yes. As you-
Alex Stamos:
But has your brain fallen out of your head, sir?
Evelyn Douek:
Yeah, that's right. Yeah, I mean, it's extremely dispiriting. This is a situation where, I mean, this is one of the toughest things that platforms have to deal with. I'm preaching to the choir here about how to deal with autocratic governments when you're operating around the world, but this is a situation where previously, Twitter has pushed back on these kinds of things. In fact, in Turkey in itself, it's previously taken the Turkish government to court to challenge some of the legal orders it's been given.
It's previously had access to it blocked in the country to try and push back against authoritarian overreach, whereas here, it appears that Musk isn't even trying to do the bare minimum of saying, "Here is a line, we have a line and we will not go beyond it." Because one of the hardest things for these companies is to work out where your line is, and at what point you will stop. It seems that Twitter's just saying, "Well, we don't have a line, and send us an email if you want us to manage your content for you."
Alex Stamos:
Right. Right. Musk@twitter.com if you'd like us to censor your political opponents. Yeah, so like you said, this is not a new thing. This has been a problem for decades, of tech companies dealing with censorship demands around the world. In fact, it is such a big problem that they have created a group of companies that have set aside a number of principles that they try to live up to, and that group advocates when companies are pushed outside of those principles. It's called the Global Network Initiative. You can go to globalnetworkinitiative.org. Basically, every major tech company except Twitter is part of it. I don't know exactly the history of why Twitter... I don't think it ever was part of it, so it's not like Musk pulled them out, but Twitter's never been part of it.
I think this group writes a lot about this, about this trade-off he was talking about, that Musk said like he's the first person to ever think of this, "Oh, do you want us blocked or not?" We went through this over and over again when I was at Facebook, and if you want to be consistent with the GNI principles or you just want to be consistent with human rights standards, the way you handle this is you resist as hard as possible, especially if the request is a political request, if it's about an election, you have to hold... There are different levels of limits. People threw out all these strawmen about Germany disallows Nazi content, and obviously nudity rules are different around the world, and rules about violence and pictures of guns and smoking, there's all kinds of things that countries have different rules that get enforced.
But this is core political speech. This is the political opponents to the ruling, democratically elected autocrat of Turkey who are being censored by Turkey. That kind of core political speech is very different than disallowing pro-Nazi videos, and you have to hold your line as much as possible if you're a company. You have to resist everywhere possible. You have to appeal it within the court system there. In the end, the companies that really care about this just take the block. You push the country in question to block you, because it makes it clear to all of the citizens that censorship is happening.
The gift Musk gave Erdogan here is he just made this stuff disappear, so there's no pushback on him. Now it came out, so people are talking about it, but you have to create the situation where you force them into that. What companies figure out is that 80, 90% of the time, the countries back off in these situations. They end up not doing the block, because they understand that blocking the entire platform will be seen as heavy-handed.
So they'll give it a shot, they'll send you letters, they might even send you court orders, and you fight it as much as possible. It looks like Musk did not fight at all, unlike Twitter in the past. He did not push it through the Turkish court system. Twitter won in the Turkish court system back in the day, when Erdogan tried this before. They didn't try any of that. They just gave in, and in his texts, in his messages about this, Musk basically advertises, "I am open for business for any autocrat who wants to try this." That is a really bad decision.
He has put himself in a horrible, horrible place, but for a couple reasons. One, because now everybody's going to try it. The other thing is this is going to a runoff. This is effectively a 50/50 election. We don't know who's going to win. One of the other things you talk inside of the companies, if you're trying to just be completely cynical and you don't care about the UN Charter on Human Rights, and you don't care about the GNI principles and you don't care about free expression, and you're just looking out what's good for the company, is one of the things you think is like, "Oh, wait, so we're picking a side by doing this, and the side we're picking might lose," and if the winners that you helped oppress come to power, they are not going to forget that you helped the autocrat.
I think that is the real risk also Musk is running here with Turkey, is that if Erdogan loses the runoff, then the people of Turkey are not going to forget who helped Erdogan censor their internet. That also creates I think this really bad cycle where now Twitter really wants Erdogan to win. So we should see what Musk does over the next couple of weeks, because he has thrown in his lot with Erdogan, and if Erdogan loses, he's now going to be in real trouble among the people of Turkey and the ruling government of Turkey. So, he's going to have to support Erdogan in the next couple of weeks.
Evelyn Douek:
Right. It's hard to imagine higher stakes political free expression than in the dying moments of the closest election in the country, for a leader that's been in power for the past two decades, that is now going to the first runoff in its history, and this is I guess what free speech is for, is in these moments. We don't know the extent of what Twitter's taking down. We don't know what kind of stuff it is.
We have some reporting around it, but who knows how dramatic it is. I mean, yes, there are times in which access to the internet and communications device, it's a human rights good in and of itself, but having access to a platform which is only showing one side of the debate in these dying moments of a extremely important and contentious election, it's not a hard call.
Alex Stamos:
Right, and also, it's a 50/50 election. If you're going to back an autocrat, don't be dumb. Back one that never loses. So, Apple has done a huge amount to support the Communist Party of China, but that's a good bet. There's no evidence that China's going to be a democracy, and therefore is going to punish Apple for helping support the Communist Party.
If you're going to pick an autocrat, don't pick the guy who's polling at like 48%. Even if it turns out you don't care at all about freedom of expression, even though you said you were a huge free speech guy, if you're just going to care about money, then be careful in not taking sides. The idea that he is neutral by following the rules here is just completely wrong. I think that is the fundamental problem here, is that he does not understand that just following the law locally cannot be compatible with human rights guarantees, exactly what we talked about in October.
Evelyn Douek:
Yeah. One thing I want to know more about, I haven't seen much reporting about is what's happening with other platforms in the country. It didn't leak or whatever, this was posted by a Twitter official account, and said that it was complying with some orders from the Turkish government. So, I guess that's some level of transparency, although it is really the bare minimum. It seems to me to be very unlikely that Twitter is the only platform that is getting pressured by the Turkish government in this moment. I'm sure that others are probably receiving... I mean, who knows? I just haven't seen any reporting about this and I haven't seen any transparency about this either, and it's something I'd like to know a lot more about.
Alex Stamos:
Yeah, I've been looking. Facebook is full of anti-Erdogan stuff, but it's hard to tell whether it's available in Turkey. That is the other thing that happens here, is that companies will, if they have to block, they'll block only the very, very specific IP addresses, but I don't see any evidence of Facebook doing that. It would be nice to hear from the Turkish press of people who are experiencing in-country of what's being blocked right now.
Evelyn Douek:
Yes, we have another two weeks of this to go with the runoff election happening in two weeks, so this story is far from over, I think. Let's see how it plays out over the next couple of weeks.
Alex Stamos:
Right. Just a month after that, something else happens at Twitter. Twitter's getting the new CEO.
Evelyn Douek:
Is there a timeline? Do we know exactly when she's scheduled to step up?
Alex Stamos:
We don't have the exact date, but he said six weeks.
Evelyn Douek:
Okay. Well, I mean, if we know anything, we should definitely trust Musk's timeline for announcements with respect to his platform.
Alex Stamos:
If he tweets something, you could take it to the bank. You could absolutely trade on stocks based upon what Elon Musk says on Twitter. It is totally safe. Yes.
Evelyn Douek:
Linda Yaccarino will be Twitter CEO, no ifs or buts about it, in six weeks. So he has-
Alex Stamos:
I mean, he wouldn't want the SEC to get mad at him. He hears a lot about that.
Evelyn Douek:
That's right. Exactly. So, this is something I'm sure he'll move hell or high water to make happen. He has tapped the former NBC ad executive for the role. This is someone that Musk promised in December, he said he will resign as CEO as soon as he finds someone foolish enough to take the job. Well, it seems that he has found her. The plan is that she will focus primarily on business operations while he handles product design and technology, which seems like a recipe for a totally harmonious working relationship that I can see no possible friction or problems. What do you think, Alex?
Alex Stamos:
Yeah, so this is exactly what happened with Mark Zuckerberg, except the titles were a little bit different, but Zuck, when it was time for Facebook to actually make money, he realized he had no idea how to sell ads. He had no idea how to make money. He had no idea how to run a big company. So, he hired Sheryl Sandberg to be COO. They split the responsibility along the same lines here. So whatever the titles happen to be, you're looking at a Mark/Sheryl model here. The difference is that Mark knew what he didn't know. So, one of the reasons it worked at least in the early days is that she had her world and he had his. The reason that relationship I think fell apart was partially because Mark created the problems, and Cheryl was responsible for fixing them, and she got blamed for all of these things.
The general assumption externally was that Cheryl was the second most powerful person at Facebook, but the truth is there's seven or eight mostly white dudes between Mark and Cheryl who are product VPs and even some of them aren't even VPs or even direct reports to Mark, just very powerful people about deciding how the product is built and in the end, that is the ground truth of what your problems are going to be. Cheryl had legal and comms and my team, and all the people who clean up an aisle five, but the people who are pushing the jars off of the shelves and making the mess were the product execs.
So, we definitely can't expect this to be some kind of adult supervision here, because as long as he is taking over the product responsibility, that means decisions like the disastrous Twitter Blue decision, it'll be interesting about these policy decisions. They haven't really discussed who's deciding what do you do with Turkey, whether that's her or not. If Musk is keeping all that, then she's got an incredibly hard job in front of her.
Evelyn Douek:
Yeah, I mean we've talked many times about how it's a bit of a poison chalice to take this role, because you're going to be the face of a company and it's unclear how much power she's going to have. So, that'll be interesting to watch. He has already assured his MAGA fans who are freaking out that the WEF executive chair would bring back the shadowbans, that that's not going to be the case.
So if her job is to focus on brand safety, which she has disparaged a lot, Semaphore had some great reporting this morning about her in her previous role talking about why TV ads are so much a better bet for companies because the social media platforms, you just can't trust your brand with those brand unsafe platforms. Good luck to her selling brand safety in her new role meanwhile, Tucker Carlson is announcing his new show is going to be streaming on Twitter from now on. So, yes.
Alex Stamos:
Yeah, [foreign language 00:16:00], Linda, because yeah, buckle up. It's going to be crazy. It is interesting because yes, the TV folks have used... Every time there's some kind of scandal in Silicon Valley, they use this excuse of why she can come back to television, as if television has never done anything bad in America. I do appreciate thinking back to those days of being lectured by NBC Universal folks about the responsibility social media has to the truth, as they make the residual checks from The Apprentice.
NBC and CNN created Donald Trump, NBC created this media figure and then CNN elevated him above every single other candidate for the Republican nomination. Trump is a real creature of television, as much as we talk about Truth Social, and his Twitter account and such. So anyway, it is kind of interesting that she's taken over at the same time that they're becoming effectively a OAN or Newsmax outlet. It's also then interesting to see what Musk said about Tucker Carlson when the announcement happened.
Evelyn Douek:
Right. He actually tried to put distance between them saying, "We have not signed any kind of deal. Carlson is going to get exactly the same treatment as everyone else," which is really comforting I think, because we know how predictable and consistent Twitter's content moderation is these days. So the fact that he's just going to get the standard content moderation bundle is I think very comforting to everyone that's worried about this.
Alex Stamos:
Right. I did see this as fascinating. I think this is also explicitly Musk not only trying to look like he's totally right wing, but also trying to preserve his Section 230 protection here, that he's basically saying, "Oh, Tucker is just another user of my product. This is not a situation in which we're paying him for a show." He is not signing up to have uncapped legal liability for what Tucker says. If Robert Murdoch will no longer be responsible for defamation suits based upon Tucker Carlson, Elon Musk does not want to step into that role.
Evelyn Douek:
So very explicitly saying, "We are not taking responsibility for this dumpster fire," and if he chucks another Dominion, Section 230 will be there. So yeah, I thought that was really interesting as well.
Alex Stamos:
And then they'll pivot back to Community Notes, which is not a feature that really works on video. So, that is kind of funny. Maybe he's got some vision for Community Notes being relevant to the livestreaming platform. But right now, if you've got somebody who has an hour video, attaching a Community Note for something that's in minute 48 is a bit of a bizarre response, which is not just a Twitter problem. The entire misinformation labeling system really doesn't fit the livestream shows, and that has been a challenge for a number of companies.
Evelyn Douek:
Right. Okay, so that's one to watch. Still getting through what we thought was some of the biggest stories of Twitter this week, and we're still not even through them. Twitter announced a somewhat encrypted messaging, or so it didn't really say. Alex, what's going on here, and what do you have to say about it?
Alex Stamos:
So the technical name for what they're doing here is encraption. So, you can encrypt anything. Encrypting something is easy. The hard part is managing the keys in a way where only the people you want to decrypt it have access to those keys, and that you are sure that the keys you are using belong to the person you're communicating with. Just encrypting anything, happens all the time, you can say lots of messaging is encrypted because any major service you use, it is using transport layer encryption to move it across the internet safely.
It is possibly encrypted at rest on the actual hard drives on the other side, but it is decrypted automatically for you before it gets displayed into the webpage you're looking at, and could be decrypted by the company. So they've been very careful not to say end-to-end encryption, which has gotten other people in trouble. So, they've been very careful to just say encrypted. But even then, I think they're really playing with fire here, because they're advertising that there's some kind of increase in privacy and with the lack of any kind of key management, of verification of who people are, of the ability to add multiple devices, so doing multiple device encrypted messaging is actually very difficult to do securely, and both Signal and WhatsApp took a long time to figure out how to do that.
These kinds of issues mean that if somebody forces Twitter to, they will be able to decrypt messages because of the way they're doing key management and the like. So therefore, any promises for privacy are actually invalid. So, I think it was incredibly dangerous and stupid for them to announce this half-done. You don't roll out end-to-end encryption half-assed, until it's ready. You can do some beta testing or whatever, but you don't announce it to people and say, "This is running," until it's actually ready, because you're giving promises to folks that you can't live up to.
Evelyn Douek:
Right. In the post announcing what's going on here, Twitter and its Help Center says, "As Elon Musk said, when it comes to direct message, the standard should be if someone puts a gun to our heads, we still can't access your messages. We're not quite there yet, but we're working on it," which is exactly what you want to hear. They're working on it, but what it also means is that if someone puts a gun to their head, they have access to your messages, and yep, that is not encryption, and it's been helpful to learn the technical term for what it is.
So, yes. They're not saying it's end-to-end encrypted, trying to avoid legal liability. Obviously, the FTC is watching them like a hawk at the moment, particularly around their privacy and security practices. So, trying not to be misleading in any way, but the vibe of the thing is they're offering this thing that just isn't ready for primetime.
And then just because it's a clown show over there, one of my more amusing content moderation stories of the week is that Twitter appeared to limit the visibility of the Twitter account of Bellingcat after it had been reporting on the neo-Nazi content of the account of the Allen, Texas shooter, which Musk called a psyop. He was clearly unhappy with this, and thought that Bellingcat was spreading misinformation. So, they just limited the account out of personal vindictiveness.
This morning, Trust and Safety Head Ella Irwin tweeted that it was because Bellingcat had posted a video of the Allen shooting, and that whenever accounts post graphic footage like that, they get labeled with a sensitive content label that demotes it in the feed. One small problem with that, they hadn't actually posted a graphic video from the Allen shooting and once CEO Eliot Higgins pushed back on that, it actually turns out that they are no longer marked as containing sensitive content.
So, normally I like to say don't assume malice where incompetence is a viable explanation, but I have to say Twitter 2.0 is really burning through a lot of the goodwill that attends that assumption, and it really just seems like Musk chucked a hissy fit about not liking this particular reporting that Bellingcat was doing.
Alex Stamos:
Yeah. So, let's not blow past the fact that the richest man in the world, the owner of Twitter believes that this horrible shooting in Texas was what he called a psyop, like he was playing into the idea that this whole thing was a setup to take people's guns or whatever, which is completely inconsistent with the fact that we are the only country in the world that has shootings on a weekly basis of this type. Another horrible, horrible event that you can have all kinds of legitimate debates about whether or not the Second Amendment's worth it, whether we can have better gun laws within the context of Second Amendment.
Is this a mental health crisis? You can have those debates. What you can't say is that this is all fake. That is a completely ridiculous and non-factual idea. So, first, let's just not ignore to the fact that-
Evelyn Douek:
Very fair. Very good point.
Alex Stamos:
... Musk came out and straight up said... No, I'm not saying you're doing that, but I just want to take a moment to focus on the fundamental problem here, which is that he is like a 15-year-old 8chan-er in his credulousness when anybody says anything that is somewhat counter-cultural. If you tell him the World Economic Forum wants you to believe this or the Illuminati wants you to believe that, he'll instantly believe the opposite, which is amazing for the world's richest man who has benefited from the economic capitalism and democracy in the economic system that has put him on top, and that now, he is some kind of bizarre populist who believes in every conspiracy theory.
But yes, on the actual decision, it does look like they just did it because he was pissed at them. There's going to be a paper trail for this. Again, there are some good people left at Twitter who are trying to keep the lights on and I think it is getting harder and harder for them to survive with this on their resume, that you are having people who are burning their reputation for him to be part of an organization where he says this is a psyop, and then punishes a totally legitimate open source intelligence group.
The other connection here is the Russian connection. Bellingcat is hated by the Russians. They do a lot of great research in Russia. They've unveiled these huge Russian campaigns. They've helped investigate the poisoning of dissidents in the UK and such. So once again, you have this, bizarre that he's aligning himself with global autocrats in attacking Bellingcat. I mean, I don't know what to say other than he's been melting down, but it's been like this at the beginning. It's not like it's getting worse. I think it's just the mask is falling off. Once again, [foreign language 00:25:14] Linda. Good luck with this.
Evelyn Douek:
Exactly what I was going to say. Good luck selling brand safety in a world where your... I don't know what his title's going to be, but basically boss is making individual content moderation decisions based on conspiracy theories and vindictiveness. It seems like a great sell.
Alex Stamos:
Which always lines up with autocrats around the world, just FYI. So, it just happens to be if he flips a coin, that's a conspiracy theory he believes. Even though it always just happens to line up with what Vladimir Putin believes. So, it's cool. Don't worry about it. Nothing bad going on. It's great.
Evelyn Douek:
That's right.
Alex Stamos:
It's great.
Evelyn Douek:
It's totally worth it.
Alex Stamos:
Good luck, Linda. No problems there.
Evelyn Douek:
Good segue to talk about what's going on elsewhere in the world today, just in the context of what's going on in Turkey and the challenges that these platforms face when they operate in these global environments. Let's look at a couple of other jurisdictions that have been busy this week. Singapore has proposed, and in Singapore, once the law gets proposed, it's pretty set to pass, an online criminal harms bill-
Alex Stamos:
Democracy's easy when you don't have it.
Evelyn Douek:
The opposition said... Oh, wait, no, sorry, nevermind. The online criminal harms bill would grant the government broad powers to restrict content online, from blocking the communication of certain material or web addresses to removing apps from app stores in Singapore, to restricting particular accounts. This is basically a dramatic expansion of the powers that the Singapore government already has under its already controversial POFMA law, the Protection Against Online Falsehoods and Manipulation Act, which it's had for a number of years now, and has used pretty extensively and controversially.
I've been told by one of my students actually it's a common parlance in Singapore to say, "Oh, you've been POFMA-d," which is been labeled as a result of an order from the Singaporean government given the extensiveness of the way that they use these laws and this control over online content. So, that was specifically related to fake news and false content, and this is broadening out to a whole bunch of other kinds of harms.
Alex Stamos:
Yeah, so it is a great example of why you have to be incredibly careful about what countries you back, any efforts they have against this information. I went to Singapore a couple of years ago, full disclosure, I had a student of mine who worked for the Singaporean government, and after he was in my class, he arranged for me to go over there and to give talks to a variety of ministries about disinformation actors. So, when I prepared for this, I was thinking, what is the disinformation risk to Singapore?
I really think about China. The People's Republic of China has really been leaning on Singapore. They want Singapore in their sphere instead of in the US sphere. Singapore controls the Malacca Strait, incredibly important in any conflict between the US, and China of oil deliveries from the Middle East and such. So, Singapore is incredibly strategically important, and it is 70-some percent ethnically Chinese. Now, these are Chinese people who've lived there usually for quite a long period of time, from the South of China.
The ethnic history of Singapore is fascinating, and part of the idea of Singapore is it's a state where one of the reasons that they justify autocracy, and I heard a great story from Larry Diamond, who we work with at Stanford, about him talking to the founder of Singapore about this, that effectively they basically said, "If we had a real democracy, we'd instantly end up in ethnic parties, that you'd end up with ethnic division. That's what democracy brings you. We are an autocracy to try to have Chinese people, Malaysian people, Indians, all the different melting pot of folks work well together." Well, the PRC believes that ethnic Chinese people have a responsibility to the Chinese Communist Party, no matter where they live.
When I think of disinformation in Singapore, what worries me is a attacks by the PRC, the Ministry of State Security, the People's Liberation Army to try to create ethnic divisions, to manipulate people based upon their ethnicity, to utilize the variety of languages and the fact that there's lots of people who can read Chinese and who use WeChat and have family members in the mainland to try to manipulate them. So, I go there and I talk to them, and you get all these questions about, "Well, how about this vlogger? How about this thing? How about this newspaper?" So, their idea of disinformation is really news stories they don't like.
I expected a bit of that, but it was amazing to try to sell them on, yes, I understand that people are writing stuff about the country that you don't like, but what you really should worry about is the MSS utilizing WeChat as a tool to manipulate large portions of the population, and that was not on their radar at all. It was just amazing to me.
Evelyn Douek:
So, good luck, Linda, with the Singaporean Online Criminal Harms Bill and elsewhere.
Alex Stamos:
No problem.
Evelyn Douek:
Yeah, exactly. Another one on the list, I hope you're reading up. Pakistan also shut down the internet in the last week in response to protests over Imran Khan's arrest. Russia fined Google over LGBT propaganda and false information on YouTube for failing to delete videos that it said promoted LGBT lifestyles and spread false reporting about Russia's military campaign in Ukraine. So, when we talk about Turkey, we are talking about Turkey, and of course we have these next two weeks coming up, but we're also talking about this is not just a Turkey problem. This is a problem that the platforms face in many jurisdictions around the world.
Alex Stamos:
Yep. It's going to keep happening. If you set the standard that all you have to do is send an email to us, it's not going to work out well. It will be interesting to see what they do next. I feel like the pushback on Musk and maybe his people are going to get to him and be like, "Sir, we get 50 of these a week, and you just opened the door to all of them," that they're going to have to reset now and they might need to create a situation where they publicly push back on a country just to make it so that the floodgates don't open.
Evelyn Douek:
Well, that's the optimistic take. Maybe a country that doesn't have a contract with SpaceX, but we will see.
Alex Stamos:
Yes. Well, let's look at the map of autocracies that do not allow Teslas to be sold, are not actually a large marketplace and do not use SpaceX. So North Korea, effectively. The DPRK-
Evelyn Douek:
That's right.
Alex Stamos:
... is now the only country in which Twitter will stand up to their censor requests.
Evelyn Douek:
Making a stand next week, you heard it here first, folks. Okay, heading over to Europe.
Speaker 3:
[foreign language 00:31:17].
Evelyn Douek:
So in breaking news this morning, the European Court of Human Rights upheld a prosecution of a local counselor for failure to delete comments posted by third parties on his Facebook wall. So the case is Sanchez v. France, and it was a criminal conviction of a local counselor, Sanchez, who was standing for election to parliament, and he was convicted of incitement to hatred or violence against a group.
So, what had happened was in 2014, he wrote a post about a certain MEP and two people commented on that post certain hateful things under it, including saying that the MEP was allowing the town to be run by Muslims, that kind of delightful commentary that you'll find on many posts by politicians. The MEP's partner didn't like this. They [inaudible 00:32:00] the posts and they lodged a criminal complaint and the French courts upheld a prosecution of the local counselor for failure to delete someone else's speech.
And then the European Court of Human Rights this morning upheld that prosecution. So, we're going to be in this fantastic situation, I guess, where in France, you could be prosecuted as a politician for failure to delete something on your posts. Whereas we have these cases going to the Supreme Court this next term about the idea that in the United States, politicians can't delete things on their posts because this is a limited purpose public forum. People come to discuss local issues. They can't viewpoint discriminate against certain kinds of statements. So, I think it just shows the total lack of coherence of how people think about these spaces and what these spaces are and what happens when a politician opens a Facebook page as part of their campaign.
Alex Stamos:
Yeah, and just the limits of all of the metaphors to physical political spaces are falling apart. Every one of these countries is trying to figure out, wait, is this actually a public square outside the building, or is this inside the courthouse where you have free speech but within rules? Is it like you're heckling or do you have a microphone? Or is it like posting a flyer? No, it's none of those things. It's somebody posting on this guy's Facebook account.
So, I do think we're going to need active shaping of this, because the courts always pivot back to what kind of metaphor do you have of how somebody would disrupt a community meeting in 1914? That's just not going to be relevant to a politician's Twitter account.
Evelyn Douek:
Right. I have to say just on the merits, I think this is a pretty concerning ruling because what does it incentivize? It incentivizes those politicians to delete comments or turn comments off when they make posts, which is hardly in flourishing of democratic discourse and different viewpoints, if you're at possible exposure for what any crazy person might say on your Facebook post. So, great.
Alex Stamos:
Right, but you also don't want people to be bombarded with death threats and such and then have to just live with it because they happen to be the local county commissioner or dog catcher, or whatever. It's super hard. I don't think there's any good answer. The other thing is that you can't have a troll farm from 5,000 miles away invade your school board meeting.
So, I think the virtual aspect of this, both the amplification, the anonymity, and the fact that people can manipulate these things intentionally from far away, even if they actually don't live in that community, all do mean that we need different metaphors than being in a public space. It's pretty silly to say that you have an actual duty to go do, but this has happened in other situations, where you have basically a duty to maintain your own Facebook page, is where a number of European countries have ended up.
Evelyn Douek:
Yeah, and I think Australia as well has had a similar ruling, which I have to take the hell for where-
Alex Stamos:
Right, you personally.
Evelyn Douek:
Yeah, exactly. I'm sorry, I apologize again, where defamatory comments on media outlet's posts as well, and I just think... I take all your points, I do think this creates very dangerous and bad incentives. So, I think you're totally right. The limits of the analogy, where we are not people wearing wigs and hats and things back in the town hall meetings in a little Massachusetts County. This is a completely different beast, and the courts have not caught up.
Alex Stamos:
If Australia keeps on going this way, and you want to be a First Amendment professor, I think we're going to have to get you a dialect coach and you're just going to end up sounding like a newscaster. "Hi, I'm..."
Evelyn Douek:
Australia? I've never heard of it, I don't know what you're talking about. British is actually not a good way to go.
Alex Stamos:
No, no, no.
Evelyn Douek:
All right, I will work on my American accent and just defend everyone. Although having an Australian accent means I can mispronounce words and just blame it on the Australians, which is very handy to be able to have when you do public speaking. So, there is that.
Alex Stamos:
Right. We should just warm ourselves up that our cold opens from here on out can be accent offs, of us trying to... I'll try to do a Tasmanian.
Evelyn Douek:
To be sure, to be sure. People are going to love that. All right, so with that, this has been your Moderated Content weekly update, and this show is available in all the usual places, including Apple Podcasts and Spotify. Show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't be possible without the research and editorial assistance of John Perrino, Policy Analyst at the Stanford Internet Observatory, and it is produced by the wonderful Brian Pelletier. Special thanks also to Alyssa Ashdown and Rob Huffman. See you next week!
Alex Stamos:
Oh, top of the morning.