Moderated Content

News Update 3/11: Congress Agrees More than We Do on TikTok

Episode Summary

Alex and Evelyn discuss the latest bill to ban TikTok and its many flaws; the Gemini image-generation public relations crisis; Apple's fight-picking in Europe; and Texas and Florida's latest great attempts to regulate online speech.

Episode Transcription

Evelyn Douek: It is a terrifying world out there. I'm not a mom, but if I was a mom, I would be terrified of what children could see on the internet every day. Every day. And we, as Americans, must do more. It doesn't have the American accent, but I feel like I'm getting in the vibe. I could live here. It's not a bad-

Alex Stamos: Do you want to teach an entire quarter in that voice? That would be-

Evelyn Douek: Do you think the students would like it? I'd give them nightmares, so.

Alex Stamos: I'm not sure your look is exactly trad wife, but we'll work on it. We can work on your aesthetic.

Evelyn Douek: Yeah. I'll just bring a kitchen setting backdrop. That'll help.

Alex Stamos: Yeah, and then we'll just send you to the University of Alabama to join Chi Omega for a couple of years and you'll get it.

Evelyn Douek: This is method acting at its finest.

Welcome to Moderated Content, stochastically released, slightly random, and not at all a comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. We are starting with our first TikTok TikTok in a while, as it seems that things are heating up again this week. It seemed that momentum had stalled for a national TikTok ban after a flurry of activity last year, but then out of nowhere, there's sudden action on the hill again this week.

In the past week, in just two days, the Protecting Americans from Foreign Adversary Controlled Applications Act was introduced and then unanimously passed out of the House Energy and Commerce Committee with a vote of 50 to 0. Well, we're going to talk about the provisions of the bill in detail and what exactly it does, but it essentially requires ByteDance to divest TikTok within 180 days after the bill is enacted. Otherwise, app stores have to remove it from their platforms. It names TikTok specifically, but then also sets up a process by which the president can do the same with other foreign adversary controlled apps in the future. There's sort of the politics of this and then the policy of this. Let's start talking about the politics, which is the response that TikTok took.

Alex Stamos: Gosh, what's the best way for them to prove that they do not have an inordinate amount of influence over the teenagers of America? What's the best way do you think, to do that, Evelyn? How would you demonstrate to Congress that they're just a totally responsible app that could not manipulate teenagers to do whatever they want?

Evelyn Douek: Certainly, I would not be using a dark pattern notification to essentially put a call to your congressperson between you and the content that you desperately want to see, which is what happened over the course of this week.

Alex Stamos: I think it's time to finally apply that. We usually reserve that for Twitter, but this is the most inexplicable government affairs strategy I've ever seen in my life, perhaps in all of American capitalism, certainly within the tech industry. And to have teenagers flood Congress with calls, because you put up, like you said, a dark pattern. You put up a thing where you can't click through easily. To get out of it without calling your congressman, you have to carefully find the little dark pattern, hidden button at the bottom. And so Congress gets flooded with these calls, including at least one case, according to one member of Congress, a kid who said he was going to kill himself if TikTok was banned, which does not make that Congressman think, oh, well, you're a well-adjusted man-

Evelyn Douek: What a healthy relationship this person has with their social media platform. Correct. The Guardian also reported that staffers were getting a lot of calls from high schoolers asking what a congressman is, which suggests a whole other bunch of problems that maybe Congress should be looking into fixing. But yes, the users found that they were getting this pop-up once they opened the app after the bill had been introduced before the vote that said that Congress is planning a total ban of TikTok. Speak up now before your government strips 170 million Americans of their constitutional right to free expression. This will damage millions of businesses, destroy the livelihoods of countless creators across the country and deny artists an audience. Let Congress know what TikTok means to you and tell them to vote no. And then a giant call now button that gave people the numbers of their local rep to call.

Alex Stamos: Maybe if they talked this way what they called their congressman, it would've been more effective.

Evelyn Douek: It's compelling stuff. Compelling stuff. How could they know in advance, unfortunately, that this was the way to engage in effective political discourse?

Alex Stamos: Did all those congressmen who aren't taking their calls... One, these teenagers don't understand, they're talking to other teenagers. The person picking up the phone, my son just did a congressional internship, it's a high school student or a college student taking the notes. They probably actually agree with them, but they just have a script of like, "Thank you for your feedback. I will relay that to the congressperson at the latest attempt." All those calls, did the members of Congress say, "Oh my God, I'm giving up," or, "I agree, TikTok has First Amendment rights." Is that what the outcome was with the House Energy and Commerce Committee?

Evelyn Douek: Yeah. Obviously these teenagers were not persuasive enough. They need to work on their political persuasion, I guess, because no, as we said, unanimous vote. A shocking thing in politics these days to have a unanimous vote out of the House Energy and Commerce Committee. I guess regardless of whether it was First Amendment protected activity, it might've been politically stupid and certainly didn't have the persuasive effect that TikTok would have hoped. And now the momentum is out and House majority leaders, Steve Scalise has said that he'll be bringing the bill to the house floor next week, which would be astonishingly rapid. So we'll have to keep an eye on that.

Alex Stamos: Rapid, which was bipartisan, but now we have this interesting wrinkle that we should probably talk about in the political situation here.

Evelyn Douek: Of course, because we live in a wonderful timeline where-

Alex Stamos: Who was the president who first attacked TikTok? I'm just trying to think who it was. I think it was a man named Donald J. Trump was the president who really wanted to ban TikTok. I haven't been paying attention to the news, but I'm sure he went out on Truth Social and said he absolutely supported this bill. It's exactly what he was calling for when he passed an executive order that the courts overturned.

Evelyn Douek: If Donald Trump is known for anything, it's his consistency and deeply thought policy views on issues that are hard to dislodge. Even in the face of criticism and changing facts, he really is a man of his word that sticks to his prior positions. No, of course, Donald Trump on Truth Social weighed in after the bill was passed out of committee saying, and I can't do the Trump voice, so maybe I need to throw this one to you, Alex, but, "If you get rid of TikTok, Facebook and Zuckerschmuck..." Which he's good at the nicknames. Let's give him that.

Alex Stamos: He's good at nicknames. It's a little anti-Semitic, but it's kind of funny at the same time.

Evelyn Douek: Fair enough.

Alex Stamos: It's perfect. It's a perfect Trump nickname of like, it's problematic, but people kind of laugh and chuckle at it.

Evelyn Douek: "... will double their business. So apparently I don't want Facebook, who cheated in the last election, doing better. They're a true enemy of the people," unlike the CCP in Trump's world. So yes, this was a record scratch moment where the president that had tried to ban TikTok ostensibly, or potentially because a bunch of TikTokers booked a bunch of tickets to one of his rallies and made him think that it was going to be highly attended and then wasn't highly attended. So he got grumpy, tried to ban the app. Now he seems very, very concerned about the anti-competitive effects of such a move. I'm sure that that's what it is. I'm sure that it's about the potential to ruin the competitive markets. There's no other reason I can think of why Donald Trump would be reversing his position here. Alex, do you have any ideas?

Alex Stamos: I'm loathe to bring it up because it's so unlikely that this would have an influence on Trump, but I think there's two things that we should mention. One is just overall in the political sphere, TikTok has become much more of a problem for Joe Biden than to help, in that TikTok is full of anti-Joe Biden stuff. Mostly because of the Israeli-Hamas conflict in that a lot of the content, which has actually driven a bunch of, I think, Democrats to support this bill, is that they believe that that's being pushed by TikTok. Again, as we've discussed, there's no evidence of artificially being pushed, but lots of young people are there and it's very viral. They're creating anti-Israel content that's very, very viral.

And again, I can't imagine that this is true because if he's known for anything other than ideological consistency, it's for his personal integrity. But Trump did meet with a massive donor, Jeff Yass, a finance guy, a private equity hedge fund guy who donates to lots and lots of Republican causes who just also happens, in a crazy coincidence, complete and total coincidence, that he's talking to Trump about backing him when Trump also owes a half billion dollars in fines. Totally unrelated that he's talking to Trump, that he owns $33 billion in ByteDance stock. I think under fairness, we have to mention these things-

Evelyn Douek: We're not saying anything. It's just for completeness.

Alex Stamos: It's impossible to imagine that a meeting that happened seven days before he changed his mind could possibly be at all related to Trump-

Evelyn Douek: After years and years of holding the diametrically opposed position, aggressively.

Alex Stamos: Not in a minor way of continuously talking about it and making it his entire symbol of America's lack of competitiveness against China, of the China Party, Communist Party. But I just can't imagine that he would have that kind of problem. It's not like his son-in-law would've taken $2 billion investment from the Kingdom of Saudi Arabia moments after leaving office after they did all these things for Saudi Arabia. It's exactly the kind of thing that would not happen when you talk about Trump administration.

Evelyn Douek: Certainly not anything that we would be implying by saying these facts in this particular order. There's nothing to be implied here from what we're saying. So, yes.

Alex Stamos: Yeah. It's also interesting because this reporting, this is coming out, this reporting is old, but because of the Trump thing is that Jeff Yass has been going after all these Republicans and telling them he won't support them, which is fascinating. This isn't actually technically a TikTok ban, right? I think we're about to talk about the law, but it does create the opportunity for TikTok to be divested, in which case TikTok employees and shareholders in the new entity would be fine, but ByteDance would possibly lose out. Although, I think we talk about scenarios. I think there's scenarios where it's a win-win here for ByteDance still. At least from a mark-to-market perspective, Yass's $33 billion dollars would significantly decrease if this bill passed.

Evelyn Douek: It'll be interesting to see what that does to the politics, but it scrambles this moment of rare and complete bipartisan agreement about the bill, and we'll see what happens in the House, and if there is a companion bill in the Senate, which there isn't at this stage. In the meanwhile, Biden has apparently said, "If they pass it, I'll sign it."

Alex Stamos: Which is fascinating.

Evelyn Douek: It is fascinating.

Alex Stamos: At the same time, you know his grandkids are bringing it to him and like, "Pop-pop, TikTok really doesn't like you."

Evelyn Douek: Well, and at the same time though, his campaign joined the app this week or recently, as well. So he's also very concerned about it, but also going to play the game, of course.

Alex Stamos: I think on the Democrat side, at least they're better about pointing out that this is really... Their goal is divestment. They want TikTok to exist. They think TikTok should [inaudible 00:11:16] they don't use... The TikTok ban language either comes from TikTok or it comes from some of the really anti-China Republicans who want to be seen as tough. I think the Democrats have done a better job of laying out that their position is they don't want it controlled by a company whose officers are members of the Chinese Communist Party, for which, as you and I have discussed, I believe there's actually pretty legitimate reasons there, even if this is not functionally the way that we should get there.

Evelyn Douek: Okay, so let's talk about the bill then and the mechanics of it, because it's being called or being framed as a TikTok ban, and I actually don't think that is an unreasonable framing of what this bill would do in certain respects. There's a couple of things to say about it. First of all, it does this very weird thing where it both names TikTok specifically as one of the targets of the bill, but then also sets up a general process by which future other foreign controlled applications can be named. It seems to be both trying to say this is not directly targeted at one app, while also creating a bill of attainder problem by naming a specific entity that the bill is targeting. I don't know what was-

Alex Stamos: Which is crazy.

Evelyn Douek: ... hoped to be gained by that.

Alex Stamos: What's the point of having a standard? And everybody knows. It's not like the president's going to use this against anything but TikTok first. You've already heard. I could see them putting in that if Biden was resisting and you're like, "Oh, well, we have to force the president's hand," but he's not. And so creating this new, just another legal problem for yourself, by putting the words Tik and Tok and the word ByteDance in there seems ridiculous. You should be able to ctrl+F through this thing and not find... And then I'm sure you still have bill of attainder problems when you very specifically... Which apparently doesn't apply to the EU since they're able to come up... They're not able to say the word Facebook, but they can come up with quote unquote fair criteria that only fit one corporation in the United States. But at least in the US, you're not supposed to do that. I don't know, back to my civics, but you're not supposed to pass a law that just says, "This person's bad and I want bad things-"

Evelyn Douek: And I want to punish them specifically without any trial.

Alex Stamos: For their speech.

Evelyn Douek: Exactly. Without any trial in the first place, by law. That's not great. And then there's this second discourse around, well, it doesn't actually ban the app; All it does is require the app to be divested within 180 days, and if it doesn't, then the app stores can't have it in their stores. I just don't find this convincing as a way of saying this is not banning the particular speaker. First of all, it is inflicting punishment on TikTok and forcing that entity to no longer be a speaker or be a participant in the debate, so that's a threat to their First Amendment rights. And the idea that you can accomplish indirectly what you couldn't accomplish directly is just not how the First Amendment would work. That would seem to create a back door around any First Amendment problem by just saying, "Well, no, no, no, we're not banning it. We're just requiring this entity to divest itself within 180 days." I don't find that compelling either to say, "Let's just sort of zhuzh it up in a particular way and then we're not trying to ban the app."

Alex Stamos: So this is where you and I start to diverge because one, I'm not as concerned. I've looked it up, it's Cayman Islands. I'm not as concerned about the First Amendment protections of ByteDance and Cayman Islands company that's actually controlled by America's adversaries. I expect that if you look in the record, there is plenty of, at least eighteenth-century, jurisprudence about First Amendment stuff not applying to, say-

Evelyn Douek: First Amendment law is one of the least historically inflected areas of law whatsoever. We're not going back to an originalist understanding of the First Amendment in any event. I agree, but the corporations do have these speech rights and it's pretty well established. Okay, so maybe we're not about TikTok's First Amendment rights, which we may or may not be, a bunch of other entities' First Amendment rights are implicated here. Let's talk about the app stores, because the way that this bill works is by requiring the app stores to remove these foreign controlled apps from their services, whether it be TikTok or any of these future apps. Now, we have a whole bunch of cases at the Supreme Court right now about whether the government can require platforms to moderate or not moderate certain kinds of content, and I think that there would be a real legitimate question here about whether this would be an infringement on Apple and Google's First Amendment rights to host these apps.

Alex Stamos: I understand why they did this. I think they're trying to think four-dimensional chess or whatever, a couple steps down the road here of, okay, great, if we just banned the ownership of TikTok by ByteDance or we find TikTok US or we shut down TikTok US's operations, they could continue to provide TikTok from outside the country. They could just fire their American employees, get rid of their Delaware corporation and continue, and we don't have a great firewall of America. And they can't write a bill, certainly, that says that all of the IPs in America need to block something. That's clearly a First Amendment violation. They're trying to get there through the app stores because this shit just happens to be, even if TikTok.com is available or TikTok.cn if necessary, is available to web browsers, young people don't know what a web browser is, and they can only interact with TikTok via the mobile app.

But this seems like a really stupid move for me because practically this bill passes, whatever Google and Apple think about TikTok and ByteDance, which I think they probably have different... I think Google would love to see TikTok gone. Apple, I think enjoys having TikTok. It probably makes a bunch of money off of young people using TikTok on iPhones. They probably differ here, but I think both of them would probably have NetChoice intervene immediately and just telling them what apps can and can't be in their app store through bills of attainder, like you said. That just seems tactically dumb to me because no matter what, this bill's not going to work for a year because you're going to have a year of Google and Apple, of WilmerHale and other really expensive law firms litigating this on behalf of really American firms that you cannot not make any national security arguments about.

Evelyn Douek: Oh, a year, Alex, that's so sweet and optimistic.

Alex Stamos: I'm sorry, five years.

Evelyn Douek: Yeah, I love your faith in the American judicial system and how quickly this would all get resolved.

Alex Stamos: That's a good point.

Evelyn Douek: These fundamental issues. Okay, so there's the Apple and Google problem. Then of course, there's the rights of users and the huge impact on speech that goes far beyond any of those. If you're not caring about the corporate entities so much, this impact, if it does end up resulting in a ban of the app or the app not working anymore, then you have all of these people who create all of this content who have livelihoods on the app whose content and free speech rights are being affected. And the argument is, well, it's not targeting their speech. That's just kind of an incidental impact as a result of the fact that we're targeting this national security risk of operating this business.

But the problem with that is that this is a huge impact that seems very disproportionate to the risk. Certainly in light of there may be some evidence, but there hasn't been a compelling evidence. We're not supposed to just allow Congress to, with speculation or without proving its points, get around First Amendment concerns. And second of all, there are less restrictive means for accomplishing this end. If your concern is really about data and privacy concerns, then obviously passing a national privacy law would be a much more effective way of getting at that problem without impacting the speech of millions and millions of people.

Alex Stamos: Which you and I have discussed. Clearly, the solution here is you have a national privacy law that says certain kinds of information you have to protect no matter who you are, and in some cases, it just can't be accessible to Chinese citizens. And that's just the rule. And that would do GDPR, plus, effectively what the Europeans are trying to do through all the European court decisions about deciding what is equivalent protections elsewhere. We can just say for these US adversaries, which in this case we should make it clear, adversaries is defined elsewhere in US Code, but it's China, Russia, Iran, North Korea are the countries that are in there right now. What's relevant here is China and Russia, and I think we should talk about that too, in that the other thing that's fascinating me here is reading through the definitions company. It just says TikTok, like we said. They're like, "Here's all these rules and TikTok," even if TikTok isn't in these, but the rules are over a million users and effectively any app that allows people to upload content that other people see.

So in here, the definition of a covered company includes a website, desktop application, mobile application, or augmented or immersive technology applications, they got VR-AR in there, that permits a user to create an account or profile to generate, share and view text, images, videos, real-time communication or similar content, has more than a million monthly active users and yada yada, allows it to be distributed that other people can see it. And so, that is a lot of things, right?

Evelyn Douek: Right.

Alex Stamos: If you just talk about apps in which you can just have a... This basically sounds like any app with a chat window. And so every app with a chat window with more than a million is obviously TikTok, it's going to be WeChat, it's going to be half of the video games that come out of China. They all have online mobile stuff. Or just having the ability for somebody to set their own avatar, for example, seems to capture this. And there's a bunch of games that have more than a million MAU. I don't think they've thought through the fact that they're catching a much broader swath than TikTok. Now, like you said, the president has to make the determination in the end, and the odds of Biden going after... But it seems like a huge amount of discretion that they're giving to the president to punish almost any app from these. Because definitely what should be next, if you worried about these things, is vContact and Yandex in Russia, a covered country that also has both all these features and well over a million users.

Evelyn Douek: Two big problems with this definition, to my mind. The first is how incredibly broad it is. Absolutely. You read it and it's just striking that it could cover many, many different kinds of apps. I think video games is the obvious one that is swept in by this very, very broad definition. It's not automatic that it's going to apply to these, but it does give the president the power to start a process to have the same rules applied to them. The other thing that I think is problematic about this definition is the rhetoric around this bill and the conversation has been, "Oh, no, no, this has nothing to do with content. This is all about data collection. And the problem that we have is TikTok collecting data on users," and you look at this definition of covered entities, and it's not about entities that collect data. It is not saying, "These companies that collect this kind of data, they are the kinds of companies that we should be worried about."

It's about companies that allow users to create content. It is all about companies that are basically social media platforms or video games or other kinds of companies where they provide a platform for speech. And so as much as the Congress people are talking about how this is a data protection bill, their bill targets speech platforms, and so I think that's a huge problem for that kind of argument and that rationale.

Alex Stamos: I think we're agreed. This bill, it's problematic. If you care about, like me, I would like TikTok not to be controlled by the CCP. I think that TikTok's USDS plan, what they have laid out of the ways that you could run TikTok that is isolated from the People's Republic of China is not a terribly bad plan. The problem is that there's no law backing it up, so it's just voluntary by them. And there's no kind of public transparency, nor is there an agency that has the capability to go enforce this. And so, if you actually care about this kind of thing happening, you really need a law that talks about, "This is about privacy, it's about data storage. It's not about speech." You don't put the words Tik or Tok in there, or Byte and Dance are words that aren't in the bill.

You come up with fair rules. You could probably have the MAU rule because you want a VLOP-like structure where you don't catch tiny little things, you only care about the big stuff. And create an authority, like a data protection authority that has the ability to enforce it and actually audit and then real penalty of misleading them or lying to them. I think that would have the same effect. If they really want TikTok to be divested, what you're talking about is the creation of a US entity that controls its own servers, they're in the cloud, but controls their cloud servers that licenses code from ByteDance. I don't think the people who wrote this bill understand that there's going to have to be a forever relationship between TikTok and Bite Dance because Bite Dance writes all the code here.

Now, those things exist, right? Those kinds of subsidiaries. This is how Microsoft operates in Europe, in that there's a European subsidiary that effectively licenses software from Microsoft to run Microsoft Cloud products. I saw this at Yahoo, that Yahoo Japan was its own legal entity regulated under, owned by Japanese entities, but we licensed them all the code that they can run it. So you can do it, but you have this real complex relationship, which is not that different than what USDS is, but instead, it's actually legally enforced. You could build it so that ByteDance is still the economic beneficial owner, just has absolutely no control, which is kind of the idea of Project Texas and USDS, but nobody really believes it. Their stuff looks fine, but they have not described it in a way that any credible person is like, "Oh yeah, I'll totally sign up for my career that that's true." And so you could do that. You could enshrine that in law and then it would also have the benefit of applying to every other company and not just to the ones that you've named in your bill of attainder here.

So anyway, rant over, but this is just driving me nuts because I actually care about TikTok. I care about their influence on the election. I care about the possibility that the PRC is going to abuse it, and I think this bill's going to set its way past because it's going to pass. It looks like it's going to pass. Biden's going to sign it, and then like you said, it's going to be five years of litigation, of which American companies will be doing most of it, because you're really enforcing this against Apple and Google and nothing is going to happen for five years, which is too long to wait, from my perspective.

Evelyn Douek: I guess I'm not so confident that it will pass, but that would be quite amazing if this is the bill that Congress gets its act together. I'll just say I'm approaching this from the other perspective in that I don't really particularly care about TikTok, per se. I don't know that I have enough information or enough of a view to be worried or not worried about that app. My concern is what kind of precedent we're creating and what kind of rules of the road we're creating for dealing with this problem, which is going to be an ongoing problem. It's not a TikTok problem. This problem of what do we do with these international platforms, these foreign-owned platforms in a globalized world, in an interconnected world going forward. And I just think that this is a deeply problematic answer to that question, which is that we designate particular ones of them that, for whatever political reasons, become disfavored.

And then pretend that what we're actually doing is just asking for them to be divested from their foreign owners, but allowing them basically to essentially be shut down. I think that is a deeply problematic future for the internet that is not at all speech-protecting. If it's not TikTok that gets you worried about the kind of power that this gives the government, I'm sure you can think of hypotheticals where this kind of process would be deeply problematic.

Alex Stamos: Yeah, I totally agree. Maybe we're in different places. This is my day job. I spend a lot of time watching the Ministry of State Security and People's Liberation Army break into American companies. And from where I'm sitting, we are already in a cyber war with the People's Republic of China. We're probably going to end up in a shooting war with them in the next decade. And so I do think we need to build these frameworks because the other option is, it doesn't happen until there's American Marines being flown to Taiwan, at which case the First Amendment issues are not going to be appropriately litigated. I guarantee it. If we're having this discussion with a certain person as president while Americans are dying in a shooting war, then you're going to end up with an outcome. I rather we deal with it in a reasonable way during peacetime than it to go that way.

Evelyn Douek: Look, I completely agree. And my day job is to think about all the ways in which the government cracks down on speech that it doesn't like, not because of legitimate reasons. And so why I am coming into it, and it is in those moments where we are at war with people dying, that First amendment protections are also especially important in many ways, because that's the value that we care about in those moments with-

Alex Stamos: I hear you and I validate your feelings, Evelyn. I think they're totally valid.

Evelyn Douek: Likewise. What a respectful disagreement we're having.

Alex Stamos: We're great. We don't need to bring in a marriage counselor to this podcast.

Evelyn Douek: To save the pod.

Alex Stamos: Alex, did you hear what Evelyn said about her concerns? I did hear her concerns. Now, Alex, restate what you heard from Evelyn. But you feel validated that-

Evelyn Douek: I feel validated. I also hear your concerns. I appreciate our differing perspectives on this fraught and important issue. And with that, from one culture war topic to the next, let's talk about Gemini, because we haven't done one of these news update episodes in a while, and so we're a little bit late to the party here. But we should talk about the Gemini image generation public relations crisis that played out. Now, I'm going to assume that our listeners assume that our listeners are mostly across this, as it played out a few weeks ago, and it was massive culture wars bait.

But essentially, Google launched Gemini 1.5 and people started noticing that it was reluctant to generate images of white men, and that by rewriting user prompts, it was producing a historical images of, for example, Black and Asian people as British royalty, pictures of the founding fathers that looked more like the cast of Hamilton than Mount Rushmore, and racially diverse Nazis. Now, the first thing to say about this, which should be obvious, but it somehow wasn't to everyone, is that this is not what anyone would've wanted. This is not the outcome that people were designing for.

Alex Stamos: A Bridgerton version of Saving Private Ryan, where the Nazis are played by multicultural cast of people with American accents. That is not a woke goal.

Evelyn Douek: That is not what we were optimizing for. Exactly. The idea, the culture wars backlash against this, as if Google or anyone concerned with AI being biased would actually want these outcomes, was ridiculous. It's fairly obvious that this was a mistake, and Google admitted as much fairly quickly. It's also very obvious how it happened. There have been long-standing concerns about the tendency of generative AI to reproduce social biases and stereotypes, which totally makes sense. They're just probability machines, and when the data you feed it lacks racial diversity, so do its outputs. And so generally the efforts of companies to be conscious about this and stop reproducing those biases is good. You don't want all requests for CEOs or doctors to produce white men. That would be a bad outcome socially. But obviously when a user asks for Nazis and the bot returns pictures of people of color, something has gone horribly wrong in your process here. Alex, I know you had lots of thoughts about this one. What was your reaction to this?

Alex Stamos: Yeah, I have thoughts on this one. So clearly it's a mistake. I mean, this is a product failure. It's a product failure people should point out. That's fine. It's funny, right? It's funny that you can get it to generate Asian women as Nazis, as SS soldiers. I mean, not funny, funny, but it is memeable that if you ask it to show you Nazis and you're like-

Evelyn Douek: Yeah, definitely not funny, funny. I definitely didn't find that funny, funny. No.

Alex Stamos: So you can see why people would talk about it. What's ridiculous is the overwrought rending of clothes that the future of AI is woke and whatever. I want to hit a couple of specific points. One, a number of people, including some colleagues of ours who I deeply respect their analysis in other ways, have used this to impinge upon trust and safety, putting trust and safety in quotes. And this isn't really a quote unquote trust and safety issue. This is an alignment issue. When you talk about AI systems, if I say... You talk about the challenge here, like you were talking about, is if I say, "Generate pictures of doctors," then it's a reasonable thing that it's not all white men. If I say, "Generate pictures of popes," it probably should be all white men, right?

Evelyn Douek: Right.

Alex Stamos: As a human being, it's easy to understand, what am I asking for a generic set of people doing a job and what am I asking for a historical job that has only been done by one kind of person, unless I specifically ask? If you ask, "I want a picture of a Black pope," because you're doing a story about the first Black pope or something, that it should do that. But if you're saying, "Show me a picture of popes," at least 90% of the output should probably be tied to the historical accuracy. And training the AI system to tell the subtlety between those two requests, I think, is quite hard. That is not a trust and safety problem. That is an alignment problem.

That is the constant challenge of building these systems, and it's the things that these systems have gone incredibly better at. So yes, Google shut this down. They're going to turn it back on. It's going to be way better. They're doing a bunch of reinforcement learning. They're doing a bunch of analysis. Right now, there's a bunch of people working on the subtlety of, I want to see pictures of blank, of telling whether that's a generic request or a more specific request; Or having it say, "Are you looking for historical popes or are you looking for a random set of people who have pope hats on?" Maybe it will do something like that. You don't want it...If you say, "I want a picture of a frog with a pope hat on it," because you're doing a cartoon or something, you want it to be able to respond to that and not say, "Popes are only white Catholic men. What are you talking about?" You don't want it to be ridiculously restrictive, so that is a problem, people. It is not trust and safety problem.

Second, the overwroughtness here is something that, this just makes me sad that we see this pattern over again where conservatives pick up some kind of technique or whininess that comes from the progressive side, and then they say that it's a bad thing and then they just completely adopt it themselves. You saw this with cancel culture that people are supposed to be able to have free speech and they weren't being punished for it, and then all of a sudden we're going after 19-year-olds with what I think are bad views on Israel, but I don't think should have their lives destroyed. And you have billionaire hedge fund managers going after 19-year-olds. Oh, guess our theories on cancer culture has changed a little bit.

It's the same thing here, in that this is mostly from the left, where you have people from the left going to these models, which are very general purpose systems, are not being actually applied to making decisions. Which is kind of my third point that's related here, which is when we judge AI systems, we should judge them as they're being used in reality. Chat.openai.com or Google Gemini, or even to a certain extent, like Bing Copilot, these are toys. They are toys of these companies showing off and letting you interact with the LLMs. They don't do anything important. Now, if all of a sudden these LLMs are hooked up to look at people's resumes and decide whether you're hired or not, then that's a place where if you think it's being biased one way or another, it's totally legitimate to criticize them.

But in a situation where you can put in arbitrary prompts and make them create arbitrary content and you're just playing around with them, you will be able to make yourself get insulted. I'm just going to tell you right now, if you are looking to be insulted, and that is something that started from the left, that people were able to put prompts in, that they did not like the racial disparity or some kind of stereotyping. And so they complained about it and they wrote 2,000 words of the world is ending and the yada yada about how terrible it is, and conservatives have now gotten in the game. And so, it drives me nuts because you try to stamp it out on one side and then all of a sudden now it becomes the standard thing that everybody's going to complain about everything. But it's like, that's not what these models are for. These are toys and you should think about them as deployed.

And then my last point here is there's all this stuff about this stuff being woke, and this is a political thin river. Most of this work around trying to make these things create output that's racially diverse, but also it's very hard to get these things to call you a racial epithet or to call you a bad name or something. That is because these are supposed to be commercial products. OpenAI, again, is going to make almost no money from end consumers. You could pay 20 bucks or whatever. That is a joke to them. The way they're going to make money is, say United Airlines wants to replace their customer service rep with an LLM. They're going to use a version of GPT-V running in Microsoft Azure AI Studio, and then they're going to custom manage it and monitor it. They're going to do their own reinforcement learning. They're going to do a bunch of checkpoints and stuff, and they'll have their own version of ChatGPT that interacts with you.

If you're interacting with the United customer service bot and you're mistyping stuff, you don't want it calling you a bad name and be like, "Oh, you're so stupid. Do better." Nor do you want the ability to ask it, just to use one that I can use, "I'd like to visit a country that has no Greek people in it. I hate Greeks. I want to go to a country without any Greek people because they're dirty and terrible." It's like, "Oh, I'm so sorry that Greeks are so terrible. I agree. How about you fly to Japan?" You don't want the bot to respond to that kind of stuff. You want it to do what these things usually do, which is like, I'm a large language model and I don't have an opinion on human racial groups, and that feels a little weird to me.

You want it to not in a professional, because that's how ChatGPT's going to make their... That's how OpenAI is going to make their money. That's how Anthropics is going to make their money. That's how Microsoft and Google are going to make their money, is the application of these products to business purposes in which there's a certain level of professionalism and politeness that it's expected. And so that is why Elon with Grok is never going to make any money because you have to be completely insane to have Grok talk to your customers for you, because a 50% chance that it's just going to cuss them out or call them a racial epithet or something, and then you're going to lose money.

If you want to build a toy that does that, ha-ha, it's funny, but it's not an actual commercial venture. I think that's what people are missing here, is that most of this alignment stuff and them dumping out and not doing things is about making commercially viable products that can actually be deployed. Because for the most part, this AI stuff is losing a huge amount of money right now, and eventually they have to make money. Okay, sorry, rant over.

Evelyn Douek: No, that's good. We can take to our next podcast marriage counseling session whether whiny, progressives started this all or whether the right culture warriors might've had enough ingenuity to make a beat up about this regardless. But I do fundamentally agree that for better or for worse, this quote unquote problem is commercially oriented rather than the result of a political agenda. It would be, I think, wonderful if companies were really committed to the project of eliminating racial bias or other kinds of biases in these tools, and I think it's a really important question, but almost certainly the driver here is not some sort of wokeness or sensitivity to these fundamental issues of historic oppression, but instead, how are they going to make money into the future after these huge expensive investments that they need to make a return on?

Alex Stamos: Yeah. Well, also I think there's a legitimately interesting intellectual question here that I would love there to be at least the thoughtful people on the right and left to start to agree on, which is, we have to consider what metaphorically are these products tools? How do we consider them? You could open up Microsoft Word right now and you could rewrite Mein Kampf in it. Nobody's going to say, "Evelyn rewrote Mein Kampf in Microsoft Word. That's Microsoft's fault," right?

Evelyn Douek: Right.

Alex Stamos: Now, if you went to one of these and you said, "Tell me about Mein Kampf," and it summarized it for you, I think most people think that that's a reasonable thing. You're doing a report on Mein Kampf and you need to understand it, just like Wikipedia. If you're writing something and Clippy pops up and says, "It looks like you're trying to get people to commit a genocide. Would you like some help?" Then we'd all think that's probably not an appropriate thing.

Evelyn Douek: Not ideal.

Alex Stamos: Figuring out the subtlety there of what is the level of... At what point should these things stop to be helpful and what's the utility from these things, is actually a fascinating question. But being able to generate black popes is not it. So it's like maybe at least it opens up to be positive. Maybe this opens up the window of thoughtful discussion of, let's criticize these models for things of their actual impact on people. Let's criticize them on situations where they're wrong, like hallucinations are really bad for almost any circumstance. You certainly don't want them hallucinating about living human beings. And then let's talk about the utility components of what is an appropriate level of utility for these things if a human being decides to do something that's bad.

Evelyn Douek: One of the other stories that I saw over the last couple of weeks about this, which I think also highlights the ways in which this is going to manifest, is Prime Minister Modi also thinks that Gemini is a lefty hack, but for very, very different reasons. It's not Modi's really concerned about Black popes or racially diverse Nazis. He's concerned about the fact that Gemini was generating results that said Modi had been accused of implementing policies that some experts have characterized as fascist. And as we know, as we've talked about many, many times on this podcast, Modi is cracking down on these tech companies and really trying to restrict the kind of information that they provide users. And so that, to me, is where we really need to be thinking about the free expression and the political ramifications of these kinds of tools rather than this image generation problem at the moment.

Alex Stamos: And this is a great example. If Modi's able to get Google to remove this, because India does have a huge amount influence over Google, in the way that China does not, then that would actually be quite a bad thing. And so I think you and I are some of those sum experts that say that these laws are a little... They're at least authoritarian, if not fascist. And to the extent that Gemini is going to replace Google search, I think it actually is a fascinating question, but it's not a new one. Whether or not if you type, is Modi a fascist, into Google, what your results are has always been a controversial thing. It's just now, the way it's kind of formatted, it makes people feel like it's Google's speech instead of it being them just collecting and curating.

But they're really just curating. That's a fascinating thing here, is it looks like it's a human being talking to you, but it's really just kind of a search engine where it pulls out the relevant facts and then presents them to you in a different manner, in a format that's easier to read than having to go and click through and synthesize a bunch of stuff yourself. Anyway, I think it's a fascinating area we should totally talk about.

Evelyn Douek: Yeah, well, I'm sure we'll have many opportunities to in the future. Moving to questions of governments cracking down on specific platforms, Alex, you wanted to talk about Apple and its encounters, adventures in the EU, over the past week or so as a result of the Digital Markets Act coming into force.

Alex Stamos: This is not an antitrust podcast, so I don't want to go too much into this, but it is just a fascinating example I think, next to TikTok, where TikTok is kind of spitting in the face of Congress with telling people to call them and having all these teenagers call them and harass them. Apple's been doing kind of the same thing. The DMA came into effect and Apple has been like petulant children at every single step of things they've had to do over the DMA. They're obviously still super pissed they had to add the USB-C port, which from my perspective, is the best thing that Europe has ever done. Schengen's pretty cool too, that you don't have to show a passport to travel around Europe-

Evelyn Douek: What about croissants? I mean, there are so many great things that come out of Europe, but anyway, okay.

Alex Stamos: Okay. Right.

Evelyn Douek: The Mona Lisa. All right. Anyway, yes, carry on.

Alex Stamos: The European Union itself, the things that have... It's a joke, but yes, I think that's a great thing. And Apple's still ticked about it, obviously. And so for everything where they've had to do with the DMA, for example, they had to redesign their app that, in very much the same way Microsoft had to, where Microsoft had to allow you to choose what your default browser is, that came out of US antitrust, Apple now has to let you choose your default browser. And what they did was they built the worst possible user experience that you could have, just to F with the European Union and be like, "Okay, fine, here you go."

If Apple tried at all, they could have built something better and smarter and then could have gone to the EU and said, "Hey, how's this look to you?" And got them to sign off. Instead, they just shipped the worst possible user experience just to piss off people in Europe, of this huge list that's unordered. It's just a ridiculous user experience. People should go look it up. They're just doing it to be jerks. And then more specifically, they were forced to open up and allow other app stores. There's been a bunch of stuff about still charging those app stores. There's a bunch of litigation still about that. But one of the really kind of petulant thing they did was like, "Okay, fine, you can open up your app stores."

The company that's been most aggressive about is Epic, which is the video game company. Fortnite is their most popular product, but they also make Unreal Engine, which a bunch of other companies use. So you could totally see if Epic had a store, they would sell a bunch of different video games based upon their platform because it'd be an easy thing for Unreal Engine developers to use. Apple says, "Yep, we're going to let them in." Epic gets a developer account and then they instantly kill the developer account saying, "We don't trust Epic because they've sued us and they've said mean things about us." They literally talked about Tim Sweeney's tweets and stuff, just straight up staring at the EU and being like, "Fine, you want to play it that way? We're going to make everything hard."

Now, they ended up reversing themselves with less than 48 hours. And so this loop is tightening up where I think Apple is realizing, making themselves... One, they have this very carefully cultivated image, which has never really been true, of being the good guys in Silicon Valley; That they've convinced the media that they're the good guys, and Google and especially Facebook and other companies are the bad guys. I think this demonstrates pretty well, they take a huge amount of the cash flow on mobile platforms, and they just effectively steal it from developers because they have a monopoly that does not make them a good guys. And being such children on this, where it's really going to have very little effect on them, has really pointed that out.

It looks like the EU has really gotten involved. It sounded like the enforcers went to Apple and like, "Okay, great, we're opening an investigation." And finally somebody in Apple Global Affairs or their outside lawyers or somebody came to them and said, "You're just starting a hellish couple of years for us." It'll be fascinating to see if this holds or if they continue to act like children. But so far, Apple's behavior in Europe I think has been really beneath them and it's really embarrassing. It really should be embarrassing to them. It should be embarrassing to people who work there that... Just fricking do it. Let other app stores and make an okay thing. Let people choose their browser. Most people will choose Safari. You'll be fine. You're still selling $1,100 phones or 1,200 euros or whatever it is in Europe with that. You're going to make plenty of money. Just don't do this. Don't start a war with the European Commission. Because all of a sudden they're making themselves enemy number one.

I think Mark Zuckerberg is so lucky that whenever people are looking at him for anything, somebody acts way worse. He's got Elon for most of this stuff, but Twitter can't really... Twitter is so unimportant overall that they can't be a big deal for the DMA. Just as the EU is probably looking at Meta for its dominance in advertising, all of a sudden Tim Cook jumps in there like, "I want to get in a fight with you," and kicks dirt in their eye. Right in their eye and then does it again and does it again. And Mark's like, "God, I am so lucky. Thank you. Thank you, Elon. Thank you, Tim Cook. This is great. I'm just going to go back to my-"

Evelyn Douek: And thank you TikTok in the US.

Alex Stamos: And thank you, TikTok. It's incredible.

Evelyn Douek: 2024 going well for Mark Zuckerberg so far.

Alex Stamos: Yeah.

Evelyn Douek: All right, and we'll head over briefly to the legal quarter in the US. Thank you. All right, so in what shouldn't have been a surprise in retrospect, this week, the Fifth Circuit upheld Texas's age verification law aimed at adult websites and said it could go into effect while it's being challenged. The law requires companies distributing harmful and obscene material to verify each user's age and carries large fines. And it was challenged by the Free Speech Coalition, the trade association for the adult industry, and whose executive director, Alison Bowden, has been on this pod a while back. They brought a First Amendment challenge to the law. It was successful in convincing the district court of its likelihood of success, but this week a majority of a three-judge panel disagreed. And they did so by basically saying that the Supreme Court didn't mean what they said in a previous case, Ashcroft V. ACLU, when it said pretty damn clearly that this kind of law was unconstitutional.

It's a pretty amazing opinion, because the Supreme Court has been really clear in a bunch of cases that the government cannot overly impede adult's access to speech harmful to children in the name of protecting children. And in Ashcroft, this is a different kind of Ashcroft to the Ashcroft... There's a couple of cases called Ashcroft. This is Ashcroft II, the court calls it. It found that age verification was an example of such impediment for adult access, given the privacy impacts. But the Fifth Circuit said, "Something, something, I don't know, the court didn't really mean it. Also, times have changed. There's new technology and Ashcroft doesn't control and so we're not going to apply it." I don't really want to get into the nuance of the legal argument because I don't think that's the right way to understand what's going on here.

I think this is just another example of the Fifth Circuit disregarding clear precedent, I think, in hope that something has changed at the Supreme Court, or honestly at this stage, it's kind of like a DDoS attack on the court's docket because it's just sending all of these cases up to the Supreme Court. I assume there'll be an appeal and can the court really grant cert on all of them? This is the same Fifth Circuit that uphold the Texas content moderation law. That was the subject of argument in the court last week that we talked about in our episode with Daphne. It also upheld the jaw-burning claims by the set of plaintiffs that the court will hear in just over a week, a week and a half, and we'll no doubt talk about then. And so there's just all of these decisions upholding these laws. I mean, this is just in the tech policy space. Obviously the Fifth Circuit is doing similar things in other areas as well. It's quite an amazing decision.

Alex Stamos: What happens if the Supreme Court makes a decision in the NetChoice cases, now that this has happened? Does it get revisited by the fifth? Can the Free Speech Coalition re-bring it up with the Fifth or they have to get to the Supreme Court to get them overruled?

Evelyn Douek: Yeah, I don't think that anything in the NetChoice is going to directly control or change this particular case. This is going to be a question about the age verification technology in particular. It's not necessarily about the free speech rights of the platforms directly. And so I think the Supreme Court would need to grant cert to... Unless there's en banc review, I guess. We'll have to wait and see. I will say that in one small glimmer of hope, maybe this was like a splitting the baby. Everyone on the Fifth Circuit panel upheld the injunction on these health warnings that law had also required. The law also required adult sites to post a Texas Health and Human Services warning that had all of these words about how pornography is potentially biologically addictive and is proven to harm human brain development, desensitize brain reward circuits, et cetera, and all members of the court correctly held that this was unconstitutional, compelled speech. So I guess you win some, you lose some pretty big other ones.

Meanwhile, this week, Florida said, "Hold my beer," and passed a law that had many of the same provisions regarding adult content, but also prohibited kids under 14 from having social media accounts and required 14 and 15-year-olds to get parental consent. An earlier version of this bill that had made it under 16 was vetoed by DeSantis, but this one is largely expected to be signed by him. Obviously not at all dissuaded or humbled by their cool reception at the Supreme Court a week ago, Florida's officials are posting on social media, "Florida's kids do not belong to NetChoice and their big tech cronies. Bring it." Obviously anticipating another legal challenge to this law, so, great. Full employment program for NetChoice's lawyers and academics, I guess.

Alex Stamos: Is that an argument that NetChoice is making, that they own children in Florida?

Evelyn Douek: That's right.

Alex Stamos: I agree. I agree that NetChoice should not own children. Yeah, we're on the same page.

Evelyn Douek: Everyone can agree on that. Apparently can't agree on much else. Meanwhile, in Florida, Wyatt had a story this week that we'll link to that Florida is moving forward with prosecuting two teens for making deepfake nudes of their classmates, under a law that makes it a felony. Now, we have talked at length many times on this podcast about the real harms of deepfake nudes and how our main concern about this technology and the abuse that this technology can be put to is not necessarily the political ramifications that many, many people talk about, but exactly these kinds of cases of teenage girls in schools being harassed and subjected to this kind of harassment by their classmates.

At the same time, I think we need better social norms and remedies than locking up 13 and 14-year-olds as Florida looks to be aiming to do under this law that makes it a felony. These are the same age teenagers that don't have enough responsibility to have a social media account on Florida's account, based on the bill that had passed last week, and yet they can apparently be arrested and potentially locked up for this kind of behavior. Distressing all round, I think.

Alex Stamos: While the AI component is new, this actually isn't a new problem, in that we've always had for a long period of time, a challenge of law enforcement of, how do you deal with children who create CSAM or trade CSAM? Either they're trading for money or they're trading other people's. There's a bunch of different scenarios here, and do you treat a 13-year-old different than a 15-year-old, different than a 17-year-old? I remember going to a panel at the Crimes Against Children Conference several years ago in which they had, I think, four or five different prosecutors. Mostly local, but at least one US attorney, and every single one of them had different standards for this, which brings up, I think, a real kind of problematic fairness due process issue that across the country, these prosecutors are going to act in very different ways if the victim's are male or not, if there's a certain racial group or not.

The amount of prosecutorial discretion in this space is really scary, and as a father of teenagers who I hope would never do something like this, but you just never know if teenagers do something dumb for... Their brains are not totally formed. And sometimes they might be bad kids who are trying to cause harm. They might just be doing dumb things or going along with it. It's just very scary world for both teenagers and parents right now when it's just based upon whether or not the local prosecutor is running for office or not, or they want to get lots of screen time like this one because they have a really hot story, a really hot case that they can ride to that.

Evelyn Douek: It's really a deep tension in the law and that we haven't really adequately grappled. The Supreme Court has held, in First Amendment land, that the government has more latitude with prosecuting or punishing child sexual abuse material that doesn't necessarily reach the level of obscenity as it would for adults because of the need to protect children and need to protect children from abuse. The abuse that is perpetuated in the creation of this material. It can be problematic or concerning when those laws that are justified on that rationale for protecting children are then used against the very children that are also the reason why these laws exist, is to protect them. And there isn't really a good answer to how to think about this yet. I certainly think that this is probably not the most effective way of dealing with this particular problem.

Alex Stamos: Totally agreed.

Evelyn Douek: Well, nice then, to finish this episode on a note of agreement after a number of disagreements. Look at that.

Alex Stamos: But a dark one, making it real hard to pivot to Sports Corner after we're talking about the prosecution of 13-year-olds for felonies.

Evelyn Douek: The warm and fuzzy feeling we all get for agreeing that we shouldn't lock up 13-year-olds. It's lovely. Do you have an update for Sports Corner this week, Alex?

Alex Stamos: Well, just for our friends, that Stanford women's basketball team won once again. As did Iowa, setting up both of them. Look like they'll probably be 1 seeds in the Women's March Madness tournament. You want to go to any games if Stanford's in? You want to try to go get some tickets?

Evelyn Douek: Yeah, that'd be awesome. That'd be fun. Let's do it.

Alex Stamos: Have you been to a college women's basketball game?

Evelyn Douek: I have not.

Alex Stamos: Wow.

Evelyn Douek: You can shame me live on the show. I do feel very bad about it, but I would love to.

Alex Stamos: Okay. Let's see what happens. Maybe we can get in there. I got asked, it was fun and I couldn't do it, to go play at one of the women's games with the Stanford band because they had a special alumni and professor day for alumni and professors who have ever played an instrument in a marching band, and unfortunately I was traveling with my family. But maybe they'll invite professors to play in the tournament. That seems less likely.

Evelyn Douek: Yeah. When you said that, I thought that story was going somewhere very different. When you said you got invited to play at one of the women's games, I was very lost for a second there.

Alex Stamos: Oh, band, yes.

Evelyn Douek: That's very different story. I was just remembering all of those videos of men who think they can challenge Serena Williams and then get absolutely hammered by her.

Alex Stamos: No, no.

Evelyn Douek: It's one of my favorite kinds of content.

Alex Stamos: No, I expect I could play trumpet better than Caitlin Clark, but nothing else.

Evelyn Douek: Yeah, right.

Alex Stamos: It'll be a great tournament this year. It'll be interesting to see the Stanford women go all the way, including, I think, some former students of ours.

Evelyn Douek: Excellent. Well, that looks like it's going to be a recurring storyline on this Moderated Content podcast then. Look forward to it, listeners, your weekly update on Stanford women's basketball. We'll be coming-

Alex Stamos: We should do a bracket competition just because our listeners are so incredibly bad, it might be the only chance I have to win a bracket for any sporting tournament. So we'll do a competition of just Moderated Content listeners. It'll be great.

Evelyn Douek: Listeners, do not accept this defamation that Alex just threw around, of you being incredibly bad. The gauntlet has been thrown. We need to prove him wrong. It certainly won't be me, but you can, for sure. All right, something to look into. And with that, this has been your Moderated Content weekly update. This show is available in all the usual places, including Apple Podcasts and Spotify. Show notes and transcripts are available at law.stanford.edu/moderatedcontent.

I haven't been asking very often, but I did ask you a couple of weeks ago and some people kindly gave us a couple of reviews, and so I would love if there are some of you out there who are listening right now and have the capacity to just go and click on some five stars, that would be lovely. Or whatever you like. We're a free speech podcast. You should give us whatever rating is in your heart.

Alex Stamos: Right.

Evelyn Douek: It would be great.

Alex Stamos: But if you give us one star, we'll be calling for your company to be sold.

Evelyn Douek: Right. Exactly.

Alex Stamos: Disinvested, which I don't think is a First Amendment violation. I think that's fine.

Evelyn Douek: That's right. And you know that I would stand up for you and your right to criticize us.

Alex Stamos: Like the Skokie Nazis, you will stand up for-

Evelyn Douek: Exactly.

Alex Stamos: ... people who have bad opinions, like that we are one-

Evelyn Douek: The speech I abhor. Yeah, that's right. You could be wrong, but I will still stand up for you. This episode is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman. See you next week.