Alex and Evelyn discuss why the Congressional hearing with TikTok's CEO was bad news for the app, AOC's opposition to a ban, and why the First Amendment hurdles to getting the ban enacted remain high. They also discuss India's continued crackdown on online speech and Twitter's continued acquiescence; whether the long-anticipated deepfake apocalypse is finally here (spoiler: no); Utah's "think of the children" social media law being signed into law; and Twitter's helpful move to give you more information about who not to pay attention to on the platform.
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Alex Stamos:
Ms. Douek. Does Congress have the ability to ban TikTok? Yes or no? Yes or no?
Evelyn Douek:
Thank you. Thank you for the question. I appreciate...
Alex Stamos:
Yes or no?
Evelyn Douek:
Thank you.
Alex Stamos:
Okay. Yes or no? Yes or no?
Evelyn Douek:
No. I mean...
Alex Stamos:
Okay. Clearly, you don't want to answer my question.
Evelyn Douek:
It's just ... If I could have a moment?
Alex Stamos:
Professor Douek.
Evelyn Douek:
Yes. Yes. Yes, sir.
Alex Stamos:
Do people have rights under the First Amendment? Yes or no?
Evelyn Douek:
Yes, sir. Yes. Yes. Yes. I've been training for this moment. I knew that one.
Alex Stamos:
Finally.
Evelyn Douek:
Excellent. Welcome to this week's Moderated Content, the weekly news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos.
Quite clearly, as you may have guessed, we are starting with the show hearing in Washington. Alex, I never know whether my cultural references will carry across. Do Americans use the term show trial? Is that something that Americans know?
Alex Stamos:
Absolutely. Yes.
Evelyn Douek:
A show trial is a trial that's held in public, and the main reason is to put on a show and influence public opinion rather than get real justice.
Alex Stamos:
But we also use ... Is this a term in Australia? We use the term Kangaroo Court. Is that insulting to you as an Aussie?
Evelyn Douek:
No. Absolutely. We love it.
Alex Stamos:
You love it? Okay.
Evelyn Douek:
Yeah.
Alex Stamos:
Because it's not like you can't insult kangaroos. Is that actually a law under...
Evelyn Douek:
Exactly. Criminal defamation of kangaroos is written into the Constitution. But apart from that ... Unsurprisingly, the Shou Chew hearing in DC last week with the CEO of TikTok did not apparently manage to update anyone's priors. It seems to be that everyone went in thinking exactly what they thought and left thinking exactly the same things.
The consensus seems to be that Shou Chew was fine and didn't have any major stuff-ups, but also that he really didn't demonstrate that he was the boss or could stand up to China in any meaningful way. The wildest moments to me were the moments where we saw us lawmakers envying China's content moderation. We had these series of question and answer sessions, where lawmakers were asking Shou why Douyin, the version of TikTok that's available in China, had so much better content moderation than TikTok.
Because you couldn't find drugs and all of this terrible content, misinformation. It just never occurred to me that this here, land of the First Amendment, we would have lawmakers appreciating China's heavy-handed approach to content moderation and censorship. Because folks, it turns out that it's a lot easier to get rid of bad content if you don't care about getting rid of a whole bunch of good content along with it. You just use blunt force keywords and things like that. That was pretty mind-blowing to me.
Alex Stamos:
Because one of the questions was basically, "Why does China have less of a drug problem than America?" Shou had to back his way into, "Well, they execute everybody." It's easier to get rid of drugs if you just execute anybody who's a drug smuggler almost immediately with no due process.
Evelyn Douek:
Next week on Capitol Hill, a new law reform initiative.
Alex Stamos:
Right.
Evelyn Douek:
Alex, what was your take on the hearing?
Alex Stamos:
I think a couple of things can be true. I think Chew needs credit for showing up, and it was also a disaster for TikTok. It was always going to be a disaster for TikTok. I was shocked when they said that they were going to do this, because it is the situation in which Democrats and Republicans are mostly aligned. It is also in this election season, which we are always in election season in America. Neither party is going to allow the other to get to their right on China.
And so, it was definitely, I think, a bad idea to show up. But specifically, like you said, he was not prepared to look like he was the one that was really in charge. They're trying to play this little game where they send the CEO, who's Singaporean and speaks great English and has a good personal history, to try to say, "We are an independent company. We're a Singaporean company," yada, yada. To try to different distance themself from ByteDance.
But, one, there were a number of members who were quite well-prepared to peel back the ways that ByteDance has control over TikTok and the access that ByteDance employees have to TikTok's data. I think that was effective. Those questions. I think the fact that Project Texas is only half done was a real problem for him, because he had to say ... Everything was future tense of, "We will protect American data." It was pretty clear that, right now, ByteDance employees can get to almost anything still.
I wrote an op-ed about this in CNN. Folks, we can link to it on cnn.com. I did a bunch of media about it after the hearing, but I do think the focus just on TikTok is inappropriate. You and I have discussed this many times, that there really should be a federal privacy law. And as part of the federal privacy law, you can come up with data flow restrictions.
But we also want to just control American companies and other Chinese companies. We can't spend three years debating every Chinese company and then killing it. That's not the way to do an overall privacy framework. I think there were two moments to me that stood out as being really damaging. One was he was asked to effectively criticize the PRC. He was asked whether the Uyghurs were being oppressed.
Debbie Lesko:
Mr. Chew, do you agree that the Chinese government has persecuted the Uyghur population?
Shou Chew:
Congresswoman, if you use our app and you open it, you'll find our users who give all sorts of content on their phone.
Debbie Lesko:
That's not my question. My question is, do you agree that the Chinese government has persecuted the Uyghur population?
Shou Chew:
Well, it's deeply concerning to hear about all accounts of human rights abuse. My role here is to explain what our platform does on this ...
Debbie Lesko:
I think you're being pretty evasive. It's a pretty easy question. Do you agree that the Chinese government has persecuted the Uyghur population?
Shou Chew:
Congresswoman, I'm here to discuss TikTok and what we do as a platform. And as a platform, we allow our users to freely express their views on this issue and any other issue that matters to them.
Debbie Lesko:
All right. Earlier, today .... Well, you didn't answer the question.
Alex Stamos:
The fact that he cannot answer this question is a real problem, because it does demonstrate a fundamental truth, which is TikTok does not want to be an arm of the Chinese Communist Party. They do not want to be part of the PRC intelligence services. I think these people just want to make apps and make money and feel like they're making good products that people like to use.
I don't think there's any intention there. But the reality is the PRC has massively cracked down on tech companies over the last couple of years. Jack Ma was one of the most famous, visible Chinese entrepreneurs. He slightly criticized the party and got disappeared, and is now effectively living outside of China and never saying anything publicly. It is clear.
Right now, as we speak, one of the most important tech investors in China has been disappeared. Nobody knows where he is. Chew, pretty clearly, if he answered that question honestly, would not be safe. It would not be safe for him to go to PRC. I think that was really effective. I think the other place that hurt is when he was asked whether or not they spy on Americans, and this was his response.
Shou Chew:
I don't think that, "Spying," is the right way to describe it.
Alex Stamos:
I don't think that was a good one. Overall, he did okay. He stayed calm. For our little joking there, I hate it, I absolutely hate it when members of Congress, which is a bipartisan thing, say, "Answer this incredibly complex, nuanced issue. Yes or no." I think he did well of maintaining his composure under that, but he did make some slipups. In the end, it did not convince anybody that TikTok is independent.
Evelyn Douek:
I also hate it when they ask really stupid questions and say, "Yes or no?" "Does TikTok access the WiFi? Yes or no?" "Yes. I mean ..." "Oh. Gotcha."
Alex Stamos:
Right.
Evelyn Douek:
Exactly. I agree with all of that. I think, as you said, we need to think about this more broad-based across the industry. I spent all of the other tech CEO hearings that we've had over the last few years saying, "What about TikTok? Why are we focusing on Facebook and Twitter, when we've got this app growing out there that all of the youth are on and no one's really focusing on it?"
And then, I spent this hearing going, "What about all the other social media platforms?" We can't just deal with this in isolation as only a TikTok problem. Of course, I spent all of those hearings going, "Why doesn't anyone care about YouTube?" Or, "Why can't we talk about YouTube," which is also an extremely large platform that no one ever really seems to ask questions about.
Alex Stamos:
You've never talked about YouTube, Evelyn. What are you talking about? That is not part of your personal brand at all.
Evelyn Douek:
Exactly. We'll do a briefing on YouTube. You may not have heard of it. A small platform, but for listeners that want to know, we'll drop that into the feed sometime. I want to spend a little bit of a moment on the First Amendment debates around this as they heat up. Because originally, no one really was really talking about the First Amendment.
People were talking about banning TikTok, and then there were a few of us off in the background going, "Hey. There's this little thing called the First Amendment that might cause a problem." Now, the First Amendment concerns are live and are here, but there are some objections that I've seen cropping up to them. Or just ways to dismiss them that I think it's worth addressing.
One of them that I see quite a bit is, "Well, no one's going to be silenced if you ban TikTok, because people can go and speak somewhere else." There are, as we were just saying, a number of other platforms. While that's true, that's not a First Amendment argument that's going to carry much water. For example, you can't say, "It's okay to close down the New York Times, because there's your local newspaper or the Washington Post." Or, "We can shut down this podcast, because there's a million other podcasts." There's no other that's quite like it, listeners. Unfortunately, that argument isn't going to hold water.
The other one is the argument that, "Well, this isn't about speech. This is about data or data. And so, this isn't a First Amendment concern, because what we're concerned about is data flows." The first is that, unfortunately, the hearing and all of the rhetoric has blown people's cover on that one. It is quite clear that this is, at least in part, about speech concerns.
One of the key concerns is Chinese propaganda or China pushing certain TikToks to people to convince them of certain things. Or getting rid of content that they don't want people to see. There's that. The other thing is that incidental regulations of speech are still subject to First Amendment constraints. You can't say, "I'm getting rid of all of this speech for another reason and get around the First Amendment."
Alex Stamos:
Well, I thought the most emotionally effective part of the hearing was when a member, I believe it was Gus Bilirakis, brought in the parents of somebody who had committed suicide. Now, one, this is a horrible issue. There's lots of teenage suicides. There's lots of back and forth on whether social media has made this problem worse. And so, it was very effective. But also, you cannot make any claim that is not a speech discussion. It was clearly about content.
And so, the fact that they wanted to pivot away from the hard national security stuff for which you might have a credible claim that does not affect the First Amendment, and over and over again, they wanted to pivot back to content that they saw that they didn't like. Which of course, unlike suicide and self-harm, there's legitimate arguments there. This is actually a really hard problem for any company that services teenagers.
So it's a legitimate thing, but it's not a legitimate thing you should discuss in the context of banning the entire platform if you're trying to make a record for the Supreme Court. Is all this stuff going to be then considered by the appellate courts, if there's an appeal? Almost certainly, TikTok would appeal ... I'm sorry. Would try to sue against any law that outlawed them.
Evelyn Douek:
TikTok will and users will and things like that. There's issues about what exactly can be admitted into the record, but it's certainly clear ... And it depends as well on what form the ban takes. We still don't know through what legal mechanism that the ban will take place, but it's sufficiently clear that there's a wide record that speech is a key concern here.
The other thing to say is this is framed as a national security issue. It is true that courts are generally fairly deferential to the government, to the executive, and to Congress, when it comes to national security concerns. But it is also very clear in the First Amendment context that the harms have to be demonstrated and established. They can't be merely conjectural. They can't just be, "We think this could happen." The lawmakers need to show the actual security harm and that the harm can't be mitigated through other measures.
Now, it is also true that this long CFIUS process that's being going on over a couple of years is no doubt, at least in part, trying to establish that record to show, "Hey, look. We really tried to find another lesser restrictive way of dealing with these data's concerns, these security concerns, but we just couldn't get there. And so, we need to ban the platform."
But first, as we've talked about repeatedly, leaving all the other platforms untouched and all of these other platforms that equally have all of these same problems, undermines the idea that banning TikTok is adequately addressing the national security risk that Congress says that it's going to get. It's also really interesting. AOC joined TikTok this week to TikTok about TikTok, in which she was very clear that she doesn't think that TikTok should be banned.
One of the things that she said in that video, which I thought was really interesting, was that Congress hasn't been given a classified briefing about the concerns that people have around TikTok. She said that was really unusual. That if there's normally national security concerns about something, that they will be given a classified briefing. All of those lawmakers that you saw yelling about a TikTok, they are operating on the same basis of information basically that you and I are. It's not like they've been given some classified briefing that really makes a compelling national security case about this.
Alex Stamos:
At least, not in general. Now, AOC does not sit on HPSCI. It is quite possible ... There are classified briefings that they give to everybody in Congress, but my understanding is their briefings are pretty watered down, because the executive branch does not trust Congress.
Evelyn Douek:
I can't imagine why.
Alex Stamos:
They generally trust HPSCI. Even HPSCI, especially recently or since the Trump years, has become a much more political versus the Senate side, SSCI, which I think is still considered a reasonable bipartisan group. AOC is going big. As of right now, she has 19,000 re-shares, 648,000 likes. This was really big. I think this is a bit of a political tactical mistake, because the Democrats have been careful to come out and say, "Hey. These are things that we should be thinking about that are broader."
Her video is close to an endorsement of TikTok. Now, she's going to have hung around her neck anything that else that comes out about TikTok. The truth is that TikTok had Chinese employees spying on an American journalist. They denied it. They denied it. They were finally forced to admit it. There could be dozens and dozens of other stories like that. And so, I think this was an interesting choice that I expect that she'll regret, because she doesn't make a good argument for, "This is a law we should pass at deals with the issue."
It's more of a whataboutism. Also, the whataboutism with the US platforms. Yes, US platforms have had privacy issues. No US platform allows the Ministry of State Security to search through a petabyte data warehouse. That is not something that they're allowed. There have been security problems. KSA famously was able to get access to Twitter's data via planting spies or there have been hacks and such, but those companies have teams to prevent that.
The challenge here is that it's exhausting of people just saying, "Cambridge Analytica. Cambridge Analytica." One, nobody who says that could ever explain to me what actually happened with Cambridge Analytica. They don't really understand what API issues are. Or the research issues. Or the fact that there's actually a tension here between transparency, API access, anti-trust issues, and data security.
None of them understand that. They just repeat it as a shibboleth to try to shut down debate. She does that in this video. And so, it's not very nuanced. It's not very smart. I think she made a huge mistake here. I'm just putting a stake in the ground. I think her endorsing TikTok ... Because the other Democrats who have come out are very low profile, and she has now put herself in a place where the Republicans are going to make her the face of TikTok. Anything that comes out now is going to be attached to her.
Evelyn Douek:
I'm sure they were very happy to see that and to have that opportunity. Politically, maybe a bad move. But as a First Amendment scholar, I will say, I did really appreciate her saying that if Congress is going to make this move, they need to make the case to the public, in public, about why exactly this is the only national security measure that can be taken. I think that that's an important First Amendment value and something that I certainly endorse.
Alex Stamos:
I think that's right, but what I'd feel better about is if she said, "Great. Here's a comprehensive data privacy law that has a bunch of controls over data flow to China." From my perspective, there shouldn't be a First Amendment issue of saying, "This is data that cannot be processed in China." That should be neutral as long as you're saying it across all companies.
Evelyn Douek:
You're probably right. Although, it's also almost definitely true that it would be subject to a First Amendment challenge and we would go through court. Because everything is subject to a First Amendment challenge these days.
Alex Stamos:
Of course. But I could feel like, with this SCOTUS, them striking down a law that says you can't process the personal information of Americans in China would be unlikely.
Evelyn Douek:
I'm with you there. Most likely. Depending, of course, on how broadly it's drafted in exactly the mechanisms. But again, the court is generally going to be deferential to the government on national security interests. I will say they are likely to be less deferential in cases where, for example, you're talking about a platform that is widely used by 150 million users.
Which is a completely different situation to one-to-one communication with the Tamil Tigers, or something along those lines, which are the other kinds of contexts that we've talked about this before.
Alex Stamos:
Right.
Evelyn Douek:
Trying to put a little bit of meat on the bones as this ramps up and we are starting to talk more seriously about the First Amendment issues, but we'll see what comes of it. I am sure that this will not be the last time we are talking about this. Moving to a country that has banned TikTok. We're going to go overseas to India now. Just to note that India continues to crack down on online speech and platforms. In particular, Twitter continues to acquiesce.
This last week, Twitter blocked 122 accounts belonging to journalists, authors, and politicians in India in response to legal requests from the government. A number of these accounts, which include prominent Sikh voices in the diaspora, were putting out credible information about the current turmoil and political unrest going on in Punjab, where there've also been in internet blackouts rolling over three days, where there's 27 million people. This is the kind of company we might keep if TikTok is banned in this country, but Elon Musk's free speech credentials continue to be unparalleled in this area.
Alex Stamos:
I don't know what to say. We called it. The company gets bought by a guy who has industrial sites all over the world and who wants to break into the Indian market. Twitter has completely ... Once again, whatever arguments are made that they're free speech absolutists, that he believes in free speech and stuff. That's clearly not true from a fundamental perspective.
Evelyn Douek:
To be clear, it's a tricky issue when companies have staff in countries about what to do with these formal legal demands. But Twitter's prior practice before the Musk era was to really push back on these. Try and challenge them in court to the extent possible.
Alex Stamos:
And to notify people. None of these people were notified. They found out from the press. They found out from the media or from people telling them they were blocked. Part of that is probably they just fired everybody. There's nobody left to go do the notifications. Or they don't know where the button is that they used to use.
Part of it is probably not intentional, but clearly a decision has been made. Effectively, the model here is, "We're free speech absolutists unless we're asked by a country that has a law." That effectively makes them only free speech absolutists in the United States. It's a vast reduction of the amount of speech that you'll carry around the world.
Evelyn Douek:
In some light affair this week, the generative AI apocalypse is supposedly here. The internet was aflutter or Twitter was aflutter this week with images of Donald Trump getting arrested. These were Midjourney images created by journalist Eliot Higgins. They're pretty awesome images, actually. But everyone was saying, "Oh no. Here it is. Here's how we're going to have ... The deep fake apocalypse is here."
I do not know that I saw a single person saying, "We must go out into the streets. Our former president has been arrested." But I did see plenty of people saying, "Oh my God. Look at these fake images. People are really going to be convinced by these." I don't know. I don't know what to make for this. Do you think that this is a new frontier of misinformation, Alex? Is this really going to be materially different to the previous stuff we've seen before?
Alex Stamos:
I've always thought that the generative AI from a disinformation perspective has been massively overblown. People talk about it and talk about it. The practical one on this is I see 50 times as many people saying, "Oh my God, this is a disaster," than I see people who have actually been convinced of it.
In fact, it's hard to find a legitimate person who's not a troll who actually believes that these are photos of Donald Trump being arrested. Clearly, the discussion has overblown, but it's true. Midjourney V5 was just released. It's really good and the quality is getting a lot better. My favorite one is somebody this week generated a set of what looked like Polaroid pictures of the members of Hogwarts going to a rave in 1998. It's perfect. It's really good. Other than the fact that obviously these people were way too young to do it.
But I think that's more ... The big political stuff I don't think is where you're going to see it, but we are going to continue to see trying to harass individuals and create rumors about individuals and stuff. Maybe more of the tabloid stuff about people of, "Here's a leaked document." Or a leaked photo of a celebrity or a famous person. I think that's way more likely than in the political sphere, where something like Donald Trump being arrested would be so widely-reported that the amount of time that somebody would fall for one of those images should be 30 seconds.
But in the stories that get less coverage, for which there's less information, there's fewer journalists, and that are more specific ... I think we're continue to see that kind of abuse. Midjourney has protections against nudity in some other ways. It itself should not be able to be used for sextortion and creation of fake NCII, but that is absolutely a concern as these generative models always make their way to people's individual graphics cards. Once people run them offline, they immediately patch out whatever protections are in place.
Evelyn Douek:
One protection they have in place is banning people that use it for such ends. Midjourney did ban Eliot Higgins for having a good time, because everything is a content moderation issue. He lost his account over these wonderful images of Donald Trump.
Alex Stamos:
It was just silly.
Evelyn Douek:
Did you see one of the Pope circulating this week? I will say ...
Alex Stamos:
That was pretty funny.
Evelyn Douek:
I did get fooled by this image of the Pope in this fantastically big white puffy coat. It looked pretty good. But on the other hand, I'm not becoming Catholic now as a result. It's not like this is going to change my behavior in any way, but it was funny.
Alex Stamos:
No. That's a good point. And that's a demonstration of that kind of fun story for which you're not going to immediately have people fact checking and such. It's a much higher area of risk.
Evelyn Douek:
Small update is that the Utah social media bills that we talked about with Rihanna a couple of weeks ago have been signed into laws. These are the laws that are basically a parody of the, "Won't you think of the children," laws, which require parents to approve a child's account before they turn 18. Again, reminder that you can get a driver's license at 16 in Utah. And that there's a curfew for users under 18 between 10:30 PM and 6:30 AM from using social media. And that parents get full access to children's DMs.
This is a pretty wild law. It is expected to go ... Or it's said that it will go into effect by March 2024. I cannot see this law ever going into effect. Because right now, somewhere in DC, probably the new officers of the industry body NetChoice, that has launched its litigation hub to track and respond to lawsuits, is writing a filing right now to challenge this on First Amendment grounds.
I think it's going to be a very strong lawsuit. Utah lawmakers and the governor have said, "Bring it on," but I doubt they'll be saying that when these challenges actually come. I think the Utah teens are safe, but the lawyers are celebrating. That's for sure.
Heading over to our Twitter Corner for the week. A series of non-updates. Musk is still the CEO. There have still been no changes to the API, and the algorithm is still not open source. Although, one week to go. Maybe listeners, by this time next week, we will finally have the secret sauce.
Alex Stamos:
And Generalissimo Francisco Franco is still dead.
Evelyn Douek:
That's right. The interesting update. This is kind of hilarious. I don't know if it's a joke. Twitter is getting rid of legacy verification on April Fool's Day. April 1st. All of those ...
Alex Stamos:
I'm sorry. I've got to do it again. This is a double trombone day for Twitter.
Evelyn Douek:
Amazing. It's so good. I'm very grateful that at least on April 2nd, I will have more information about who not to pay attention to on the platform. But that's basically all I've got on this one.
Alex Stamos:
I am not neutral here. I've had a blue check mark for quite a while. I'm one of the horrible blue check mark people that folks like to rant against. But at least, when I got it ... I completely agree that democratizing the blue check mark was a smart thing. Offering it to more people. The problem is they got rid of the verification. They're taking the money without spending the money on the backend to actually verify that you are who you say you are. It means nothing for that.
The other thing they've done is if you have a blue check mark, you clearly get a huge amount of juice in your posts. We've seen a pretty significant divergence between people who are Twitter Blue verified and everybody else on the amount of reach they get. Twitter is moving toward a TikTok-like algorithmic experience, where they're much more aggressive about using the algorithm for discovery than they were of using a somewhat light algorithmic ranking of who you follow.
And so, the blue check mark is now going to become a MAGA hat. When you get rid of the legacy blue check marks, there's no longer ambiguity of whether this person's verified or not. Or whether they're previously considered notable. It is 100% people who want to pay Elon Musk, which is probably why also a number of some researchers figured out that Twitter seems to have code for a new service, where you can pay for blue check mark, but have the blue check mark turned off. Which is effectively, you're just paying to have your content ranked up, which I think this will be fascinating to see who does that.
But if you're a democratic politician, for example, you absolutely don't want people pretending to be you. You absolutely want to be ranked up, but you don't want to be seen as wearing this MAGA hat effectively of having a blue check mark. That means it is going to become an endorsement of Musk's policies in his politics. And so, it'll be fascinating to see who's secretly paying to get the up ranking, while not having the check mark next to their name.
Evelyn Douek:
I love this detail. You can pay and then hide your thirst for clicks. No one knows. I don't know. Like platform shoes that don't look like they're platform shoes. You get that boost in height, but you don't have to look like you're trying too hard. I don't know. It's going to be fun.
Alex Stamos:
The other related thing is that a number of people have been doing experiments where they create new Twitter accounts, follow nobody, and they see who shows up in the For You. As you can imagine, Musk is a significant percentage. Effectively, he is pushing himself out to people who have done absolutely nothing to indicate that they have any desire to see his content.
Evelyn Douek:
Again, therapy is just so much cheaper than buying a social media platform and running it into the ground.
Alex Stamos:
Because another data point here is this week Musk told his employees who remained that there would be an equity offering for them in the new holding company for Twitter. He valued that at $20 billion. He has to do an IRS. He has to file a form with the IRS to say what the stock is valued at for doing that kind of thing.
That is both less than half of what he paid for it, and also at least twice as much as what external observers consider the value of Twitter now. It should be well less than $10 billion at their current revenue. I don't see the verification game making up for that problem, making up for that revenue drop.
Evelyn Douek:
But seven dollars a month for a signal of shame that you could wear around your neck. Okay. I think that's it. Anything else that you want to add before we close out for the week? Any sports segments? Sports news? Stanford gossip? I don't know.
Alex Stamos:
In sports news, there's a bunch of stuff happening in college football in this thing called the Transfer Portal. We don't have to get into this, but I am very sad about the professionalization of college football. Now, if you are a smaller team or a less winning team and you take a bet on a kid, you give them a scholarship, you give them opportunity to get a degree, they're immediately going to jump the moment they have an opportunity to do it. We're watching the slow death of college sports, I think. Starting with football. It's going to move to basketball. We'll see how far that goes.
Evelyn Douek:
That's a bit of a bummer to end on.
Alex Stamos:
I'm sorry. But the tournament's still going on.
Evelyn Douek:
Excellent.
Alex Stamos:
It's Final Four time. Standford is not in either of them. Neither is Cali.
Evelyn Douek:
He was so close to having a happy ending, but just couldn't manage it this week. All right.
Alex Stamos:
Sorry. We shouldn't record on Mondays. I feel like it's just ... I'm not in the mood to have a happy sport anecdote. Sorry.
Evelyn Douek:
That's okay. All right.
Alex Stamos:
Sacramento Kings are doing great. I grew up in Sacramento. I went to a gazillion Sacramento Kings games. I was actually at the first game in Sacramento as a kid. The Sacramento Kings are doing great. There you go. We'll turn this into a Sacramento Kings update. Go Kings.
Evelyn Douek:
Fantastic. All right. Let's close it out before we ruin this beautiful equilibrium that we've arrived at. This has been your Moderated Content weekly update. The show is available in all the usual places, including Apple Podcasts and Spotify, and show notes are available at law.stanford.edu/moderatedcontent.
This episode wouldn't be possible without the research and editorial assistance of John Perrino, policy analyst at the Stanford Internet Observatory, and it's produced by Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman.