Moderated Content

MC Weekly Update 9/19: The Lawyers Always Win

Episode Summary

Alex and Evelyn discuss reporting on a proposed deal between TikTok and the US government for it to continue to operate in the country, and the broader geopolitical context of US-China relations; how to think about search-term blocking; YouTube preventing Russell Brand from monetizing his videos on its platform; the Musk stories from the week that matter; and the enjoining of the California Age Appropriate Design Code by a California judge.

Episode Transcription

Evelyn Douek:

How are you doing, Alex?

Alex Stamos:

Oh, okay. Just college admissions in America is crazy, and the pressure we put on students is just totally ridiculous. Is it this bad in Australia?

Evelyn Douek:

It's differently bad, I think. It's bad, but not in the same way. You sit some exams.

There's no wholesome evaluation of you as a person and whether you're inspirational. There's just a mark that you get.

Alex Stamos:

Oh, so it's not like with Stanford undergrads where every single one of them is a total gunner that had to invent five new clubs that they're the president of, even to be considered by Stanford?

Evelyn Douek:

Right, exactly. No, nothing like that. Although this has pros and cons. Because fun fact, one of my favorite facts about me as a professor at Stanford Law School, is that I actually didn't get into law school the first time I tried out of high school, so there you go.

Alex Stamos:

Okay. That sounds weird to me that you have to apply to law school when you're 18. It sounds a little odd, but it has obviously held you back significantly.

Evelyn Douek:

Yeah.

Alex Stamos:

If you're from the Australian legal establishment, well, you guys messed that one up pretty hard. Congrats.

Evelyn Douek:

Yeah. Somehow, I escaped. Welcome to Moderated Content Weekly's slightly random and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek and Alex Stamos. Alex, it felt like there was a time when all we would talk about on this podcast was TikTok, and what was going on with the efforts to ban TikTok, constrain TikTok, make TikTok safe.

It actually feels like we haven't talked about TikTok in quite a while now. The cause to talk about it again today, is that The Washington Post had reporting this week that TikTok's parent company, ByteDance, and the US government are back at the negotiating table after a break, to try and renegotiate a deal for the app to continue operating in the US.

The teams from ByteDance and the Committee on Foreign Investment in the United States, CFIUS, sat down to talk about a deal that had been proposed by ByteDance more than a year ago. The Washington Post had details of this deal, and it was confirming reporting that had originally been in Forbes by the fantastic Emily Baker-White about what the deal would allow.

It's a pretty incredible deal, to the extent of control that the US government would be given over TikTok. The document itself isn't public, we're just having summaries of it from these reporters that have seen the deal. But it gives the US government veto power over content moderation, policy changes and all sorts of supervisory authority over TikTok.

It just seems like it feels like we're coming from this parallel universe, where the First Amendment isn't really a thing when we're talking about this.

Alex Stamos:

Yeah. It's a fascinating document to read in the context of the excellent episode you and Genevieve did last week talking about the Fifth Circuit decision, which update looks like it might go to the Supreme Court. We might get some interesting frameworks here from the Supreme Court of what inappropriate influence is.

In this case, it's TikTok saying, "We will give up our First Amendment rights," which I guess they have the ability to contractually do that or do they? Are there examples of human citizens, not corporate personhood, of surrendering First Amendment rights like this?

Evelyn Douek:

You can surrender rights but, of course, TikTok's rights aren't the only rights that are at stake here.

We're also talking about all of the users' rights that would be impacted by the government having this kind of control.

Alex Stamos:

Right, which they can't waive.

Evelyn Douek:

Right.

Alex Stamos:

Or at least you and I would assume that, but there's really no precedent for this, for an intermediate body to say, "We will waive the right." You couldn't imagine a book publisher saying, "We agree that we will edit the books that we published, or we will not publish a certain kind of books," And then that affecting the rights of authors. Yeah, it's a fascinating question. Again, it's like the focus on TikTok on a bunch of these content moderation decisions and stuff, I think, is a little silly.

Because it generally to me, and we've talked about this multiple times, the most interesting risks here are all data privacy risks. The simple solution here, instead of let's create some crazy end run around the First Amendment that only applies to one company ever, like let's do all this stuff for one company. Is we should have a federal privacy law that explicitly, unlike GDPR, explicitly handles data transfers to China.

Explicitly handles what countries have appropriate protections and have intelligence services that we don't trust, and then apply it to every company. TikTok would be at the top of the list, but not the only list. It's just like the amount of time that the American taxpayer has paid for these folks to come up with rules that only apply to one company ever. It is an important company, but it's not the only Chinese company that does important things.

It's not the only foreign company that does important things. It's just kind of I feel like the weirdest, slowest way to do appropriate regulation.

Evelyn Douek:

Yeah. It's, of course, important to note as well that the reason why these negotiations have restarted, is because we went through this whole news cycle where Congress was like, "I know what we'll do about this problem. We will ban the app." Then we had to go through this entire thing where a bunch of people were like, "Excuse me, sorry, problem, a pesky little thing called the First Amendment. That's not exactly kosher either."

We went through that whole thing. Talk about all the taxpayer money that was spent and the reporters' time and our time. Our special time that was taken discussing that proposal, which was always just a little bananas really. Then so they're like, "Okay. Well, let's go back to the negotiating table." I'm glad you brought up the jaw-burning conversation that we had last week because it does feel like whiplash.

On the one hand, we're having this conversation, this important, nuanced conversation about how much pressure or encouragement or cooperation the government can engage in with platforms about content moderation, and where is the line. Then meanwhile, you have this company and the government having this back and forth, that just seems to be in a completely parallel universe. But I guess it's like the China, there is this foreign fear, this China exception in a lot of cases.

Alex Stamos:

Right. We're seeing a back and forth here. We need to understand the TikTok in the context of the overall decoupling between the United States of America and the People's Republic of China. Some of which is being driven by individual, commercial and corporate decisions, a lot of which is being driven by each government. So two other things that we could talk about in this context.

One, the US is a little bit out of bounds for us, but I think it comes in bounds because of this topic, is that the US has effectively created outbound CFIUS requirements. Traditionally, CFIUS, the Committee on Foreign Investment in the United States, rules were preventative of Iran comes in and buys General Electric or something. Like of some critical defense contractor or critical American company.

CFIUS therefore, makes sense, has control in the TikTok situation, because the creation of TikTok goes back to an acquisition that includes American assets. But now what the Biden administration has done, is they've created rules for outbound investment from the United States into China, that specifically prevent investing in anybody who's developing or exploiting sensitive or advanced technologies.

Or products critical for military intelligence surveillance or cyber-enabled capabilities. This could very much hit almost anything in the semiconductor world. Optics, satellite technology, anything that's ITAR controlled. It's fascinating because it is a limitation on American private equity firms and venture capital firms especially. We've got that going on, and then the Chinese are responding.

To quote Mike Tyson, "Everybody's got a plan until you're punched in the face." Well, it turns out your adversary gets to make decisions too. We're seeing some interesting posturing from the PRC of threatening American champions. That the target right now looks like to be Apple, that the PRC has had a number of faints in this direction. Now, it's not clear exactly what the rules are going to be, but there was a rumor that they were going to ban iPhones from any state-backed companies.

The Chinese denied it, but there's some confirmation so it's not clear exactly what's going on. But then when they denied it, they also said, "But there's lots of security problems with iPhones," which is true. iPhones have a lot of security patches, but so do every other major consumer device that runs a software stack. I've got my problems with how Apple does some security stuff, but to use that as an excuse to ban them versus some Huawei phone is just ridiculous.

It's clearly that this is retaliation and Apple stock dropped significantly on that news. This is obviously super complicated for the Chinese, because the Apple iPhone is the world's most important consumer product. It is probably one of the most complicated things that is assembled and shipped out of China. It is a huge, crowning achievement to the PRC, that they are behind the manufacturing of this incredibly important product.

Chasing Apple out of the country would be a humongous mistake. It looks like they're trying to calibrate a little bit, not on the supply side, but all on the demand side. But it is interesting to see all these things play out.

Evelyn Douek:

Absolutely. All right. Using TikTok to segue to a segment about search term blocking and some of the conversation that's been happening about it over the past week, and whether potentially it's lacking some nuance. There was a post by Media Matters for America earlier in the week, that was freaking out. Because it was saying that TikTok was blocking the search term WGA during the Writer's Guild strike, and suggesting that something nefarious was afoot.

It just turned out that this was an inadvertent mistake because WWG1WGA, where we go one, we go all, is a common QAnon phrase. It was related, just WGA got picked up in its content moderation of that, so don't jump to conclusions. Content moderation is hard and companies are generally very bad at it, where don't assume malice where incompetence is also an equally available explanation often.

On search term-related topics this week though, there was also some conversation going around about Threads because, of course, Threads has recently enabled search on the platform. This is Meta's new Twitter competitor platform. The Washington Post had a story about how it was blocking searches related to COVID and long COVID. So if you would search them, you wouldn't get anything at all.

It was a critical story about how this was impeding access to information about COVID, at a time when we are still suffering the effects of COVID obviously, and people looking for reliable information about that. Alex, what's your take on this story and this kerfuffle?

Alex Stamos:

It's interesting because you and I have talked a little bit about the fact that there are companies that make trust and safety issues much easier for themselves because they're limited platforms. The examples are Etsy or Pinterest. Which is if you're a platform for selling handmade goods, there will be trust and safety issues obviously, but they're a lot more restricted. What are your anti-vax policies are much simpler for an Etsy.

If you're Pinterest, you're like, "We're for people to put together look books for weddings or fun vacation places. We're not here for political speech." Then by reducing the expectation of the freedom of speech people have, you reduce the need for you to be the referee on every single, complicated topic of the day. It looks like what's going on here, is both TikTok and Threads are trying to get themselves partially into that place retroactively.

Effectively, TikTok was never meant to be a political platform. It became one because it was important. They're unraveling that a little bit of, "Hey, how about you guys go back to eating Tide PODS and doing fun dances?" That's way better than talking about vaccines or strikes, in this case. Threads, being the newest entrant to this space, is like, "Well, maybe we're not going to be a Twitter."

You constantly see this thing in product development, product management for Threads. Is there's people who inside of Facebook who obviously, they're like, "Let's just copy Twitter and crush Twitter, what used to be Twitter." Some people were like, "Oh, this is an extension of Instagram," and this clearly seems to be an extension of Instagram model. Instagram, it's a fun place and pictures of your food, and bikini shots on the beach.

But not going to be a place where we talk a lot about deep topic topics. A great way to reduce your risk, is you get rid of the ability for people to do these very broad searches like COVID or sex or porn. It eliminates entire types of discovery. That doesn't mean that content doesn't exist, but the ability for that content to spread, is obviously massively reduced if you think about it.

Evelyn Douek:

Yeah. The Washington Post also said that it ran some other searches and found that sex, nude, gore, porn, vaccine and vaccination were also words that were blocked from being able to be searched. I think there's a conversation to be had here maybe about those kinds of terms that are more likely to pull up violating content, or conspiracy content or something along those lines. It might be that COVID or long COVID might return is important.

A search that's important to get people information about healthcare or what's going on. But it also might be a term that the risk of allowing those search terms is really high, but that's an error cost calculation about, "Well, which side of the line do you want to err on? Do you want to err on the side where you allow people access to information but you have this risk that it might be bad information, or you don't and they can't get to good information?"

It does to me raise questions. We've talked about this a little bit before, about how quickly Threads is scaling up its content moderation teams to handle these kinds of trade-offs and these kinds of risks. Whether this is something that they're permanently leaning into where it's like, "We are going to try and be a place for fun and happy, and try and not allow people to access this content as easily."

Or whether this is just we're building a platform as we're flying it. We will allow these kinds of things once we have the teams to make sure that the results are safer, but this is Meta. I don't spare a lot of tears for Meta having to scale up its content moderation teams quickly. I think it's interesting that Threads isn't more on top of this more quickly.

Alex Stamos:

It also is an interesting indicator of the long, promised integration of Threads into the Fediverse. It's a little weird to say like, "We're going to plug into all these tens of thousands of uncontrolled servers."

That seems like a different kind of model than we block the word COVID for search, right? You're like if you don't like people talking about COVID, then you're really going to love longCOVIDtalkers.com, an entire server that all they do is people who believe they have long COVID.

Evelyn Douek:

Okay. Speaking of platforms using content moderation as, I guess, scandal control or concerned about imagery, let's talk about Russell Brand and YouTube. YouTube, I think it was this morning or yesterday, suspended comedian and actor, Russell Brand, from making money from his videos on YouTube. So he's no longer allowed to monetize on the platform, as a result of the investigation and reporting about four credible accounts of sexual assault that have been reported over the last few days.

There is apparently a policy that, "A creator's off-platform behavior, if it causes harm to our users, employees or ecosystem, we may take action to protect the community," is what a spokesperson said. This is a remarkably broad policy that is kind of incredible. It does seem like the actual name of this policy, in this particular case, is we are not the other platform where the CEO is just defending Brand. We're not that guy.

But I think it's a really interesting thing that we've talked about this before, about the need to distinguish between monetization on a platform and the way in which a platform allows users to make money, versus allowing users to post content. But it's still a strikingly broad policy that it's not clear to me is ever going to be able to be enforced consistently.

Alex Stamos:

This brings up the thought experiment. An alien lands on the planet and the question they ask you, of all the things they can ask a human being, is what is the difference between Tucker Carlson and Russell Brand, and Fox News and YouTube? This is a very specific alien. This is an alien who listens to moderate content. You throw out how 230 is written and all this history.

And you just have to think, "Okay. These are two guys who sit in front of cameras, that creates a stream of audio and video that is sent to a centralized location, that then modifies it, puts it on a CDN. Then just turns it into IP packets and delivers it to people's televisions." Now there are some differences obviously, but in the end, especially in the post-COVID era when for a period of time, lots of TV hosts were doing it from their houses.

It wasn't Tucker going into the MidTown Studio necessarily, where YouTube has all these services for creators that are way above and beyond and tools, they run conferences. They treat these people as talent, right? Creators have talent agencies. So just like Tucker Carlson has a talent agency who represents him, PewDiePie and these guys have agents who negotiate with YouTube on their behalf. What is the difference between a Russell Brand and a Tucker Carlson?

Famously, Fox News was held responsible for things Tucker Carlson said. I think YouTube is one, they're trying to distinguish themself. I think they're also looking around the corner here of understanding in the modern, over-the-top era, it is really hard to make the argument that they are not a news network in the same way that Fox News or CNN or MSNBC are. For anybody who doesn't have a terrestrial broadcast license, for all these people.

It's just like if you explain it to an alien, it's really hard to distinguish between those two. The fact that the law treats them separately, seems like something that can't survive, because this is a significant mismatch between the law and the reality. If you are going to show ads and then keep a lot of that ad money, and give a bit of that money to the talent, to the person who's on air. That is exactly what Fox News does and it's what YouTube does, then you're going to probably, eventually be held responsible for what comes out of it.

Evelyn Douek:

Yeah. Just to recall, there was the big Supreme Court case last year, Gonzalez v Google, where there was discussion about is a platform liable if it amplifies content? The Supreme Court ducked that question, but there was these Easter eggs in the opinion or this constant flagging about monetization and suggestion.

That monetization was something that they'd be interested in as a potential way to pierce Section 230. Regardless of the legal matter or the current status of the law, and whether this is something that could be changed. Normatively, that feels right to me. It does feel right to me that when you are making money from and helping someone make money.

Your relationship with that person, like Spotify's relationship with Joe Rogan or whatever it is, is very different to the average user that's just uploading something to YouTube.

Alex Stamos:

Right. These weird, little things that used to be the difference. The difference used to be that Fox News went out and chose Tucker Carlson, and then they had a wet paper signature. Then he probably signed it and then went to dinner with Rupert Murdoch and his bosses. Russell Brand in theory, did a clickthrough agreement.

That stuff starts to really fall apart when you look at the actual relationship YouTube has with its creators, where they're not doing wet paper signatures, but they are treating them in a personalized talent way. If you go onto YouTube, you can find videos of their creator something and stuff.

They are picking people out and making editorial decisions of this person is very important to us and this is not. I think that's also reflected. Anybody who opens up YouTube from an anonymous window, will see editorial decisions that are clearly being made by human beings, to put front and center certain creators who they believe are the people who will entice folks to stay on the platform.

Which is not that different than putting people on your primetime lineup that if they're going through channels that they stop, and then they watch your ads for gold bars or house insurance or reverse mortgages.

Evelyn Douek:

Okay. Speaking of the platform that everyone's trying not to be, over to our Twitter corner. Okay, so we are trying not to get as caught up in all of the Elon Musk fluff that's going around. Boy, is there a lot of it in the last few weeks, especially around the release of Elon's new biography from Walter Isaacson.

Focusing on some of the actually important stories, let's start with the one part of Isaacson's book that I think is far more important than a lot of the other stuff that we're seeing reported on, and is not getting anywhere near as much attention.

Is the reporting in there that Elon Musk told Barry Weiss last year, that Twitter would indeed have to be very careful about the words they used regarding China, because Tesla's businesses would be threatened. China's repression of the Uyghurs, he said, "Had two sides."

Alex, congratulations, you called it right from out the very first podcast about this, that Elon Musk's exposure in China was going to have a pernicious effect on the content moderation on his platform.

Alex Stamos:

Yep. I don't know what to say. The People's Republic of China is the most important, authoritarian state in the world. They have a massive censorship regime. They are trying to influence people's beliefs about them. They were just caught by Facebook with a massive propaganda attack against social media. Interesting enough, Twitter did not announce any equivalent investigation, which is the kind of thing you would've seen back in the past.

It is, I don't think, a shock that they are going to be very aggressive in using whatever levers they have. If you have a CEO, who has already a taste for authoritarianism and a respect for authoritarians and strongmen, that he represents multiple times when he talks publicly. Then also has lots of money tied up in one country, that they're going to use it. Yeah, this is a huge deal. He says it right there.

It's right there in black and white and nobody's talking about it. I think the Ukraine and Starlink issues are actually quite important. But there's a little bit of static there about exactly what decisions he made, and whether they're positive or negative decisions, to not create capability or to take away capability existed. In this one, he just lays it all out.

Evelyn Douek:

You're totally right about the Starlink issues. I hadn't actually been really thinking about that in the Isaacson context, because there's been some great reporting about that recently in The New York Times and The New Yorker, including Ronan Farrow as well.

I totally agree that there's just this underlying question about how much power do we want to give to a single, mercurial man? Is this really how we want to run a public sphere or a country, but this is the situation we're in?

Alex Stamos:

Well, and there's two totally different things. To go back to the government influence, I think it'd be totally inappropriate for the US government to try to influence Twitter's decisions in the opposite direction of the Chinese. I think you and I would agree on that. SpaceX is a company that completely only exists because of US taxpayer dollars.

If you have a company that is a massive federal contractor and actually a defense contractor, then I think the rules are a little bit different in how much the American public gets to have influence over the foreign policy decisions there. Whereas there's no federal money goes to Twitter, and obviously the First Amendment holds in a way it doesn't for satellite decisions.

Evelyn Douek:

Right. It's not just the Chinese that Twitter or X is susceptible to pressure from. The Washington Post, again, had good reporting this week about how Twitter is complying with German data requests much more under Musk than it had been previously. There were some pretty shocking quotes in here about how in Germany's enforcement of its online hate speech laws, it has really ramped that up. It is very quite aggressive in that enforcement, and that Twitter is basically really enabling, it says hundreds of new cases are being pursued as a result.

There's a quote there that, "Before Musk, we almost never got data for digital hate crime cases. After the acquisition, we almost always did." There's an example in the article that says like, "In one recent case, prosecutors in the German state of Bavaria used data to identify a suspect who mocked Markus Soder, the leader of the conservative CSU party." I'm sure I'm saying this wrong, so apologies to our German listeners.

Sodolf, a play on his last name and the first name of Adolf Hitler, which it just to the American ear in terms of the hate speech that we would be concerned about or think justifies prosecution, is pretty shocking. That's pretty core political speech. This is the kind of thing where Musk is now complying with data requests in order to enable these kinds of prosecutions. The concerns about that should be pretty obvious.

Alex Stamos:

But fortunately, Twitter has continued with all of their transparency, right? We're able just to download and see what's going on, because that's what Musk said, that he wanted transparency in all these decisions.

That's one of the reasons he took over Twitter, because he didn't want governments doing these things in the dark. Right, Evelyn?

Evelyn Douek:

Yeah. No, definitely.

Alex Stamos:

You can go right now and see all these decisions have been made or at least statistics.

Evelyn Douek:

The trends where between compliance with these kinds of orders for takedowns or data requests before and after.

Twitter famously really pro-transparency. That was what the Twitter Files were all about, as we well know.

Alex Stamos:

Okay. Let's look, I'm going to pull it up and the numbers only go through December 2021. Well, I'm sure they'll get right on that.

Evelyn Douek:

Oh, bummer. Did something big happen there?

Alex Stamos:

Yeah.

Evelyn Douek:

Yeah, exactly. Okay. Speaking of transparency, actually this is an interesting one. Musk has also sued in California over a new law that California passed, that would require transparency about content moderation from platforms that are taken by social media companies. We are in an age where in this turn against voluntary platform disclosures, we are seeing more and more legislative mandates for transparency.

This is a lawsuit challenging it in California. We've also seen this as an important part of the NetChoice cases that are coming out of Texas and Florida as well. This is not just a democratic states' thing, this is also in those cases, this is a serious lawsuit. It is not just trolling or just trying to own the libs or anything. This is a serious lawsuit.

It has one of the most famed members of the First Amendment bar, Floyd Abrams, on the brief, also famous for being Clearview AI's lawyer. Having a very absolutist idea of what the First Amendment should allow, I think we should talk about the transparency mandates issues. We will talk about it a lot more on this podcast, because I think it's a really tricky area.

Suffice it to say, that I think there are genuine, constitutional concerns with some of the way that these transparency laws have been drafted or passed. But I do think that some of the rhetoric around these transparency mandates in the commentary and in these lawsuits, could transform the First Amendment into a pretty scary, all-purpose deregulatory device that prevents governments from passing important kinds of laws that could endanger large parts of the administrative state.

I think that there's meaningful, difficult issues here. I think this is a good thing that needs to be litigated, but I do worry that we're going to create some bad law that could be overboard and result in some pretty libertarian results, in terms of hampering government power.

Alex Stamos:

Yeah. Like you pointed out, this doesn't line up with clean partisan goals, because you have rules out of Texas and the like that are much more aggressive in saying what platforms can and can't do from a content moderation perspective. It's very hard to think of a rule where it would be unconstitutional to require transparency, but totally constitutional to say you're not allowed to do these things. Or to create legal liability for First Amendment protected decisions around content moderation.

Yeah. I think this is actually a great topic for you to cover with one of our law professor friends, because I don't have a lot of confidence in my ability to speak to it intelligently. I personally, just as a non-First Amendment lawyer, OEC requiring transparency to say, "I think it's a reasonable compromise to say these companies can make these decisions that they have the right to make the decisions."

But they have to provide some knowledge to the people who the decisions impact. So that they have not legal responsibilities, but at least that there's some commercial impact on them that individuals can choose. It's like from my perspective, an informed consumer route here. That if you don't know what the policy say and what decisions are made out of a policy, you can't decide which of the dozens of different platforms that you might want to carry your speech of what you should do.

I see the transparency requirements as closer to requiring labels you can put whatever you want into this product, but you have to tell people kind of thing. That seems to be we have a long history in the United States of the Supreme Court allowing the government to tell, "Okay, you can buy cigarettes, but you have to be honest about what the impacts are." This is not saying X is possibly bad for your health. It's making Twitter say, "This is the kind of enforcement decisions we made." Then giving individual consumers some choice.

Evelyn Douek:

Yeah. It's been fascinating to watch because the position that you just described, I think, would've been the mainstream almost orthodoxy even five years ago, that transparency mandates. Of course, we can't have the government regulate the substance, but let's have transparency and informed debate. Over the past few years, there's been a marked turn against transparency.

In part because it has been weaponized by government actors in ways that show, I guess, the perils of government enforcement of even transparency mandates. If you look at the way that Texas and Florida have talked about transparency mandates, it's clear that they have pretty partisan ends that they're trying to achieve by having these laws.

Even New York has this law that was requiring disclosure of hateful conduct policies, that also was trying to achieve a substantive end through a transparency mandate. Because the idea might be that it's less like, "Oh, let's trick them. It's a transparency mandate, it's going to pass muster," but actually it's trying to achieve substantive goals.

As a result of this, we've seen a lot more skepticism, a lot more hesitancy, a lot more nervousness about transparency mandates and their potential for abuse as well. But for all of the reasons that you laid out, I get very worried about the idea of giving up on the idea of transparency entirely, or the idea that the government can mandate transparency entirely.

Because then we're in a situation where you can't regulate these enormously powerful corporations substantively, and you can't even get transparency from them about what they're doing, to understand what's going on in our public sphere.

Alex Stamos:

Yeah. Just from a practical perspective, there's a ton of transparency requirements in the Digital Services Act, and no reasonable legal challenges it seems, that I know of, to stopping the enforcement of that in Europe.

As an American citizen, I'd end up in a situation where if I was a European, I would have decisions about my content explained to me, but as an American, I'm held in the dark.

Evelyn Douek:

Right. It does also undermine the arguments that are being made of like, "Oh no, we can't have these transparency mandates. We could never possibly comply with them."

They're impractical. Then over in Europe, they're complying with them or they will be shortly.

Alex Stamos:

Now, I don't know enough about, I probably should dive into California law, if it goes way beyond what the Europeans, but if you calibrate the California law to basically say, "Hey, if you're doing it for Europe, just turn it on for America." That seems like a reasonable level here. Again, just requiring transparency about this, gives people the ability to make decisions.

For the folks from the political side that's pushing the idea that these companies are big censorship machines. Then transparency should be something we should all agree on, because then at least you can make a public argument of whether decisions the platforms are making are intelligent or not, or appropriate or not, or are unduly influenced by the government or not.

Evelyn Douek:

Yeah. Well, we will return to this a number of times and I will find our professor friends, as you call them, on to talk about it because there will be this case that Twitter is pursuing in California.

We have a New York case, we have the Texas and Florida cases, so it's going to be a big year for transparency mandates.

Alex Stamos:

Preferably, professors who've gone to law school the first time they tried.

Evelyn Douek:

I'll add that as a screener question to my guest requests.

Alex Stamos:

Who didn't go through the back door of Harvard Law. Yeah.

Evelyn Douek:

That's right, exactly. Okay. Speaking of the NetChoice cases, over to the legal corner for the week, so this is for the NetChoice restatement of the law files. A California judge has granted NetChoice requests for a preliminary injunction in joining California's Age-Appropriate Design Code. I'm not sure we've talked about the Age-Appropriate Design Code on the podcast before.

I don't think today's the day to get into the nitty-gritty of it, because I'm sure we're going to have opportunities to do that in the future as well. This lawsuit is absolutely not going to end here. Suffice it to say, I think the law is problematic in many respects and it's very poorly drafted. It's being framed and being sold as a law about privacy and minimizing the level of data that platforms collect about children.

But it's also pretty clearly aimed at constitutionally protected speech that is "harmful to children." Raises a whole bunch of concerns about that that we've talked about in other contexts before. I was very ready to read a decision that I completely agreed with in terms of enjoining this law. For example, I have agreed with large parts of the Arkansas and Texas rulings in the last few weeks that also struck down age verification laws.

But this ruling from the California judge is actually pretty wild and quite broad, and includes some language that I think if it's allowed to stand, would imperil almost any privacy kind of legislation in a way that's simply not mandated by the precedents. It also seems to require an unusual degree of evidentiary proof from the state about why it wants certain things.

For example, there's a mandate that platforms should have their policies written in age-appropriate language for the children that are likely to be using their platform, so that they can actually understand their rules. The court's like, "Well, that's unconstitutional, because there's no evidence that actually protects children from harm," which seems to me pretty crazy.

I think we can understand why children being able to understand the policies on the platform, might be helpful for them as users of the platform. I think there's a lot more there. Suffice to say, it's a very libertarian and I think pretty sloppy ruling, in my opinion. I would be really shocked if the California AG doesn't appeal. I think that they should, not because I think they will win on appeal.

As I said, I think this law is pretty vulnerable to constitutional attack. But I think that they should try and get some better law here, that narrows some of the unnecessary broadness of some of this language for future cases. That is the podcast for today. Any pressing sports news we need to know about, Alex, for the favorite part of the podcast for the week?

Alex Stamos:

Well, there's actually a sports and legal corner, which brings ours together, which is there's now a huge legal battle over college realignment. We've discussed ad nauseum about the death of the PAC-12. One, the PAC-12, which still exists as of this right now, is kicking total butt. Actually looks like it's going to have the best record against out-of- conference opponents of all the Power 5 football.

Great. Thank you, Fox Sports and ESPN, for killing the conference that is winning more against its out-of-conference opponents than anybody else. I'm so glad that that happened, makes total sense. That's one thing that's happening. But what's also happened, is you have 10 of these teams have decided, have announced that they're going somewhere else, and there's only two teams left, Washington State and Oregon State.

They have now sued to basically say that all the people who have announced under the rules of the PAC-12, are not allowed to sit on the board anymore and make decisions. You might end up in the situation where all of these teams are now playing in a conference, in which Oregon State and Washington State who are both extremely angry about being left behind, are the only people with any voting rules.

Could pass all kinds of crazy things like the PAC-12 champion has to be from a state that grows apples, or has to be east of this arbitrary line that just happens to catch Pullman in Corvallis. Really, what it's about is the money that's left, that there are assets that belong to the PAC-12. They're like, "Well, you guys are all leaving, so thank you for leaving behind all that money, that now the two of us can take to help offset the fact that they currently are not part of major conferences."

It's a fascinating thing. There was a lawsuit, there was actually the judge put a temporary restraining order in place to keep the PAC-12 from actually having a board meeting. We should probably, actually do some research here and we could do an entire legal corner on, I don't think there's any First Amendment arguments, but there's some very interesting, contractual stuff.

Evelyn Douek:

There are always First Amendment arguments. Alex, you can't prevent us from having a board meeting. Are you kidding?

It is my First Amendment right to speak. You're just not thinking hard enough if you can't find a First Amendment argument.

Alex Stamos:

Yeah. Once again, the lawyers win, right? It's the big winner of all of this, is the television networks and the lawyers.

Great. Congratulations, college football, that's exactly what college sports are supposed to be about. They're supposed to be about lawsuits.

Evelyn Douek:

The lawyers always win, that's for sure. That's why I tried a second time to get into law school. With that, this has been your Moderated Content Weekly update. This show is available in all the usual places, including Apple Podcasts and Spotify.

Show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't be possible without the research and editorial assistance of John Perrino, policy analyst extraordinaire at the Stanford Internet Observatory. It is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman. Talk to you next week.