Moderated Content

MC 3/29: It's the Best of Times, It's the Worst of Times, in Platform Transparency

Episode Summary

Alex and Evelyn talk about X's lawsuit against CCDH for writing about hate speech on X being thrown out this week, and online rumors about Kate Middelton and the Francis Scott Key bridge collapse, and what they say about the health of our online information ecosystem. Then Brandon Silverman, cofounder and former CEO of CrowdTangle, joins to talk about the state of platform transparency tools in the wake of Meta's announcement that it is going to be shutting the tool down.

Episode Notes

SHOW NOTES

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn’t start with “X.”

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

CORRECTION: White & Case is not representing X in its lawsuit against the Center for Countering Digital Hate. The lawsuit was filed by former White & Case partner Jonathan Hawk and he is representing X in this case with his current firm McDermott Will & Emery. 

 

Evelyn Douek: ... slightly more popular podcast, I think with someone, some tech commentator that our listeners might have-

Alex Stamos: Kate Smisher?

Evelyn Douek: Yes. Stisher? Stitcher?

Alex Stamos: Stitcher?

Evelyn Douek: Something like that.

Alex Stamos: It was interesting. Yeah, yeah. For those of you who don't listen, I got to be on Kara Swisher's podcast with Taylor Lorenz and that was... This is what I'm going to say, I really like having you as my podcast partner, Evelyn.

Evelyn Douek: It's very sweet, Alex. Well, thank you. I guess I'll take that as a compliment.

Alex Stamos: I mean, her accent's impossible to understand.

Evelyn Douek: All right.

Alex Stamos: She's not a Sheila, so it's just the-

Evelyn Douek: Exactly.

Welcome to Moderator Content's stochastically released, slightly random, and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek and Alex Stamos. Before we get started, remember to follow this feed so you never miss an episode, and we're going to jump right in. So can we please get... I think we need a special sound effect this time, the sad Twitter sound combined with the legal corner sound. So let's get one of those.

Okay. And the special occasion for that sound effect is that X this week had its lawsuit against the Center for Countering Digital Hate thrown out by a Californian district court. We talked about these lawsuits at the time when X brought them, which is essentially that the CCDH publishes these reports about hate speech and misinformation on X, and had published a series about how in light of Musk's policy changes and firing all of the trust and safety team, apparently there was a spike in hate speech on the platform. And Musk, through X, brought these lawsuits, which as at the time, we said was essentially a defamation claim, but dressed up as breach of contract claims and claims about scraping. And that the problem was that the only damages that X was alleging was that CCDH was saying these bad things about X and that that was causing advertisers to leave, and X was upset about that.

And essentially, you can't punish CCDH for engaging in that kind of protected speech. And that's exactly what the judge said in his opening paragraph in this lawsuit this week. I'm just going to read from it a little bit because I can't put it better than this. "Sometimes it is unclear what is driving a litigation, and only by reading between the lines of a complaint can someone surmise a plaintiff's true purpose. Other times, the complaint is so unabashedly and vociferously about one thing, that there can be no mistaking that purpose. This case represents the latter circumstance. This case is about punishing the defendants for their speech." And as we know, this is Evelyn now editorializing, as we know, the America doesn't really like punishing people just for things that they say. There's a provision in the Constitution about this. And so the judge threw it out, which is a great win for free speech.

Alex Stamos: Yeah. I mean, this whole opinion is kind of amazing. You got to love that the judge just said it upfront. I've got a new up there with our friends in the Delaware Chancery court for kind of clear legal writing. There's no obfuscation here behind a lot of pseudo-intellectual phrasing just said, this is about punishing defendants for their speech. And like you said, they're hiding it behind a bunch of scraping claims.

The scraping stuff is interesting to me because I think scraping has always been an interesting thing for researchers, but it's become a really big area of law thanks to AI. And so I am interested in see how this progresses because the only way you can reasonably do studies of certain behaviors online is through some kind of scraping. But there's also I think a legitimate need for sites to be able to control what of their content ends up being included in commercial AI models that make billions and billions of dollars. And how the courts balance the socially positive uses versus the commercial uses will be interesting. But in this case, it's not even close.

We've discussed the CCDH study multiple times on the podcast. I have methodological issues with it. I'm not a big fan of CCDH's work, to be frank, but they have an absolute first amendment to be wrong, and Elon Musk should not be able to punish them. So a legal question, Evelyn, do you know under California's anti-SLAPP statute, is CCDH going to be able to move for penalties here?

Evelyn Douek: Yeah. So I don't know if they get penalties, but they will get their costs back. So this is a really, really important part of the ruling is that in these cases, they're called strategic lawsuits against public participation. If you have, well-resourced plaintiffs like Musk, classic, if they're well-resourced, but vindictive, they can bring lawsuits that are pretty specious against these kinds of entities to basically make the process the punishment, that you shut these people up because they just don't want to face a lawsuit regardless of whether it gets thrown out in the end because you have to pay all of your attorney fees defending these lawsuits.

And so California has this anti-SLAPP statute that allows for defendants in these kinds of lawsuits that are being brought purely for this purpose to try and shut them up, to allow them to get their attorney fees back. And so the process doesn't become the punishment. And so this is why it's a really, really important win for free speech that the judge found that this was a SLAPP and that CCDH will be able to get those costs back.

And it also shows... This is one of those things as a Australian that I think is crazy, that anti-SLAPP statutes aren't just the norm in every jurisdiction, and there isn't a federal anti-SLAPP statute, for example, because it just seems obvious to me that you shouldn't be able to use the courts and the process of the courts to shut people up vindictively like this. And so it's just a really important provision.

Alex Stamos: It's kind of shocking to me that they filed this in the Northern District of California. I mean, that is the appropriate jurisdiction, but judge shopping has become such a normal part of oppressing people's speech. I'm not neutral in this. I'm currently a defendant in a lawsuit that is trying to punish me for my first amendment protected speech and for the speech of my students. And it was filed in a jurisdiction I've never been to, that I've never visited, that we did not target by a person who is not mentioned in our research, but as a clear, clear, clear job shopping because there's only one judge in that district, and his proclivities are pretty clear, I am a little afraid that Musk will continue to be Mr Anti-freespeech despite his claims that he's going to try to use the courts to punish people who criticize him or criticize his companies.

But he will likely not make the mistake of filing it in the Northern District like he should. He's going to somehow shop it into a jurisdiction where... The crazy part for me on the jurisdictional stuff is the same judge that you shop into is the judge that could decide whether they have the right jurisdiction, which is nutty to me. That seems to me that it should have to be a neutral determination because all you have to do is file with the right judge and if they want to keep the case, they will find some excuse to keep the case.

Evelyn Douek: Right. Yeah, I mean, this is a bigger conversation, but a lot of our judicial system and legal system is premised on the premise of good-faith judicial officers who are faithfully interpreting the law. And I don't know what you do once that starts to fall down, but-

Alex Stamos: That's definitely started to fall down over the last couple of years, unfortunately.

Evelyn Douek: Right, exactly. The scraping thing, just to go back to that, is a super interesting point. I mean, this is a huge issue, and how to think about scraping and how to think about scraping of these websites and protection for these websites. But the important point here is that this was nowhere close exactly as you said. The, quote-unquote, "scraping" in this case was CCDH using Twitter's search tool to collect public tweets from 10 Twitter users and then doing an analysis of that kind of scraping.

So we're not talking about the kind of mass automated scraping that we talk about in terms of the development of AI tools and things like that. And so this just got thrown out, which again, really, really important win for free speech and important win for this kind of research, which is becoming increasingly difficult. We'll talk about this more later in the turn against transparency by these platforms. But if you can't do this kind of normal interaction with a platform and analysis of what's going on on the platform, we're not going to have any visibility at all into what's happening online.

Alex Stamos: Yeah. And it's also, again, surprised me that they filed in the Northern District of California because the most important scraping cases are on the Ninth Circuit. And the Ninth Circuit has gone this direction. The Nosal cases, the hiQ case have been going this direction to saying that scraping is not something that you can punish under the Computer Fraud and Abuse Act. Do you know what law firm filed this because this is just frankly bad litigating? I'm not a professional litigator, but it's just bad litigating from my perspective to make these claims when you know there's a huge amount of case law that already directly affects it. That is in this from exactly this district court.

Evelyn Douek: Yeah. I don't actually have the name of the law firm in front of me, which is maybe a good thing not to get personal, but also just... I mean, they knew, they realized at some point there was a motion to amends their complaint because they'd made all of these tactical errors in bringing this lawsuit, which suggests, I mean, I don't want to going out on a limb here, that maybe this wasn't the most well thought through strategically planned lawsuit that was ever brought. I think that part of these SLAPP lawsuits was that they were just bringing it to try and scare CCDH potentially and to shut them up and impose this kind of cost. So yeah, very, very comforting to see that the judicial system at least in this case worked as it's supposed to and protected CCDH. Like we've said-

Alex Stamos: It turns out, I found it's White & Case, which is a big, well-respected white shoe law firm, their LA office. It makes me feel that maybe Elon Musk is not the best client to have when you're a litigator. Oh...

Evelyn Douek: No.

Alex Stamos: ... at least if he pays his bills, that's great. I hope they charged up front of like, "Oh, you want us to file a lawsuit that under..." This feels like that Musk comes into the office and slams down and says, "They're hacking us. I want you to sue them for hacking." And no matter what the lawyer said, Musk is the world's expert, he's the expert in everything, had to have their way. So I hope they charged up front for this and they were not taking contingency fees later.

Evelyn Douek: I mean, who knows? Maybe he was that specific. Maybe he just came in and said, "CCDH is saying worse. CCDH is saying things. I don't like it. Let's sue them. Find me a cause of action." And they did the best that they could, but fortunately it wasn't good enough. So yeah, important protection for researchers, and a nice win to see this week.

Okay. So as is one of our favorite sayings on this podcast, everything is a content moderation story. And one of the biggest stories in the last few weeks has been the rumors about her Royal Highness, Kate Middleton. I am sure our listeners are devoted followers of this particular storyline, and so we don't need to catch you up on all of the ins and outs of what has been happening with Princess Kate. But the angle that we want to focus on and want to talk about is a New York Times story that we'll link to in the show notes that was covering research out of Cardiff University in Wales, suggesting that Russia had played a part in spreading some of the many conspiracy theories that were percolating around what was going on with Kate and what was the reason for her mysterious absence and the distorted photos.

And the researchers say that this is calculated to inflame divisions, deepen a sense of chaos in society, and erode trust in institutions, in this case, the British royalty and the news media, to which I say, "Oh, no. Eroding trust in the British royal family. How will democracy survive? What a terrible fate." But I mean, Alex, you thought... that's being facetious, but Alex, you thought this is an important thing to cover. So what is Russia doing here and why does it matter?

Alex Stamos: Yeah, so first, just the Kate Middleton story. I have to agree with Ari Voukydis who's a comedy writer who tweeted, "I literally couldn't have picked Kate Middleton out of a line up. Now I know how to recognize Pippa's silhouette in a poorly lit car, and how to pronounce Marchioness of Cholmondeley." So this has been the story that has pulled in all of the husbands and boyfriends of the Kate watchers, of the Royal Family watchers, including myself, of all of a sudden, my wife and I were reading TMZ together wondering, where's Kate?

So first off, as somebody whose family's dealing with some of these same kind of issues, our hearts obviously go out to the princess and everything she's dealing with. I know how hard this must be with little kids, and all of this crap that they're getting around the PR side. If you're caring about how your kids take it and how to frame it up and also worried about your health or the health of your spouse, then clearly all the PR stuff is secondary even if you're a member of the royal family.

So I just want to say that before we dive into the internet side of it, but the content moderation side here is like Martin Ennis was talking about, it does look like Russian accounts are pushing it. I think the meta commentary about the fact that you have Russian disinformation actors talking about Kate Middleton is missing the point, which is the goal of these actors is to build accounts that blend in with everybody else. And so if you've got the hot story that, like I've said, has not captured just the normal people who hang out on reddit.com/royals or read TMZ every day or read the Daily Mail just for the royal updates, but it's caught the attention of lots and lots of people, then it's a topic you're going to want to talk about because you're going to pull in audience members, you're going to build an audience, and you're going to seem like you are within the kind of normal conspiratorial bounds of their personal Overton window.

And so what the Russian accounts have done for a long time back including 2016, was I think this is where people get confused, is when I was at Facebook and we released the first data about what the Russians were pushing, the idea was, oh, Russians really care about this, really care about that. No. What they're trying to do is they're trying to look like this is a legitimate account. This is actually an American sitting where they say they're sitting, that believes the same things as you. Out of the 10 pieces of content they post, nine are going to be normie things, like Princess Kate and this and that, and then the 10th is going to be Ukraine is against your state and should not be supported by NATO. And so that is the goal here is that 90% of it, 95% of it is filler to make you believe that this is a legitimate voice.And then 5% of it is content that supports the geostrategic interests of the Russian Federation.

And so that's how we should look at the Kate stuff, is not that they're behind the Kate rumors or pushing the Kate rumors. It's that the overall conspiratorial tone of the internet, the fact that this has become a normal way that people talk and operate, the Russians have some... they can take some credit for that, but they're also taking advantage of it, and that it allows them to insert themselves into the conspiratorial style. And then the thing they actually care about is most likely these days going to be about Ukraine.

Evelyn Douek: I think it's such an important point to make the distinction because obviously the mainstream media and many high profile people did a perfectly fine job themselves coming up with lots of conspiracy theories, if you like, or just misinformation or rumors or speculation that turned out to be totally false when it came to the Kate story. So we didn't need any help from the Russians in coming up with a lot of crazy... the stories that were saying, "Well, this is just a Photoshop from Vogue transplanted onto this image," and all of those sorts of things. We don't need the Russians to help with pushing that kind of rumor-mongering in these kinds of events.

And so the point is to not focus on the idea that this sort of Machiavellian puppet master pulling all of the strings and making us all sort of believe certain things that we wouldn't otherwise, but just this kind of strategic use of these openings that we have in this kind of information environment that Russia's putting in. I think it's important to be particular about what the harm is and what the tactic is, lest we just further that idea of everything is a Russian conspiracy theory, which obviously is not the case.

Alex Stamos: Yeah, absolutely. And so nobody should read this as the Russians created the Kate... And this is what I hate about the media discussion of these issues, is you can't point out a normal thing the Russians are amplifying the Kate stuff without then it becoming Russia's fault. Russia's doing plenty to mess with people on the internet, but it is not because they just want to screw with the UK. It's because they have real geostrategic purposes, and it's working. Here in the United States, you have a significant portion of one party that believes in cutting off Ukraine and allowing Russia to take over all Ukrainian land.

It's crazy to say it, but the Russian disinformation campaign has worked on a significant number of people. And that might actually mean if Ukraine is not an independent state in five years, a big chunk of that is probably goes to Russia's disinformation efforts online unfortunately. That's the fact. But that is the goal, it is not the Kate stuff, and that's really important for people to keep their eye on the ball.

Evelyn Douek: Right. Yeah, that's a much worse outcome than eroded trust in the British royal family, which although my aunties in New Zealand would be very upset if people turned their back on the royal family, I myself am not super invested in their continuing-

Alex Stamos: You don't have the Kate and William plates. You don't eat...

Evelyn Douek: That's right.

Alex Stamos: Your dining ware every day is not... Yes.

Evelyn Douek: Yeah, framed and hung on my wall behind specialty glass that never shall be damaged commemorative plates. No, that's not me. But I did grow up listening to lots and lots of stories about the royal family. There's a certain generation of people in the Antipodes who are very invested in that institution. So Russia's really scaring and harming them.

Alex Stamos: Well, apparently it's more than just that generation.

Evelyn Douek: It's true.

Alex Stamos: I mean, we fought really two revolutions to throw off the royal family, but it seems to have not worked from the amount of airplay that-

Evelyn Douek: Right. Yeah. I don't know.

Alex Stamos: We don't send them tax money, but they control the mind space of a huge amount of huge number of Americans.

Evelyn Douek: Right. Yes. Draining our resources by pure just attention grabbing, which we cannot look away.

Alex Stamos: The original clickbaiters, the royal family.

Evelyn Douek: Right, exactly. Okay. So from one story about the health of our online information environment to another, again, because everything is in some form a content moderation story. We want to talk about the tragedy that I'm sure all of our listeners have been watching unfold and saw the footage of this week, which was the collapse of the Francis Scott Key Bridge in Baltimore on early Tuesday morning after a cargo ship ran into one of the support pylons. And the footage is just horrifying to watch.

And of course, in the aftermath of this kind of event, which is mind-boggling to watch, and your immediate question is, how did this happen? How does something like this happen? Well, online, many people jumped in with their preferred answers to that question of varying degrees of veracity. Of course, in the immediate aftermath, no one knows anything. And so it's a very, very common situation.

Our colleague, Kate Starbird has written long written about the process of collective sense-making rumoring that happens in the aftermath of these kinds of crisis events as people try and work out, try to make sense of something that seems impossible to make sense of. But something's different now, which is again, the online information environment and in particular the situation on X, where things are just kind of out of control, and it's impossible to know what's going on. And so this is something that you were watching, Alex, so what did you see?

Alex Stamos: Yeah, so first off, again, our sympathies to the family of the workers who died. I mean, amazingly, thank God. The Port of Baltimore is one of the busiest ports in the country. And this could have happened any time. If this had happened during rush hour, my God, it could have been hundreds and hundreds of cars in the water. And as everybody who's listening to this knows this ship lost power, they were able to call the Coast Guard and say, "We're drifting towards the pier." The Coast Guard was able to get the cops to close the bridge, and if you watch the video, you have the last cars leaving the bridge seconds before it strikes. It's kind of amazing, and a real shout out to the Maryland troopers who were on station and able to close it so fast. But unfortunately, there were some workers who were there who did not get the message, who perished.

Yeah. So this is actually kind of something I've been afraid of for a while. We've talked about, I think I'm a big sailor, and spend a lot of time on the San Francisco Bay, and after 9/11, one of the interesting things that happened that got a lot less coverage was that the normal way that when ships come into ports, and lots of people are learning this now, is that the captain of the ship is not actually in charge, is that they pick up these people called pilots, and the pilots go out in these orange boats, it says pilot with these big rubber bumpers. And they go out, in the San Francisco case, out to the seven-mile marker to what's called the light ship, and it's like the territorial waters barrier. And they pick up the pilot in the huge heaving seas.

This ship comes up and then they open up a door in the side of the big ship and the little ship, and a guy jumps across. Videos of it is ridiculous. It's like straight up like action movie crazy. And it's an incredibly dangerous job. Fortunately, these folks are very well paid, but it's a very experienced mariner who knows everything about that specific port, because it's like these international shipping lines are captains from all over the world. In this case, there's Indian, there's lots of Greeks, lots of Filipinos, there's lots of people from around the world who might not have never been to San Francisco, might never been to Baltimore. And so the pilot is actually in charge. The crew runs a ship, but there's somebody who's a local mariner who's actually giving orders, and is legally responsible for what happens to the ship during that period of time.

And after 9/11, there was a change in San Francisco. The rule went out, is that you would be boarded not just by a pilot, but by an armed squad of Coast Guard or Marines. And if you did not pick up, if you did not stop at the seven-mile marker, you would be sunk. And there were like, here in the Bay, they were running drills where planes would take off from Fairfield with the goal of could you make it to sink a boat between it not slowing down at seven miles and the Golden Gate Bridge. And it turns out that's an interesting race to get an F-16 scrambled fast enough at the speed at which these boats go.

So clearly somebody after 9/11 was doing the math on a Panamax container ship with 100,000 metric tons dead weight at 20 knots going up against a pier of a bridge. When people do the math, you're talking about billions of joules of energy being stored in here. There's nothing you can build to stop it. So clearly people have been worried about this. People have also been worried about the cyber side, which is where a lot of these happen. CISA has actually had a number of simulations as well as warnings of marine transport systems that modern ships are incredibly computerized. And in theory, I don't think it would be trivial, but in theory, those ships could be infected by malware that would affect their systems or shut them down. I don't think directly steer them so much is the worry as much as affecting the ship. And so there are legitimate concerns here.

There's no evidence for that now. And so moments after this tragedy happens, X in particular becomes completely flooded with conspiracy theories both this is intentional cyber attack, and the Biden administration's covering it up. Now, I'm not sure what kind of cover-up you can do in eight hours. A cover-up would come weeks later. This will be one of the most intense NTSB investigations in all of history. You're going to have NTSB, you're going to have Coast Guard. If there is a cyber component, you'll have CISA and NSA and FBI. Everybody's going to be involved. This is going to be incredibly large. Thousands of people are going to work on this investigation. Covering something up would be spectacularly difficult, but if it's going to happen, it's going to happen six months or a year from now. It's not going to happen within eight hours. So first, there's that kind of ridiculousness. X becomes completely full of this. And some of it's just normal, weird racist stuff like, "Oh, this is what you get for DEI or immigration."

The person in charge was the local harbor pilot who belongs to a union. And it looks like... I mean, I've seen some analysis of... But it looks like there's not much they could have done, that they just lost power at the absolute worst time possible. And everything they did... And then DEI for what... Do you think it's DEI is why Greeks control shipping? I'll just tell you right now, no. It's actually a vast conspiracy going back thousands of years, or that you have an Indian crew on a Singapore flag vessel. No, it's completely ridiculous.

So there's that kind of stuff which is just like actually is just full of random racist stuff, just racist. And that stuff goes big. But the direct... Those are at least not... Those are horrible opinions, but they're clearly opinions. But then people are making specific falsifiable claims of cyber attacks, of conspiracies, of terrorist attacks. And this stuff went big on X.

And I think there's a couple of things going on here. One, this is really just the model for X now is the selling the blue check marks, and getting rid of any verification of blue check marks means in the creation of an economic model where a blue check mark can get some money from their content has created a grifter industrial complex, that there is now a big incentive to have 400, 500, 600,000 followers because you can make money every single month. And because you don't have to get verification, you can create a bunch of blue check mark accounts that are the Liberal Patriot, and this and that of just pseudo anonymous names. I expect for the biggest ones, a bunch of them are being run by a small group of professionals, Macedonia, Nigeria, somewhere that's a low cost for them to do content creation.

Or maybe they're in the West, but they could also be using AI, and they could be using other ways to create lots of content. It's a full business that you could be running on X. And so saying something crazy, sounding like you're an expert in something is really a huge deal. And then the second is because there's no verification telling who's a real expert or not. This used to be the wonderful thing about Twitter is it's a way that normal people could interact with journalists. They could interact with the presidents of universities and Nobel Prize winning economists, and people who had expertise in all these areas. And so if Twitter was like it used to be, you could have had people who had blue check marks who were like, "Oh, well, I'm actually... I teach at the California Maritime Academy, and I am a professor in maritime safety, and here's what I say."

And now it's impossible to tell who anybody is. And so X has basically created the perfect platform for this kind of BS, and for creating an economic incentive. And in the long run, it has destroyed itself because this is kind of the first major, I think, real-time event where it's become clear that you cannot get your news from X anymore. So it was a great day for Bluesky and Threads and Mastodon and other folks because it used to be one of the last things people liked about Twitter was it was much more lively real time when events were happening. And now it is extremely lively, but 90% of the stuff you see is BS. And so it is no longer a place that you can go. It was never a great place to get breaking news.

And now to most normies who are like, "Huh, is it a DEI cyber attack?" I'm not sure how that is. This is a racially diverse virus?" But look at this. And they're like, this is insane. I just wanted to hear the news of what's going on. And you can't get that anymore from X. So it's an unfortunate, another indication that Twitter is truly dead, and I think we should stop calling it... We shouldn't use the term Twitter anymore. We should call it X, just like Elon wants us to because it is a very, very different information environment than the Twitter that existed just a couple of years ago.

Evelyn Douek: Yeah, a true branding success story. He has fundamentally... A rebranding success story. He's fundamentally changed the product and is unrecognizable from what it was in the past. And I mean, the competitive advantage that Twitter had as well was the deep bench. Something like this would happen and you would have experts weighing in. I was still seeing this kind of tweet in the aftermath, was like, bridge engineer here, here's my analysis. And in the past, I would've clicked on that and been super interested to see the professor of some engineering at blah, blah, blah, blah, blah, university telling me all of this stuff I didn't know. Whereas now you see that and you go, "I have no idea if that's true." There's no way to verify or trust anything. And so it means you just kind of... I at least, switch off from a lot of consuming information because it's more damaging for me to consume information that I think is true and turns out to be false than to just sort of wait and try and learn things as they come out.

And instead, what you've got is it is, it's like conspiracy theory mad libs of, "What is the latest bugbear of people?" Oh, it's DEI. Let's insert DEI into this story somehow in a very tortured way that it just makes absolutely no sense, but it's going to get... You know that it's going to get the likes and retweets because that's the topic du jour, the bugbear du jour, and it's going to attract that attention and then get monetized.

So in terms of... We've always had not the best incentives in the public sphere in terms of what kinds of information and news gets attention, but at least when it was attached to particular media brands and things like that, there was this interest in longevity and trust that needed to still be maintained, even if it was all based on eyeballs and attention grabbing and clicks. And all of that has just been totally destroyed. And yeah, it is. I also feel really sad about the loss of what Twitter was in these kinds of moments.

And the other thing to say is that it's so much harder to even know what's going on on these platforms in these moments than it was. And with the turning off of the API and API access for researchers, we don't necessarily know what the dominant narratives are or what people are saying or what's getting attention. And that's why, again, great that CCDH won its lawsuit, and that kind of research at least is protected. But it does raise this question of how are we going to get visibility going forward?

Alex Stamos: But fortunately, X is the only platform that's turned off a really useful tool for researchers. Is that, Evelyn, nobody else would possibly do that in the middle of a campaign for the most important election of our lifetimes.

Evelyn Douek: Thank you, Alex, for that layup. I'm going to grab that ball and I don't know, I don't know the sports terminology, dunk it through the hoop or something.

Alex Stamos: Wow.

Evelyn Douek: Nailed it, right?

Alex Stamos: This is a great try out to do the play by play [inaudible 00:29:27].

Evelyn Douek: Exactly. She dunked it through the hoop. What a dunk.

Alex Stamos: She grabbed the layup and dunked it, threw the hoop, threw the hoop, the dunk.

Evelyn Douek: Nailing it. It's totally seamless.

Alex Stamos: Grabbing the layup.

Evelyn Douek: None of our listeners, they're just going to be... their jaws are going to drop in awe at how seamless I picked up this layup to talk about the other major platform that has recently turned off API access. Alex can barely control himself listeners. He's losing it. I have successfully tortured him enough.

So we've got a special interview with Brandon Silverman to talk about the other platform that is taking this step, and at this wonderful time to talk about the backing away of transparency, not just at X, but across the industry, and Meta's announcement this week that it is finally... I mean, it's not a surprise, but it is the nail in the coffin turning off CrowdTangle in August this year.

Brandon Silverman, thank you so much for joining us to talk about this important topic. So you are the co-founder and former CEO of CrowdTangle, and you continue to work on it after it was acquired by then Facebook in 2016. And so I guess the most important place to start for our listeners or the unacquainted is what is or was, I guess, CrowdTangle?

Brandon Silverm...: Well, great to be here. Thank you so much for inviting me, and a long time listener, so just excited to be here. So CrowdTangle was a social analytics tool that was acquired by Facebook at the end of 2016. And we joined there, and began mostly working with the news industry to give them a better window into what was happening on the platform. But within a few years, we had [inaudible 00:30:58] our scope to become one of the main ways Facebook was transparent with the entire outside world about what was happening. And so that included not just news industry, but we began working with academics, researchers, civil rights group, human rights organizations, et cetera. And our mission was to make Facebook the most transparent and collaborative platform in the world.

Evelyn Douek: Yeah. And I mean, I think it really was industry best practice for a long, long time of transparency into platforms. And if people know CrowdTangle, if listeners know it, they probably know it from the New York Times reporter, Kevin Roose's Twitter account that tweeted out the top 10 links and got a lot of heat because those links often were right wing content or inflammatory content, hardly a list of the most edifying material that one would want to find on the internet. But I suppose it's also important to talk about the other kinds of work that CrowdTangle enabled and the kinds of things that transparency can bring. So what are the benefits of this kind of tool?

Brandon Silverm...: Yeah, we definitely have one of those dynamics. What is it? What do they say about the CIA? You only hear about it when they have failures, not their successes. So yeah, I mean, the vast, vast majority of our work had nothing to do with national reporters at national news outlets, but it was helping... Probably our single biggest cohort was local news organizations, and not just in the US, but in Europe and other places where it basically was between the years of probably 2013 and 2020, Facebook was one of the single most important ways to distribute content if you were in the news industry. And it was very hard to understand the flow of information inside of it.

And so what our tool did was make it really easy for, say, of local news organizations to see, one, what was happening in your community. So we would have local news organizations who were following all the local public schools, the local nonprofits, the local politicians to just get a window of what was happening in their community, but then secondly, to also to understand how to have their own content do well on the platform. In some cases, literally just to compare themselves to competitors.

So one is we just highest level is [inaudible 00:33:08] on the overall flow of information across the system. And it turned out that that was useful for a lot of different communities, including local news. A really huge other cohort was fact checking organizations. We worked with a lot of election protection groups, especially in the global majority at places where there weren't necessarily a lot of social media staffs there to help support do various concept moderation roles. So there were actually a lot of use cases. I can give more examples, but they ended up being pretty diverse, which also was part of the problem at the end of the day.

Evelyn Douek: Right. Okay. So the problem, so the big news, the reason why we're talking to you now is that Meta has announced that it is shutting off CrowdTangle in August this year, which is great timing. Now, I guess I just want to know, as co-founder of this tool, how are you feeling? What's going on for you? How are you feeling about that? What does it mean for you?

Brandon Silverm...: Yeah, I mean, listen, I think this is not the most surprising news in the world. I think this is always kind of the writing was on the wall for this for a long time. And also just, I mean to pull back the [inaudible 00:34:13] also to myself, when you sell a company, I think one of the things you go through is kind of a little bit of an emotional detachment from the thing you built and having complete ownership. And one of I think the really unique things that happened to us in which I give actually Facebook a ton of credit for is we actually got to keep running it and building it and pursuing a lot of things we cared about inside Facebook for almost four years. And in fact, Meta gave us a ton of resources to go do that work. And I think one of the reasons I think actually there's been so much pushback since the announcement is partly because they actually invested in us and gave us so many resources for so long.

So all that's to say though, I think the thing that surprised me the most was that almost every month, almost every week that we went by, that we got farther into 2024, I kind of assumed that the announcement for shutting it down and moving to something else wouldn't happen until 2025. And so the fact that they got through essentially the entire primary process here in the US and waited until the general began, but also just with so many elections happening around the world, shutting it down just in the middle of the year around that, that genuinely surprised me.

Evelyn Douek: So just to hone in on that a little bit, because that there has been a bunch of outcry around this decision, especially just a few months before the election, the US election, but of course this is a huge year for elections around the world globally. There's some phenomenal amount of people are going to the polls this year. What is it... And we'll talk about the alternative in a second, the reason why Meta says it isn't turning its back on transparency entirely, but what is it that these tools, what is the kind of information, what is it that the public gets from having this kind of access that is important during an election that we might miss out on?

Brandon Silverm...: Yeah. So there's a handful ones, and I'll just go through a few use cases. And I'll base these on kind of other elections I saw. But as we began to shift our mission towards more transparency, our election work actually became one of the parts of our work that became most important. And so in the final two years we were at Meta, we got very in the weeds and kind of on the ground using CrowdTangle to help support elections in Nigeria, Sri Lanka, Brazil, Myanmar, and we did a ton around the US 2020. And just to go through a few examples of what that looked like. So on the one hand, part of what you have is helping simply the news industry understand what the biggest narratives are around an election, and not necessarily... And this is not a part of any sort of the jawboning or government trying to request takedowns, things like that.

This is simply helping the news industry understand what the stories are around an election so that they can cover them, respond to them, provide voting citizens with more information about that topic, et cetera. And there's kind of a fractal version of that. There's both what are the stories at the national level, but getting all the way down into local communities, and understanding what's the discussion around this election, what's the discussion around this candidate, and how can we help make sure [inaudible 00:37:10] an informed citizenry about this? So I'd say one really juicy piece is just having a more informed citizenry given the role of these platforms in hosting so much civic and political discourse.

A second one is also around violating content. And the two big buckets around that that we ran into the most around US 2020 was, one, claims that we thought that Meta and others thought might interfere with the voting process or the integrity of the election. And so obviously Trump and others were pushing a lot of narratives around mail-in ballots being some sort of unreliable fraudulent thing, et cetera. And that was genuinely a threat to getting people to participate in the voting process, and was spreading through a lot of public discourse on the platforms.

And so one of the things we did actually is Meta made CrowdTangle available to every single local election administration official in all 50 states so that they could help track for information that might be... And in some cases it wasn't even malicious actors. Sometimes it was somebody who had an influential account and they just put the wrong day, and this could help correct it sooner rather than later. The second one was also around just general misinformation that violated the policies at some level. And when you have a system as large as Facebook and Instagram, it's very hard to catch everything, especially when it is moving quickly. The actors are changing, the behaviors are changing, content's changing, et cetera. So empowering some of the external community to also help play a role in this.

And I'll give one example. Common Cause, which is one of longest civic advocacy groups in the country, they use CrowdTangle and they recruited several hundred volunteers from their member base. They were all given access to public live CrowdTangle displays that all came from different communities and small towns around the country that would keep an eye on what the big stories were. And if there were ones that they thought Common Cause should try in come in and debunk, they would, or in other cases, they actually found violating content and were able to flag it for Facebook, and they would decide for themselves obviously if it was or wasn't.

But it was essentially in some ways I think about it is like open sourcing some of the content moderation around really important moments in a country's history or time. And then obviously you have fact-checkers as well. So those are some of the use cases. I could probably go on for even more, but yeah.

Evelyn Douek: Yeah, no, that's super helpful to get really concrete, and I guess just highlights with all of these countries going to the polls, that's the kind of thing that you would want visibility into. And I think also really highlights I guess two things. One is that you need this broad constituency of people. When you have a platform the size of Meta or any platform, a centralized version of content moderation is only going to do so much. You need to have all bugs are shallow with many eyes, that kind of problem of if you have lots and lots of people looking at things, you're going to have a much healthier ecosystem. But it also depends on that infrastructure. Having those people. I guess this is a problem with the decline of local news and just also civil society, you need to have those people with resources to do this kind of work.It can be really resource intensive.

And then also to have access to the tools. It's all good and well to have the transparency tool set up, but if they don't have access and if it's not broadly available, then it's not going to have the impacts that we want. Which I guess brings us to my next question, which is that there has been this outcry in the last week after Meta's announcement that it's turning off CrowdTangle, especially in such an election year, August, and then with the US election just a few months later. And Meta's response pushback has been, "No, no, no, you're misunderstanding. We're turning off CrowdTangle because we're devoting all of our resources into this new thing called the Meta Content Library, which is just going to be a much more powerful tool. It's going to do much, much more. So it's not that we're reducing transparency, we're just redirecting it to this different tool, and it's going to be even better." And so I'm curious for your thoughts on that. Is that true? What is the Meta Content Library going to do? How is it different, and should we all breathe a sigh of relief?

Brandon Silverm...: Yeah, I mean, I have many thoughts. I think that probably the highest level one is there are really promising things about the Content Library, but to make the claim that right now it's as good or better than CrowdTangle, or, and in my opinion, to even say that it will be by August 14th feels very hard to defend, if not is technical in some ways. And I'm happy to go into all the reasons why, but I think a big part of it is exactly what you just said.

There is a huge difference between simply technically saying data is available over here and actually making data useful in a way where it drives impact and collaboration. And that might've been my single biggest learning from 10 years of doing it as a CrowdTangle. It is not just about dumping data on people, which by the way, they're also [inaudible 00:42:08] not exactly even what they're doing. But you have to make this stuff useful, especially if you're also trying to collaborate and work with civil society. So at the moment, there is a universe where this Content Library could get there, but it is not right now. And I worry a lot that it won't be by August 14th either. And so that's my take on it at the moment.

Evelyn Douek: Okay, so let's get a little bit more specific then. So let's start with the ways in which it might be better. So what is it that Meta says it's going to do with the Content Library that will offer more than CrowdTangle did?

Brandon Silverm...: Yeah, and I'll actually bucket this into three things. So I think about this is there's data, there's usability, and access. Those are the three buckets I have been breaking this into. So on the data side, they have mostly been saying, "Hey, we have some additional data, therefore it's better and we expect it to be as useful to all these partners as CrowdTangle." And the four areas where that is true it is CrowdTangle didn't have all public content. We had a subset of public content. We tried to grab the most important stuff, but it was not everything. This according to them, is going to have everything. That's a huge deal.

Secondly, they have a few data points, which I would've imagined we would've added by this point, but weren't in there when we left two years ago. They're also great. So one is reach, that's a huge deal, especially of social scientists who want to be able to understand actual consumption patterns and et cetera. That's a huge deal. The other one is they have more formats than we had. So for instance, they have Reels and Events. Again, I would think we probably would've added that, but [inaudible 00:43:41] CrowdTangle, that's awesome and a huge deal.

And they have also publicly said they're going to add comments. That had been something we were working on but also hadn't gotten over the finish line. So that also another... That's awesome, and I am super excited about it for so many of our partners to be able to have that. I think the challenge is while they added some of those things, in some cases, they also took away, it does not have some of the data that's in CrowdTangle. And then other, there's some real trade-offs to I think some of the decisions they made around the architecture of adding all that data. By the way, I feel a little bit like... I feel like Mark doing his analysis of the Apple Vision Pro in comparing it to the Oculus Quest.

Evelyn Douek: No, you're being very generous.

Brandon Silverm...: Oh, yeah. Great. [inaudible 00:44:27]. One is they're being super, super restrictive on what you can do with the data and how you access it, way more than we were at CrowdTangle. And that has really important implications for how much civil society can use this, in some cases, how much even academics can get out of it. So for instance, there's a thirty-day data retention policy, which means any queries you set up, every 30 days, you have to essentially delete and start rerunning again. That really knocks out a huge amount of research people might be able to do.

Second, even for civil society groups, the search query inside their interface is essentially non-deterministic. It is essentially when they can't go get everything, they sample the result. And so if you do a search for voter fraud, you can run it once, and if you run it again, you might get a different set of results. That was never true in CrowdTangle. We searched our entire database. We ranked every single result in the exact set of parameters you told us, and that was it. That's a huge deal.

There's also... And I keep going. One of our biggest use cases, it was what we had to call a URL search. You could put a URL in the search box and we would show you all of the pages and accounts that have posted to that URL, and it's basically this really powerful way to understand how links are moving through the information system. I don't think that's available in this system.

And I'll give one last one just to really hammer home this idea that there's some more data, in other cases, there is less data, is one of the things... Facebook's or CrowdTangle's fundamental architecture was a little bit like the internet archives way back machine. We would snapshot things. It was like an archival approach. We would snapshot things, save the snapshot, et cetera. And so one of the things you could do is if there was some new page that showed up or a new public group that was suddenly starting to show up on all your results, you could go look up the history of that account and see essentially when it was started and how it grew over time on its follower count.

And that was really fascinating information because one of the things we saw, and just go back to your example of how this actually plays out, is some malicious actors, what they did in 2020 is sometimes they would do is start pages and they would be all of these totally anodyne meme content of like cats or whatever, cute dogs, and they would grow a huge fan base, and then 30 days out of an election, they would switch it and start pushing a bunch of political content, and they would sometimes change their name, et cetera.

There was a whole Harry Potter one that built a whole network off of Harry Potter and then switched. So misinformation and disinformation researchers could go in and look the history of these pages, and it was really fascinating. And also something you could see is huge abnormal spikes, which meant they bought all their followers. Anyway, so that's another whole data point that is not no longer going to be available in this. So I think them saying there's more data therefore it's better is a little bit of a disingenuous way to describe it.

So the other two... And I'll go a little shorter. I'll go a little quick on this. I think usability is probably even, there's a much bigger delta, and that's probably the single biggest delta, and I think for me really matters, in some ways, the most is there's just much less of a user-friendly interface for a lot of partners, and I'll just go through a few concrete examples. We had a whole interface called Intelligence where you could go look up accounts, you could go look up groups of accounts. So if you wanted to compare every single member of Congress, every single Democratic member of Congress to every single Republican member of Congress, you could compare their entire activity over time, their growth over time.

And we actually had a Ukrainian academic researcher who was beginning to pick up on changing political dynamics in Ukraine at MEMO 98 and was using that exact functionality for years, and produced a live great research. That entire functionality they haven't even attempted to recreate. So that's just gone, and I think a really important tool.

Second, I mean, just get really specific. So if you do a post search and you get... Let's say, you look up voter fraud or Donald Trump or Joe Biden, it'll give you a 10,000 posts show up. You can't click on the post and go visit it.

Evelyn Douek: What do you get then?

Brandon Silverm...: You can see the post, but if you want to go visit the actual one...

Evelyn Douek: Okay, I see.

Brandon Silverm...: ... there's no link to the actual post. And the reason is I think they consider that privacy sensitive, the URL, as a privacy sensitive thing, and so can't... There's literally no links to the post, which is really can be very problematic for lots of reasons. And I think another one, just to go through this interface, there's no ability to collaborate with team members inside these interfaces. So let's say you build up a bunch of queries, a bunch of lists you want to track. In our system, you would sometimes have five, six a dozen. I mean, we have some accounts with hundreds of people who are all using the same shared team interface to monitor the things that are important to their work. Here, it's all individual. And that's a big deal.

And I'll say the very last one is, and I could go on for all these, is the access point. And the strengths on this side are that Meta is taking a much better approach than some other companies where instead of limiting this entirely to academic affiliate organizations, which also I think was where they were at the beginning days of the Content Library, they have now begun to expand it to fact-checking organizations, nonprofit news outlets, and also some election protection groups. That's awesome. And I think they deserve credit. I think the challenge is if they're going to say right now it's as good. The problem is that the elections are happening right now, so it's not who can get access, and who has access right now? So how quickly are they going to onboard all of these organizations?

There were almost 200 election related organizations who signed this letter that Mozilla wrote worried about this. And my question is how many of them are going to get access and when? And also my experience in this case is onboarding people out of tools like this is not just about sending a link, it is about them learning it, getting familiar with it, customizing the setup. And so if they all get on there by August of 2025, we'll have missed the point. So for me, it's not about technical specifications, it's about access, but when are they actually going to get it? And I think that's the area where if they move quickly and it can get there. And so I just hope they do. It's also probably worth noting that the Mozilla letter didn't focus on this, but they're not letting any for-profit news outlets into this.

Evelyn Douek: I was about to ask you, that was an obvious omission from the list you just gave us that no-

Brandon Silverm...: Yeah, and I think this one hasn't [inaudible 00:51:08] attention, but yeah. I mean, they're essentially... And so again, if they are the ones publicly out there saying, we think this is better, I think what they're also doing is telling the entire for-profit news industry, we're not going to make this available to you. And I understand. I mean, I was in the weeds on some of the complications around that, but for me, I think of the [inaudible 00:51:31] the state as a critical part of liberal democracies. And yes, they get political sometimes, and there are complications for it, but shutting them off from being able to understand what information ecosystems look like and understand how to report on them, how to shape and guide them, that feels like a really big loss.

Evelyn Douek: Yeah, I mean that's a huge omission. I love academics. I am an academic. Some of my best friends are academics. Like really, really important work.

Brandon Silverm...: [inaudible 00:51:59] academic.

Evelyn Douek: Right, exactly. So no knock on academics, but we are one part of an ecosystem of research, and often piggyback on the work of journalists who do frontline real-time breaking news, and especially in elections. I mean, really important work comes out months and months later about the election from academics doing in-depth analysis that is really important. But as things unfold, the kind of things that you need come from not just nonprofit news outlets, although also very important, but you need the media. You need the media with lots of... And yes, there are going to be lots of different takes, and there are going to be people that contort things or make certain claims that can't be supported, but that's free speech. That's how we do this, is we have people engaging in that kind of analysis and then we have a conversation about it. So that's huge and extremely disappointing.

Brandon Silverm...: Totally. I mean, one of the intellectual journeys I have been on over the last two years, an area where I really changed my mind is I used to think about transparency and data access as essentially two main constituencies, academics and then civil society, including news industry. And I used to think about it. I'm a nerd, married to an academic. I mostly thought about it as the academics are going to bring 75% of the value, and the civil society is going to provide 25%.

I've now essentially flipped, and if not made the ratios even worse, because I mean, if you followed the US 2020 research program with Meta, it was a program that they... it was a collaboration they did with a bunch of academics to study the US 2020 election and all these questions around it. It had some of literally the best social scientists in the world I think helping shape these questions. Meta put a ton of resources into it. Everybody had a huge vested interest in making it successful. Four out of what are going to be the 18 papers have come out so far. So 14 out of the 18 are probably going to come out after the next election cycle. That's [inaudible 00:54:01].

And I mean, again, I know a lot of people involved. I respect the hell out of them, but it's just too slow. And even if you would cut that in half and it'd have been two years, that might've also been too slow. And so that's why I love the Stanford new journal that has been trying to speed up some of the peer-reviewed process around that. I think that can help solve this. But I really come around this idea that I think a lot of this data access stuff is about what I call civil society observability, which is making these platforms observable to community and civil experts as well as the news industry to almost play an exploratory role in helping see what's happening that then other mechanisms can do deeper dives. But if it's going to take more than a year, the platforms have probably changed enough. I'm not even sure how much the results matter.

Evelyn Douek: Okay. So that's super useful in debunking and being really specific about why Content Library is not the hoped for replacement for CrowdTangle yet.

Brandon Silverm...: It has the potential.

Evelyn Douek: Right. But it's not there yet and we're in the middle of this big election year, and even if it will be in the future by rectifying these things. And part of it is having people criticize these deficiencies and hoping that Meta responds.

But I guess zooming out a little bit, and putting this in context, we are seeing this turn against transparency across the industry. There's Twitter/X famously turned off API access. We're seeing where there was this greater trend from platforms offering more and more data, we're seeing platforms reneging from that in part because they didn't get a lot of reputational benefits from doing it. It allowed a lot of criticism, and it didn't end up being the boon that they hoped for as just being good citizens providing this insight.

The reason why we are still having replacements, and Content Library is not because necessarily Meta is a wonderful beneficent actor in this space, but because we are seeing regulators step into the breach and try and stop this turn against transparency, and the Content Library itself is a product of Europe's Digital Services Act, which has a provision that requires data access. And I guess a couple of questions about that. I mean, the first one is are these deficiencies somewhat a product of Meta is box ticking. It's complying with the formalities of the DSA in terms of making sure that it provides the access that the DSA, which is rather vague and broadly worded in making sure that it's not breaching its obligations but not really going any further. If it doesn't need to provide access to news organizations, then maybe it's not going to. And so how much of this is regulatory design flaws? How much of this is just we need to see regulation pushing further in making sure that the platforms stay open?

Brandon Silverm...: My sense is it's probably 50-50. I think some uncertainty about what exactly is required for the regulation. And I can talk about that in more detail. I also, I think the other 50 honestly is that my experience is there is rarely senior level leadership who really feels ownership and excitement about building these solutions and tools and companies. And so I think partly the deficiencies are there wasn't anybody who really understood the full breadth of the use cases and the value and why the interface mattered so much in a way where they could say like, "Hey, we can't ship this until these usability things are up to par." I think there's a more cynical possible interpretation of that, but I don't think that's true. I more generally think that is they probably didn't fully understand the implications when they made some of these decisions.

And then yes, I think one of the things we need is we need the European Commission to start providing some more guidance, and I think we're roughly on a predictable timeline for this, around what they expect when it comes to 40.12 and whether they consider this sort of program in full compliance or not. And I can point to at least one way in which it's not fully compliant, that I feel very legally grounded on. But I think one other thing we're seeing is all of the other industry efforts to come into compliance, how they compare and contrast, and then what the commission is going to think about that, and where they're going to come down in terms of enforcement, additional guidance, et cetera.

Evelyn Douek: Okay. Well, I can't not take that teaser. What's the way you think that it's obviously deficient in terms of legal compliance?

Brandon Silverm...: Yeah. Well, so there is one prominent data point they have not included in the Content Library. And it was a data point I tried for a long time to get in CrowdTangle and couldn't get over the line, but it is around labels that the platform attaches to different sorts of public content. When I was there, it was authoritative voter information as well as fact checking labels. Now, where we're also going to likely see is watermarking labels. And so if you are doing any sort of this research on public data, and that label is on the piece of content you're looking at, you may see it in the interface. So I'm not even sure 100% that's true, but it's not coming through the API. As far as I know, it's not in any of the downloadable data, et cetera. And so that's a publicly available piece of content. I think it's very important to the systemic risks, a bunch of the systemic risks that are identified in DSA. And as far as I can tell, there's no defensible reason not to include that data.

So I think that's one. But yeah, I mean, I just think we're in this, it was both the best of times and the worst of times moment where all of the voluntary efforts are now all being wound down. And some of those were really good. Especially on Twitter, that's been incredibly disruptive to a lot of people who use that data. But I think the best of times is, and I've talked about this a little bit, but I don't think it's been fully appreciated, is there's a lot of platforms quietly trying to come into compliance with 40.12 and releasing programs that they never have in the history of their companies. So you can now apply for access to look at public content on Alibaba, you can do it on LinkedIn, on TikTok, et cetera. There's over a dozen platforms that have now launched public data research programs.

And by the way, one [inaudible 01:00:17] we understand it is they rarely announce it. They will just change the terms of service, and they will put up a form. And so only literally if you have a Google alert [inaudible 01:00:28] of service changes, do you find out about these things. But they're all starting to come online. And [inaudible 01:00:33] I'm an optimist. I think there's a universe in which we're at the first inning of permanent infrastructure of data that is no longer going to rely on the voluntary interests of platforms, but will be around, be enforceable, and will exist for the entire industry, not one or two platforms. And I think that could be a huge deal. I mean, this can make that really concrete.

Alphabet, Google now has a form where you can go and apply to scrape Google if you are interested for public interest purposes. So that is the largest scraper in the history of the internet that has never allowed you to scrape anything they do now has to [inaudible 01:01:21] required to make that available. I mean, that is a huge deal. The challenge, of course, is right now as it's written, the platforms have way too much say in deciding all of the details of these programs. And so they're all taking slightly different approaches. And so a lot of the work that we have to do on civil society on behalf of consumers is try to figure out how to negotiate some of that power back or try and create more industry norms so that these aren't just, as you said, tick boxing in the smallest possible way, which by the way, we've already seen X do, which is they put up a form to apply for this. No reputable researcher in the world has gotten through it as far as I can tell so far.

Evelyn Douek: Right. I'm shocked. I'm shocked. So, so surprised to hear this. Yeah, right. Yes. I mean, that's great. I love an optimistic note, and I mean, it is an incredible moment. All of those examples are super, super useful. The proof will be in the pudding, and exactly as you say to see with all of these applications going up, how many people get access, and what does that access look like?

I guess one last question before we close out that I'm curious in is this is as a result of the European Union pushing forward and creating this infrastructure and this requirement, is that sufficient though? We are not seeing similar... There have been bills floating around, but there's not, at the moment, progress in getting a similar thing happening in the US. And in terms of talking about, we're talking about global elections and global information, how much can we all just piggyback on Europe's progress here and opening up these avenues, and how much of it is going to be...? Is the data restricted? How much information can global researchers get for their relevant markets as a result of this EU action?

Brandon Silverm...: First, one very specific thing, and we can dream in mind who should be able to say this. We can put this in show notes, but there is a... George Washington has this amazing tracker of all of the application processes, all of the details, and it includes a specific question of is there a Brussels effects? For platforms coming into compliance with this European regulation, are they also making it available to other researchers and other topics? Right now, it's mixed. Some are, some aren't. And the way Europe defines it actually is you have to be researching something that they consider a systemic risk to Europe. But in a lot of cases, you don't necessarily have to be a European house researcher, is one interpretation of the language, but the platforms are all taking different approaches. So I think if you are another country, there's no guarantee that this is going to be available where you are and for topics you care about.

I mean, one fascinating thing is I even probably talk to more regulators the last two weeks than I have in maybe the last two years. And I've done a lot of that work. And so I think there's a lot of interest in this at the moment. But I think, listen, I think for a while, we didn't know how to craft some of these transparency laws. I just want to be very specific. And so I think one of the really big benefits of 40.12 is we... And watching how some people come into compliance, is we're learning about how to write these well. And so I think that there's going to be a huge benefit to that. And there's a lot of countries. Up in the US, somewhere in the middle, there's interest, but I'm not sure how real the interest is among all parties. So I'm not super optimistic about something happening in the US of a federal legislative level, but there's certainly interest at the state level.

And if we had more industry international standards and norms we get point to versus having states get into all the details themselves, there would be some... That's I think starts to open up that. And then there's certainly... I mean, listen, I think New Zealand is very interested in this topic. Canada is very interested in topic. There are other countries that I think would also have an interest in this, which is also... I do think one of the things I've been increasingly working on, and that Rebecca Trumbull and Anna Lenhart, some others have been helping lead, it's helping spit up, basically creating an international standards body that could build off of what's happening in Europe, but could be a create standards that different legislative jurisdictions could essentially build off of around the world.

I'll say two last things is the Europe still has a lot of work to do. There's still just a ton of work in the implementation to get this right going forward, but I think they have proven that there's a model that has real impact, and I think there's a lot of legislators paying attention and watching to see how they can potentially build up.

Evelyn Douek: That's great. I love it. We had you on to talk about the death of CrowdTangle and the real loss that we're feeling, but we're ending on a real optimistic note about how we might be on the cusp potentially maybe. You did a big... For the listeners, Brandon just did a big shrug of, I don't know, we'll see how it works out, but we could be on the cusp of something really exciting...

Brandon Silverm...: Totally.

Evelyn Douek: ... in terms of having visibility into these previously really opaque ecosystems, and that could be great, but a lot of TBD in terms of the details.

All right, so thank you, Brandon. And with that to close out, of course, we should allow Alex to have an actual sports update to redeem ourselves for our sports listeners who are still trembling in pain as a result of my butchering of the lingo earlier. So Alex, what's going on in the sports world this week?

Alex Stamos: Well, one of the teams that grabbed the layup and dunked the ball through the hoop would be our students in the Stanford women's basketball team who defeated Iowa State and is now on to the next round, the round of eight in the Women's NCAA Championship. They'll be playing tonight North Carolina State. So still possible for them to make the finals, and we'll have to do special coverage of that. Maybe we'll do a live. We can watch it together and do a play by play. That could be a special podcast episode for people to play simultaneously.

Evelyn Douek: That'd be good content. I've been telling people how silly I felt this week because you invited me. You said we should go to a game. And I was super excited, but I'm traveling at the moment. I said, "Oh, I'll be back in April. Can we go then?" To which he replied, "It's called March Madness, Evelyn." So fair point.

Alex Stamos: I mean, technically there are a handful of games in April, but it is the finals, and it would be great. I guess, we could go to the women's championship in Cleveland for your absolute... Your first time in Cleveland and your first basketball game could be going to the women's finals.

Evelyn Douek: I mean...

Alex Stamos: How about let's watch it on TV...

Evelyn Douek: ... okay, sounds good.

Alex Stamos: ... together, then we can record something? Okay, perfect. Anyway, good luck to our friends in the Cardinal. Remember, it's Stanford. It's not the Cardinals, it's the cardinal, the color, which is incredibly stupid. But hey, good job. It's not their fault for those wonderful student athletes.

Evelyn Douek: What is that? Oh, and our mascot is a zombie fired looking tree thing, right? Is that correct?

Alex Stamos: Well, so that's not actually the mascot of the school. The school does not have an official mascot. That's the mascot of the band. So there's a [inaudible 01:08:08] interesting history here, but Stanford used to be Stanford Indians.

Evelyn Douek: Can't see any problems with that. That's great. Yeah.

Alex Stamos: And so my understanding is there's actually a vote by the student body of who they wanted their mascot to be, the university being the Leland Stanford Junior University being named after the son of the creator of the Southern Pacific Railroad, and one of the railroad barons wanted to be called the Stanford Robber Barons, which would've been a awesome name. The administration did not agree, and so decided to call the school the Cardinal, which is kind of a play on Harvard Crimson. So it's like back in the '70s, Stanford wanted to be the Harvard of the West, and now Harvard wants to be the Stanford of the East kind of thing. But anyway, it was an unfortunate naming of it is a color, and so the university does not have a mascot. The tree is the band's mascot. It is the mascot of the Leland Stanford Junior University marching Band.

Evelyn Douek: And so you're telling me if I go to a game, I'm not going to see a giant bird jumping around in celebration?

Alex Stamos: No, it's not cardinal like the bird.

Evelyn Douek: That's ridiculous.

Alex Stamos: It's cardinal the color, so it's like [inaudible 01:09:11].

Evelyn Douek: So it's just a red?

Alex Stamos: I guess, you could have a human anthropomorphized like Pantone square of like...

Evelyn Douek: I actually love that.

Alex Stamos: ... when you're going to Home Depot.

Evelyn Douek: I love that for us. That's wonderful.

Alex Stamos: Yeah, we should actually suggest that to the band. That would be really funny. Like a color strip from Home Depot, but humongous with arms in the face. Yeah.

Evelyn Douek: So much character. There you go. Just throwing off these great ideas to save the university branding scandal. Sounds good.

Okay, well with that, this has been your Moderated Content weekly update. This show is available in all the usual places, and show notes and transcripts are available at law.stanford.edu/moderatedcontent. If you're feeling inspired, please leave us a rating or review wherever you happen to listen to us. And as always, this episode is produced by the wonderful Brian Pelletier. Special thanks to Justin Fu and Rob Huffman. Talk to you next week.