This week, Evelyn and Alex went long on one of the weirdest tech reporting stories they've ever heard. Last week, The Wire—an Indian news outlet—released a bombshell report alleging it had seen internal Instagram documents evidencing improper influence by the Indian government over Meta. Meta denied the reports and what followed was a back and forth that got stranger and stranger as the saga went on...
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:
Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.
Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.
Like what you heard? Don’t forget to subscribe and share the podcast with friends!
Evelyn Douek:
We both talk too fast, which is not a surprise.
Alex Stamos:
I'm going to work on that.
Evelyn Douek:
Yeah, I think we feed in into each other.
Alex Stamos:
But I think we accelerate each other.
Evelyn Douek:
Exactly. It's just a bad combination.
Welcome to the Weekly News hit episode of Moderated Content with myself, Evelyn Douek and Alex Stamos, director of the Stanford Intern Observatory. Two people who are temperamentally immoderate, so it's always going to be fun to see where this heads. Alex, we have to talk about one of the craziest stories in tech reporting I think I've ever heard, which is this fiasco over some reporting by the Indian news outlet The Wire and Meta's ongoing rebuttals and the back and forth. It's been a bit of a tennis match at this point. I think this is probably going to take us a fair chunk of this episode, which is breaking the model, which was supposed to be quick hits of the news, but I think it's probably sufficiently important for us to spend some good time on it.
Alex Stamos:
Yeah, I do think it is by far the most interesting story in a long time in this area, but it also indicates some really important things that we got to talk about.
Evelyn Douek:
Yeah, right. So it started only a week ago on October 10-
Alex Stamos:
Which feels like a lifetime.
Evelyn Douek:
It really does.
Alex Stamos:
[inaudible 00:01:10].
Evelyn Douek:
You are going to be amazed at how much we can fit into this week. So The Wire published a story claiming that it had seen an internal Instagram report that made clear that an official from India's ruling party, the BJP, had special privileges that allowed him to report pieces of content to Instagram and have them taken down automatically without any internal review. Now, the story at this point was already a little weird because it claims that privilege was part of Meta's cross-check system, which there had been a lot of controversy around. This was part of the Facebook Papers, the Facebook Files as well. But that cross-check system normally means that someone has additional review before their content gets taken down to avoid false positives on really high profile accounts. So it's not normally that a reporter gets special privileges.
But at the same time, I tweeted about this story, there is some history of Meta having uncomfortable relationships with the Indian government in India and it's such an important market that this was a red flag and something to watch at this point.
Alex Stamos:
Yeah. So when the story first came out, I was suspicious about the details, but I agreed with the overall thrust which is that Facebook/Meta has definitely been manipulated by the current Indian government a number of times. So for those of you don't follow Indian politics, a party called the BJP is in power. They are a Hindu nationalist party. So they're very much about Hindu rights and are kind of famously not respectful of religious minorities, especially Muslims, in India. And so there has been definitely a push by the BJP over the years to try to get American tech companies to slant their content moderation in a way that is beneficial to the current ruling party. And there's a number of well-documented incidents of that, including the person who's working at Facebook India, running Facebook India being very close to the Prime Minister, as well as some scandals where accounts were shielded from being taken down even though they were violating policies.
So when I saw the story, I'm like, "That's interesting." Now, the thing that was originally weird was the account that they're talking about that had its content taken down is not the Instagram account of a congress party politician or some big journalist. It was a private meme account with a couple hundred private followers that was posting spicy memes. And so it was just initially the first thing that hit me was why would the BJP even care about this? And certainly I don't see Facebook writing a incident report on a single takedown of a spicy meme from a guy with a couple hundred followers. It's just not something the company cares about to be honest. And so that was my initial thought when the first reporting happened.
Evelyn Douek:
So then later that day Meta spokesperson Andy Stone denied the report in some tweets and saying that that's not how cross-check worked, and also that the underlying documentation appears to be fabricated. So that's a pretty explosive thing, people start watching. What was your reaction to that, Alex? I mean, you already had your alert radar on.
Alex Stamos:
Right. My alert Radar was definitely on and for them to straight up deny it is... Facebook will dissemble and spin a lot, but it is extremely rare for the company just to straight up deny something that is true. For example, the Haugen documents, which was a much bigger deal and had a bunch of content in it, that was a much bigger deal, I've been told by multiple reporters that Facebook didn't deny a single one of those documents. So this would be a real change in tactics for them to just straight up lie to the media instead of just trying to spin it. So yes, that was the immediate and counter push for Meta instead of either hiding or trying to spin it, I think, was pretty interesting.
Evelyn Douek:
The next day then on October 11, The Wire published a new story claiming that Stone's public comments are starkly in contrast to another internal email that he had sent to a group of Meta employees asking how the hell the same document had been leaked. And included in this story was a screenshot of that email. So, again, this is sort of escalating now. We don't only have this explosive internal Instagram report about relationships with the Indian government, but now we have these reports about Facebook just lying, Meta just lying to the media. And that plays into a lot of people's priors too about how Facebook interacts with the media and spins PR.
Alex Stamos:
Right. Okay, so this is where I got pulled in and this entire thing started taking over my life. My entire life is before and after the Andy Stone email. That's how, unfortunately, I can measure it. Because a reporter sent this to me and said, "Does this look weird to you? Do you think this is right?" And so I ended up replying to that reporter and then also doing a quick tweet, which then got me sucked into this vortex of time suck.
This quote, unquote, "leaked email" that was supposed to be sent by Andy Stone looked really odd to me on a couple of levels. One, so Andy Stone is a political operative. He grew up in New Hampshire. He went to George Washington University, he worked for the Democratic Party, he worked for the Kerry campaign and stuff. He worked for the Democratic Party for a long period of time. Somebody like that does not post super... Facebook is involved in two dozen investigations or litigations at any moment. Andy Stone's email is being pulled out of the e-discovery archive at Meta every single day by a lawyer, by outside counsel. Somebody like that does not put incriminating evidence in corporate email because it will inevitably end up leaking or being in discovery for a lawsuit.
Evelyn Douek:
Right. 50% of Andy's emails are like, "Let's take this offline," and, "Do you have time for a phone call?" Right?
Alex Stamos:
"Yeah, why don't you call me?" And then you call him and then he screams at you with a bunch of obscenities of why the hell did you put that in email? So for him to say, "Yes, we are bad," in email, was just totally not consistent. That's something you'll see from a junior engineer or something. But for a political operative who now works at Meta comms, there's just no way they make that mistake.
The second is when you read the language, it does not sound like how somebody from New Hampshire uses the English language. "How the hell link got leaked? Who is the reporter not on our watch list? And why didn't anyone of you bother to link me up?" Even in a hastily typed email, this is not how most Americans write emails. And so there's a couple others, "Send me an activity record of the document for last one month." It just sounded wrong to a lot of people and I think this is where the wheels started to come off because even without any kind of technical analysis, just looking at that email, anybody who is conversant with American English, including a bunch of independent Indian reporters, started starting to ask questions.
Evelyn Douek:
So it wasn't just you jumping to Meta's defense and raising red flags at this point. Meta then released a bunch more of denials Meta's CISO Guy Rosen tweeted, "The TLDR is that all of these stories are fabrications. There's no such report and there's no such email." And one of the issues in the email was the screenshot had Andy's email as @facebook.com, but Guy Rosen said, "This isn't even Andy's current email because they now used @Meta.com.
Was this email real now becomes the centerpiece of the story. This is still not over because then The Wire released another story saying that they stood by their reporting and had evidence that the internal email was real. I'm going to hand it over to you, Alex at this point because the technical stuff gets a little bit beyond me and so explain it to me, I have no technical knowledge, just hypothetically, what happened next.
Alex Stamos:
So The Wire posted out a big article. A big article, and this is now on the weekend, that says, "Here is our evidence that all of this is real." And they've got a bunch of different arguments in it. First is about the fb.com versus Meta.com. So I would just rule that whole thing inconclusive because it is true that you can email anybody at facebook@fb.com and you randomly get emails back from Meta.com and fb.com. Facebook uses Office 365. So Microsoft hosts Facebook's email and it is clear that their Office 365 instance is configured to have both domains and that different people have been moved over to what their default send is. So I think we can discard that whole one because it's actually not a strong argument from Facebook or from The Wire. But then they go into a bunch more detail and they put a bunch of stuff up there.
So when a bunch of this stuff happened, one of the things that I and other people throughout, was that in other situations where emails have leaked, there has been the ability of journalists to verify that it was a real email by checking what's called the DKIM signature. So DKIM is domain keys identified mail, and it is a anti-spam feature where a sending email server, sending email across the internet, will create a digital signature that it puts in the header of certain parts of the email. And then that digital signature is verified by a key that is available as a domain name that is published to the entire internet. So in theory, any receiving email server can look at the DKIM header can say, "Give me the DKIM key for fb.com," and then check to see whether the signature matches. This is not a hundred percent for a variety of things that we just don't have time to talk about all the complexities of DKIM, but it's a pretty strong way to verify email.
A famous use of this was verifying Hunter Biden's emails on his laptop well after the initial stories that a bunch of those emails are signed with DKIM keys that prove that it was really his emails that he sent at the time. Or somebody who had control of his email account. I and a number of other people said, "Well, you could have experts verify the DKIM." So instead of sharing the email, the full email, with all the headers, which is what you need. So you can't just forward it, you have to export the email from your mail system. If you're using Outlook, you can just drag it to your desktop and that'll have everything in it. Or in Gmail you can say, "View source," and you can see all the content. You have to have all of that to do the verification.
So instead of sending that to some kind of third party security expert to then see it, what they do is they release a video of tech employee at The Wire. I believe I know who it is, but I'm not going to use any names because this stuff's getting really personal in the Indian press right now. But it looks like a tech employee at The Wire talking through his verification of DKIM. And the problem is, one, they blur out a ton of stuff when they do it to quote, unquote, "Protect the source," which is not a great argument already. And then second, the way they do it would be easily fakeable, right?
So if you sent me an email and I verified that it was DKIM signed, I would have high confidence. If you show me a video of you doing it, all I see is you typing on a screen and then it comes back and says, "Yeah, everything's fine." And that is trivial to fake. I pointed this out and a number of other people pointed out that would be trivial to fake and then other people actually just did it. They wrote scripts that faked a DKIM. It takes 90 seconds to fake that video.
And then what they did was they said, "Well, we had two outside experts support how we did this. This is where it really starts to get bad for The Wire, because what they do is they then post these emails from these outside experts with their identities blocked out. But the initial time that The Wire posts these emails, the dates are off by one year; October, 2021. But the days are correct; Friday, Thursday, Wednesday for example. And then they silently change after people point out that the dates on the email's wrong, they silently change it on the website. They then later, when people point out their change was silent, add a note that says, "Well we got the dates wrong on our computer when we received the email, we've redone the screenshots."
So this falls apart in a bunch of different ways. One, the things that look like screenshots don't really look like screenshots. If they are screenshots, they're from macOS Superhuman. The excuse the gentleman who made the screenshots made from The Wire was that he had installed Tails OS, which is a secure Linux based operating system that uses Tor, and then he had reset his clock and that his clock was wrong when he read the email. Two problems with that. One, if your clock's off by a year, then Tor doesn't work. So Tails would not work at all and TLS won't work. And so basically all the stuff he said he did would've been impossible if he had his glocal clock off by a year. Second, it doesn't look like Tails. It looks like the window Chrome from OS 10.
And so he then comes up and there's this big conversation between him and David Thiel, who's the CTO here at SIO and also a former Facebook employee like me, where David asks these questions and he digs deeper and deeper and deeper where he comes up with these crazy story of he installed Tails, he set the clock run on Tails and then on top of Tails he installed a emulation layer that allows Linux to run OS 10 software.
And then he installs Mac Chrome, and he installs the Superhuman plugin to Mac Chrome. And all of this is his explanation, which does not explain why the day of the week would be wrong. If your clock is wrong on your computer, the clock still says it's a Thursday instead of a Friday. Instead of this crazy explanation, the most likely explanation is that those screenshots were faked. Now that wasn't the actual Andy Stone email, but it was the verifier's email and so a number of people point this out and he digs himself a deeper and deeper and deeper hole. Eventually then they release an email from one of these guys who turns out to be a solutions engineer at Microsoft Singapore, who then later asked for his name to be taken off the story probably because he gets a lot of abuse, probably because it looks like his friend pulled him into something that he should not been part of. So, yes, it starts to get really crazy. At this point.
Evelyn Douek:
I think I've got about 75% of that, which is good work, Alex. But I can tell from what I did get and from your increasingly exasperated expression as you talk through that this is pretty wild at this point and it's really not good.
Alex Stamos:
Right. And so I wrote a thread of if The Wire's telling the truth, about 10 unbelievable things have to be true or they're lying. And you have to apply Occam's Razor at this point of are all of these crazy excuses accurate for why these things are wrong? Because then there's another part of the story, which is this Instagram video. Do you want describe the video that they showed?
Evelyn Douek:
No, you go for it.
Alex Stamos:
Okay. So another part of this rebuttal story is a video of which they said was a Facebook employee logging in to Workplace Live, of their desktop while they log into Workplace. Workplace is the enterprise version of Facebook, it is the intranet that Facebook uses for a lot of stuff, and then opening up the incident report about this Instagram account's content being taken down. What they said was the incriminating evidence in the beginning. This is an example of them getting to it.
So it's a video of somebody logging in, going through the real login steps that you would go to log into Facebook's internal Workplace. Except, one, it's missing two factor authentication. Two, it's missing a client certificate, which is something that is required to log in to Facebook's intranet, and it's in a private mode of Safari, which would not send the client certificate. So that all is not accurate.
And then there's a jump. They have a timer up in the top of the video that doesn't jump, but at the moment where the login changes to over to the Workplace site, the cursor jumps across the screen. So there's a cut in the video where the video has been cut even though they said that there is no cuts in it. And then all of a sudden you're in this Workplace site. And when they get into this Workplace site, it is a real Workplace site, but the person's newsfeed is blank, which means that there's no other content that this person can see, which I could tell you is not true. If you work at Facebook, you log in at 8:00 AM, you have a thousand stories to read in Workplace. And then they click on the notes section and here is the list of what they call the incident reports.
But every single one of them is timestamped of being created in the last three hours before the video was created. So they were all created at the same time, they're all created by the user who's logged in, who had set their profile picture to be the Instagram logo. So they made it look like it's an official Instagram thing, but they just uploaded the Instagram logo. And none of them had any other viewers. And then they forgot to get rid of up at the top it said "Core", which is the level of the Workplace product which anybody can buy. That is the cheapest level of the product. It's four bucks per user per month.
And while Facebook is doing pretty bad from a stock price and they're trying to cut costs, I don't think they cut costs by using their own product in the cheapest tier for internal. So it was pretty clearly a real Workplace site that had been set up by the person who made the video. To a number of Facebook people were like, "That is not the real Workplace. Clearly they either got a free trial or they threw down a credit card, they opened up a Workplace site and then they populate it with these notes." But there's no other content, there's nobody else there, so it doesn't look real at all.
Evelyn Douek:
Right. And at this point it's kind of amateur hour, because all of the things that you just said are readily understandable to someone like me. We're not talking about DKIM signatures or whatever. We're just seeing that this just has a bunch of posts created by one person in the last three hours as a mockup of an internal site. So it is truly getting bizarre and wild as to who's responding to what incentives and what they're thinking at at this point. We are still not done with the external saga of the back and forth this tennis...
Alex Stamos:
There's more.
Evelyn Douek:
... Rally is not over yet. So then Meta comes into the net with another blog post saying that they confirm that the video shared by The Wire and they had two videos in this long post that you said they had a four minute, forty second video showing how they verified the email, like you said, a one minute, 13 second video of the alleged report. This is elaborate and Meta is saying that all of the things that you just said about the Workplace account and why that wasn't real.
And then just keep digging. The Wire then releases a statement at this point, this is last night now, saying that the reason why Meta keeps denying their reporting is to try and get them to publish more information that will force them to reveal their sources. And I quote, "We are not prepared to play this game any further." So they just quit it seems. And then not quite over, we're still over, because it still gets weirder. At this point, I'm just impressed that this can keep escalating somehow. The statement was edited after the fact to remove a reference to the idea The Wire had said that one of its relationships with the sources that for its story was a personal relationship and then they cut that out after the fact.
I mean, it's just the cherry on top of this story at this point.
Alex Stamos:
Yeah.
Evelyn Douek:
I mean, I want to zoom out a little bit and why have we spent such a long time talking about this story? It's not just sort of palace intrigue and incredible details and ridiculousness. And I think that there are a couple of reasons why this is an important story to talk about. I mean, there's the question of who and why and what's going on here? What could possibly be the reason for all of this? And then there's the question of why does this really matter? This is not just a crazy tech reporting story, but I think it has some pretty serious ramifications.
Alex Stamos:
Right. There's a couple of levels of ramifications. So, one, The Wire up to this point was a very legitimate respected outlet in India. They've broken a number of critical stories, including a big story on the use of Pegasus malware against Indian activists and Indian journalists, which was a really big deal in Indiana around the world. They've written smaller stories that I have a lot of respect for. Our team at SIO did a bunch of research on the Indian Army's influence operations in Kashmir and a very brave reporter from The Wire who lives in Kashmir, so she lives in this conflict zone, ended up very honestly following up on our story with more reporting about influence ops there. So there's a bunch of good reporters there who have nothing to do with this. But there seems to be a reporter, one of the senior editors and one of the tech people have now probably destroyed the reputation of the entire outlet.
My initial thought was, "Oh, this is a weird story." Then it was, "Oh, these people are getting played," right? And there's this history in Indian politics of really aggressive operations like this. There's a famous story of an activist academic who was tricked into thinking that she had a fellowship at Harvard and changed her entire life around and the whole thing turned out to be fake. An incredibly extensive trick played on her. And so I thought it was something like that. Like, "Oh, maybe this is actually the BJP or some right wing group that's tricking them," but instead of taking 72 hours to investigate and figure out, they double down, triple down, quadruple down. They're all in. They've pushed all the chips across the table and, I just have to say right now, it looks like The Wire themselves are faking videos. They are creating fake evidence.
This is a big deal because you now have to question all of The Wire's reporting going back, including some really blockbuster stories that have had significant impacts on Indian politics. So I think at the first level, that's a big deal for an entire outlet like this to get blown up because if you look at other places where you've had fabulists, like the Stephen Glass situation and such, you have the institution investigating, finding out that something was bad and then firing that person and retracting and apologizing to maintain their own reputation. Whereas The Wire does not seem to have a separation from the senior editor who keeps on doubling down, doubling down and there doesn't seem to be anybody there that has the ability to rescue them.
Evelyn Douek:
Right. And it's not just ramifications for past stories. I think this has important ramifications for stories going forward. The Wire had been doing all this incredibly important reporting in an environment of increasing authoritarianism, increasing crackdowns on the free press, on civil society. There is an increasing lack of good, important, detailed expert reporting on what's going on in India. And I often say, I think India is the most important jurisdiction for the future of free speech and in the world. More important than what's going on here, as self-absorbed as we are, and that's not purely because it is a marken of 1 billion people. It is also because, as a democracy, it is doing this with all of the trappings of the democratic process, which could then set a precedent for many other democracies or semi-democracies, quasi-democracies around the world that are looking for ways to crack down on these tech companies and the free expression that they enable.
So this is all in the context of escalating regulation of tech platforms, the Information Technology Act, and this ongoing battle between all of these companies and these tech platforms. So I think it is really important to interrogate Meta's role in India for all of the reasons that you said at the top. There seems to be a lot of capitulation by these companies to the Indian government and they're in really tricky positions. Twitter is increasingly the most high profiling standing up to the Indian government's orders for censorship, it is challenging some of these censorship orders in court and it gets knocks by police officers on their offices in India and their employees have to answer for that. So it's a scary place for these tech companies right now.
Alex Stamos:
Right. And what The Wire has done here is one, the have now given the government possibly the ammo they need to shut them down. Maybe you can give a little intro to India's defamation laws, but it is something that is very alien to somebody who was used to the First Amendment.
Evelyn Douek:
Right. I mean, not a lot of intro needed except to say not good. And...
Alex Stamos:
You heard it here.
Evelyn Douek:
Yeah, exactly.
Alex Stamos:
"Not good", according to Stanford professor.
Evelyn Douek:
High level legal analysis from Stanford law professor Evelyn Douek. Look, I'm sure that we will have time to revisit the story of India in future episodes. And I think it's a really, really important one. I want to make sure that we do, because I think this flies under the radar way too often. But enough to say that, yes, this could be a real loss of what was an important voice in the Indian public sphere.
Alex Stamos:
So they've created a way for their enemies in the government to shut them down. The odds of Meta suing them in Indian court is effectively zero, but they named a BJP operative. So you have a Indian citizen who's a political operative who is now... Lies have been told about him, it looks like. And so that's going to be a problem in India. But then second, every time now that there's a story from the neutral press or the political opposition saying, "This is something I don't like, this is a decision that I question that an American tech company made that seems to be pro-BJP." Every single time, this story is now going to be brought up as the counter argument of "You're making it up. You're inventing it." A huge own goal for the forces of democracy in India. And like you said, India is famous for trying to create this fourth way of internet governance.
So you have the U.S. and China at two poles, who have the EU and then India has said, "We want our fourth way," that, "We're not going to copy any of these countries. We're going to build our own way." And the Brazil's, the Indonesia's, the Turkey's, all of the other emerging democracies or democracies that are trending towards authoritarian directions, taking their cues. And so if India goes a direction in which it is closer to China than to the U.S. or the EU in how it controls speech online, then I see that as not just a huge in impact for the 1.3 billion Indian citizens, but for people who live throughout the global south in developing democracies.
Evelyn Douek:
We've spent a long time on this story, but I think it's worth it because this could be, in the timeline, a key turning point I think for the battle that's going on between the Indian government and these tech platforms and online speech in the country. So yeah, sad story.
Alex Stamos:
And it's not over yet because, as of Monday morning, we're still waiting for, what I think, the next shoe to drop, which will be Meta will probably do a big detailed post with all the evidence. If you're going to fake documents about Meta, don't use Meta's products. There's just a little tip to fabulists in the future, because now they are sitting on a huge amount of data and metadata that is going to be probably pretty incriminating.
Evelyn Douek:
Right. So story was a lot of laughs, but I actually think in the end it's a pretty tragic tale.
Alex Stamos:
Yes.
Evelyn Douek:
Another story that is a lot of laughs, but maybe a bit of a tragic tale. We're sitting here, Kanye has made a deal to buy Parler this morning, which I don't want to spend a lot of time talking about Elon Musk or Kanye on this show, but I just am loving this trend of men with hurt feelings just buying up tech platforms because a few of their tweets were moderated.
Alex Stamos:
Oh my god, I don't know what to say.
Evelyn Douek:
Yeah.
Alex Stamos:
I am literally speechless other than yes, you get moderated a little bit and you decide you want to own the means of production. And so obviously a much smaller purchase than Twitter. They have not disclosed terms. But I expect it might be in the seven figures or very, very low eight figures for the value of Parler. I'm not sure Kanye's going to really enjoy being on the other side of this. But also I saw a great tweet that the RNC was tweeting about Elon, Kanye and Trump, and somebody said, "Three people who now really care about Section 230."
Evelyn Douek:
Right. And then my final story for the week is my favorite, it's this week's story in Everything Is Content Moderation. Everyone's favorite annual tournament, The National Park Service Fat Bear Week has been rocked by revelations of voter fraud and coordinated inauthentic behavior. In the final hours of Sunday's semi-final round between roly poly bear 435, nicknamed Holly, and airplane-sized bear 747, Holly received 9,000 votes in a very short period of time. Fortunately, the National Park Service Twitter account confirmed that they were able to identify the fake votes and ensure election integrity just in time, awarding the win to 747. So it was a really heart wrenching tale there. But I'm glad that election integrity has been preserved.
Alex Stamos:
Just in; Holly says that she has a secret email between the National Park Service and bear 747. So tune in next week for me talking about DKIM headers for bear emails.
Evelyn Douek:
Excellent. I'm looking forward to the 10 page SIO report on that. All I want to say is that if people can do this for 435 Holly, then surely they can do it for this podcast. I would love to see 9,000 positive ratings in the next few hours on iTunes. So get to it people.
Alex Stamos:
Once again, we are calling for your coordinated authentic behavior.
Evelyn Douek:
Completely.
Alex Stamos:
And you love this podcast and that you want to vote, vote, vote.
Evelyn Douek:
That has been your episode of Moderated Content for the week. This show is available in all the usual places, including Apple Podcasts and Spotify. And show notes are available at law.stanford.edu/moderatedcontent. This episode of Moderated Content wouldn't have been possible without the research and editorial assistance of John Perrino, amazing policy analyst at the Stanford Internet Observatory. It is produced by the marvelous Brian Pelletier. And special thanks to Alyssa Ashdown, Justin Fu and Rob Huffman. See you next week.