Moderated Content

MC Weekly News Roundup 10/24: Fun Facts about Railroads

Episode Summary

This week Evelyn and Alex discuss severed fiber-optic cables in France, Kiwi Farms v2.0, worrying moves to crack down on online content in Turkey and Brazil, and how Republicans are going after our last line of defense against an unusable inbox: spam filters. Also Alex reveals Evelyn’s lack of knowledge about 19th Century Railroad regulation.

Episode Notes

SHOW NOTES

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Alex Stamos:

You're running fiber optic cables between major cities in the United States. Where are you running them through? What are you using to get, b-, say we want them to be between Kansas City and Oklahoma City, and you want to-

Evelyn Douek:

Highways.

Alex Stamos:

... do fiber optic cable. Highways? That's, uh, very close.

Evelyn Douek:

(laughs) You... we're recording this live, Alex, this isn't fair (laughs).

Alex Stamos:

Right. No, this isn't fair, but neither is this the correct method, as you, you deploy it against your own law students.

Evelyn Douek:

Welcome to the weekly news hit episode of Moderated Content with myself, Evelyn Douek and Alex Stamos. The great thing about doing a weekly show about tech policy and trust and safety is that we are never short of material, and it's a really big week this week, Alex, so let's jump right in.

Alex Stamos:

Let's go.

Evelyn Douek:

Let's go. All right. Put on your seat belts. We probably should start with a little coda to our episode last week on the scandal about the Wire and Meta in India. Um, it was, um, almost immediately taken over by events, but I think it's still a good useful episode for, you know, all of the backstory, but also the broader implications, the policy implications, I think, still stand. 

Right after we dropped the episode, the Wire took down its stories and announced that it was launching an internal investigation, and then over the weekend it retracted those stories and said that it was going to conduct a review of its previous reporting, and also took down other stories that had been posted, co-written by the same authors. So, kudos to them to owning this, uh, and hopefully the sort of the investigation and, and taking responsibility and accountability helps restore the Wire's credibility in this space, because it would be a huge loss, but, yeah, still a very sad story.

Alex Stamos:

And I the big one that they took down that was not related to this latest controversy is the tech fog story, which was a story in which they alleged that the BJP has a mobile app that has some pretty spectacular power, such as the ability to take over, uh, WhatsApp accounts. I had not paid attention to that story when it first came out, but I then read it because of this controversy, and it is pretty clearly, a bunch of it, is just not possible technically. Uh, and so, yeah, this is gonna have some pretty significant impacts. I think the other big impact it's gonna have is a-as we talk about, you know, there's a lot of talk about transparency from the tech companies coming out of the Hoggen documents. I think this is going to put, create a lot of pressure for transparency from journalists that when they have leaked documents they need to post them, because not just in India, but around the world, it is gonna be less and less likely that critical readers are going to trust the assumptions made by journalists on something that they say is a leaked doc. 

Evelyn Douek:

Consistent standards for different parts of the media ecosystem? A wild and controversial recommendation, Alex. Okay, so over the last few days there's been some stories about fiber optic cable cuts in France, and people are freaking out and blaming this on Russia. Tell us about that. 

Alex Stamos:

Yeah. So, as all of our listeners know, uh, there's pretty clearly coordinated attacks against the various Nord Stream pipelines crossing the North Sea carrying, uh, gas from Russia to Europe. Lots of finger-pointing. But, you know, gas pipelines don't just randomly explode. They certainly don't do so at exactly the same time. That has primed people to be on the lookout for other kinds of, uh, of infrastructure sabotage, and there's kind of a blowup on Twitter where people are freaking out that apparently, you know, multiple fiber optic cables, as reported on Twitter, were cut under the ocean serving the South of France. It turns out that these attacks were actually terrestrial, you know, once cables land they then have to pass through the ground to the various data centers, at which they will then spread out to a bunch of different networks, and somebody cut them. It was and to accidental in this case. They were intentionally cut, but they were cut on the ground, and the number of people who could do that is actually quite large. 

So, you know, I think one of the things we've just got to b-be aware of is these kinds of structure failures happen all the time. Fiber optic lines are cut every week, and most of the time it's accidental. Uh, in my experience, the most dangerous person, uh, in the world for the, the internet is a, a contractor with a locally rented backhoe. And there's a specific place. So I'm gonna do a little syncretic method here, Evelyn. If you're running fiber optic cables between major cities in the United States, where are you running them through? What are you using to get b-, say we want them to be between Kansas City and Oklahoma City, and you want to do-

Evelyn Douek:

Highways?

Alex Stamos:

... fiber optic cable. Highways? That's, uh, very close.

Evelyn Douek:

(laughs) You... We're recording this live, Alex. This isn't fair.

Alex Stamos:

Right. 

Evelyn Douek:

(laughs) 

Alex Stamos:

No, this isn't fair, but neither is this a correct method as you, you deploy it against your own law students. Look at the map behind me. The map here in my office is of the Transcontinental Railroad. 

Evelyn Douek:

Ah. 

Alex Stamos:

What did the congress give the railroads when they passed the Transcontinental Railroad Act?

Evelyn Douek:

Again, I don't know (laughs).

Alex Stamos:

Oh, that's right. Well, they gave them right-of-way, right? So, the, the railroads have a huge amount of space to the left and the right of, uh, the, the Transcontinental Railroad lines. It turns out that is where much of the fiber optic cable between cities in the United States is in that right-of-way. So anybody who rides, you know, you ride on a train, look to the right and left. You're gonna see these little posts with a little orange cap. And if you look closely at that cap, it'll say something like, "Before digging here, call this number. Call AT&T. Call Verizon. Call Sprint." And this happens all the time that somebody will be doing construction near a railroad right-of-way, and with one swipe of a backhoe, they will cut a dozen different fiber optic lines and cause a, a huge amount of internet traffic to get re-routed. 

And so this kind of stuff happens all the time. We can't automatically assume it's the Russians. Uh, and so people just need to be a little bit more calm (laughs), and they need to look for real evidence, uh, when these kinds of things happen.

Evelyn Douek:

My fun fact about interstate railroads, and, and this explains why I didn't know the answers to your questions, is Australia is basically a commonwealth, a federation, because the states got sick of having different railroad gauges between them and they're like, "Okay, we've got to stop this." So instead of a grand revolutionary war, we have trains getting sick of having to change, uh, gauge at, at the state borders. So, there you go.

Alex Stamos:

That's fascinating. I did not know that. And thank you for not quizzing me on that story of railroad system. 

Evelyn Douek:

How could you be so ignorant, Alex, not to know-

Alex Stamos:

I apologize.

Evelyn Douek:

... about the (laughs), our railroad regulation, uh, in Australia? 

Alex Stamos:

Yeah. You, you can tell I'm from Sacramento in that every, every field trip as a kid went to the Sacramento Railroad Museum, in which they talk about the Transcontinental Railroad-

Evelyn Douek:

(laughs)

Alex Stamos:

... which ended in Sacramento. And they have the golden spike and everything, so it's like I'm just gonna quiz you on, on stuff from fourth grade California history [inaudible 00:06:02].

Evelyn Douek:

Excellent. I had not been- 

Alex Stamos:

Next week study up on the missions. Yeah. 

Evelyn Douek:

Sacramento hadn't been on my list of places to go now that I've moved to California, but now that I know that there's a railroad museum (laughs), I'm headed there as soon as possible.

Alex Stamos:

How is was it not on your list of places to go.

Evelyn Douek:

Uh, (laughs) I've just offended a lot of people, I'm sure. Lost a lot of listeners with that one. All right, continuing our global affairs segment, uh, I think this is important to keep track of this, looking at this sort of growing trend, uh, that we're seeing overseas, and talking about authoritarian crackdowns, um, and on, on speech in different countries. So, just in the last couple of weeks, uh, Turkey's parliament adopted a law that would jail journalists and social media users for up to three years for spreading disinformation, obviously (laughs). Uh, the meaning of disinformation is not at all clear, and can be sort of determined by the government, um, at its own whims. Publicly disseminating the information is also extremely broad and can apply to things as simple as re-tweeting a story. And if social media companies are requested to remove content, they have to hand over personal details of people involved and take the, the content down, as I said, and they could face up to a 90% slowdown of services if they don't, uh, uh, and large fees, if they don't comply. So, Turkey has never been a bastion of free speech, but this is, you know, a terrifying new development. 

Alex Stamos:

Yep, another, you know, and they, they are intentionally copying the exact language you hear from European and American politicians here, um, so I think, once again, this demonstrates why if you are a politician in a large democracy, you have to be very, very careful of the kind of speech you use around the... wanting to make disinformation illegal. 

Evelyn Douek:

Right. This is famously known as like the net's TG effect, um, after Germany passed a law, uh, very early on, and then we saw a lot of those laws, uh, the l-language of those laws, replicated in many, many countries. And it's hard to say, you know (laughs), look, Germany does it, uh, why, why are you blaming us? And so, uh, so it's, it's an important thing, I think, that democracy needs to be considering. And we also saw in Brazil, uh, in the last week the elections chief in that country has been given the power to order the immediate removal of content that he believes, this single person, believes has violated take-down orders, um, and the platforms have to comply within two hours or face the suspension of their services. In Brazil, this is obviously in the lead-up to the election in about a weeks time on October 30th, and there's been plenty of reporting about how misinformation and disinformation and sort of this massive issue in this election. 

And so, you know, the, uh (laughs), there's many, many v-, uh, reasonable concerns in these situations, but putting the power in the hands of one person when that person's not Mark Zuckerberg, is very scary.

Alex Stamos:

Absolutely.

Evelyn Douek:

Excellent. All right, so, that concludes our global affairs segment for (laughs), for the week. Uh, returning back home, so Kiwi Farms. We haven't talked about that before, but Kiwi Farms was, uh, CloudFlare pulled services from it earlier this year, I think about a month ago now, after there had been a broad-based public campaign against it. It's a cesspool of a website that has been linked to multiple doxxing and harassment campaigns, and, uh, some including, uh, ending in suicide. And Buzzfeed reported that it was back up and running this week. I checked, uh, before recording it, and I couldn't, uh, get on, but what's the story there, Alex?

Alex Stamos:

So, Kiwi Farms is in this fight, uh, to try to get back online. Uh, they fell back to a Russian domain for a while. They got kicked out of certain hosting out there, and then they were back up for something like 12, 15 hours, something like that, before another ISP kicked them. So, yes, I, you're gonna kind of continuously see this, just as we've seen some other de-platformed hate sites. It is, it is very hard, you know, you never want to be the last ISP, the only ISP that will host this kind of stuff, because you're, you're basically opening up this problem that you are... you will attract the worst of the world of customers, and now that they have been kicked from everything, n-nobody wants to be the last one holding the bag. So, yes, they are, as of this moment, down again. It's quite possible they will once again get back up. The place that people used to go for this is Russia. I except both the political situation, as well as the practical issues around doing things like making payments in Russia and stuff, make that a less practical option for folks at Kiwi Farms, but it is possible we will see them pop up in a Russian, what's called a bulletproof hosting, uh, provider sometime soon.

Evelyn Douek:

I guess a question I have is a lot of the discourse around should they, shouldn't they, CloudFlare, pull service from websites like this is premised on the idea that they're basically a monopoly in all of these services that they provide, and if they, you know, their infrastructure, and if they pull it, these websites don't have other places to go. I'm curious, you know, does this sort of disprove or undermine that argument at all?

Alex Stamos:

I think what this demonstrates is, you know, CloudFlare is far from a monopoly. They're not even really an oligopoly, but that overall once a handful of big providers make the decision to de-platform a site, that that has a signaling effect that will be followed by lots of other ones. Um, and so even though there is, there are thousands of companies around the world that could host this website, they don't want to, and they're all making that decision kind of in concert. So I think it is a very complicated issue. The, the pro Kiwi Farms people, I think there is a legitimate argument that is a little scary that sites can be completely wiped off the internet if they're unpopular. That being said, as I've said publicly multiple times, Kiwi Farms, in particular, was, you know, leading to some really horrible in-world behavior, and was inevitably kind of spiraling towards somebody getting killed, and, and so I thought CloudFlare's decision was fine, but I do understand the concern that like once you end up with this kind of decision being made, it ends up getting copied over and over again in a way that eventually makes it impossible for the site to be up. 

That being said, there's plenty of other hate speech sites that are rocking and rolling right now, and I think we can make a prediction what the next one's gonna be. So, uh, do you think our listeners are ready for a preview of the next big CloudFlare blowup? Do we have a CloudFlare music that we can play just for, for CloudFlare controversies? 

Evelyn Douek:

(laughs) W-we'll make one. I don't know. Like do you think is gonna be a recurring segment on, on the show-

Alex Stamos:

Right.

Evelyn Douek:

... basically? 

Alex Stamos:

If you are a composer and you would like to write our CloudFlare theme, please write us. Yeah, so the next one is a site called GoAMTV. So here in California, uh, this week, um, the last couple of days, actually, we had some really horrible antisemitic ac-activity in Los Angeles. Uh, so some, some Nazis, and when I use the term Nazi, like Nazi is one of those overused term on Twitter, where everybody to your right is called a Nazi by people of kind of the, you know, more liberal side of Twitter. I mean like actual Nazis and people dressed in all black doing a Hitler salute and hanging antisemitic banners from an overpass on the 405 Freeway, and in doing so they were advertising GoAMTV, which is a, uh, I'm not even gonna call it alt-right, but close to a neo-Nazi, uh, or a neo-Nazi, uh, version of YouTube, which is, of course hosted on CloudFlare, and uses CloudFlare's streaming capabilities. 

This is going to be the next one, because, you know, it's, right, most of it is just straight-up hate speech, but I, I, you know, in doing kind of generalized hate speech or the kind of stuff that CloudFlare has allowed in the past, the line that was crossed for Kiwi Farms was direct attacks against individuals and, and the possibility of immediate violence. The... Apparently, the same people who did the, the banners on the freeway were also doing things like leaving antisemitic flyers in the mailboxes of individuals in Beverly Hills, and so, you know, very possible that we're gonna see some kind of antisemitic violence tied to this website. And, once again, this will fall on the shoulders of CloudFlare to try to decide whether or not they, that this kind of site should be able to exist and to use their, uh, CDN to, to distribute videos around the world. 

Evelyn Douek:

Yeah. I do hope at some point we develop a better framework for this than a growing public pressure campaign followed by a blog post by CloudFlare or Matthew Prince saying, "I hope we can have a public conversation about this," and, uh, uh, a diagram of how the stack works, because this is just, you know?

Alex Stamos:

Right.

Evelyn Douek:

And it's gonna be happening more and more often, um. 

Alex Stamos:

And then a reversal within 18 hours. 

Evelyn Douek:

Right. Exactly. Um-

Alex Stamos:

We can just script this whole thing up just to run automatically every time.

Evelyn Douek:

All right. Well, thank you for the preview, and, uh, please do get in touch if you have, uh, appropriate theme music. Next is democrats and spam filters. I have this theory that basically all of content moderation issues can come down to spam issues, uh, in the end. For some reason we never talk about it, but spam filters are, you know a big part, uh, perhaps, I mean, definitely, uh, the, the by far the largest amount of content moderation that the platforms do, uh, and it turns out there is no completely objective and, uh, scientific definition of spam, which is always fascinating, because you see lawmakers and things like that saying, "Oh, no, no. We will have... carve out a must... some must carry rules for this objective category called spam." 

In the spam wars this week, we have republicans suing Google for, uh, alleged bias in its spam filters. So, sort of taking a step back, this comes out of, uh, some culture wars that have been going on after a study out of a group from North Carolina University where they found that Gmail marked around 60% more emails from the right candidates that they studied as spam compared to left candidates. Um, and Outlook and Yahoo, their figures were much lower. It was about 20% and 14% more emails. Um, can you talk to us about spam filters, Alex? What might be leading to this kind of thing?

Alex Stamos:

Yeah, so, this, as you said, RNC lawsuit is, is based upon some, you know, I think legitimate scientific study coming out of North Carolina on spam delivery. That work, I think, is actually based upon a story that I was very critical of when it came out, which was one of the, the first stories written by the Markup, which is kind of a spin-out from Propublica, in February of 2020 during the democratic primary, in which they wrote about delivery rates and classification of campaign emails, specifically from democrats, and kind of darkly implied that there was something untoward about the fact that Pete Buddigieg had better delivery than, say, Elizabeth Warren. 

And so that started, I think, this whole discussion about delivery, and, unfortunately, that markup series was not very good. It was clearly written by people who did not know much about how this worked, and there's not a lot of discussion of the, of the technical details. So, first off, one of the things you have to look at, uh, around spam is whether or not the, the sending infrastructure is implementing all of the appropriate anti-spam technologies, SPF, decam, demark rules and the like, but, more importantly, especially in the Gmail case, is that spam is a very dynamic thing, and the rules that Gmail applies are very reactive to how other people have rated emails. For anybody who's ever donated to a political campaign, i-if you have not done it yet, here's a tip. Do not use your real email. Do not use your real phone number. It is illegal to use a fake address, but it, I do not think it's illegal to create a spammy Gmail account and give it to Act Blue or the equivalent, because the moment you give $1 to a candidate, your inbox is flooded continuously with every single person running for dogcatcher in a couple 100, you know, 170 miles away from you. It is completely ridiculous. 

The, the political world has gone gaga for crazy, unsolicited emails. They have a, you know, politicians wrote themselves an, an exception into the can-spam act back in the day, which I found incredibly corrupt, and, honestly, I find this working of the refs to be corrupt again, because generally politicians always want tech companies to do better to protect users and to fulfill users desires, and almost are always asking for appropriate and thoughtful ways of doing content moderation and putting unsolicited or requests for money into the promotions tab is exactly why the promotion tab exists. In Gmail it's a totally appropriate thing. It is based upon lots and lots of people marking it as spam or dragging it to their motions, and then Google picks up on that, and that is where it should be.

And so the fact that all these politicians now on, on both sides of the aisle are trying to work the refs here and to get their stuff delivered into the inbox I find incredibly corrupt, uh, and completely ridiculous, and totally inconsistent with almost everything all of them have ever said about how tech companies should operate. And, you know, again, it is about what individuals are doing and the fact that most of this stuff is not solicited means that lots of people hit the, hit the categorization button or they hit the spam button, and Google learns from that. And, unfortunately, it looks like they're gonna be effective, right? Google has announced that they're going to deliver more emails into the inbox, and I think that's a horrible outcome. I expect Google will end up reversing this, because it is going to have such a negative feedback from users who don't... who want their email to stay usable, even if they made the mistake of giving a politician $10.  

Evelyn Douek:

Just to your point about spam being a dynamic system, the study found that Gmail was extremely responsive to user interactions, uh, more than Outlook and Yahoo-

Alex Stamos:

Right.

Evelyn Douek:

... uh, in terms of what people marked as spam. And in the background of all of this, I mean, this is basically, uh, a lot of working the refs and, and political posturing. Google in response to pressure from the RNC launched a program that would allow campaign committees to opt out of spam filters. Uh, the democrats have signed up for that. The republicans haven't. Unclear why. It could be something to do with some basically reasonable standards that Google imposed, uh, in order to... prerequisites in order to sign up for that program. 

But in the meantime, the RNC has decided instead to go to court. This lawsuit i-is in California, and it's quite remarkable. They're basically throwing spaghetti at the wall here. They have all sorts of claims, non-discrimination, interference in, uh, economic relations, and the big ones are common carrier laws, which are these, these, uh, applying... popping up all over the place, the idea that email and other platforms should be classified as common carriers. This is basically nonsense. Uh, the FCC has made clear that emails are not a common carrier under federal law. The lawsuit acknowledges that, but says they're preserving the claim for possible SCOTUS review, uh, and have, instead, also made claims under the state common carrier statute. But it's really sort of impossible, uh, ridiculous to argue that spam filters are not a reasonable kind of discrimination, uh, because as we all know, and as you were just saying, Alex, email would basically be nonfunctional, uh, without some sort of spam filter. 

So, the question is how far does this get? Does the judge allow discovery? Who knows? Uh, it's something to watch, but, uh, it, you know, we won't be holding our breath for, for success on that one.

Alex Stamos:

Well, and good luck with that discovery, because Google has thousands of people who work on various anti-spam and, and anti-SEO stuff, and, uh, you're talking about millions and millions and millions of pages of documents, and Google doesn't... The reason why people use Gmail is partially because it has the best spam filtering, right? Like that is one of the reasons why it grew aggressively versus especially Yahoo back in the day. And so I think this is actually a very dangerous place for these politicians to play, because making this product unusable that is beloved by hundreds of millions of Americans is, seems like a really stupid kind of front to open up against the tech companies. 

Evelyn Douek:

I love the idea that Google's response to the discovery request will be, "You want spam? Here's spam," and just unleash, uh, millions of documents on the RNC (laughs).

Alex Stamos:

Right. Right, right. Here is... Here is an exabyte of spam, uh, that we have been storing for the last 20 years in paper format. Uh, for, uh, yeah. 

Evelyn Douek:

Right, exactly (laughs). Run your own analysis on, uh, on whether these classify as spam. 

Alex Stamos:

Uh, which, which, uh, which football stadium would you like us to deliver all of this paper to? 

Evelyn Douek:

Yeah, right. PDVR, the clock at some law firm that's gonna spend their, you know, next years worth of, of weekends, uh, uh, marking each of those documents.  

Let's talk about TikTok. Uh, there was a story this week in Forbes that TikTok was using... the parent company, ByteDance, was using TikTok to monitor the personal location of some specific American citizens. What do you make of that one?

Alex Stamos:

Right. So this is, this story has a couple of, uh, red flags in it. Uh, one is it is based upon secret documents, so back to our discussion of the Wire and leaked documents. There is very little detail of, of what these documents said, of what this kind of internal, you know, inappropriate use of TikTok was exactly going to be, of, uh, you know, there are no screenshots. There's nothing that could be either authenticated or, uh, or nitpicked by outsiders. You have to completely trust the reporter. 

The thing that raises questions for me is that TikTok in my understanding generally does not have access to what we call find GPS location, right? So, uh, it, it has, as every service that you interact with can generally tell where you are from IP address, and that could tell you as much as you are within a city or down to a specific house if the right data exists. Um, but, you know, generally when you're doing the kind of thing that they're alleging here of tracking individuals, you're gonna want their GPS location, and that is not available to TikTok in most circumstances under my understanding. That being said, this is the kind of thing that gets complicated, of checking all the different versions and all the different operating systems exactly how they're configured, so it is possible, uh, that I'm, I'm incorrect there, but, uh, you know, I think... I, I think there are legitimate concerns around TikTok. 

Um, I think the amount of data that they will be gathering will be very, very useful to the Chinese Communist Party. I know for a fact that there are good people who work there in trusted safety and security, who are very frustrated with the fact that, um, so many decisions are made in Beijing and so much engineering happens in Beijing, and so there are legitimate concerns we have about TikTok. This story is not doing it for me yet, because, uh, there's just not enough detail here, um, and, again, I think they need to explain exactly what kind of location data, um, i-is accessible to TikTok, uh, and, and therefore ByteDance. 

Evelyn Douek:

Well, the TikTok story is definitely not over, um, so we will no doubt be coming back to the related issues in future episodes. Finishing up now with a plea to enjoy your potentially final days on Twitter. The, the deal, uh, with Musk is perhaps, uh, or is scheduled to close on Friday this week. There's been Washington Post reporting this week that he has plans to fire 75% of the workforce, which would, you know, if he wants to get rid of the bots that may not be the b-, the best approach. So, en-enjoy your time, enjoy your followers, um, and, uh, you know, if you have other handles on other websites that you want to drop, Alex, do you have anywhere that the people should follow you if, if, uh, if this is the end times? 

Alex Stamos:

Uh, yeah (laughs). I, I feel like, uh, we're all gonna end up on Substack. I don't know. It's a good question.

Evelyn Douek:

Uh, yeah. 

Alex Stamos:

Because the interesting thing is Twitter does not have any equivalent, it feels like, in kind of elite American discourse. Like the reason why Twitter is powerful is not from its size of its user base, as much as the, the fact that lost of taste makers and journalists and media personalities and, uh, uh, you know, other folks who have a lot of influence are on it. And so, yeah, that's a good question of like what is the alternative. I think Instagram is the closest, but the problem is Instagram is all imaged based. Like I couldn't imagine having to come up with some kind of photo of me in front of a sunset for every single hot take I want to have on a current policy issue.

Evelyn Douek:

Um, well, uh (laughs), I'm gonna go find... Now, with that promise, I'm definitely gonna go find your, your Instagram account. Um, is it feasible at all that some sort of competitor could pop up really quickly? 

Alex Stamos:

Oh, I think so, for sure. I, you know, I... This is always what kind of drives me nuts about like DC discussion of, you know, social media companies having monopolies and such. Y-you drive out here on the 101, and you look at left and right, and you see the skeletons of the last generation bleached white by the s-, the California sun, demonstrating the fact that there's always massive turnover in the tech industry, right? Like Facebook... My first visit to what is now the Facebook campus is when I went to Sun Micro Systems. My first visit to what is now the Google campus is when I, I interviewed for an internship at Silicon Graphics, right? Like, um, you know, we just kind of recycle the buildings and then they're putting new companies in there, and I think especially if there's a... if there's an immediate event that drive people from a platform with network effects, then I think that is, by far, the best opportunity somebody has to get a massive movement of 10, 15, 20, inter-, deeply interconnected people to make a move all at once. So it, it will be interesting to see what happens. 

Evelyn Douek:

Okay. With... so with that haunting image of the crumbling buildings of MySpace, we will leave it there. That has been your Moderated Content news roundup for the week. This show is available all the usual places, including Apple Podcast and Spotify. Our show notes and now transcripts are available at law.stanford.edu/moderatedcontent. This episode wouldn't have been possible without the research and editorial assistance of John Grainer, policy analyst extraordinaire at the Stanford Internet Observatory, and it is produced by the brilliant Brian Pelletier. Special thanks to Alyssa Ashdown, Justin Fu, and Rob Hoffman. See you next week for more Moderated Content.