Moderated Content

Big Tech's Big Tobacco Moment?

Episode Summary

Alex and Evelyn talk about the Congressional hearing with tech CEOs this week about child exploitation on their services. What did we learn? What are the next steps? And... who wasn't at the hearing?

Episode Notes

Stanford’s Evelyn Douek and Alex Stamos talk about the Senate Judiciary Committee hearing with Tech CEOs about “Big Tech and the Online Child Sexual Exploitation Crisis.” They mention: 

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Alex Stamos: Professor Douek, are you or are you not a citizen of New Zealand?

Evelyn Douek: No. I'm from Australia, I think we've covered.

Alex Stamos: That's a yes or no question.

Evelyn Douek: No. I mean, remember all the talk about the Matildas? I'm Australian.

Alex Stamos: Yes or no?

Evelyn Douek: No. No, not a citizen of New Zealand.

Alex Stamos: Have you ever been a member of the New Zealand Communist Party or the Kiwi Communist Party, some call it?

Evelyn Douek: No, Alex. As I just said, I'm from Australia, so that would be strange.

Alex Stamos: Are you sure you're not from New Zealand, because you look like you're from New Zealand today?

Evelyn Douek: I mean, I don't even know anything about rugby, so definitely not from New Zealand.

Alex Stamos: So you're not denying it? You're not denying that you're a member of the Kiwi Communist Party? I just want for the record to state Professor Douek did not deny, she would not answer the question about being a Kiwi Communist.

Evelyn Douek: I mean, I really did four, five times. I denied...

Alex Stamos: I'm asking the questions here, professor. I ask the questions here.

Evelyn Douek: Gotcha. And with that, welcome to Moderated Content, stochastically released, slightly random, and not at all comprehensive news update from the world of trust and safety with myself, Evelyn Douek, and Alex Stamos. We want to talk today about the Senate Judiciary Committee hearing earlier this week on big tech and the online child sexual exploitation crisis, which has been billed as this big tech, big tobacco moments because there's this image of all of these tech CEOs being called into Congress like the big tobacco CEOs to account for the harms of their products.

I want to start by saying this is a tough thing to talk about. The hearing opened with a video of stories from victims of child sexual exploitation and the hearing room was filled with parents and families of people who had committed suicide as a result of their online experiences and trauma. And so it was a charged and very emotional atmosphere, and I think that really was very effective actually in underlining what's really at stake here.

And I also think it's very welcome to have a hearing that is so clearly focused on an issue, on a particular harm, and that is proven and palpable, which is by contrast to these wide range hearings that we've had over the past few years that are conspiracy laden and about political bias and that grandstanding. So all of that's on the positive points for Congress side. On the other hand, of course, there was also still political grandstanding in the four hours.

There was a wide range of kinds of questions and answers. And so I'm curious, Alex, what's your top line takeaway from this hearing? How did you think it stood up compared to what we've seen over the past few years?

Alex Stamos: Yeah, so we keep on having big tech, big tobacco moments just like we keep on having cyber Pearl Harbors, right?

Evelyn Douek: Right.

Alex Stamos: It is the metaphor people keep on going back to. I think this was particularly effective because like you said, it was shockingly focused compared to other congressional hearings. I would say the average congressional hearing, you spend 12% of the time on the actual topic, and this was like 60 something percent focused on the sexual exploitation of children online.

Evelyn Douek: It's a record. Yeah.

Alex Stamos: Yeah. And I think part of it, is like you said, with the victims there and their families and the pictures of the victims, but that had a couple effects. One, it did sharpen the mind and it made it less likely for the senators to go off task. I think it also probably had a huge human impact on these CEOs. If you're the CEO of one of these companies, you have trust and safety people who have to deal with the downside of your products every day. I have been in court. I have seen victims. I've been to the sentencing of a bad guy.

I met the victims and their parents of that bad guy. I've been to the Crimes Against Children Conference multiple times. I've met victim advocates there or seen the victim discussions of things that have happened to them online on the platforms I've worked at. That's incredibly hard. Mark Zuckerberg's never had to do that before. So the fact that he had to turn around and actually see that there are human beings behind these numbers, that to him look like small prevalence events compared to the three billion people on Facebook.

That even if these numbers are only in the thousands, that then each one of those numbers is a horrible tragedy, I think that's actually a really important thing for all those people to see.

Evelyn Douek: Yeah, I completely agree. It was far removed from just the attempt to get headlines or whatever. It did have this very pretty profound emotional impact and it does make these statements of, "We're investing so much, and we're going to keep investing so much," ring hollow when the harms were so tangible and right in front of them. In terms of talking about harms though, one of the things I think that's important as background to this conversation is to be specific about what we're talking about.

Because when you talk about child safety online, that's a pretty big umbrella term that can mean lots of things and the impact of tech and platforms on kids is varied and enormous in all of these sorts of different ways. And so what are we talking about when we're talking about child safety and what were these hearings about?

Alex Stamos: Yeah, so that's a great question because like you said, when people talk about kids in the internet or kids in mobile phones and social media, the first categorization you want to do is the passive versus active harms. There's a whole debate is like, are phones bad for kids? Is it bad for their study habits? Is it bad for their dopamine levels or their interactions between each other? Even if there's no intentional badness, is it just bad for them day to day? And there's a lot of people studying that.

That's not my area. That's not what we teach, and that fortunately was mostly not what this was about. Actually you see one of the funny mismatches is that Zuckerberg in the beginning was quoting from a National Academy Study. The National Academy Study is about that, right? It's like just are phones overall bad for kids? It is not about the online child sexual exploitation. And so this was about active herding of kids and in particular active herding of kids that is mostly sexual in nature.

There was a little side discussion by Klobuchar about fentanyl, which is a bad thing, but a little off-topic. It was mostly about the online exploitation of children and mostly by adults, not by other children. There was both things that happen kid to kid that are bad, but the worst outcomes are generally because adults are involved. And I think that was really great. Now, if you're interested in this space, if you're a Stanford student, you can take trust and safety cross listed in computer science communications, international policy in the spring.

If you're not a Stanford student, there's a couple of things you can read here if you're interested in all the different terminology here. There's a really important paper by an old colleague of mine named Vic Baines, who was in law enforcement and then worked at Facebook. She wrote a paper called Online child sexual exploitation: towards an optimal international response, which is a paper I assign in class because it's got a very good taxonomy about all the things that happen online, what's grooming versus extortion versus contact abuse, all that kind of stuff.

And the other is this thing that you can search for called the Luxembourg Guidelines, which is an international standard for how do you talk about online child exploitation. And again, shockingly, the Senate mostly stuck to those things with a couple of sideshows, including an incredibly racist questioning of the CEO of TikTok. But for the most part, that doesn't seem to have detracted from the effectiveness, in the same way that in the previous ones you have, "Mr. Zuckerberg, how do you make money? We sell ads, Senator," becomes the entire thing people remember.

Out of this thing, what people remember is Zuck apologizing to parents and the other kinds of really emotional moments.

Evelyn Douek: I got to say, I know we were making light of it in the B roll about Tom Cotton's questioning, asking TikTok CEO Shou Chew whether he has been a member of the Chinese Communist Party. I mean, I thought members of Congress yelling at people completely without evidence, "Are you a member of the Communist Party," was a period of America's history that you weren't particularly proud of and willing to bring back.

But what do I know? And then Shou Chew is just saying, "No, I'm from Singapore." I mean, I think actually, full credit to him, I thought he handled it extremely well given how offensive it was to be subject to this extremely racist tirade.

Alex Stamos: His last hearing was just about him and just that for the entire time. So the fact that it was like just Tom Cotton made it a pretty light hearing for him, honestly.

Evelyn Douek: Yeah. Okay, well, speaking of who was there and who was not there, I can't go on...

Alex Stamos: Do you have feelings about this, Evelyn?

Evelyn Douek: I have feelings about this because...

Alex Stamos: People can't see it, but Evelyn's about to explode. There's a vein standing out from her New Zealand forehead.

Evelyn Douek: Because I don't know, I can't yell any louder. I feel like if I just yell maybe a couple more decibels, someone will finally hear me on this point. Okay, so we have the CEO of Meta, X, TikTok, Snap, and Discord. Now, Zuckerberg and Chew appeared voluntarily. The rest were subpoenaed. So it's not like this was a party that whoever wanted to just showed up to. They could have had anyone there because they were subpoenaing people. And you know who wasn't there, Alex? Who wasn't there?

Alex Stamos: Gosh, I've heard that there's actually an app that is the most used app.

Evelyn Douek: It's 93%

Alex Stamos: By time spent by teenagers. It just slips my mind though. SchmoveTube, something like that.

Evelyn Douek: I don't understand how they do it. The YouTube magic dust has clearly been passed on in the CEO desk drawer from Susan Wojcicki to Neal Mohan. It's just like whoever their lobbyists are, I know they're probably being paid a fortune, but it's clearly not enough because they are the best in the business. How YouTube doesn't get called to these hearings is just beyond me. I mean, it is incredible. It's so highly used by kids. Alex, does YouTube have any child safety issues? Is it a platform that might be harmful to kids?

Alex Stamos: It's possible that YouTube has some significant child safety issues both on the passive and active side. It's possible that this week that somebody was beheaded live on YouTube and that the video then circulated for hours. That's the context in which YouTube somehow avoids getting called. It is totally amazing. It is shocking.

For Discord, Discord is an important platform for teenagers. I'm glad they were there, but YouTube is probably a thousand times more hours spent than Discord, maybe 100 times. We'll have to find the stats, but it is not even close on, unlike the importance of YouTube versus Discord for teenagers. It is unbelievable.

Evelyn Douek: Right. So YouTube gets away. I mean, obviously YouTube is also a subsidiary of another company, which is somewhat important in this space that also somehow didn't have to turn up to this hearing.

Alex Stamos: Right, because everybody still uses Ask Jeeves. If the current CEO of Yahoo was there, that's what would've really made it obvious that there's something going on. Because the idea that Google, the most important company on the internet, has no representation there is freaking amazing. It's completely and totally amazing.

Evelyn Douek: Yeah, just full credit to their lobbyists. I bow down before you. I take my hat off. I am obviously not speechless because I have plenty to say about it, but pretty astounded.

Alex Stamos: Is this going to be the end? Because I'm not sure where we go from here for being shocked on Google's ability to walk magically through walls in Congress.

Evelyn Douek: Well, yeah, I don't know either. Well, let's talk about Google because one of the things it did come up in the context of when we're talking about solutions to some of these problems, one of the things that got thrown around and one of the things that you've talked about and have been talking about since has been this idea that when we're talking about parental control, that it should rest with the app stores rather than the particular platforms.

And this is something that Meta has been pushing and now the cynical response is that obviously Meta is pushing that because it offloads the responsibility from them to verify the age of their users to the app stores. But you've been pretty outspoken about this in saying that you actually think it's a good idea. So why is that?

Alex Stamos: Yeah, there's two kind of controls here. First is you have to decide whether or not an app gets installed on a phone. There are already pretty good controls in both Google's and Apple's ecosystems here that for parents to say, "My kid has to get approval to install an app." They aren't highly used, but those exist. What doesn't exist is that once you allow a kid to install an app, if they are responsible for setting up the app and setting up the account, then they are the one who has to put a birthdate in.

There is a shocking number of users of Instagram who were born on January 1st, 1900.They look fantastic. From the photos you see, what is your skincare regime, ma'am? Because for 123 years old, you look like a 14-year-old. It's shocking. Really should be in a L'Oreal ad or something. Age verification is a humongous problem. And what we've seen is... We're talking about the bills, but a constant you see around the world, in the US and around the world, is bills that have some kind of age verification component.

Whenever you want to do age verification, an option would be you could have hundreds and hundreds of different apps do it or what I have been pushing for years now and finally Mark Zuckerberg gave me a shout-out, which was nice. He didn't give me any credit. I got very little credit at this for a number of things that I was cited for. We'll talk about that, but it's fine.

But what you really should do is that the most highly leveraged moment at which a parent has with a teenager and a device is the moment they hand it over and you do initial setup, is what I'd like to see is when you hand over an iPad to your kid, you set it up and you say a 13-year-old is using this. You put a birthdate in there. And then that is passed through an API to all of the apps. So if you allow them to install Instagram, Instagram knows they're 13, Snapchat knows they're 13, TikTok knows they're 13, Discord knows they're 13.

And what that allows is that there's the decision of whether or not you're allowed on it at all, which again is already enforceable by the app stores. But what's more important is 13, 14, 15, 16, you will treat those ages differently. And when we talk about age appropriate design codes, which I think is a good idea, I think it's a good idea for these companies to have age appropriate design to gate certain features if you're 13 versus 17 years old. But that doesn't do anything if the person says they were born January 1st, 1900.

And so having some kind of somewhat reliable number doing that in a way that doesn't require every single adult to turn over their driver's license to create an account, which is the way the People's Republic of China solves this, which Tom Cotton is both for I think that protection and then also against the Chinese Communist Party. He doesn't see the cognitive dissonance there. If you don't want every adult to have to show their ID, then you need some kind of mechanism that makes it easy for parents to do this.

And from my perspective, it should be locked to the device and that you have to completely reset the device, which will either require a parent pin or it throws off a notification to the parent account, stuff like that. There's a bunch of different ways you can make that difficult for kids. And so Zuck said he basically wants that and I'd love to see that. I'd love to see that as a self-regulatory thing.

I'd love to see Google and Apple do that together with Facebook and Snap and the other companies before a bill passes because it would just be much better for us to do this voluntarily instead of the very challenging First Amendment issues that come up from it being legislated.

Evelyn Douek: Right, yeah. I mean, there's, of course, all of these questions in this area about the contrast between what is effective and what might work and be a good policy measure. And especially when we're talking about age appropriate design codes and all of those sorts of things, that might be exactly the kind of thing that is appropriate for a platform to implement.

But the question of whether it can be legislated and mandated and then with various different governmental actors having enforcement authority and the ability to prosecute platforms over these issues, that's a whole different can of worms, which we can come back to. There's going to be all of these First Amendment problems. And age verification, I mean, famously the Supreme Court has repeatedly struck down a whole bunch of age verification requirements, and so unclear how that would fare now.

But in terms of thinking about voluntary solutions, maybe that's a good one. I think it's also a good one. Because one of the themes of the hearing was there was a lot of emphasis on giving parents control, and I think it was Amy Klobuchar that made this point actually, and I think it was a good point amongst a bunch of stuff about how parents are overwhelmed and parents don't have the capacity and the time and the ability to be micromanaging everything that their kids do.

And so nice in theory that, oh, we don't want to stand in in the role of the parent and we'll leave the parent to make these decisions for their kids. That's very nice. But in terms of practice, that breaks down a lot.

Alex Stamos: Just as a parent, the idea that your average parent can out tech their teenager. Not to be too arrogant, but I have done cybersecurity my entire adult life. In fact, most of my teenage life. This is what I do professionally and my kids have defeated some of the things I've put in.

Evelyn Douek: Wow, that's impressive.

Alex Stamos: They both get punished while I have a single tear. Go to your room, but I'm so proud. The idea that your average parent is going to outwith their teenager. Which is why from my perspective, you have to do it in a way that parents know the feature exists and it's very simple. And so something like, what is the age of the person who will use this device, gets locked in, set a pin, and do not give it to your kid.

Those two steps being two of the first things after you choose your language on an iPhone or an Android device, I think, would be realistically the best you could do. Again, both Apple and Google have incredibly complicated parental safety controls. The studies show they're used by one or 2% of parents on teenager's devices.

Evelyn Douek: Right. Okay, so that's the question of at least then the platforms would have the knowledge of the age of their users and they can't plead ignorance. There's obviously a whole bunch of questions. What flows from that? What protections and features do you put in place after you know that you have these children on your platforms in order to keep them safe?

And that's a separate question. You mentioned that you didn't get credit for a bunch of things that got mentioned in the hearing. My next question to you is, you and Ted Cruz tight, have you been hanging out with him and briefing him on your work?

Alex Stamos: Yeah. So of all the people to put up something, a poster behind himself that had a cut and paste out of our report that I helped write, Ted Cruz was not high on my list. Ted Cruz has written a very nasty letter to Stanford with a bunch of factual, incorrect information about our work on political disinformation. But then he also very heavily cited our work from last year on the sale of child sexual abuse material on Instagram and Twitter and a variety of other platforms. So you can go to io.stanford.edu and you can see it.

It's under SG-CSAM, so self-generated CSAM. We've talked about it on Moderated Content, but the idea that you have accounts that are at least purportedly teenagers themselves who are running a illicit OnlyFans on a combination of platforms, and the most important platform there is Instagram. And what he showed was a screenshot that David Thiel, our great college SIO, took when he was doing this work, which is if you search for certain hashtags to find this kind of material, I'm not going to say any of them, but let's just say they're obvious.

If I said them right now, you'd be like, "There's no possible way that that hashtag has any potential positive use, unless you're really into pedometers measuring the number of steps you take." But for the most part, these hashtags, there's no legitimate possible use. That Instagram will pop up a box that says, "You did a search term that might lead you to child sexual abuse material. Do you want help or do you want to continue?"

And Ted Cruz, to his credit, it's reasonable to say in a situation where you post that, you put that box up with a high level of probability that the content's bad, there should be no continue button. It should only be, "Here is the hotline you can talk to if you want to deal with your issues," which I think should exist. I think we should have those interventions. Google has tested those interventions. To their credit, if you search for certain terms on Google, it will pop up, "Hey, you might need some help, dude. Call this number," but for at least the high probability ones.

Now, Zuckerberg's response here was, "Well, we're not sure when we put that up." And that's true for some things, but there's a bunch of the hashtags we looked at that there's no possible way there is anything positive on the other side of that search term. And so in those circumstances, the search term should be completely... It should return nothing, and that should be all you get, right?

And if you do enough of it, you probably should get your Instagram account first timed out of like, "Hey, you searched for a lot of pedophilia. We're going to shut you down for 48 hours so you can think about what you did and we can make it clear that we know what you're doing." And then eventually those accounts should be completely shut down. Because even if they were doing a better job, they would eventually be able to find it. They're not doing that great of a job. Our findings were it was way, way too easy to find accounts that are selling CSAM.

And then it doesn't actually move across Instagram because the thing that Facebook does better than anybody else is scan for the content. So you would never move it across Instagram, but they pivot to other platforms like Telegram to actually do it. And as we've discussed, instead of using payment on the manager platforms, they'll use things like gift cards to sell the content, which is very hard to track and a hard thing to centralize control. But anyway, Ted Cruz used our report.

He referenced it to The Wall Street Journal, which is true. The Wall Street Journal wrote a story about our report and they reproduced that image. I'm not going to say that was an intentional way to not credit the group that he's said nasty things about, but certainly he has talked a lot about in the last six months about the importance of academic citations. He's written a number of letters to Harvard that Ted Cruz really cares about the quality of academic citation work.

And so I just think since he cares so much, he should know obviously that the proper citation on that poster is to the Stanford Internet Observatory.

Evelyn Douek: It's a good burn. Yes, that's absolutely fair. I mean, the good news is that our millions...

Alex Stamos: It's not a burn. The man cares deeply about academic. He really is against inappropriate citation policies.

Evelyn Douek: No, the good news is how millions and millions of listeners obviously now get the correction and are in the know. And so they won't be misled as to the source of this image. So that's great. Well, congratulations on making a difference.

Alex Stamos: Most of the work there was David Thiel and Renee DiResta, but it was a Stanford team that was doing that work. So Renee and David did excellent work there, and David has done follow-ups on the quality of the fixes, which have not been great, by Instagram or any of the other platforms.

Evelyn Douek: So let's talk about fixes then, because, I mean, how do we progress from here? Where do we go? I mean, four hours of hearings later, I don't know how much progress we made. I mean, I don't think a lot of news got broken in terms of new information as a result of this.

Alex Stamos: No, it's kind of shocked. Yaccarino was there.

Evelyn Douek: Oh my God.

Alex Stamos: The actual CEO of X. She's the one in charge. Every time I see a story about X/Twitter, I'm like, man, Linda Yaccarino is the decision maker. There's never been a more powerful CEO in all tech.

Evelyn Douek: Amazing how Musk... I mean, I wonder why Musk didn't get called. Just incredible.

Alex Stamos: So Yaccarino had kind of previewed what she talked about there, which is their rehiring for child safety. Our friend Brian Fishman has pointed out...

Evelyn Douek: No, no, no, not rehiring. Didn't you know they're only a 14-month-old company?

Alex Stamos: Oh, I'm sorry.

Evelyn Douek: They just getting set up. They're new, so give them a break. But they're really trying their best and prioritizing this as a new company.

Alex Stamos: Yeah, it's amazing that they spent $55 million buying just the domain Twitter.com and everything else is new.

Evelyn Douek: But they then destroyed and got rid of. That's right.

Alex Stamos: Right. And so Yaccarino had... Our friend Brian Fishman pointed out on Threads that there's actually hard data here thanks to Australia, the country that you're not a citizen of because you're from New Zealand and a member of the Kiwi Communist Party.

Evelyn Douek: I mean, again, just...

Alex Stamos: Oh, you messed up. You admitted it. You admit it. Kiwis, come for your girl because she's not a Sheila. What is the somewhat insulting, misogynistic name for a woman from Zealand?

Evelyn Douek: That is not an answer that's going to result in good things for me answering. So I'm just going to plead ignorance on that one.

Alex Stamos: Back to the point, because of an Australian eSafety Commissioner Report, we actually could see the amount of the huge dearth of trust and safety employees of how many have been fired. And so rehiring 100 only brings them back up close to the baseline of what they had before, which is going to be absolutely necessary. Twitter is much less important to teenagers, so the actual direct abuse that they talked a lot about is much less common on Twitter because you just don't have 16-year-olds using it all day.

That's what TikTok and Instagram are the platforms for which direct sextortion and direct reaching out to kids is a problem. But Twitter has a bunch of CSAM, and one of the things we found in our work last year is David worked very hard on building a pipeline that for all of the data we intake into the Stanford Internet Observatory, he scans all of it for CSAM. And if something hits, it gets automatically encrypted, reported to NCMEC.

Not to preview the numbers too much, but it turns out there's a major phone manufacturer that the Stanford Internet Observatory is now a better reporter of CSAM than them. So we'll talk more about this when the numbers come out in a couple months. So we try to be very careful to never intake that kind of content. And what we found in doing our research is all of a sudden, we started getting these hits from Twitter because their CSAM scanning just broke.

And so for them, it is less about the direct outreach to kids like you have on Instagram and it's more about adults creating hashtags and then using it to entice people to sell. Twitter's problem partially being that they allow nudity. So you end up in a much more difficult situation where in theory, Instagram and Facebook should just kill something if there's a naked person no matter what their age. If you allow nudity, you end up of like, is that person 17 or 19 becomes a real interesting challenge that is difficult for companies.

So anyway, with this hiring, but that was the only... There's a lot of repeating what they're doing. I was kind of shocked because I expected especially Zuck to come with, "We're doing these five things," and that didn't happen.

Evelyn Douek: I mean, I'm glad you mentioned Yaccarino though, because one piece of news that she did break was actually my favorite moment of the hearing was when Senator Blumenthal was asking the CEOs if they supported the Kids Online Safety Act, which we can talk about more in a second, but Linda Yaccarino, who unclear whether she knew what this piece of legislation was before being asked this question and tried to give this word salad, which was as non-committal as possible, but seemed to be in support of KOSA.

So we support KOSA and we'll continue to make sure that it accelerates and make sure it continues to offer community for teens that are seeking that voice, which I don't know what that means, and twirling, twirling, twirling towards freedom. It's a word salad that X hadn't previously come out in support of KOSA.

I think it was also news to the not CEO Elon Musk who then later that day tweeted that or posted on X that, When you hear the names of legislation or anything done by the government, it is worth remembering that the group that sent so many people to the guillotine during the French Revolution was called "The Committee of Public Safety", not the "Cut Off Their Heads Committee." So he's obviously a big fan of the Kids Online Safety Act.

Alex Stamos: Yeah. I'm getting real Marc Andreessen enemies of progress feeling from Elon. So let's see how long that lasts of the hiring of trust and safety folks when he finds out he has to actually pay for them to do that.

Evelyn Douek: But the thing is, this is, again, the meme of worst guy makes a decent point. I mean the Kids Online Safety Act is not a good piece of legislation. It's a deeply flawed piece of legislation that we've talked about it somewhat on the podcast before. But in broad terms, it imposes a duty of care on platforms to prevent "harmful content" in certain areas from reaching minors. And certainly a lot of this content would be constitutionally protected speech. And the big problem is that state attorney generals have the power to bring enforcement actions under the act.

The obvious problem this creates is that state attorney generals, many of them have radically different views of what's in the best interest of a child or what's harmful content to a child than I might and they've been pretty explicit. Some of the bill sponsors, for example, Senator Blackburn has said that one of the goals will be that it will help protect children against the transgender.

Alex Stamos: So yes. I mean, there's a bunch of acts here I think. So when we talk about solutions, let's talk about platforms first. There are things platforms can do. They can hire like they're doing. I think Instagram needs to hire. Meta needs to rehire. They've laid off a bunch of safety people. And when you look at the stuff we found, it is not stuff that you find through magical AI. It's not stuff you find through lots of engineering work. You just need to hire investigators whose job it is to pretend, if I was a pedophile, how would I find content, and then go do it.

And then go pull the string on who's selling this, who's buying it, investigate, build up the Maltego graph, pull all this data down, wrap it all up, send it to NCMEC in a report, and then hopefully six months later there are arrests. That's what needs to happen for a bunch of this stuff. It is not super high-tech. It is for the really bad things you have to do that kind of work. The high-tech part for Instagram and for Discord and TikTok in particular is one of the real problems in the last year is that sextortion has become a big business.

Now, there has always been a dichotomy where sextortion is either content-based, where the bad guy wants it for the content, or it's financial, where their goal is to get a kid, get them to send an image, and then blackmail them for money. That has always existed, but the ratio here has massively changed and that now you have these really large, these scam groups that operate out of especially Nigeria and Côte d'Ivoire and a couple other countries like that, that they have pivoted from your traditional 419 scams and other kinds of things is extortion.

And that unlike a 419 scam, sextortion in a tragic number of cases leads to the suicide of a child. Because unlike an adult, they don't see a way out other than to take their own lives. And so for the sextortion side, the key things here are, one, intelligently looking at outreach by adults. So if somebody creates an account from Nigeria and all of a sudden it messages 10 teenage boys, that should be an alert. That's something that needs to be paused and a human being needs to look at.

The second is then the age appropriate design of if your app knows that somebody is young or can guess that somebody is young, then totally disabling certain kinds of messages, not allowing adults to discover them in searches. Things like that are important, which brings us back to the App Store. That you can build those protections, but as long as people are lying about their age, it's not going to help.

But I think those are the things that platforms need to be doing, whether or not there's legislation, because the toll of sextortion of a financially sextortion right now is absolutely horrendous. It's really, really terrible.

Evelyn Douek: Okay, so that's on the platform side. Let's talk about on the government side, and I know you have a lot of thoughts here because you've been working with NCMEC about what they can do and what the government can do in terms of supporting them. So what needs to be done on that side?

Alex Stamos: Yeah. So on the government side, there's five bills. You've talked about one of them, the Kids Online Safety Act. They vary in how aggressive they are. Some of them are more about the passive versus the active. To me, the test here is there is a bill that has passed out of the Senate unanimously last year called The REPORT Act that is not that controversial. I don't think anybody's lobbying against it. It does not seem to have any civil liberties impact. It increases the number of things that have to be reported to NCMEC.

It amends 2258 to include other crimes, but it's still under the standard that companies can look voluntarily. For example, I think sextortion is in there. If a company knows sextortion is happening, it's because they're looking right. And a company that's looking is probably reporting into NCMEC already. So there's not a lot of change there. It might be interesting from a Fourth Amendment perspective, and we can talk about that, but for the most part, I don't think it has a practical impact there.

The big change is it changes the amount of time that companies have to hold onto evidence of child sexual exploitation and are not supposed to delete it, which is a key thing, because one of the things we have found out through our research is that local police are very rarely able to get search warrants within the 90-day window in which most companies delete the content. And so this extends it to a year, which is good. And the other thing is it allows NCMEC to modernize their systems and to use cloud computing. So we're going to have a lot more talk about this.

This week I actually got to visit the National Center for Missing & Exploited Children. I was there on the day of the hearing, which was an interesting experience. And I was there with Shelby Grossman, with Renee DiResta, Riana Pfefferkorn, and we're working with David Thiel on a big report on everything CyberTipline related. But we have a lot of things coming out of it, but the thing that right now that Congress can do is The REPORT Act is sitting on somebody's desk in the House, and I don't think he even has any sponsors in the House.

I was telling you before, I ran into a member of the Bay Area delegation in the United Lounge on the way back this week. And so maybe I can get a sponsor with some appropriate emails. Funny enough, if United giving me Global Services leads to The REPORT Act getting passed, then I'm going to write the United CEO like a nice little letter. But if you're listening to this and you're a congressional staffer, go look at The REPORT Act because it's not controversial. Nobody's pushing against it.

Everybody's in support of it. It just needs to move. Somebody needs to sponsor it, push it through committee, and then vote. The Senate's already done their work. Because once you do that, it's still going to be a year of work to modernize NCMEC's systems, so let's not wait any longer. But the rest of the bills, I think, are actually much more problematic for the areas that you study, right?

Evelyn Douek: Yeah, for sure. So I mean, we'll link to... You mentioned Riana Pfefferkorn. She's done a good briefing on The REPORT Act, so we'll link to that in the show notes. But exactly as you say, it's a narrow targeted law at a specific problem. And as a result, it doesn't promise sweeping reform and saving all the children forever. And so it doesn't attract as much political attention or political momentum.

Alex Stamos: It just solves problems without hurting anybody. Who would vote for that?

Evelyn Douek: Exactly. So politics, it's great. And by contrast, the bills that do promise to fix everything and solve everything in one big swoop just by making sure that the bad guys are finally responsible for all of the bad stuff that they're doing, they're the ones that get a lot of the focus and the momentum. But by contrast, they're poorly drafted and raising a lot of concerns with a lot of civil liberties groups. So we talked about the Kids Online Safety Act and the obvious problems there with state attorney generals and the definition of "harmful content."

We've also talked before on this podcast with Riana, who, again, ill link to her great analysis of both the EARN IT Act and the STOP CSAM Act, both of which raised a lot of concerns with civil liberties groups about threatening end-to-end encryption, and also reducing Section 230 protection for certain offenses, but with very low mens rea standards, which create all of the incentives to remove a whole bunch of protected speech. I mean, a lot of these bills are going to inevitably have First Amendment problems and First Amendment challenges as well.

And so at the end of the day then, we end up at the end of this hearing with we didn't break a lot of news. We maybe have some small ideas for incremental improvement. We have these bills. The most effective bill is probably not moving anywhere, and the other bills are probably not moving anywhere either. But even if they do, probably unconstitutional and going to make things worse. So I don't really know at the end of the day whether we made any progress this week. What do you think, Alex?

Alex Stamos: Again, I think I expect at least Mark Zuckerberg because it was very personal and him, went back and is focused on this, although the next day Facebook had earnings. Now, Facebook stock apparently today has gone up. Facebook added more market capitalization today than any other public company in the history of capitalism. So for a great measure of... He's had a...

Evelyn Douek: Yeah, they really showed him. They cut him down to size.

Alex Stamos: Now, again, it's a little unfair because Facebook does more for child safety than all those other companies up there combined, to be honest, but still has more to do. So it is both unfair to blame him and fair to do so because there is plenty of work. So I expect he goes back. My guess is he's not a bad person. He's a father. He's a good dad. He's going to go back and I think that's probably going to have some effect on how he talks to the child safety team, how he supports trust and safety work.

Maybe he'll read our reports. Hey, Mark, io.stanford.edu, man. Happy to come over. I'll give you a briefing in Menlo Park if you want a read out of what we've found. And for those other companies, I think this probably helped sharpen for their executives how important it's going to be to not become Mark Zuckerberg. So for TikTok, they already have a lot of people working on child safety. I expect they'll continue on that. I think Discord is a company that needs to invest a lot there.

They've become a very important platform. Still not as important as YouTube, but important in its own way. I don't expect Twitter to change anything because Yaccarino actually doesn't have any power to do the things she's talking about. So who knows? But yeah, I think that that will have some impact. God, just pass out The REPORT Act. I mean, prove us wrong, right? Pass out The REPORT Act. And then Ron Wyden has a bill to fix some things and also to fund NCMEC.

This is the other thing is like these numbers are getting humongous for the stuff that's being reported by companies. If these companies go back and get really aggressive about finding sextortion and such, then there's going to have to be resources for NCMEC to process those reports and route them to local ICACs, and then the ICACs need support to be able to do it. The ICACs are the child safety working groups between law enforcement agencies that are generally regional.

So you'll have the San Francisco Bay Area ICAC. You'll have the Dallas ICAC, which will have a dozen different law enforcement agencies in it. That's where most people think of the FBI as doing this kind of stuff. The FBI does very little in the way of child safety for the most part. The biggest federal agency is probably HSI, Homeland Security Investigations, but vast majority of this work happens at the local level.

And those people, if these companies do their job, then they're going to need the resources to go knock and talk on the victim's door so they can tell the parents, "Hey, your kid's being sextorted. You need to take care of them," and then to go arrest the bad guy. And we're going to need some international action here too, right? As long as this becomes a thriving, thriving business in, again, Nigeria, Côte d'Ivoire, the Philippines, Romania. Romania has cracked down a bit, so it's become less of an issue.

It is going to be very difficult to protect kids if people have an economic interest to defeat whatever protections these companies put up.

Evelyn Douek: I mean, it is just obviously true. I mean, to be very clear, relying on the beneficence of the companies to take voluntary action, I mean, that clearly can't be enough. It's great if Mark Zuckerberg does feel personally motivated as a result of the hearings to do more, but we need legislation in productive ways, The REPORT Act. Also, lawmakers need to work actually with the civil society groups and address these problems with their legislation rather than just continually insisting that the problems don't exist and trying to move things forward.

But I won't hold my breath. Is there anything else happening in the next year that might be keeping lawmakers busy and distracted from actually the productive work of legislating? All right, so that's the hearing. Any other takeaways or sports news that you want to cover before we close out, Alex?

Alex Stamos: I mean, we have some crossover sports news, which is apparently the Super Bowl has been rigged by the Pentagon. It is a PSYOP. And the Kansas City Chiefs will definitely be winning according to the current conspiracy theories because they have to win. Taylor Swift will get engaged at midfield during the trophy ceremony, and then she will endorse Joe Biden or perhaps be named as his vice president, depending on which of the theories you hear. And then we'll take over for him and we'll become our ruler forever.

So you hear it first, the Chiefs are definitely going to win apparently because it's been rigged. As I pointed out on Threads, if you believe that, if you really believe it's rigged, then you should go to this magical place called Las Vegas and there's some people there that if you take all of your life savings and you give it to them, they will give you twice as much money back. In fact, if you put it on the Chiefs right now, you'll get a little bit more than twice your money back. And you should totally put your money where your mouth is.

So there's your sports news. It's all rigged. It's all PSYOP. Why do we even have to watch it on February 11th just to find out whether the black helicopters roll that day or whether they wait a week before they make Taylor a dictator? I guess we'll find out.

Evelyn Douek: All right, Biden-Swift 2024, it seems. There you go. All right. And with that, this has been your Moderated Content weekly update. This show is available in all the usual places, including Apple Podcasts and Spotify. Show notes and transcripts are available at law.stanford.edu/moderatedcontent.

This episode wouldn't be possible without the research and editorial assistance of John Perrino. Big thanks to John this week who covered the hearings and did a lot of work in breaking that down for us. Policy analyst extraordinaire at the Stanford Internet Observatory. And it is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Hoffman. Talk to you next week.