Moderated Content

MC Weekly Update 3/13: Extremely Persuasive Dance Routines

Episode Summary

SIO's Riana Pfefferkorn joins Evelyn this week to talk warrants for abortion related data and California laws post-Dobbs looking at ways to protect sensitive information; why the FTC asking Twitter for information about its privacy practices is totally unsurprising; an anti-Jawboning bill on the Hill; this week's TikTok tick tock; and a Utah "think of the children" social media bill.

Episode Notes

Stanford’s Evelyn Douek and Riana Pfefferkorn weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn on Twitter at @evelyndouek.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Evelyn Douek:

Before we jump in, I don't know if you have any never before heard of sponsors that you might want to mention and thank for the episode. I don't know if there's some random, I don't know, like a mattress company that has particularly good cash storage segments under the bed. I don't know, anything along those lines.

Riana Pfefferkorn:

Sneakerly, for when you need to literally run to the bank in order to remove all of your money and shove it into dubious cryptocurrencies. You want to be on the top of your game because you don't have to be first, you just have to be faster than the slowest person to try to redeem their deposit.

Evelyn Douek:

That's great.

Riana Pfefferkorn:

Sneakerly sneaky.

Evelyn Douek:

Hello and welcome to Moderated Content's weekly news update from the World of Trust and Safety with myself, Evelyn Douek. And Alex is out this week. I briefly thought about using generative AI to sit in for him, but then I thought for two seconds about what a bot trained on the publicly available statements that Alex has made would say and thought better of that idea. So Plan B actually is much better than that, and instead we are joined by Stanford's very own Riana Pfefferkorn, a research scholar at the Stanford Internet Observatory. Riana, thank you so much for stepping in.

Riana Pfefferkorn:

I'm now kind of disappointed in myself versus seeing the Alex eBooks version of what this podcast would've done. But thanks for having me here today anyway. Hopefully I can be at least as entertaining as the AI version of Alex.

Evelyn Douek:

Well, definitely at least it's entertaining and absolutely professionally less risky for me at least so that's good. Now as it happens, this is a great week for you to be on because there are a few things that you are perfectly placed to comment on and do updates. So the first is an update to the "comprehensive segment" that Alex and I did last week about law enforcement demands for data from tech companies, particularly in relation to abortion prosecutions. And you listen to the episode and wanted to flag a few things that our, again, "comprehensive segment" could have used to maybe be a little bit more comprehensive. So why don't we go to the first, which was a law past in California last fall to deal with exactly the kind of issue that we were talking about. So tell us about this look.

Riana Pfefferkorn:

Sure. So on last week's episode, you were discussing what the major online service providers could be doing to try to push back against warrants or other legal process that they might receive from anti-abortion states to try to investigate people who have had or sought or assisted in or provided reproductive care. And it turns out that actually the pro-choice states, California included, have been thinking ever since the fall of Roe last summer about how to protect abortion seekers and providers within their states, even if they may be coming from out of state. And so one of the things that California has done was that last fall, California passed and the governor signed a law that is intended to bar California corporations from complying with warrants from out-of-state entities seeking information that they know or should know is related to what the law deems prohibited violation.

Which basically means if it's abortion care, that would be legal in California, but would be illegal in another state. Then if it's something that would be legal here, then the in-state electronic communication service provider or remote computing provider is prohibited from complying with that warrant or other form of legal process. This is intended to try to protect private messages, other information, potentially location data right from being disclosed in response to legal process that would otherwise be compulsory. Since, as you guys discussed on last week's episode, the typical practice is for every state to honor legal process coming from other states. We are a federalist country, but in general we try to have rules in place to keep things running fairly smoothly.

And so up until the passage of this law, California had a law that said without exception that if you are a California entity and you receive out-of-state process, then you have to comply with that process. Fine. This is now a carve out that would try to shield people who may not even live in California, but who may seek abortion care here or who may be using services provided in California by electronic communication service providers in order to communicate and seek abortion care.

Evelyn Douek:

So we mentioned that this was passed last fall. Do we know anything about whether this has been particularly effective, the effects of this, how it's been enforced in practice? Anything about... Yeah.

Riana Pfefferkorn:

I don't know yet. It's a great question and this is one of those things that make transparency reporting is important, and this is, I think the period during which this law was passed is the period for which I think a lot of entities are probably trying to prepare reports that may go out or if you are Twitter may never go out again. So we don't necessarily know yet what impact this has had. One interesting, at least for lawyers, quirk of the language of the law is that it only applies to California corporations and does not apply to "foreign corporations", which in context can include out-of-state corporations. So one question that I have that I don't know the answer to, but that I'm sure a lot of in-house council and outside council have been trying to figure out since the law got enacted is if you are headquartered in California, but you are technically organized in Delaware as a lot of tech companies might be, does this law even apply to you?

And if it does not, then how many entities does this actually end up applying to, and how many users in aggregate are actually helped by this law? So I think we may hopefully get a little more information as transparency reports come out, if entities have an incentive to try and explain more about their compliance or non-compliance with this particular category of warrants or legal process. And I think here providers have a very tricky line to walk because half the country wants them to shield abortion privacy for people who are getting it, and at least as to the attorney's general and leadership and legislatures. Of the other half of the country, they may be under significant pressure for anything they're seen to be doing that may make them look more like woke liberal entities that care about bodily autonomy.

You'll note that when Google had announced shortly after the Dobbs decision that they were going to start rapidly deleting location history entries for people who have visited sensitive medical facilities, whether that's not just abortion clinics, it could be fertility centers, it could be addiction treatment facilities or a few other examples. They phrase this carefully as not just like it's not about abortion, it's all of these things that are more sensitive than just the dentist. You might not mind if somebody sees you walking into the dentist, but you might mind if they see you walking into a weight loss clinic. And so that I read as being this careful way of trying to do what hopefully a lot of people within Google think is the right thing without potentially attracting too much blowback. So what the actual practical effect will be of California's law is yet to be seen.

And it's one of a number of different bills that California has been working on since the fall of Roe to try to protect abortion seeker's privacy. Another one that was just recently introduced last month is a bill that would prevent what we think of as reverse location warrants or keyword warrants that have been used and abused by law enforcement in recent times to try and say, "We just want to find everybody who searched for a particular term. We want to find everybody whose device was within this particular geofenced area," which have been under a lot of scrutiny for their constitutionality in recent years. And yet we have not yet seen widespread action to try and bar these kinds of unconstitutional warrants.

And as with so many other issues that have been coming up during the proliferating culture wars in recent years, this is what it took to finally get a state to try and potentially pass hopefully a reverse location warrant and keyword warrant bans. In the same way that a lot of the things that surveillance activists have been advocating for years such as 702 sunsets for foreign intelligence investigations, wire tap reforms, reining in the FBI, all of these are things that suddenly we're now maybe going to get or are closer to getting than we have been in a long time, but from unexpected corners. And so it's been a weird, interesting times ancient curse to live in when it comes to privacy related topics.

Evelyn Douek:

And I mean, it's interesting because it brings up all the issues we were talking about last week as well about California is really throwing its weight around here as a huge economy and the home of these corporations and trying to leverage that for power. But it does raise all of these questions that we're talking about that you mentioned about comedy and full faith and credit and the idea of being a United States of America. So it's one solution, but it's going to create all of these other problems as well down the road. I don't know.

Riana Pfefferkorn:

And the other thing, when we look at the language, one of the things that there's a concern about even this non-compliance without those state warrants for prohibited investigations that you know or should know, one of the things that sort of gets you off the hook if you are a corporation covered by this is if whoever is sending you this legal process, law enforcement from another state, provides an attestation that the investigation doesn't relate to a prohibited violation. And as we've seen even since before the Dobbs opinion issued, there were plenty of investigations and prosecutions against people who had obtained abortions or even had just experienced miscarriage or stillbirth under other headings like child abuse or murder or attempted murder.

And so it is entirely possible that if an out-of-state law enforcement officer signs that affidavit, and if they even say what the investigation is about at all, which as he noted last week is not necessarily common with state level warrants, it might very well just list one of these other types of crimes that we've seen be alleged, even here in California by conservative Central Valley prosecutors in abortion related cases. And so I guess the question there is how deep will a provider actually have the incentive to dig to try and confirm? Or can they just say, well, there's the aattestation that puts us in the clear therefore we can comply with this warrant without any worries. Because I don't think it would be great PR if one of the big tech companies gets a reputation for non-compliance with warrants that have child abuse written across the top. Nobody wants that or murder or attempted murder.

And so a lot of these things, these efforts to try and get companies to do the right thing, if you have loopholes in them or easy workarounds, then I think we'll have to see, again, going back to the question about transparency reporting, and will we even know? Since a lot of these investigations don't necessarily come to light and remain under wraps. Will we be able to see is this actually having the intended effect or are there ways that both law enforcement agents from out of state and corporations here in California can wiggle around them and continue providing the information that now users might potentially have a false sense of security of being lulled into thinking, oh, this law exists. And it turns out we might see down the line whether it's actually as protective as it was intended to be.

Evelyn Douek:

And so those transparency reports like Google and Meta and Facebook, I think they haven't reported for these periods yet. They're up to June or December 2022. So those transparency reports will be forthcoming, and it'll be interesting to see if we see a massive spike in requests for data and then maybe the percentage of requests for data that are complied with goes down. There will be a real question about whether the data's going to be granular or enough for that to actually be useful information, you can look at it at a country level, but that's not necessarily going to give you enough specific information. It's also goes back to this, it's all voluntary and it's not verified and we don't know a lot about the processes behind collating and creating this information, and the incentive structures that you were talking about for companies when they're releasing this information.

So it would be great as well to have more transparency from governments about what's going on and how this is being enforced, but won't hold our breath. So the next area that it's really great to have you on to do an update for is maybe a little counterprogramming to last week's somewhat critical segment of the FTC, which is a story about how the agency is just trying to do its job. So there's been an outrage fest this week about the FTC's investigation into Twitter's compliance with its consent decree with the agency. So the House Judiciary GOP Twitter account tweeted that the Federal Trade Commission sought the identities of JOURNALISTS in communication with Elon Musk. Why isn't every journalist on this platform OUTRAGED at such government overreach? This is in letters that have been sent by the agency to Twitter from November 10 last year up until February?

Now as it happens, we did a podcast episode about this on November 17th last year, which we'll link to in the show notes because it is still a great primer on what is going on because what is going on is exactly what you would expect to be going on as we talked about in that episode. So that episode was titled Elon Musk puts rockets into space. He's not afraid of the FTC because that's something that Musk's then lawyer said when people suggested that maybe he should be a little bit worried about the fact that he's not complying with the FTC's consent order. And it turns out that if you thumb your nose at a federal agency, they don't just sort of pack up their bags and go home and decide to just not enforce the law. So Riana, tell us a bit about what this story was about, what the Outrage Fest was about, and if any of these sort of surprised you or what we should know about what's going on.

Riana Pfefferkorn:

So first, let me start by saying, as I did the last time I was on to talk about Twitter, I used to be Twitter's outside council, but I was their outside council both after the first FTC consent decree that Twitter entered in 2011, and I left my firm years before the second one in 2022. And so all of my discussion here is on the basis of public information and not on information gleaned during the scope of that representation. With that out of the way, as we talked about on the last episode, Twitter has these consent decrees in place one from 2011 that was then revised and updated last summer in light of new developments that pertained to its privacy and data security practices. In the 2011 consent decree, one of the major concerns was that too many people internal to Twitter had access to users' accounts including their direct messages, for example.

More recently, the consent decree from last summer was primarily about the use of phone numbers provided for security purposes that were then being used for target advertising purposes instead. Nevertheless, both the first time and more recently, Twitter had to promise to step its privacy and data security game up to have stricter controls on who can access internal data to only use information provided for security purposes for that purpose and stop misrepresenting whether they were using it for other purposes instead. So now fast forward to March of this year, we've seen that as expected that the FTC has been exercising its oversight authority that it is granted as part of the 2022 consent decree that says that they can do compliance monitoring. And that when they send a written request to Twitter, then Twitter has to send additional compliance reports, requested information sworn under penalty of perjury. They have to appear for depositions, they have to produce records.

The FTC is allowed to demand that representatives being made available for interviews, they can have counsel present. So there are all of these ways that the FTC now has oversight over Twitter and the ability to investigate into whether or not it's actually doing what it's said it was going to have to do on the privacy and the security tip. The objections and the outrage that we're seeing now coming out of the House committee on the weaponization of the federal government, which is a title that is subject to multiple interpretations, let's say, is outraged at the fact that these data demands have been going out pursuant to Twitter's compliance monitoring agreement to the FTC. And the objection seems to be that the report that the committee issued says like yes, they do have compliance monitoring capabilities under this consent decree, but that a lot of what, according to the report from the committee the FTC is demanding, goes beyond the privacy and data security scope of the consent decree.

It's weird enough as it is that a congressional committee is dragging into the spotlight the ongoing internal investigations within an executive branch agency, and disputing them and shaming the agency for sending them does not seem like the kind of staying in your lane that I would ordinarily expect two co-equal branches of governments to try to do. But to the extent that the report from the committee is calling out these FTC data demands for both allegedly how over broad they are, how frequently they have been sent, how much information they're asking for. It is hard to evaluate those claims by the committee report because while they selectively quote from the FTC's demands and have citations for the dates that they were sent, they don't actually disclose the contents of those demands from the FTC.

And so it's hard for us as outsiders to review them and try and make heads or tails of whether the claims in the report that these are over broad, that it's a fishing exhibition that some of them have no logical connection to user privacy, whether those are true or not. It's entirely possible that these demands do go very broadly and are asking for more than what is in the scope of the FTC consent decree over privacy and data security. But without seeing them, it's hard to know. It's not as though overbroad phishing expedition demands from regulatory agencies to private companies is solely one or the other political parties sole remit. This is an issue that crosses party lines. Weirdly enough, I don't think that any of these members of Congress who are angry about the FTC's demands to Twitter were the ones pushing back or making the human cry back when any of the many times that Republican State Attorneys General have sent phishing expedition data demands to big tech companies.

We're getting some pushback from those companies and from other observers, I've litigated at least one of those myself in my past role as a lawyer. And so absolutely, I can tell you that a lot of the times if you are outside counsel for a tech company that receives a subpoena or civil investigative demand or whatever, and here this is within the scope of an existing consent decree that says we can send you information demands. You might very well say this is very overbroad and try and negotiate with the agency to try and narrow it or even to litigate and challenge the ability to do it in the first place for what might be seen as retaliatory or politically motivated or not really about an actual attempt to gain the information that is being requested may be viewed as punitive.

And there's even ongoing today, the SEC has a wildly over broad subpoena to a major law firm trying to learn the identities of all of the firm's clients that may have been affected by a data breach at the law firm. This is a huge issue within legal circles because it seems to contravene basic professional responsibility obligations on the part of firms. Weirdly enough, the committee on the weaponization of the federal government isn't looking into the SEC subpoena. Oh, that's just to say there's all of these issues around this demand from the FTC or series of demands to hear the report tell it about whether or not FTC is going beyond what it is permitted to do within the scope of its authority and within the scope of this consent decree. But all of this is stuff that we basically could have predicted, especially to the degree that so much of this seems to have to do with-

Evelyn Douek:

The Twitter files.

Riana Pfefferkorn:

I think that's a little more-

Evelyn Douek:

Twitter files.

Riana Pfefferkorn:

... aggressive. The Twitter files.

Evelyn Douek:

Yeah, we need Alex, damn it. That one the bot definitely could have handled.

Riana Pfefferkorn:

We're going to have to get the bot for that one. But some of the characterization of it's trying to find all the times that Twitter talked to journalists, and the journalists are engaging in First Amendment protected activities and trying to expose big text abuses to the public and it's like, okay, I get what they're trying to say. It's nice to see journalists and the First Amendment being trumpeted by Congressional committee. But on the other hand, as you and I know, the First Amendment does not protect violating the law. And what the FTC is trying to look for here is not are these journalists engaging in their First Amendment protected News Gathering Act rights.

They're trying to determine whether Twitter has violated the law, whether that is violating its legal obligations under the consent decree with respect to protecting user privacy or whether it may be violating other underlying laws such as the Store Communications Act, which regulates whether and under what circumstances entities like Twitter are allowed to release the contents of their users' communications. And there have certainly been some worries, and we talked about this the last time that that might actually be happening. There's been a lot of concerns that Elon Musk himself or Ella Irwin, the head of Trust and Safety, might just be spewing the contents of DMs of, for example, journalists that Elon Musk does not like to the public or to a select list of journalists that they do decide that they like.

Evelyn Douek:

And listeners who are interested in that. And we had Oren [inaudible 00:22:38] do a cameo for our episode on December 13, 2022 in the light of when Barry Weiss had released the Twitter files and there was some screenshots in there where it looked like she may have had access to individual users DMs. Now that was denied at the time that it actually was a screenshot from Earl Owen who the journalist didn't get access to DMs. But there's certainly enough information there that we've had questions about the Stored Communications Act and it seems unsurprising that the FTC might have to. And so to rehash these are consent decrees from 2011, 2022 before Moscow in the company, within weeks of Musk owning the company, the chief privacy officer, chief information security officer, and chief compliance officer all resigned on the day that they were supposed to issue a compliance notice to the FTC.

And that combined with these screenshots and other sort of reporting and things in the air, it's not a surprise that there were these information requests to Twitter. And it's not to do with Musk ownership, it's not as is being suggested here because everything's part of the culture or has something to do with the new free speech regime. It almost would be a dereliction of duty for the FTC not to be looking into this kind of thing, which is what we discussed on the episode. So we will see where this heads and like I said, go back to listen to the other episode if you want more information about that. Meanwhile, House of Republicans have also passed a bill called the Protecting Speech from Government Interference Act. It's an anti-jawboning law, which seeks to prohibit any member of the federal government from influencing, coercing or directing another to influence or coerce censorship or private platform from doing any kind of censorship where censorship is defined as the removal of suppression of lawful speech or the addition of any disclaimer information or other alert to lawful speech being expressed on a platform.

So basically, this is an extremely broad statute that prohibits most forms of communication about speech with platforms. Now this again, is a political, it's part of the culture wars. This is about Republicans who are concerned about Democrats pressuring social media companies to suppress content including about Hunter Biden and COVID misinformation. On the other hand, this mirrors a lot of what civil society has been talking about for a long time about concerns about government censorship. And so there are legitimate issues here that are not being properly aired or properly discussed because of the culture war aspect of this.

And that there's reporting requirements here and to try and make this whole process much more transparent. And so it would be great if we could have a legitimate measured conversation about jawboning concerns and potential measures that Congress could take to prevent illegitimate jawboning that we mentioned that nice thing called the First Amendment to protect the First Amendment, but I don't think what's happening here.

Riana Pfefferkorn:

We should at least be grateful that they didn't decide to background them jawboning into an extremely tortured title for this bill. It's almost refreshing to just see a straightforward label on the tin instead of something that takes a very long message and turns it into a long string of nonsense words.

Evelyn Douek:

Well, speaking of here is the TikTok TikTok for the week. So thank you for the segue because Congress is so sure that Congress people should never be in the business of trying to restrict speech on social media as is the anti-jawboning bill. Here we have Congress with Senators Mark Warner and John Thune introducing the Restricting the Emergence of Security Threats That Risk Information and Communications Technology Act or the Restrict Act. This is one area where I really think chat bots have a potential to be very helpful, which is legislative acronyms. So I don't know if we owe that one to ChatGPT or not, but I think there's room for improvement. We've talked about TikTok a lot on the show, so not to belabor the point, but this is the most momentum that we have seen in this area. It would give the Secretary of Commerce authority to ban technology companies with ties to foreign adversaries, including TikTok.

And the White House applauded this legislation and urged Congress to pass it very quickly. So certainly we can see the snowball effect happening here. It doesn't require the banning of the app, it would just enable it. And so we'll see what happens once it passes. At least Senator Warner did note that this is a popular application. I think it's going to be incumbent upon the government to show its cards in terms of how this is a threat because there will indeed be First Amendment concerns. So I'm grateful that there was at least a mention or awareness of the First Amendment here. But yes, we will see what happens here. Nothing particularly new except that this is again, just the momentum every week. It's amazing how every week when we sit down to write this podcast or prep for the podcast, there's always new stuff to add here in the TikTok segment. So Riana, curious if you have any particular thoughts about this or your take on what's going on here.

Riana Pfefferkorn:

Yeah, I mean, the Restrict Act is such a weird document because all of the national security type concerns feel like trying to collapse together concerns that might be about data security, data privacy. And you and Alex said something last week that I agree with, which is why don't we just pass a generally applicable federal data privacy law. This looks like a distraction from Congress's inability to do that and just say, "Let's pick out this one particular entity and use that to raise fears about what the national security impact might be." But it's not quite the same as when we've been worried about hallway for the last several years and how they might potentially be surveilling Americans data and especially the data of people who are insensitive positions where particular government might be especially interested in learning or exfiltrating data about that person either through the app or just off of their device, if they could do that.

It seems we're collapsing together those concerns about the national security of individual's private data with national security concerns that go more towards the driving a wedge within the US population through foreign influence campaigns. And that's where a lot of the First Amendment stuff comes up and where it seems like the fact that these are foreign corporations, especially for particular countries that are on the list in this act, is the one lever that the federal government has. Where they know that they could not pass a domestic law to try and say no corporation of whatever ownership shall be allowed to promote or algorithmically emphasize content that shows disinformation about an election or cast out on the effectiveness of vaccines or what-have-you.

And yet that's the kind of national security concern that I've heard stated by people on the hill as a rationale for why we need to "do something" about TikTok. And those are just things I'm not sure that we can really regulate consistently with the first minute we've done this before. There have been court decisions about Baidu, about WeChat, about TikTok itself from the last time under the previous administration when there were attempts to restrict Chinese companies operations in the United States. We know that Chinese companies with users in the US have rights under the First Amendment, and that extends both to their user speech and also to their own editorial choices about how to display content, what content to display or not to display.

And so to the degree that this attempt to lay the groundwork to enable federal government to potentially take as strong of a gesture as banning TikTok or other particular entities goes, it seems like it weighs directly into First Amendment concerns. And to the degree that the concern is not this data exfiltration side of it, but this, well, what if they platform particular ideas that we don't like? Well, you couldn't do that to a US company. And if you ban TikTok, that content doesn't go away. It's still going to be on YouTube. So I'm not even really sure what the point is of all of this, except that it seems to distract from the sort of two ongoing themes in recent years, which is number one gosh, it would sure be nice if we could get our act together to pass a federal data privacy bill. But we can't so hopefully nobody notices. And number two, man, we sure wish the First Amendment protected less stuff.

Evelyn Douek:

Oh no, what if the Chinese mount a really persuasive argument for communism with 15-second dance routines? Like the children, someone think of the children, how will we ever be able to counter that with other persuasive speech? I do not know.

Riana Pfefferkorn:

You say that, and yet what I take to be just a flat out piece of propaganda for the Modi government, just one best song last night at the Oscars. So we only estimate the power of entertainment as a projection of soft power by our foreign adversaries [inaudible 00:31:25].

Evelyn Douek:

Well, speaking of won't someone think of the children and Congress not passing laws, they're because Congress or maybe in irrespective of Congress's lack of activity, the activity in the states continues. And Utah passed a bill this week. The social media bill, SB 1 52 expected to be signed into law soon, which is basically a parody of a won't someone think of the children bill. So it requires all social media accounts in Utah to be age verified and that any account for a minor needs parental consent who is a minor, that means someone under 18. Now, just in case you are wondering, you can get a driver's license at the age of 16 in Utah. And for these accounts for minors, the parent or guardian who has given consent for the minor to have the account also gets access to all of that accounts posts and direct messages.

So I don't totally know what to say about this except that hey, miners have First Amendment and privacy rights, too. And this is a full employment program for lawyers who are currently drafting the First Amendment challenge to this bill right now in offices not too far from where we sit, that will be filed 30 seconds after it passes. It's also very helpful for law professors who are looking for material for their syllabus over the next few years. So yeah, any particular thoughts on this one?

Riana Pfefferkorn:

Just exasperation, I guess. As with so many other proposals, we did this before, we did this in the '90s and it got struck down as unconstitutional under the First Amendment by the Supreme Court at that time. I know everything is up for grabs with the Supreme Court to just revisit settled principles of constitutional law and how it applies. But this is one more thing on the list apparently is age verification and restricting what not just minors can see or how they can access the internet, but what adults can do and how they can access the internet because measures intended to verify that somebody as a child necessarily apply to adults as well. And so this doesn't just affect children. This also affects adults' ability to use the internet. And as you said, even children have privacy rights, have free speech rights, have free expression, and freedom of association rights.

And it is a huge step to go from the age of 13, which is what Kappa does in terms of parental consent for gathering information about children, to the age of 18 as though like you said, there's just absolutely no developmental difference in there. Of course, even littler kids should have some privacy, should have some ability to speak freely. Notably, we are the only country in the United Nations that is not a signatory to the convention on the rights of the child. And so that's how we can, I think, see some of these bills. The thing that I find most fascinating, I guess, about this bill, which is intended to, it seems like both keep kids from getting abused or be exposed to abusive content on the internet and also to deal with addiction to social media services, is that it applies to some social media services. And then there's a list of carve outs that's like a mile long.

So somewhat refreshingly for me as somebody who worries constantly about and encrypted messaging, messaging services and email are carved out. That's nice. That's rare from what I tend to see. Which then brings up the question of like, well, what are you supposed to do if you are Meta and you're trying to integrate all of your messaging functions together? Where some of those are attached to social media sites like Instagram or the main Facebook app, whereas others like WhatsApp are standalone messaging services that would seem to be carved out from this? Well, okay, if all of those are now, the idea is that you should be able to send a message from Instagram and somebody who's using WhatsApp could receive it and vice versa, under this idea for integrating these services, how is Meta supposed to comply with this, where some of its services are carved out in, some are not.

There are other wild carve outs in here that include cloud storage as though nobody has been able to leverage cloud storage to do anything, abusive. Business to business software. So I guess Jira is still going to be able to threaten our children now that we know that that is a weaponized tool according to the US Congress. Interactive gaming, virtual gaming services, what are you talking about? Why is gaming carved out? This just sounds to me like Roblox's lobbyists did a really good job getting this in here as though gaming is neither addictive, which is one of the concerns of this bill, nor is a vector for potential abuse or exploitation of children, which seems to be another one of the concerns of this bill.

Evelyn Douek:

No, it's Facebook. Don't you know where kids are hanging out these days? It's not on robots, it's Facebook obviously.

Riana Pfefferkorn:

[inaudible 00:36:03] So the act of giving like mile wide carve out that you could drive a truck through streaming services like Disney Plus Netflix. So if you're addicted to scrolling through Instagram, that's a problem. But if you're watching all of the Mandalorian in one sitting, apparently that's not a problem. And then I also love because we-

Evelyn Douek:

That's an achievement. That is not a problem. That is an achievement.

Riana Pfefferkorn:

I can't keep my eyes open that long, but that's because I'm old and I'm not one of the youngs who's being protected by this bill. And then also there are carve outs for sites that enable e-commerce and for looking for Oregon job training for employment. So in order to both be a good little worker bee who consumes, we're going to make sure that your ability to spend your allowance on Depop and look for a part-time job do not accidentally get affected by this ridiculous bill.

Evelyn Douek:

That's great. Speaking of part-time jobs, I am going to have to go off to my second job now because Twitter has announced the pricing packages for getting access to its API, and the lowest tier for access to the API is over $500,000 per year, which is extremely pricey. It's going to price out a lot of important academic research. It's actually not...

Riana Pfefferkorn:

Not feasible. Oh, can I make sure before we wrap on the Utah bill, my favorite carve out?

Evelyn Douek:

Oh yeah, sorry.

Riana Pfefferkorn:

My favorite carve out is the carve out for genealogical research sites. The most Utah possible carve out from a list of websites that are no longer allowed to potentially harm children. Genealogical research. Thank you, Utah.

Evelyn Douek:

What do you think the story is behind that? There's just some lawmaker out to lunch with a friend who... I don't know.

Riana Pfefferkorn:

Well, so the story about Utah is that it's full of Mormons. Mormons are huge into genealogical research, in part because they seem to baptize people who are dead into being part of their church. So genealogical research is a huge deal for Mormons and a big pastime. And so for a state that is most famous for having Mormons in it, I'm not sure what else Utah is known for. Rocks, I guess? Yeah, bad internet legislation? That is like just chef's kiss. Perfect carve out to see you in this bill.

Evelyn Douek:

Excellent. Okay. And I think with that, we're going to leave it there for the week. So thank you so much, Riana, for stepping in and saving the day. The show must go on. And that has been your Moderated Content Weekly News Update for the week. The show is available in all the usual places, including Apple Podcasts and Spotify, and show notes are available at law.stanford.edu/moderated content. This episode wouldn't be possible without the research and editorial assistance of John Perino, policy analyst extraordinaire at the Stanford Internet Observatory, and it is produced by the wonderful Brian Pelletier. Special thanks also to Justin Fu and Rob Huffman.