Moderated Content

MC Weekly News Roundup Halloween Edition

Episode Summary

Evelyn and Alex reluctantly talk about Elon Musk and Twitter, again, before some updates about The Wire in India, the midterms at home, Meta's political ad transparency fine in Washington state, and the publication of the EU's mammoth regulation, the Digital Services Act.

Episode Notes

SHOW NOTES

Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments:

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

Episode Transcription

Evelyn Douek: And I guess that will only be an acceleration or amplification of a dynamic that already exists, which is the powerful people have generally also being able to get themselves, secure themselves, through pressure, maybe different standards in certain cases.

Alex Stamos: Right. Well, and famously at Facebook, there's a program specifically to look at for higher level accounts called X Check, which ended up getting them in trouble in India.

Evelyn Douek: Right. Yes, exactly. Okay, well that brings us. Thank you for throwing me that one.

Alex Stamos: I am making the soft pitch, underhand pitch motion to you.

Evelyn Douek: Literally. We're in a studio together and I'm catching it. Welcome to the Weekly News Hit episode of Moderated Content with myself, Evelyn Douek and Alex Stamos. It has been around 72 hours since we dropped an episode, so I hope you haven't been missing us too much over your spooky Halloween weekend. Unfortunately, I think we have to start in the same place where we left off, which is Musk. And in 72 hours, he has made a lot of new news.

I don't know, Alex. Where do you want to start? My gripe with Musk over the weekend is actually with the reporters. Everyone's saying he's definitely going to set up a content moderation council as a result of the fact that he tweeted once that he was thinking about this, which can we not retweet or report on Musk tweets as if they are more than thought bubbles until we actually see some follow through?

Alex Stamos: Yeah. He immediately saw the complexity that he was jumping into, said, "No changes have been made," which I think actually continues to be true, is there's no evidence that any actual policies have changed at Twitter, but that there wouldn't be any changes until there was his Content Moderation Council. That lasted several hours until he started replying again to people complaining to him about content moderation decisions, saying he's going to look into it, which seems inconsistent with the idea that he's going to be set up his Blue Ribbon panel to be working on [inaudible].

Evelyn Douek: He is the Content Moderation Council with his diverse viewpoints all contained within his singular head.

Alex Stamos: Right, just like he's now the entire board of directors. He fired the rest of the board directors, which sometimes happens when a company goes private. Although, often, when a company goes private under a private equity firm, you end up with still a functional board. It's just every member of the board represents a different investor. He's a board of one. He's a content moderation council of one. These are jobs that normally are done by dozens of humans, but apparently he has the ability to do that all while also running a rocket company and a car company and a boring company that is digging holes in LA.

Evelyn Douek: And also apparently having time to tweet more than I find time to tweet with running my one single job. So that's good fun.

Alex Stamos: And being a very involved father on the children he has with I think three different women now.

Evelyn Douek: Of course, it's really quite impressive. And of course, the oversight board of Meta jumped into this debate as well with basically a hand wave saying, "Hey Musk, if you're interested in the Content Moderation Council, here's one we set up earlier." Mike Musick had a great tweet saying, "Oh, someone put the sales department in front of the oversight board's Twitter account," which I thought was very funny, so ...

Alex Stamos: It is an interesting question. I mean, this is exactly what the oversight board, the people on it, want, is in theory, the oversight board is a independent nonprofit. It has an independent board. It is not run by Facebook, but its only revenue comes from Facebook. Right now. Right? It only has one client in Facebook. And I think they are desperately looking for another platform to say, "This huge talent pool that you've built, which is a actually quite impressive, pool of academics who have studied speech issues around the world, we're going to pay you for access to them as well."

The idea that Musk would use something that Mark Zuckerberg had already set up and blessed for his own content moderation, I think, is close to zero, right? Whatever, if he sets up a Content Moderation Council, it is going to be VCs who have absolutely no operational experience and a couple of guys he picked off of Twitter. I mean, just effectively a clown car of people pouring out of a VW van, smoking some pot, and then deciding what speech looks like for the entire globe.

Evelyn Douek: Excellent. Experimental approaches, laboratories of online governance, who knows what could come up. You did say that there is no evidence that there's been any change in the standards on Twitter, and we talked about this last time in the context of the hate speech, that there'd been a huge influx, and there's been an update from Twitter on that in the days since.

Alex Stamos: Right. So there has been more empirical evidence that the amount of resultant prevalence of hate speech has gone up. Right? The number of tweets that are making it through that people are actually seen that have hateful slurs, especially racial slurs, it seems, as well as antisemitic content has, gone up. Yoel Roth, who's the head of safety and integrity at Twitter, and a friend of mine, and somebody I think is a really positive actor in this space-

Evelyn Douek: Seconded.

Alex Stamos: ... somebody who has survived so far. Right? Unfortunately, the clock is ticking for Yoel. The odds of him being able to survive the Musk regime I think is extremely low. But right now, he is doing everything he can, clearly, to hold on with his fingernails and to try to salvage something from the mess that's happened so far. And he tweeted that the rules had not changed on hateful conduct, but that there had been an increase in hate speeches that came from a relatively small number of accounts. So it does seem, as we discussed over the weekend, clearly there are groups of people who have been emboldened who are organizing on Telegram, on 8chan, on places like that, who have come to try to flood Twitter with hate speech, and that is causing the impact, but it's not because Twitter's changed their rules, it's because they just don't have the ability to handle this much inflow up front.

And so we will see if that stabilizes as those people get kind of banned or kicked off and bored, and we'll see whether or not any of these rules actually do change and if the enforcement changes. My expectation is that the day to day hate speech enforcement will not change, but what you will see is people get banned under it and then appeal to Musk and then get. And so you're going to have this kind of crazy model where your odds of coming out of Twitter jail because you're able to appeal or your friends are able to appeal to Musk to get it overturned is going to be much higher. And so you'll have a kind of continuous low level conflict between the day to day content moderation at Twitter and Musk's decisions, but it's not clear exactly how that's going to work yet.

Evelyn Douek: Right. And I guess that will only be an acceleration or amplification of a dynamic that already exists, which is the powerful people have generally also been able to get themselves, secure themselves, through pressure, maybe different standards in certain cases.

Alex Stamos: Right. Well, and famously at Facebook, there's a program specifically to look at for high level accounts called X Check, which ended up getting them in trouble in India.

Evelyn Douek: Right. Yes, exactly. Okay, well that brings us. Thank you for throwing me that one.

Alex Stamos: I am making the soft pitch, underhand pitch motion to [inaudible].

Evelyn Douek: Literally. We're in a studio together and I'm catching it. So it's a good segue to a small update on a story that we covered at length. The other story that we've spent a lot of time on, which is the story about the fiasco in India over the Wire's now confirmed fabricated stories, about Meta's relationship with a BJP official.

In the last few days that BJP official has filed defamation and other claims against the editors of the Wire. Those are criminal provisions in India, and the Wire itself has also filed claims against its editor. And over the weekend, a number of editors of the Wire, their homes were raided by the police, and their phones and laptops seized. So it's a tragic end. There were lots of questions raised about the story at the start, but this is a very dramatic escalation in this story.

Alex Stamos: Yeah. And a sad one and one that we've predicted. You know, you and I have talked about a couple of times, the fact that they had named actual Indian citizens meant that despite the fact that there's no way Meow was going to file criminal charges, that a political operative for the BJP clearly is, Indian politics is, no holds barred. And it looks like the BJP and their allies are going to use this to try to push to destroy the Wire. And perhaps this is also to send a message to the rest of the opposition press, that if you make any little mistake that they can hold onto.

Now, this wasn't a little mistake, right? This is straight up faked evidence. But the fact that the Wire's going after, I believe, not an editor, but a contributor, Divesh Kumar, who's effectively kind of like their technical contributor, who was a full-time employee but was a contractor at this point, his relationship to them is a little bit complicated. Not so complicated though that the evidence he apparently faked couldn't be included as core evidence and then he was backed over and over again.

So I think it is really dodging responsibility for the Wire to pin this on one guy when they made a intentional editorial decision not to double check, not to get different sources, and then double down over and over and over again. At any moment they could have pulled back. And instead they decided to put these snarky posts out. Their supporters called me and other people names, colonialists and effectively the Indian version of gringo for having these opinions and for telling the truth instead of going and looking at it. And so for them to now turn around and say it's not their problem and that it was just one guy I think is totally unfair to that one person.

Evelyn Douek: Yeah, there's good reasons why trust in the editorial systems that the Wire has been undermined for all of this, but it is a sad further constriction of the Free Press and Civil Society in the country. Going back to the idea that Twitter hasn't changed its standards, we are heading into the final stretch of the midterms and the big question about whether Twitter would adopt a very different approach before in that final period seems to be answered in the negative. Musk has said he doesn't really expect any big changes, as you said. So tell us what you're seeing across the industry there.

Alex Stamos: Right. So our group, Stanford Inner Observatory, is one of the two academic institutions that run the Election Integrity Partnership, EI partnership.net. And on our blog, we recently posted a analysis of platform policies around election disinformation. Not going to go with all the details except to say they are, they're seriously improved from the early 2020 cycle. Twitter rates quite highly in our eyes. Everybody but core Facebook, Instagram turns out to be weaker, TikTok, YouTube, both doing much better, but having some holes in their policies.

I think the interesting question here, it's clearly they're not going to change policies between now and the midterms. But there is talk of Musk firing a huge percentage of Twitter employees. It turns out November 1st, tomorrow, as we record, this is a big vesting date for Twitter employees. For those of you don't know, when you work in Silicon Valley, especially for a large public company, a large percentage of your income actually comes from stock, usually things called RSUs, restricted stock units, which goes up and it goes down. But it's generally something people can depend on of happening. And tomorrow is a day where a significant percentage of the income for a lot of these people is going to be paid out.

Traditionally, what they would do is they'd get stock in a Schwab account and then during a window in which employees are allowed to sell and buy stock, they could sell it and turn it into cash. Because Twitter stock doesn't exist anymore, it's all held by Musk. He effectively has to pay straight up bonuses, right? They're just going to get cash for what that stock was. That's an obligation he bought. Twitter has a contractual obligation to these employees for this income, and that is something that he bought when he bought Twitter just as he bought all of Twitter's other debts. And it looks like he is setting things up to try to get rid of a lot of people and to not pay that, which one, is going to be a full employment plan for the San Francisco based employment lawyers because California has all kinds of rules that explicitly don't allow you to do these kind of shenanigans.

I think for the purpose of this podcast, which is not an employment law podcast, for the midterms, I think the interesting question is, even though they're not going to rewrite any policies, does it affect their ability to actually enforce policies? And I think that's one of the big questions that we'll be looking at, at EIP. Is there a significant change in how quickly tweets are labeled, how quickly things are taken down, whether or not more junk makes it through, just because the people who used to have that as their job have now been fired?

And so when you create this much chaos inside of a company, even the people who survive are spending all day thinking about themselves. They're not actually doing their jobs. This is a really bad week for people at Twitter to not do their jobs. And so what kind of impact that has is going to be interesting.

Evelyn Douek: We're not an employment law podcast yet, but we might have to be one. In other news, Musk has apparently fired the top executives at the company for cause in order to avoid paying large bonuses that they would do to receive as part of their package. And so it's massive layoffs at Twitter, massive hiring schemes at all the law firms across the country, as we go back to court in a number of places. On the midterms front, how much variation is there across the platforms and how much of a difference does it make between the variation between them?

Alex Stamos: So there's a mean pretty decent amount of variation in the policies. They all have policies around, for example, violence against election workers and poll workers. But in some cases, which we rated lower, they don't have specific policies about that. They just kind of roll it up in their general threats. We don't believe that's enough, that we are facing, for folks who have been paying attention, you have people with guns standing outside ballot boxes. You have death threats against election workers. You have crazy theories being spun about both elected leaders, employees of elections, as well as volunteers. There's a real threat of physical harm that is overlaying all of this work. And a significant percentage of volunteers have said they're not going to volunteer anymore.

To work in elections, that the little old ladies who are nice enough to sit there all day and to hand you your ballot and then to make sure that you sign it correctly and it gets dropped in the box, they're doing that out of the good of their heart. They're not going to volunteer to put themselves in physical risk. And so that's one of the areas where I think there has been some variation, is how explicit the rules are around that kind of violence. Some things around offers to buy and sell votes. There's some variation.

There's a big level of variation of what they do. If somebody makes a completely unsubstantiated allegation of fraud. In a lot of cases, that stuff gets labeled, but the quality of those labels vary. In some cases, those labels will be specific and say, this does not have any evidence. In some cases, it'll be like in the Facebook's case. A lot of the times, it's just a link to here's the election results, which is kind of useless. Right?

So there is a decent amount of variation. These companies, there's no laws here, there's nothing controlling them. There's been these crazy conspiracy theories about DHS censoring these platforms. But the truth is, if you look at their rules, their rules are very, very different. Their enforcement is very, very different. They're clearly not under the control of the White House. And that's I think a normal thing for having all these different platforms. But it does mean that you will see very different kinds of calls on different platforms.

Evelyn Douek: Speaking of crazy conspiracy theories and the terrifying state of rising political violence in this country, there was obviously the despicable attack against Paul Pelosi in the last few days. And we are somewhat numb to political violence with the New York Times reporting that below the fold. But it is of course, because everything is, it is also a content moderation story as these crazy conspiracy theories have spun out across the web with Elon Musk himself also retweeting and then deleting his retweet of speculation about a conspiracy theory about the cause of that attack.

Alex Stamos: So Musk has to be everywhere. Any controversy we have, he's going to have to play part of it. As you said, horrible attack against Paul Pelosi.

Evelyn Douek: He's the Forrest Gump of awful dumpster fires these days.

Alex Stamos: Yeah. And you have this 80 something year old man called 911 and leave it open while he's being attacked. I think a lot of credit goes to the 911 operator who figured out what was going on and got police there before he was killed. I believe he is still in the hospital with a broken skull here in San Francisco. And the person who carried it out has a personal blog, who has always subscribed to crazy beliefs. Those look like crazy beliefs on the left. And he's gone to the right, which is a pattern. We have seen a lot of people who have an openness to really populist things are being controlled beliefs. Left, right, doesn't matter as much, as the fact that there's an idea that there's a constant conspiracy out there making things bad for them. And most recently, he has been more of a conspiracist on the right.

But what then we saw was this explosion from all different parts of the right wing media from the lowest kind of total fake news sites all the way up to Fox News themselves, which is really the pinnacle, the crown jewel, of right wing media, trying to spread disinformation about this, just ask questions throughout lots of garbage. And it's a very Putin-esque strategy, right? Which is a lot of the disinformation in Russia that's aimed at Russians themselves, which is much more disinformation than we see come out of Russia. Right?

That's one of the things you always have to think about the Russian disinformation campaigns is most of that stuff is aimed at Russians themselves, is that whenever anything happens, you hear all these crazy theories that are interpretation of the facts that are not true and nobody has to believe any of them. But the fact that there are 12 different theories that are thrown out there makes people kind of nihilistic and believe that nothing is true and that helps Putin stay in control.

And that's exactly what we're seeing starting with Fox News at the top and then trickling down to all these organizations is throwing out these crazy conspiracies, just asking questions. We need to know more. The head of the RNCC was on Face the Nation this weekend, and honestly, I'm just going to say it. He sounded like a scumbag because he was asked about this and said, "Well we need to know more, but I condemn all violence," which is a really scum baggy way of implying that there is something unknown that is a dog whistle to his listeners. And that came from a elected member of the House of Representatives who is the head of the Congressional committee.

So Musk fell for the lowest end of this, which is the "Santa Monica Observer", which is one of these just complete fake news sites that puts out crazy stuff, gets people to click. And what we have found is that you could create a fake newspaper as long as you have the name of a city and then something that sounds like a newspaper name and you put them together and you can run it out of Macedonia or run out of Pakistan or India. You have low cost labor who create all this English language content for you. And if the world's richest man can fall for it, it does demonstrate how it's really compelling for lots of people.

So just a really sad day. And like you said, the New York Times, which is supposed to be the pinnacle here, totally downplayed this as well. And it's kind of shocking. Who are we supposed to rely upon if you can't rely in the New York Times to really aggressively cover a story about the attempted assassination of the third person in line for the presidency of the United States?

Evelyn Douek: And that's not clear that we can content moderate our way out of this mess either, which is often people's reaction to a lot of disinformation. But as you're saying, when it's Fox News, when it's Elon Musk, and it's coming from the top, a few extra stronger rules and getting the stuff down the bottom isn't going to make a lot of difference, which is the findings of [inaudible] et al, in the lead up to the 2020 election as well. When it's coming from the top, it's not necessarily a social media problem, although of course social media does also amplify those dynamics as well. So ...

Alex Stamos: I mean, the only upside is Musk was shamed into deleting his tweet, so there is some level of ... At least he got embarrassed that he got taken by such a scummy, no quality website. And so perhaps that's a little tiny bright spot here is that there's a little bit of shame left in some of the people who don't want to at least look like they're being used.

Evelyn Douek: Take our wins where we can find them. Thanks for that attempted optimism.

Alex Stamos: I'm trying.

Evelyn Douek: So we did mention this over the weekend, but I do think it's a story worth mentioning again because it is getting lost a little bit, which is the fine in Washington state where a King County judge ordered Facebook to pay a maximum penalty for its campaign finance violations of $24.6 million, which was the maximum penalty, as I said, for 822 violations of Washington disclosure laws.

When you run political ads in the state, you have to have transparency around them and Facebook just hadn't done that. $24.6 million and no one at Facebook's going to lose sleep over that. But I do think that there's this push for greater transparency and not just relying on voluntary transparency from the platforms in political advertising. And so this could be the sign that other states may act on this as well and get more transparency.

The judge's judgment wasn't particularly detailed. It didn't have any first Amendment analysis. I would be really surprised if Facebook doesn't sort of appeal this and see what it can do in terms of challenging the constitutionality of that as we're seeing in a bunch of other places. But I do think it's really interesting that this is the start potentially of many more legal transparency obligations on these platforms as well.

Alex Stamos: Yeah, I mean it's interesting. A couple things here. One, Facebook has the best ad transparency of any platform. So it is a weird target other than people don't like Facebook. So it's always a good target. The other reason it's targeted is because of these transparency requirements, a number of platforms have dropped political ads totally, including Twitter.

And so I think it's great to have transparency requirements. We have called for that in our report we did several years ago on at the federal level. I really don't think they should be state by state. I think 50 different transparency requirements is just a silly way for us to do stuff as a country, quite like Australian rail gauges. This is the reason why we should have a federal [inaudible].

Evelyn Douek: You got to go to a federation and a unified United States of America to avoid political different political disclosure laws.

Alex Stamos: Political rail gauges.

Evelyn Douek: Right.

Alex Stamos: And so I think it would be great, but this is something that we're just going to have to also keep an eye on, is you want those requirements to exist. You want them to be fairly applied. We have almost no transparency on any other platform than social media. So you get no transparency from the radio, which drive time Radio is full of terrible, terrible political ads. And there's no archive of those and there's no way to see who's paying for them. We don't have it from Fox News or from local TV stations.

I believe in the fact that local TV stations, correct me if I'm wrong, are required to run certain political ads. They can't actually decide not to. And so there's this interesting, where it's even on the opposite side, where they carry everything with absolutely no transparency and they have no ability to have any policies about what they do and do not carry.

We don't have any transparency from the newspapers. The New York Times runs a massive first party advertising network and has absolutely no transparency about the ads that run on newyorktimes.com. So if I think it's great to have these rules, but they should be fairly applied and not just used for ... this feels a little bit like a political one-off that a lot of people get to be happy about the outcome and to go do press conferences, but doesn't actually make anything better.

Okay. But I think we're also probably going to see is it might be possible, this is the last cycle where Facebook carries political ads. It is a tiny part of Facebook's revenue and it is a huge amount of pain and suffering that comes from it. And some people might celebrate that. You have to decide whether or not you think pushing political ads back onto television instead of online makes it better for people who are not incumbents or who are not backed by huge super packs, right? Cause obviously the cost of doing TV advertising, the minimal cost of both production and running that kind of stuff is much, much higher than online.

Evelyn Douek: Right. I mean, the interesting thing about this Washington state law is that it is a broad based law. Didn't target only online advertisers, which I think is interesting and maybe made it more likely to survive any challenge, but I have no knowledge of how broadly it's been enforced and applied. And this was certainly extremely high profile. The AG in the state was making a lot of announcements about it. Very proud that this was the largest ever campaign finance violation fine. So I definitely seems to be motivated by that anti ... yeah.

Alex Stamos: Great. So let's see this for other forms of media.

Evelyn Douek: Right.

Alex Stamos: I keep on seeing the influence peddlers and the lobbyists for certain media companies highlight this while they themselves, their members have absolutely no transparency at all. It would be nice just to see those kind of scummy people on Twitter celebrate this. That's fine. But I do think the AG needs to come for all the other kinds of online advertisers, or not just online advertisers, but all kinds of political advertisers.

Evelyn Douek: Yeah. And I think you're right about Facebook saying that it's going to eject from political advertising if a lot of this becomes more broad based. I believe it did actually try to do that in Washington state and ban taking political ads, but it had failed in certain instances, which is why this fine applied.

And it reminds me of when there was discussion a couple years ago. Facebook said it would start de-emphasizing political content in people's feeds and everyone's celebrated. But it's not clear to me that it's great for democracy if the place where a lot of people go for democratic discussion and political news and things like that is de-emphasized. It seems to me to be counterproductive to just not want people to know what's going on in politics.

Alex Stamos: Right. And deciding what is political, right? That de-emphasization really focused on things like they started down ranking news sites and such. And so you stop up-ranking like content created by your uncles and aunts. And there's not a great empirical evidence, but there's some anecdotal evidence that this led to some of the QAnon on and other things that are ground-based cults and crazy conspiracy theories, that instead of them coming from Fox news.com, they're now coming from your crazy uncle because that stuff's getting ranked up a little bit more.

So yeah, there's no clean answer here. People like to parrot. These people should get rid of ads and they should get rid of this or that. And it turns out that any change you make, the actual downstream impacts are incredibly complicated.

Evelyn Douek: Right. So anything else you wanted to cover before we close up for the week?

Alex Stamos: Well, I think the one other thing that happened last week is there's a new draft of the Digital Services Act, is that correct? So I guess we're still-

Evelyn Douek: It's finally passed. Yeah.

Alex Stamos: Finally passed. Yeah.

Evelyn Douek: And the final text released. Yeah. And so this is something I think we mentioned last episode about Musk and Twitter, but it's a massive regulatory package in Europe to regulate online platforms. It has a whole bunch of risk assessment, auditing, disclosure obligations, due process obligations, appeals to independent third party arbiters for the massive platforms known as VLOPs or VLOPs. I think we're going to have to come to consensus on how to pronounce the very large online platforms acronym that they've adopted. And it's going to come into force over the next few years.

I think the big question is how is any of this going to be enforced? The language is very general, very broad, and so it's going to be a full employment program. As Elon Musk is employing all of the employment lawyers are getting jobs over here. All the internet lawyers are getting jobs over in Europe as people work out what on earth any of this means and it starts to get implemented. It's going to be a massive scramble.

Alex Stamos: Yeah. As we saw GDPR, you have that broad language and so you have to wait for a local ... is it a data protection commissioner that makes the complaints under this? Or are there going to be a new kind of role for the actual content?

Evelyn Douek: The way that enforcement's going to work is there's both country specific regulators that are going to be set up to enforce certain country specific requirements. And then there is a more centralized body going to be set up in the European Commission to enforce some of those rules against the very largest platforms.

Alex Stamos: So it's very GDPR like in that you can have a complaint from Italy that works its way through the Italian court system and then eventually to Luxembourg and years and years of lawyers writing briefs in multiple languages.

Evelyn Douek: Right. Yeah. They have tried to sort of centralize some more of the largest obligations through this European Commission role, in particular in response to the kerfuffle that was the implementation of the GDPR. But there's still all of these decentralized monitoring and enforcement mechanisms. So yeah, it's going to be fun trying to keep track of all of that.

Alex Stamos: Okay, well, plenty for us to talk about then.

Evelyn Douek: Excellent. Yes. And so that has been your moderated content news roundup for the week. This show is available in all the usual places, including Apple Podcasts and Spotify and show notes are available at law.stanford.edu/moderatedcontent. This episode wouldn't have been possible without the research and editorial assistance of John Parino, policy analyst at the Stanford Internet Observatory, and is produced by the brilliant Brian Pelletier. Thanks also to Alyssa Ashdown, Justin Fu, and Rob Huffman. See you next week unless Elon Musk does something truly explosive in the meantime.