What Role Might Elon Musk Play in the Post-Election Period?
Justin Hendrix / Nov 2, 2024Audio of this conversation is available via your favorite podcast service.
If you’re trying to game out the potential role of technology in the post-election period in the US, there is a significant "X" factor. When he purchased the social media platform formerly known as Twitter, “Elon Musk didn’t just get a social network—he got a political weapon.” So says today’s guest, a journalist who is one of the keenest observers of phenomena on the internet: Charlie Warzel, a staff writer at The Atlantic and the author of its newsletter Galaxy Brain. I caught up with him about what to make of Musk and the broader health of the information environment.
What follows is a lightly edited transcript of the discussion.
Charlie Warzel:
My name's Charlie Warzel. I'm a staff writer at The Atlantic magazine, and I am also the author of its newsletter, Galaxy Brain, which covers technology, media, and politics.
Justin Hendrix:
Charlie, I'm glad to talk to you today. It's been some time since I've seen you. I think the last time I saw you was in New York, and it was kind of in the middle of the pandemic at one of the first conferences I can remember folks coming out to. But I remember first talking to you six, seven years ago now. You were writing a story that ended up with a title, something about the “infocalypse,” if I remember correctly, back at BuzzFeed News. And you were beginning to look at a set of questions about how new technology might change the production of propaganda and misinformation. So I don't know, you've been looking at this stuff for a long time. How do you think it's going?
Charlie Warzel:
Oh man, that story. We were so young then. This is how I think about it.
Justin Hendrix:
We were both younger men. The listener can't tell by looking at the screen, but both of us are quite wizened by the years since then.
Charlie Warzel:
A lot of beards and gray hairs for everyone. So that story was trying to basically put, it was a little bit before, I think I wrote it in 2017, but it was after that period where people were really looking and scrutinizing post-2016, the role of platforms and information, and either the rise of Donald Trump or trying to basically figure out how much blame to assign if any, what these systems could be done, how to moderate them effectively, if at all. And the pushback obviously from people on the right and the "free speech maximalists."
And we really weren't arguing over the term misinformation quite as much at that moment. So my thought was, okay, we're having this debate now, but let's future cast it. Let's try to think about, is what we're seeing right now evidence of a problem that's only going to get orders of magnitude worse? And then if so, how?
And it was actually motivated by that wonderful New Yorker story about the big one, the big Pacific Northwest earthquake. It was this great piece of journalism showing this thing that was just out, just hanging over a region's head, right? That's something that is going to happen, but are people ignoring it? Do we understand the extent of it? And told in this compelling way.
So that was what I was actually trying to do. And I got a bunch of people, yourself and other researchers to talk about this. And it ended up explaining this future in which we get off the rails, a lot of AI powered kind of garbage. This was before we had the term of generative AI, but this idea that soon we'll be able to spin up images, videos, whatever, impersonations, "deepfake" style stuff, and that it could basically blur the entire lines of reality.
And I think the story really resonated with lots of people and frankly just scared a lot of people, which I don't know how productive that always is, but it at least raises attention. And I think if I'm looking back at what that story forecasted versus what actually came true is always how technology plays out, which is that I think there is a lot of it that has come true. But in more banal, mundane ways than we expect. The future never feels like the flying cars or like the AI robots spewing falsehoods and not knowing who's on the other end of the line.
But at the same time, if you look two years into the generative AI era, the internet has been flooded with synthetic information. Not all of it duping people, but this idea that there is essentially pollution, like garbage injected into our ecosystems.
Justin Hendrix:
Slop.
Charlie Warzel:
Slop, yeah. But I saw recently the other day it was a joke post on Threads or somewhere. It was like Quora has been breached. And the example was a Quora post that was clearly just ChatGPT responding to 30,000 different questions. And it's this idea, of course, this is what it looks like in that really banal way, which is that it's not a bunch of people walking around being necessarily mind controlled, but it is this idea of an information ecosystem where you do have to scrutinize every single thing really intensely if you want to verify that a human being actually touched it or thought about it.
And from the other side of it, quite obviously, and I've written about this a lot lately, the desire from a certain segment of the country to want to construct their own reality, and live in it, and also be unapologetic about it. I think we saw that and we saw the contours of that. Obviously, we've seen it for a long time.
But where we're at now with this idea of people sharing generative AI images and then saying, "Yeah, I know it's not real, but it feels real, and so I'm still going to share." It is really a break I think from the way that things were before, where if you were caught sharing something fake, you either deleted your account and went away for a while, or you apologized, or whatever. But this sort of unapologetic nature of it, I think, speaks to us now seeing the effects of the last decade on certain groups of people.
Justin Hendrix:
And I can think of some examples of that phenomenon you just talked about that aren't just regular users, but lawmakers who share something false. And when called out will say, "It could have been true."
Charlie Warzel:
To that point. I always remembered growing up and watching Colbert and the whole Truthiness thing, but I hadn't revisited the segment and the clips online. And it's from 2005, and it's like he is outlining this idea of what your gut tells you and makes you feel is truer than the truth could ever be. And obviously, it was related to Bush and the Bush administration and all that kind of stuff, but it's pretty wild that has gone from something that you are parodying in the extreme to that is just the baseline ideology of mainstream MAGA Republicans.
Justin Hendrix:
That brings us perhaps to one particular subject that you've been covering very closely over the last few months, and I suspect one thing that you nor I could have predicted about today's internet or information ecosystem is that one of the major social media platforms would be bought by someone with that same set of goals. That desire to make falsehoods real, to perpetrate various conspiracy theories, to otherwise use the power of the platform to accomplish a set of political goals. And of course, I'm talking about Elon Musk. You say in this latest piece for The Atlantic, which comes under the title, "This Is What $44 Billion Buys You," that Elon Musk didn't just get a social network when he got Twitter, he got a political weapon.
Charlie Warzel:
I wanted to really look at X or Twitter, whatever we want to call it, through the lens of what it is doing, what its political project is in this sort of end stage of the election. And I think nothing speaks more to that than this election integrity community that he and his America PAC have spun up, which is essentially a feed for which people can post any fear-mongering rumor, any sort of, "Just asking questions. Is this real? Hey, I saw this thing my friend texted me that blank is happening at the polling location, election fraud conspiracy center." And in a way, the experience over the last, especially in 2020 was like all social networks filled with this stuff in a filtered manner. You'd see a post on Facebook alleging something, you'd see something on X or Twitter, but this is a very concerted feed, almost like pure uncut distillation of the worst parts of social media, which is a fake news engine, but also an incitement engine, right? You're seeing now that this feed is filling with examples of, "Hey, this person dropped off some ballots here. I think they're trying to meddle in the election. Stop the Steal," and then people will, basically the Libs of TikTok version of trying to go find that person and then harassing them in public.
And so I was looking at that and thinking to myself that it was always a little bit clear that X, the purchase of X Elon made was going to be a political project to some degree. But it's extremely clear right now what it was all for, or at least what he has realized that it can all be for, which is to poison the information ecosystem. If you're trying to do the Steve Bannon job of flooding the zone with shit, usually as a political influencer, as a party, there's only so much you can do. You can create what you want to create and you hope you can bring everyone aboard. But owning one of the communications platforms essentially means you can turn the knobs or create these communities that's sole purpose is to flood the zone.
And if you go through those feeds, you start to have this thing, I'm inured to a lot of the worst stuff on the internet. I know how to keep myself from saying, "Okay, I know what this is. I know that this is people trying to stir shit, or who are propagandists," or whatever. But the experience of scrolling through that feed for 30 minutes can make a person even like myself go man, "Look at, oh geez, look at all these examples." And I know that most of them are completely out of context. If it's an irregularity that's actually real, it's usually way downplayed. Someone hit the wrong button on a machine, and is confused, and some poll workers help fix it.
But at the same time, the volume of stuff is really disorienting I think. And there are so many people out there who are going to experience that and see it, and it really builds this, the repetition of these rumors and falsehoods. You can see how it over time can change a person's worldview. It can really make something feel like it is happening, even if you are pretty skeptical that it is.
Justin Hendrix:
Or satisfy their worldview if that's the view they've already got, or the political or partisan goal they have is to assert that the election was stolen despite the facts. This is almost like if Mark Zuckerberg himself were running the big Stop the Steal group on Facebook in the post-twenty 20 election period, or perhaps running The_Donald or something like that. You've got this kind of perfect combination of Musk's PAC, which I think at this stage he's put more than $100 million into in order to help elect Donald Trump as president and the platform itself. Do we have other evidence that X is being, as you say, the knobs are being turned in favor of a political outcome?
Charlie Warzel:
Yeah. I want to be clear that I don't know anything about the algorithms. I am not insinuating that there are hands on the dials actually doing this. But more metaphorically, there's a multi-pronged approach of how X has become a really effective broadcast tool for Trump and for the MAGA right.
One of the elements of it is Musk himself in his account. He has the most followed account. So anything that he retweets or any tweet that he replies to, those things get a massive boost of visibility. He actually hadn't tweeted about this election integrity community when I started working on the story. And I wrote the story, what day is it? On Tuesday afternoon. And it went into edits, and I woke up on Wednesday morning and I saw that the fact-checker had boosted the number of people in the group. It was 13,000 when I wrote it, and it was like 33,000.
Justin Hendrix:
It's 54,000 now.
Charlie Warzel:
So that was an overnight, that was Musk's contribution to that. So he is this amplifier for anything that he wants to push people towards. And because he's followed by not just right-wing people, hundreds of millions of people on the network follow him because he's an influential figure, and was previously less of a propagandist, it's this huge megaphone and amplifier to any idea, anything that he sees fit. And he sees a lot of these conspiracy theories as the things that he wants to talk about and boost into people's feed. So there's that element. The other element is for whatever reason, whether nefarious or not, the recommendation feeds on the platform have trended toward right-wing shock jocks and influencers.
I believe the Wall Street Journal had a story, but I did a similar experiment back in mid-September because I had heard that this was happening to people, where I started a new account on X and I said, "Okay, I'm only interested in sports and technology, nothing else. I don't want to know about politics. I'm not going to follow any politics." And it asks you to follow different feeds or different people. Elon Musk, and a bunch of right-wing influencers, and Trump were some of the first people that asked me to follow. I said, "No," and only followed ESPN's main account. Just the most anodyne, I'm just interested in basketball scores thing.
And finally when I went through the onboarding process, it populated an algorithmic feed for me because I only followed one person. First tweet was Elon Musk. There was a tweet from Jack Posobiec, Donald Trump, Charlie Kirk, and a bunch of these influencers, Libs of TikTok. The feed was essentially a recreation of this feed that I keep, that I've kept since 2016 of the fever swamp MAGA influencers. It was almost identical. So X is feeding people political content whether they're asking for it or not.
And then the final part of this is based off some great reporting that the Washington Post did this week, which showed that because of all these other elements, a lot of progressive-leaning people have been driven off the platform. And that has given space for Republican lawmakers, Republican influencers to have more reach. And as a result, it's this sort of self-perpetuating cycle of, "Oh, my tweets are getting seen by more people. I'm going to tweet more often." And so the balance just on the social network, just from a political standpoint, not talking about conspiracies or anything like that, has changed dramatically.
And so all those things taken together to me suggest that Elon Musk has either just by his presence or by decisions he's made that we don't know anything about, has skewed the network, skewed the platform towards basically his own far-right ideology. And I think that should Donald Trump win next week, I think it's a reasonable thing to suggest that X will be something almost akin to a state media platform. The closest thing that we have in this country to that.
Because Elon Musk is a businessman, is an entrepreneur, is the CEO of a bunch of different companies, but he is a right-wing activist and propagandist. Just definitionally by what he does. He's going across the country paying people millions of dollars and campaigning for one presidential candidate.
And it is just absolutely mind-blowing to me when you look back to the outrage in 2016, when you look back in the outrage from the Republicans in 2020, and you look back at the outrage post-Hunter Biden laptop, all this stuff at these kinds of smaller level bits of supposed interference, like putting your hands on the moderation dials and trying to cut back misinformation that hurt Ben Shapiro's Facebook page, and just congressional subpoenas about that.
And meanwhile now you have Elon Musk who has essentially transformed a social network around one political ideological issue. It's just wild to me that nobody cares about that. It's a testament to the fact that this was never about equality of ideas. This was never about free speech in this way. It was always what a lot of people suppose, which is this cynical notion of working the refs and trying to make sure that far-right propagandists were able to spew absolute and utter lies whenever they wanted.
Justin Hendrix:
One of the things I keep thinking about is what the rest of the world must think watching this situation unfold, that kind of moment where Donald Trump and Elon Musk were on stage together, and Musk is leaping into the air trying to raise the crowd's enthusiasm for Trump. What must it be like to look at these two right-wing billionaires on a stage often spouting falsehoods, essentially so close to potentially, again, leadership of one of the world's great superpowers? I suspect if I were a political leader in Europe or some other part of the world, I'd be very afraid of what might be about to happen next.
Charlie Warzel:
Yeah. Truly, it is a wild outcome. I think when Musk was first talking about doing this and buying Twitter, it was very clear, at least to me, and I know to others, that he had not done any of the homework on what it takes to run a social media platform. And I think that there were thoughts based off the way that he had been tweeting and the reactionary politics he was starting to really get involved with or show sympathy towards, that he could run this in a vengeful, dark way.
I suspected that probably what was going to happen was going to be more of a turn to just basically turning Twitter in 2022 back into 2016 Twitter, which was to say, we really don't understand how to do content moderation. There's going to be a lot of awful stuff. People are going to get harassed, but whatever.
I didn't see, personally my own shortcoming, the extent to which he has become a political actor in his own right. I thought that the free speech posturing was the proper way that he wanted to position himself, which was like, it's very clear what his politics were, but he was never going to take the mask off fully. And he did.
I think it's going to be really interesting if things go Trump's way next week, what level of power Elon Musk has. Because you have this guy. You have two very singular figures who have boundless egos, but Trump is known for shiving pretty much anyone who pisses him off, no matter how loyal they have been in the past. I think it remains to be seen how that relationship would play out.
But I think that I saw a tweet or two of his today of Musk's, and he has become arguably the biggest anti-media crusade, mainstream media crusader. Unbelievable. And I do think from a global perspective, yeah, it's chaos. It's absolute chaos. And I think from a domestic perspective, something I'm planning to write about, the connection I've made in my mind is there's been a number of lawmakers recently who've mentioned Elon Musk's handling of Twitter as sort of the aspiration for their goals to gut the federal government, right? Vivek Ramaswamy mentioned it in a podcast that came out yesterday or this week as his aspiration.
And when you pair that idea with what Musk has done with Twitter and this idea that Musk is coming out now and saying, "If we take power next year, expect a lot of temporary financial hardship. Expect the markets to tank because we're going to be doing a lot, and there's going to be the shock to the system. But don't worry it, we'll correct it. We'll be good."
Justin Hendrix:
And let's be honest about what those things are. Talking about deporting 20 million people. We're talking about cutting, they say $2 trillion off the federal budget. Talking about massive tariffs that would likely devastate the economy in the short run and drive up massive unemployment including trade imbalances. Who knows, maybe we get ourselves a war. These are not small shocks.
Charlie Warzel:
No. And also just the simple notion of cutting 80% of the federal bureaucracy. You just have agencies and infrastructure projects, and things like that, just don't have anyone manning them anymore. Truly parts of our government or infrastructure that people will be asleep at the wheel there.
But it's such a scary thing to think about in that regard because it's terrifying just to think of the fact that there are so many people who look at what Elon Musk has done with Twitter and see it as this shining success story.
But in reality, if you look at it, what has happened is he's destroyed the business of Twitter completely. It is propped up simply by a whole bunch of debt and banks who are taking an absolute haircut on all this stuff. And there's no prospects for growth beyond this idea of grievance.
And I think it's a really instructive thing, though, for what a potential Trump administration would do. Because Elon Musk transformed Twitter, not into a better product, not into a more successful product, but he took the infrastructure of Twitter and turned it into a weapon that worked for them. And I think that's an exact way to look at what the Trump administration and what an administration that has Elon Musk's influence in it, what it would do with the federal government. It would gut a lot of the bureaucracy, but what remained would then be leveraged as a weapon, an ideological tool that benefits a certain ideology, and a certain group, and a certain set of values at the expense of others.
So I think that Twitter really, it's the blueprint, right? It is the blueprint for a Trump administration, and I think that's genuinely terrifying because running a social network, as we have seen, is not the same as running pretty much anything. It's its own thing. You can gut 80% of the employees of a software company and hope the servers keep running. I don't think you can do that with every government agency, so it's a bracing thought.
Justin Hendrix:
One of the things that struck me about Musk's own rallies, the ones he did himself in Pennsylvania over the last couple of weeks were some of the vox pop interviews that journalists did. They came across, I should say mostly young men who were huge fans of Elon Musk. What happens to Musk if Trump loses? What do you suspect happens? Is he elevated or does Donald Trump blame him? Is he somehow diminished by losing on this big stage, or do you suspect somehow it works out for him?
Charlie Warzel:
As wary as I to make predictions, I think that Elon Musk and Donald Trump share a quality that is, obviously it's not singular if they share it, but that is, it's incredibly rare. And it is an ability to attract attention, and an ability also to warp reality to their own, the way that they see things. And an outgrowth of that is that they get in trouble, but it tends not to work in the same way that it works for most people, right?
Donald Trump is a convicted felon, and he's potentially days and a couple million votes away from being president again, right? There's that classic meme of I'd like to see Donald Trump slip out of this one, right? If you've been following him for over a decade, he says the thing you're not supposed to say. He does the thing that's going to get you in trouble. He gets in actual legal trouble, he gets convicted, and yet here he is. He's still here.
And I think that Musk has a bit of that himself. He does get in trouble. He's gotten in trouble with federal agencies for his tweets and things like that, but they don't seem to dog him in that same way. Even the purchase of Twitter was this terrible financial deal that by now probably should have already come home to roost, right? But he's managed to structure it financially in a way that it's held together with duct tape, and it's for the very short-term, time being, it's working. So I think that him putting all his eggs in this basket, I don't see it as his downfall if Harris should win.
I think we should look at Donald Trump as the analogous figure here, which is that there is a period of potentially being chastened or changing the strategy a little bit. But I think if you look at X right now, X and this election integrity community is a platform that is setting the stage for the next Stop the Steal moment, right? All of his tweets, everything that he's doing is setting himself up to should it not go his way, claim that the election is stolen. So that's just going to be more advocacy, more of him ingratiating himself to this movement of people who are going to be denying whatever reality happens unless Trump wins.
Should Harris win, I see Trump just going way further down the rabbit hole, becoming far more extreme, turning X into essentially a resistance, alternate reality media hub.
And I think from the standpoint of his businesses, I really try to look at Tesla and SpaceX as almost apart from him. I know that they're tied to his wealth. I know that his mercurial nature could have massive effects on the stock prices and on the businesses themselves. But SpaceX is run by competent people who are launching and catching rockets while Elon Musk is campaigning for Donald Trump and tweeting like a radicalized grandfather.
So I think the government right now at least needs Elon Musk for some of its space exploration projects for better or for worse, because there's not really another option. Tesla is a business unto itself that is meaningful in a lot of different ways that have nothing to do with Elon Musk.
So I think that I don't see this as the massive reputational bet that is going to come back and bite him. I think the only way he gets out of this whole political experiment is if Donald Trump wins. He puts a lot of stuff behind Donald Trump, and Trump casts him aside, as he does to people like that. And I think that's the situation where Elon Musk is screwed. He put all the eggs, the basket gets chosen, and then he gets removed from said basket. That's how I'm thinking about it, but I think it could go any way.
Justin Hendrix:
We'll see. My own personal thought about this is that seems to me that Musk may well be a much more durable figure going forward for MAGA, but also perhaps even more generally for the far right globally. One of the things, if you follow this guy closely, you realize it's not just Donald Trump he's in bed with. It's not the only right wing leader or nation that he is working closely with and trying to wield his influence over. And all over the world from Argentina, to Italy, to Hungary, to various other countries across the world. And sometimes it's tied to his interests, his business interests, his need for raw materials, and resources or capital, and sometimes it seems to just be about politics, but I want to ask you-
Charlie Warzel:
And adulation too. People who are willing to see him the way he wants to be seen I think. He's similar to Trump in that way, right? He will glom on to certain people, not really care what they represent or what they do, as long as they're willing to, at least on the surface, treat him the way he believes he ought to be treated.
Justin Hendrix:
So I think all these things are reflexive and participatory, so I don't want to suggest there's a supply side and a demand side when it comes to the types of ideas that Donald Trump and Elon Musk peddle. But certainly, Donald Trump and Elon Musk are elites, and they're selling some of this mis and disinformation to audiences who are willingly lapping it up, helping to produce more of it, sharing it, etc. We're in this kind of, I guess what Kate Starbird calls participatory model these days.
But I want to tie that to something else you wrote earlier this month, which is about the information ecosystem or issues in the information ecosystem in the aftermath of the hurricanes in the American Southeast. You wrote, "What we're witnessing online during, in the aftermath of these hurricanes, is a group of people desperate to protect the dark fictitious world they built rather than deal with the realities of a warming planet hurling once-in-a-generation storms at them every few weeks. 'They'd rather malign and threaten meteorologists who in their minds are nothing but a trained subversive liar program spew stupid shit to support the global warming,' as one X user put it."
There is a possibility, whether Donald Trump wins or loses, that this is America, that this vision that they're reflecting back to us on X these days is more America than perhaps we'd like to believe.
Charlie Warzel:
Yeah, I want to be careful with that in terms of more or less, but I think the first line of that piece is something that a meaningful percentage of Americans have dissociated from reality. And I think that's the way that I like to phrase it because I think the internet's always going to show us a fun house mirror style of the world. But it's certainly a lot of folks who are working very hard to protect a very specific type of worldview and inoculate themselves from other viewpoints or things that are painful to consider about the nature of reality.
I wrote that piece in part because of what you're saying here, which is wanting to take a little bit of agency or give a little bit of agency rather to this group. And a lot of that, the idea around this is due to reading a lot of the work of Michael Caulfield at the University of Washington and his work on ‘copium addicts,” and this idea of making sure that we think about misinformation not only as a term of persuasion. I think we are actually doing a real disservice to people and insulting the intelligence honestly of a lot of our fellow Americans by just seeing them as rues or dupes who don't know anything about the world, and they see a tweet or something and they say, "Yep, that's the way it is," and to seeing it more the way that Caulfield talks about it, which is that misinformation is a tool to keep people inside these worldviews, right? It's a tool to help people not have to grapple with what's going on outside.
And I think that's really a smart thing. A lot of people glaze over when I talk about it this way, because I think a lot of people are really angry and don't want to think about people who are believing and peddling these lies, and living in an alternate reality as anything other than malicious actors. And I do think that they need to own that, obviously.
But I also think we do ourselves a bit of a disservice not thinking as much about how hard and painful it is to change your ideology and the extent that people will go through in order to protect it. If you grow up, if everyone around you has a very clear understanding of the world and how it is, and one of those things is that global warming is not real, right? And for you to acknowledge that global warming is real, it's changing, it's doing all this would mean having to cut ties with people in your community, having to ostracize yourself, having to be outside of that, it's painful.
So I think it's almost understandable in that sense how people are recruiting these types of ideas, how people would rather say, "Yeah, you know what, it is the government using nuclear weapons on these hurricanes or using weather weapons to manipulate these storms," because it's so much easier for them to believe or to pretend to believe that Kamala Harris and the Biden administration are doing this in an election year to win votes or to hurt people in red states than it is to understand that essentially, their voting for someone whose climate policies are so retrograde that they're going to accelerate all of these problems, and we're going to just get more and more extreme weather events that are going to adversely impact these communities in some of these places.
That's a little bit of, I think it's both really important to recognize their own agency in that, not think of them as dupes, and then also recognize the reason why this is happening. When people ask about, "Okay yeah, we've heard about the misinformation crisis. We've heard about all this types of stuff. What do we do about it?" I think there's always this idea that there's these quick solutions, that we'll content moderate ourselves out of this problem, or we will add a couple media literacy programs in schools and we'll really get ourselves to a better place.
And I think these solutions are so massive and structural. I think that when you talk about things, even the decline in trust and media, you have to address the problems with local journalism and why that's dying, because so much trust in media is built at the local level.
I wanted to add a little bit of complexity in there instead of just pointing and saying, "This information's good. This information's bad. These people are bad for believing it. These people are good." No, the problem is just so huge and that's why we're in this awful position where a lot of people seem to want to dissociate from reality.
Justin Hendrix:
I'm going to ask you one last question, which is what you're looking out for in the post-election period. It does strike me as it's difficult to imagine how things might've been different. Maybe they wouldn't have been different at all, but you think back to the kind of singular role that Twitter seemed to play in kicking off the events of January 6th. You had this tweet from Donald Trump, the kind of ‘be there, will be wild’ tweet, and then this sort of flurry of activity that took place there. And then of course, shortly after that event, Jack Dorsey, seemingly against his better instincts, decided to go along with the idea that Trump should be removed from the platform. It's very hard for me to imagine a scenario where Jack Dorsey actually said, "Nope, we're all in. The election was stolen. And we're going to use X as the platform to promote that idea in this moment." I don't know, how are you thinking about the more dire possibilities for the post-election period should Trump lose and challenge the outcome?
Charlie Warzel:
I would, first of all, Jack Dorsey man, what is he thinking right now of all this? I would love to really be able to get him on the record about that kind of stuff. I have ideas, but some of the stuff that he said about "Elon is the only one I trust in 2022." It's very fascinating to wonder what he's thinking seeing all this.
But that aside, I want to push back just slightly on the idea that Twitter was as influential. It very well may have been in terms of Trump's megaphone and all of that on the day.
Justin Hendrix:
I'll be very clear, I do not think that social media caused January 6th or that it was necessarily a causal spark. But certainly the platform played a role, and not even just in those last days or weeks, but you could argue in the months and years preceding it with things like allowing the QAnon movement to grow there. That's the way I think about it.
Charlie Warzel:
Yeah. What I was going to say is I think that Facebook actually had just a really outsized role in the Stop the Steal stuff. I just remember, I don't think it was the day after the election in 2020. It might've been the Thursday, but I just remember that Stop the Steal group, refreshing it, the main one on Facebook, and seeing 15,000 people joining every two minutes. The community that it was able to create there, and then that community clearly splintered off into real factions that showed up on January 6th. So I do think the platforms played some role for sure. I just think Facebook has had more influence maybe than Twitter in that moment.
But anyway, what I'm looking for, what I'm thinking about, I'm extremely frustrated in the sense that it's the least short-term chaotic path, and I say this with all irony. In the short term, the least chaotic path for the next six weeks is that Donald Trump wins decisively, right? Because Democrats, Kamala Harris is going to accept the results of the election where it seems Donald Trump will not, right? So the most chaos is an extremely tight race in which Harris wins. That is where all of this Stop the Steal stuff would probably go as much, and this is all very obvious. It's why the Harris campaign is asking people to show up in such force that she wins in no uncertain terms.
One thing I'm thinking about with the coming weeks is for everyone, including journalists, to remember that this is not 2020. The biggest difference between 2020 and what happened after election day and whatever will happen now is that Donald Trump is not the president. If there has to be a transfer of power, Donald Trump is not the one to have to step down. That's really meaningful. I think we actually aren't talking about that enough. There is this idea of looking through this election strictly through the prism of 2020, and that's just a fundamental difference. So I think that there is something there, and the actual process of certification in this moment will be handled by people who I think can be trusted to certify the election however it goes.
I think that a lot of what we will see will be localized, whatever it is. What I'm trying to think of, would there be an October surprise? Obviously we're recording this on the 1st of October, and there hasn't been a crazy surprise or anything. But if there is an election day misinformation debacle or something like that, I can see things happening really locally, something happening where someone says, "A polling place, the pipe broke, it's flooded. You can't go there," and directing people away from their local polling place to disincentivize people to vote. Little stuff like that, I think is something to keep an eye on. This idea that all these things are going to be smaller, but there's a sort of death by 1,000 cuts problem there.
I think January 6th, the spectacle of that, the violence of that, the centrality of that moment, those things tend to be pretty rare. But it doesn't mean that there won't be localized versions of things happening, right? Looking to all those swing states, Pennsylvania looks like it could be a place where especially in a delayed count in some of these counties, Montgomery County, Delaware County, those types of places. Arizona, I think Georgia. Looking in those spots and paying attention to those little areas, it's going to be important and interesting.
But the most depressing thing I think about all of this is I don't know how much there is that news gatherers can do to stem the tide in this moment. It just seems like when you're looking at the internet the way that it is right now, it's so primed to accept only a certain amount of outcomes. Everyone is in their positions and knows where to react in the right-wing ecosystem should this not go their way.
And I think that it's going to be really intense period in that way, but we all have to keep in mind too that Donald Trump is not the president. I just want to ask you what you're looking at and what you're looking towards, because I think it's important for me to see this from multiple perspectives.
Justin Hendrix:
I think you're right that it's likely that there won't be a big spectacle like January 6th. Certainly, there's a different security profile around those types of events in Congress. It's a national special security event now, but it'll receive the appropriate amount of security. It's unlikely that something like that would occur again.
But yeah, I think places like Pennsylvania, Wisconsin, Michigan, other swing states, it strikes me as a real possibility that if there's going to be some kind of political violence or real effort to dispute the outcome, it'll probably take place in those states.
If you look at even back in 2020, I don't know if you remember, there was a man arrested for a plot in Pennsylvania that had to do with claims that the election had been stolen. That one was disrupted. But we'll see if perhaps that is the case again.
But again, even our focus on it right now, as you say, could turn out to be anxious speculation if in fact the outcome is clear one way or the other. We might be primed to be afraid of something that won't come to pass.
But it does strike me that just as you say, for instance, at the end of that article about mis and disinformation to do with the hurricanes, that no matter what happens next week, the desperation and the sense that something's been taken from them will remain, even if the election isn't one of those things.
Charlie Warzel:
Yeah. As you're talking, it also helped me clarify when I say localized too, I think that the story, whether it's the hurricanes, whether it's any of these examples of what we see, and especially what we're going to see around the election, election day and the voting. We're seeing some of it now.
But I think the thing to focus on, or that I hope people focus on is the impact, and effect, and the human toll of all of this. Because I think, I wrote at the end of that piece that one way to look at this is a cultural and political assault on reality. And the people who get hurt in that are people who have to attend to reality. If you're a librarian, if you're a doctor, if you're an election worker, if you're a teacher, if you're somebody who has to be a steward of the way things are. If you're an election worker, all you can do is deal with the votes that come in, and the way that they are, and how they're tabulated, right? But if it's counter to the reality that this group wants, they will be the ones on the front lines. They will be the ones who are assaulted, just like public health officials during Covid, librarians, doctors, teachers, members of the press, whatever. They will be the ones because they're interacting with reality, because they are just having to attend the way that things are. They'll be the ones who get harassed, who become the victims, become put in the spotlight for just trying to do their jobs.
We saw this with public health officials during Covid. They will become the "enemy" to these people who are having to deal with the fact that reality is not going their way. So that's what I'm watching for and hoping that other people do is looking for these examples of the human toll of this and trying to highlight them.
Because I do think one way to counter this, and it may seem naive, is trying to appeal to whatever decency people have in them and saying, "Yes, you can believe this. Yes, you can believe that. But look what is happening to this person. Look at this person who is an underpaid election administrator who's just trying to do their job and getting death threats, and having had police cars outside their house at night," things like that I don't know if it's an effective thing, but I do think focusing more on the human toll of this rather than the broader informational toll is hopefully a way for people to understand the depth of the problem.
Justin Hendrix:
Thank you very much. I direct my listeners to the Galaxy Brain newsletter on The Atlantic. Charlie, thank you so much.
Charlie Warzel:
Yeah, thank you for having me. I really appreciate it.
Related Reading:
- Are Platforms Prepared for the Post-Election Period?
- Tracking Elon Musk's Political Activities
- Seeing Rising Election Misinformation, Americans Say Social Media Platforms May Bear Responsibility for Political Violence
- Tech Platforms Must Do More to Avoid Contributing to Potential Political Violence
- Evaluating the Role of Media in the January 6 Attack on the US Capitol
- How to Reduce the Danger of Social Media Facilitating Political Intimidation and Violence
- Deplatforming Accounts After the January 6th Insurrection at the US Capitol Reduced Misinformation on Twitter
- The Science of Social Media's Role in January 6
- Read the January 6 Committee Social Media Report