Episode 3: Talking About Garbage
A lot of the time, when we talk about disinformation, it’s like we’re talking about garbage—not what’s in the garbage, or who made the garbage, or why the garbage spreads, just that there is garbage and we have to get rid of it. That’s a mistake.
Host Alice Marwick explores the relationship between disinformation, extremism, and media manipulation. Who’s behind white supremacist disinformation? What are their motivations? Why is this content so sticky, and how does it keep getting pushed into the mainstream?
About our experts
Host: Alice Marwick
Alice E. Marwick is an Associate Professor in the Department of Communication at the University of North Carolina at Chapel Hill, a Faculty Affiliate at the UNC Center for Media Law and Policy, and Faculty Advisor to the Media Manipulation Initiative at the Data & Society Research Institute. She researches the social, political, and cultural implications of popular social media technologies. In 2017, she co-authored Media Manipulation and Disinformation Online, a flagship report examining far-right online subcultures’ use of social media to spread disinformation, for which she was named one of 2017’s Global Thinkers by Foreign Policy magazine. She is the author of Status Update: Celebrity, Publicity and Branding in the Social Media Age (Yale 2013), an ethnographic study of the San Francisco tech scene that examines how people seek social status through online visibility, and is co-editor of The Sage Handbook of Social Media (Sage 2017). Her current book project examines how the networked nature of online privacy disproportionately impacts marginalized individuals in terms of gender, race, and socio-economic status.
Guest: Jessie Daniels
Jessie Daniels Faculty Associate at the Harvard Berkman Klein Center and a (Full) Professor of Sociology at Hunter College, and affiliate faculty in Africana Studies, Critical Social Psychology and Sociology at The Graduate Center-CUNY. Daniels is an internationally recognized expert on Internet manifestations of racism. She is the author of several books, including White Lies (Routledge, 1997) and Cyber Racism (Rowman & Littlefield, 2009) She is currently at work on another book in this series, Tweet Storm:The Rise of the Far Right, the Mainstreaming of White Supremacy, and How Tech and Media Helped. Daniels’ latest book, Nice White Ladies: The Truth about White Supremacy, Our Role in It, and How We Can Help Dismantle It tries to understand what’s behind the “Karen” phenomenon, and so much more about white women’s role in the current political landscape.
Guest: Whitney Phillips
Whitney Phillips is an Assistant Professor in the Department of Communication and Rhetorical Studies at Syracuse University. Phillips’ research explores antagonism and identity-based harassment online; the relationship between vernacular expression, state and corporate influences, and emerging technologies; political memes and other forms of ambivalent civic participation; and digital ethics, including journalistic ethics and the ethics of everyday social media use. She is the author of This is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture (MIT, 2015) and the co-author of The Ambivalent Internet: Mischief, Oddity, and Antagonism Online with Ryan M. Milner of the College of Charleston (Polity, 2017). Phillips and Milner’s new book, You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape explains how to understand a media environment in crisis and how to make things better by approaching information ecologically.
CITAP panelist: Francesca Tripodi, Assistant Professor – UNC Chapel Hill School of Information and Library Science, CITAP Senior Faculty Researcher
CITAP panelist: Becca Lewis, PhD Candidate at Stanford University, CITAP Graduate Research Affiliate
CITAP panelist: Katie Furl, PhD student at UNC – Chapel Hill, CITAP Graduate Research Affiliate
In this episode, we discussed:
- You Are Here: A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape. MIT Press. Whitney Phillips and Ryan M Milner.
- Black Trolls Matter: Racial and Ideological Asymmetries in Social Media Disinformation. Deen Freelon et al.
- The Oxygen of Amplification: Better Practices for Reporting on Extremists, Antagonists, and Manipulators. Data & Society. Whitney Phillips
- This Is Why We Can’t Have Nice Things: The Origins, Evolution and Cultural Embeddedness of Online Trolling. Dissertation, University of Oregon. Whitney Phillips.
- This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture. MIT Press. Whitney Phillips
- Nice White Ladies: The Truth about White Supremacy, Our Role in It, and How We Can Help Dismantle It. Seal Press. Jessie Daniels.
- Cloaked websites: propaganda, cyber-racism and epistemology in the digital era. Hunter Publications and Research.
- “Critical Disinformation Studies: A Syllabus.” CITAP. Alice Marwick, Rachel Kuo, Shanice Jones Cameron, and Moira Weigel.
Kathryn Peters 0:04
Welcome to does not compute a podcast about technology people in power by the Center for Information Technology and public life at the University of North Carolina. This week, our host is Dr. Alice Marwick, an associate professor in the Department of communications and a principal researcher at CITAP. Our guests are Dr. Whitney Phillips from Syracuse University and Dr. Jessie Daniels of Hunter College. We’ll also hear from CITAP researcher Francesca Tripodi and graduate affiliates, Becca Lewis and Katie Furl.
Alice Marwick 0:32
Hi. I’m Alice Marwick. I study social media and since 2016, I’ve been researching online disinformation, especially disinformation that’s spread by fringe, far right, and conspiratorial groups. I’m most interested in white supremacists because I think a lot of disinformation pivots on appeals to white identity and frankly, to racism.
Shannon McGregor 0:57
It’s like the disinformation that sticks in some ways sticks because it plays on these tropes, right? Racist tropes, sexist tropes, like whatever, all sorts of like, biases that people have either explicitly, you know, or implicitly, right? That obviously vary across people’s identities themselves. You know, it’s like the sort of receiver of some of that, and so it makes it more powerful.
Alice Marwick 1:20
That’s my CITAP colleague, Shannon McGregor, she studies appeals to white people in political communication. Another of my colleagues, Deen Freelon has looked at how the Russian Internet Research Agency used anti-black stereotypes to amplify their disinformative content. Race and gender, nationality, and sexuality is often where disinformation campaigns pivot. But despite knowing this, a lot of the time when we talk about disinformation, it’s like we’re talking about garbage. Not what’s in the garbage or who made the garbage or why the garbage spreads, just that there is garbage and we have to get rid of it. I think that’s a mistake. In this episode, I look at the relationship between disinformation and far right extremism, specifically white supremacy. Who’s behind white supremacist disinformation? What are their motivations? Why is this content so sticky? And how does it keep getting pushed into the mainstream? I’m gonna start with a guy named Andrew Anglin. Andrew Anglin is a notorious neo-nazi. He founded a site called the Daily Stormer in 2013. He ran a previous site that was full of his like long essays about fascism and he realized that had limited appeal. So we founded the Daily Stormer to basically be the Gawker or BuzzFeed of white supremacy, to appeal to millennials through clickbait, memes, and outrageous humor. So I’m going to give you an example. After Charlottesville, the Daily Stormer ran a story with the headline. “Heather Heyer: Woman Killed in Road Rage Incident Was a Fat Childless 32-year-old Slut.” “Despite feigned outrage by the media, most people are glad she is dead as she is the definition of uselessness,” Anglin wrote. “A 32 year old woman without children is a burden on society and has no value.” The site is so over the top that sometimes it’s hard to tell if it’s a parody or not. But this was exactly England’s goal. He calls the site quote, “non-ironic Nazism masquerading as ironic Nazism.” In 2017, the Daily Stormer style guide was leaked to Ashley Feinberg, who at the time was a journalist at the Huffington Post. Here’s an excerpt.
From the Daily Stormer style guide 3:32
“The tone of the site should be light. Most people are not comfortable with material that comes across as vitriolic, raging, non-ironic hatred. The unindoctrinated should not be able to tell if we are joking or not. There should also be a conscious awareness of mocking stereotypes of hateful racists. I usually think of this as self-deprecating humor. I’m a racist, making fun of stereotypes of racists, because I don’t take myself super seriously. This is obviously a ploy. And I actually do want to gas k****, but that’s neither here nor there.”
Alice Marwick 4:07
This irony is familiar to scholars like Whitney Phillips, who studies digital folklore and trolling culture. Whitney is an Assistant Professor of Communication and Rhetorical studies at Syracuse University. Her dissertation and her first book were on 4chan, a website famous during the 2000s for spawning names like Lolcats and Rickrolling, as well as the vigilante activist group, Anonymous. The kind of humor that the Daily Stormer uses has its roots in this 4chan subculture.
Whitney Phillips 4:35
So I started looking at what I now refer to as subcultural trolling in 2008. But when I say subcultural troll, I’m referring to this particular group of people on and around 4chan’s /b/ board, the random board. And the way that one of the markers of the subcultural trolls, first of all, is that they self-identified. Trolling was something that they actively set out to do. They referred to themselves as trolls. And they engaged in a number of strategies and tactics that were essentially focused on media manipulation. They didn’t call it that, but that’s what they were doing and the strategies that they employed, it related to swaying public opinion, trying to drive wedges between groups, weaponizing, the new cycle, and any sort of hot button issue they could get their hands on, and generally gaslighting people. And the tactics within those sorts of strategies would be, you know, strategic targeted over-the-top provocation, coordinated harassment campaigns, various forms of sock puppeting. So they would pretend to be members of particular groups to drive wedges within those groups, swarming online polls, organizing false boycotts, and outrage campaigns and all of that all of those strategies and tactics were employed, specifically to generate what they referred to as lulz, L-U-L-Z, which was a particular kind of antagonistic laughter that sort of pointed to and took great joy in the distress of other people. And and it sort of hinged on this idea that lul, nothing matters, nothing should be taken seriously, the only reason to do anything is to try to extract lulz, so they would utilize these strategies in a very self-conscious way to elicit lulz and then you know, to basically laugh, and that’s what they did.
Alice Marwick 6:27
The lulz allowed people on 4chan to use hateful words and joke around using racist slurs by claiming that nothing mattered because it was just on the internet. You could do or say anything, as long as it was funny.
Whitney Phillips 6:38
On 4chan and trolling spaces, this was an irony of lul, nothing matters. It was this belief that there were no consequences for anything, and everything just got to be an enormous joke. And so I describe it sometimes as lulz as an aspirational register. It’s not just the kinds of jokes that were being made, but a kind of aspirational quality of “and it is good,” “and it is proper that we don’t actually have to worry about consequences or think about the context. We’re just laughing in this really disconnected, dissociative sort of way.”
Alice Marwick 7:15
This type of irony has a bunch of different effects. It normalizes hateful speech, and allows people who might be curious about the alt right or white nationalism to dip their toe in the water. It creates plausible deniability. You can say, “Oh, I’m not racist. I’m making fun of racists.” But nobody who’s anti-racist is going to stick around in a space where racist speech is common. So it makes sense that a site like the daily Stormer is comfortable with this irony as well, because being able to joke around about racism or sexism is a sign of privilege.
Whitney Phillips 7:47
So it isn’t surprising that that’s the kind of humor that mapped on to these really, really white and really, really male spaces, which is how, you know, 4chan sort of operated, it was very white, very male. And over time, in particular, sort of seeing how this irony has played out politically. I would argue that that kind of irony that nihilistic irony is, in some ways, kind of synonymous with white supremacy, certainly structural white supremacy, where white people are central, and their experiences are universalized. And their movements are always protected by institutions. And generally, they’re shielded from a range of dangers, that white supremacy gets to the ironic they get to wink when they’re being racist. They get to hide behind this rhetorical out. Because the people who are making those jokes who are embracing that kind of humor, they’re, they’re removed from the world’s horrors, they’re shielded from it. You don’t have to take anything seriously. When your body and your spirit are not constantly under attack. You get to detach yourself entirely. You get to pick and choose what you want to care about, and then chuckle darkly about everything else. And that’s what trolling subculture encoded into its early means and to this general idea of lulz.
Alice Marwick 9:08
It’s precisely this deniability that makes this nihilistic irony so dangerous, because it allows white supremacist ideas to leak into the mainstream by disguising them as ironic humor. In 2018, Whitney wrote a report called “The Oxygen of Amplification,” which looked at how mainstream journalists had covered extremists and the alt right during the 2016 election.
Whitney Phillips 9:32
And what happened in 2016, during the campaign is that all of this troller stuff that had become pretty recognizable to people who were conversant in certain internet spheres, they recognized what they thought was sort of old school trolling, classic trolling, where lul nothing mattered. It was all for the lulz. They saw that within these maga spaces, and assumed that that meant that everything they were seeing was just trolling as usual, that all the swastikas and all of the white supremacist language, it wasn’t really real. It was just trolling. And so the initial kind of red flags that potentially could have been and should have been raised were not because a lot of people who were very familiar with trolling culture and meme culture and 4chan, they thought it was all the same kind of joke.
Alice Marwick 10:17
Whitney’s report found that a lot of millennial journalists saw the lulzy troll culture of places like Reddit’s r/The_Donald or 4chan or the Daily Stormer and wrote it off as a joke, rather than seeing it as actual white supremacy. And that’s exactly what Andrew Anglin is trying to accomplish with the Daily Stormer. Ironic humor is one communication strategy of white supremacists. But it’s not the only one. Extremist communities produce a lot of different types of content. And it doesn’t all look like 4chan or the Daily Stormer. To find out more I talked to another expert. So what is the relationship between extremist communities and disinformation?
Jessie Daniels 10:57
Yeah, it’s a great question, and one I’ve been thinking about for a long time, some of my research has hit upon this. And from where I sit disinformation has long been a tool that’s popular with white supremacists.
Alice Marwick 11:08
That’s Jessie Daniels. She’s a professor of sociology at Hunter College, and she’s studied white supremacists and their online presence since the 1990s.
Jessie Daniels 11:17
So in the early, I guess it was the mid to late 90s, I started getting curious about what white supremacists were doing online. And one of the things that I stumbled across actually, from a student of mine was a site called MartinLutherKing.org. that was online for a really long time. It’s now not online, but you could find it through the Internet Archive. But I was really curious about this site, because it looked like a tribute site to Dr. King. And in some ways, the graphic user interface, the GUI, was really crude. And it looked like you know, an amateur had created it, maybe a young person. But on the other hand, it was actually a very sophisticated form of disinformation, especially for the mid to late 90s. Because, you know, when you scroll all the way to the bottom of that page, you discover that it’s hosted by Stormfront, which is a large white supremacist portal, which is still online, and they invite you to come to Stormfront to discuss Martin Luther King. So how is it disinformation? Part of what it did, this MartinLutherKing.org site was it understood very early on and took advantage of the innovation in the lack of gatekeepers in that early form of the web. And part of what they were doing there was saying, “we’re going to try and challenge the successes of the civil rights movement” in a way that is what experts refer to as epistemological. And that’s a fancy way of saying it’s challenging knowledge claims. In other words, how do you know what you say you know about Dr. King? How do you know what you say you know about the civil rights movement? How do you know what you say you know about racial equality? And so part of what they were doing with that early site, is they were challenging those claims of, you know, racial equality that were won through the civil rights movement by undermining Dr. King’s legacy through that site.
Alice Marwick 13:15
What Jessie is pointing to is another tactic of white supremacists, creating what she calls “cloaked sites” that look legitimate or even scholarly, but actually work to spread white supremacists ideas. The goal of these sites is opening the Overton Window. The Overton Window is the range of acceptable ideas in political discourse.
Jessie Daniels 13:35
What are the other ones I found had the sort of multicultural sounding title, “The Forgotten Voices of the Formerly Enslaved.” And basically what they did was they used historical records from the Library of Congress that has an archive of WPA recordings of formerly enslaved people. And they, they took some of those quotes out of context, and re-formed them, repositioned them on this website to make the claim that slavery wasn’t so bad. I wanted to know what young people thought about these disinformation sites and how they made sense of them. And so I found young people ages 15 to 19, who were going to, you know, elite high schools in New York City, and asked them if they would, you know, use any of these sites for their own research if they had to do a paper in high school. And one of the young people told me, “I guess,” about the one that was about slavery, she said, “I guess I can see their point, there’s two sides to everything.” And that, that really struck me for a lot of reasons. But, but one of the things that people were saying at the time was that white supremacists were using the internet to recruit people into their groups. And I thought, this is not what’s happening with this young person. She’s not going to join up, you know, and put on a funny outfit and be one of those kinds of white supremacists, but in some ways, it was much more pernicious, what was going on. She was having her mind changed about, you know, the evils of slavery and whether or not it was a good institution or bad institution, you know, and that that seemed to me a much more nefarious use of disinformation than we were seeing, you know, at that point in the internet. And that’s a white supremacist talking point, right, that slavery wasn’t so bad.
Alice Marwick 15:24
The Voices of the Forgotten site is an attempt to open the Overton Window to ask, “was slavery that bad?” And I’m just gonna go ahead and answer that. Yes, Yes, it was.
Jessie Daniels 15:36
Part of what I found, I’ve talked about since then, is this idea that they’re innovation opportunists, you know that they take every innovation and technology or media, whatever it is, and they find the opportunity in that for spreading white supremacist ideology.
Alice Marwick 15:53
Jessie highlights the fact that extremists are really good at modern media. This brings up another tactic that extremists use to spread their ideology, manipulating the media to push their viewpoint into the mainstream. Just like 4chan delighted in spectacular hoaxes and pranks, modern day alt right trolls like Anglin stage ridiculous campaigns that often get picked up and amplified by mainstream media.
Jessie Daniels 16:18
So this is sort of 2015, was around the time there was this kerfuffle about the racial identity of a woman, who was then known as Rachel Dolezal. She’s since changed her name. But in case people don’t remember this particular event, or series of events that happened, Rachel Dolezal, claimed to be Black and sometimes African American, when in fact, she was raised white. And so there was this, I guess she would say, charge of being an ethnic fraud is the way some people talked about it. The terms that she used for it was that she was transracial. And part of what happened in the controversy following that claim that she had crossed over, if you will, into a Black identity, is that internet trolls on the far right, got hold of this, and sort of started playing with this equivalence, right? If people can be transgender, then why can’t they be transracial, and started goofing around with that? It was a way to score points. You know, sometimes people talk about “owning the libs” it was way to own libs. And they, the trolls, on the far right started a hashtag called #wrongskin. And there was someone I don’t actually know who started this Twitter account of Abu Shakur, that’s a guy who said he “was born Tom Briggs, my slave name,” but he’s now Abu Shakur, one of the leading Black voices of our generation. And then he had all these hashtags about #transracial, #wrongskin. And in a way, you know, if it’s a more contemporary example of the same kind of thing that was going on with the Martin Luther King site, which is playing with how we know what we say we know about race and gender and identity, and it’s trying to subvert it. This is different, though, in the kind of social media internet meme era, because part of it is a joke. Like, even if you found out who the actual person was that started this and tried to hold them accountable somehow for this, they would just play it off as well. “It’s a joke. It’s a joke, why can’t you take a joke?” And it also works on the level of people who are already inclined to sort of believe in white supremacist ideology. You know, it’s like an inside joke for them, you know, and then for people who are not used to thinking about race, it’s just really confusing, right? It’s like, “Oh, well that, like on the face of it, that sounds like it makes sense. Like, why couldn’t you be transracial, if you can be transgender?” Right. What it does is it ends up shifting the terms of the debate, you know, so instead of talking about racial equality, and how Rachel Dolezal’s performance hindered, you know, the goals of racial equality, we end up talking about this false debate, you know, this false equivalency between transgender and transracial. So in a way that #wrongskin hashtag, sort of, you know, saw an opportunity in this controversy about race, and then use the affordances of Twitter to sort of create a disinformation campaign.
Alice Marwick 19:24
Media manipulation is like twisted PR mixed with search engine optimization. It takes advantage of the norms and work practices of journalists and manipulates trending topics and keywords to push fringe viewpoints in front of larger audiences. It does seem that there’s some media outlets that are better about covering these fringe groups, and movements than they were previous to say 2016 but we still see them falling for things like hoaxes. For example. Can you kind of give an example of something like that and how should the mainstream media cover these types of things?
Whitney Phillips 20:04
A few weeks ago now, I guess it was a reporter reached out to me, because they were looking to fact check a hoax that was circulating on TikTok warning women not to go outside on a specific day. This is in April sometime, because men were planning on a, quote, “National Rape Day.” And in this outreach email, the reporter indicated, of course, that this wasn’t true, and that other outlets had already proven that it wasn’t true. And yet still, apparently, this article about this hoax needed to be written. And this is sort of classic trolling, in a sense, because there could be no greater gift to these kinds of hoaxers, these media manipulators, than putting the name of the hoax in a headline taking it seriously enough to try to fact check it. It, also sort of working as a kind of backdoor sort of terror mechanism, because it is a reminder that sexual assault is something that happens every day. So it just was emphasizing the dangers that women face because of sexual violence. So it was perpetuating fear and violence, and also was sort of this gift to the people who are trying to manipulate journalists. And part of the problem here, and in cases like this was that frankly, many reporters’ credulity, or at least their editors’ credulity, this sort of willingness to believe that there are some people who have fallen for this hoax. And because they fallen for it because they don’t have the facts, we then have an obligation to try to report what’s true. And to be very clear, this is there is no National Rape Day, it’s not a thing. But that doesn’t take into account the fact that, again, this is classic trolling rhetoric. These are classic trolling strategies. It could be the case that some people think that this is real, but it is more likely that the people spreading this kind of hoax, they know it’s a hoax. They’re doing it because it’s a hoax. They’re doing it because it’s going to rile people up. And so this impulse to try to fact check it. All that ends up doing is it spreads the message and brings more people into the story sort of raising the question mark? Well, if everybody is talking about how seriously people are taking this, maybe there’s something to take seriously here.
Alice Marwick 22:18
What’s really important here is that these extremist groups understand the media really well. They know journalists source stories from Twitter, so they use Twitter to spread rumors and make topics trend. They understand that the media needs constant new content and loves novelty and sensationalism. They recognize that journalists want to present both sides of the story. And they understand that ranking and recommendation algorithms and search engine results can be gamed. But why? What’s the point of all this? Don’t forget, these groups are motivated by ideology. In many cases, they genuinely believe that their racist, anti semitic, misogynist, or hateful beliefs are important, and they want to spread them. But journalists play an important role themselves.
Whitney Phillips 23:04
These groups need to manipulate the media because their numbers are often quite small comparatively, that’s not to say that they can’t do a great deal of damage and that they’re not potentially individually quite dangerous. But their numbers are often pretty small. Their resources are often pretty limited. And their messages, especially the explicitly hateful messages are often unpalatable to many people within the mainstream. And mainstream journalists, mainstream media can help with all three issues, including the fact that they can help launder extremist messages to seem more reasonable.
Alice Marwick 23:40
But this is old wine in a new bottle. These racist ideas are nothing new. The technology may have changed, but the underlying assumptions about people are coherent with American culture and history. Jessie Daniels agrees.
Jessie Daniels 23:53
One of the things that I think people often journalists are missing when we’re trying to understand white supremacy is that they’re, you know, like a magpie to something shiny. It’s like, Oh, look, there’s a new technology and suddenly, there are white supremacists there. And that is not new. If there’s one thing that I would try and get across, it’s that this happens at every innovation of media communication technology is white supremacists are right there. Right there, looking for an angle that they can use this new innovation as an opportunity to advance their political ideology. So for example, if in this article I talk about, there are some movie studios that white supremacists created in the 1920s. Most people know that Birth of a Nation was a white supremacist film. Many people know that it was used by the Klan at the time to recruit people and they showed it at picnics and church groups and things like that. What many people don’t realize is that the white supremacist of the time the KKK created their own movie studios because they saw such opportunity in this new form of communication and, you know, talking picture films that they wanted to create their own. So they created like dozens of white supremacist films. And so I think we have not learned that lesson of history that every single time there’s something new, there’s going to be white supremacy is there and I think, you know, I mean, one of my other big points in all this is to follow the money. I mean, I think, you know, the development of cryptocurrency and Bitcoin and all that is another place that we have to pay attention to the way that white supremacists are exploiting that. You know, there’s a Twitter account, something like Neo-Nazi Bitcoin, that tracks payments, to neo-nazis through Bitcoin, that I just think we have to pay closer attention to them. I mean, my big thing is you have to pay attention to technology and white supremacy at the same time. And most people are not doing that.
Alice Marwick 25:51
In fact, the focus on technology may obscure the fundamental underlying dynamics.
Whitney Phillips 25:56
Generally speaking, when folks are talking about problems online, they’re looking at the sunlight zone of the ocean where everything is really visible. And that’s where all the actions are, that’s where we can see people posting stuff that looks like radicalization. And that’s where the conspiracy theories live. That’s where everything is that we can measure and quantify and freak out about. Because the focus is there, this stuff that looks like radicalization, or lines up with what an idea of radicalization is, the solutions to the problem tend to stay within that sunlight zone layer. So then we’re talking about demonetization, we’re talking about moderation, we’re talking about, you know, the things that are on the platforms, because that’s presumed to be what will fix the problem, because the problem is all of the bad radicalized-looking stuff online. And the problem with that is that, you know, if you look at the layers of the ocean, the further down you go, the murkier things become. And in terms of sort of where people get their ideas and bigotries manifest, it sort of once you dive beneath the the zone of awareness, right, like people, they might have stories about bad, different others that they’re sort of aware of, and then it becomes a little bit murkier as they go down. And the further down they go, the more sometimes obscured a person’s reasons are to themselves, and all of that stuff that’s murky, or totally obscured totally in the midnight zone. That’s actually where all of the answers to our problems are. That’s where beliefs are formed. That’s where narratives are constructed. That’s where deep memetic frames live. Those are all of the actual drivers of the action that we’re seeing. So to the extent that there’s a thing called radicalization, it lives down here, and it’s cultivated over time. It’s not something that you get on Facebook, it’s something that you get from a lifetime of education from various arenas in your life. And this is the place where we’ve got to direct our solutions. And we’re not going to get there, and we’re not going to be able to have the sort of deeper conversations about like, where does white supremacy come from? Guess what? It’s not Facebook, Facebook is where we see it. And so then Facebook seems like an especially problematic space, and it is and all kinds of ways but Facebook doesn’t make white supremacy. We would be, I mean it would be so much easier to deal with this if it were because then you just fix the shit on Facebook, but we are not there. And so we’ve got to think of ways to sort of get down beneath all of that. But we’re not getting there right now. That’s not where the conversations are taking place. So it’s like the stuff that looks like radicalization, you know, is up here, but all the stuff that causes those actions are down here. And this is where I wish that we were spending more of our time.
Alice Marwick 28:38
Hi, everybody, thanks for agreeing to do this. It’s great to have the opportunity to talk to all of y’all. So this episode is about the relationship between extremist communities, media manipulation, and disinformation. So I asked you guys all here because I think you have important things to say on this topic. So one of the things that I’m really interested is the use of humor and irony and extremist communities. And so I’m going to start with Becca, who I know has looked at this, what do you think the role of irony plays in these sort of far right or extremist communities?
Becca Lewis 29:15
Yeah, irony is a really important factor in radicalization and foreign communities know that and explicitly talk about it. And the reason it’s so effective is because there’s kind of a plausible deniability built in, you can reach more people when you’re making jokes that have kind of a racist undertone or a sexist undertone, than if you’re just spreading serious propaganda. People may be drawn to that humor because they see it as subversive, transgressive you know, wanting to shock people, for whatever reason, they can kind of embrace it a little bit more easily and still have that plausible deniability of saying, Well, I’m not actually saying these things seriously.
Alice Marwick 30:03
So this episode mostly looks at white supremacist groups. But there’s lots of other types of extremist groups as well, what’s the relationship between like anti feminist groups and something like white supremacy?
Katie Furl 30:16
We see similarly in male supremacist groups, like incels or involuntary celibates or other adherence to the red pill ideology. There’s work being done to cloak misogyny and anti-feminism in their rhetoric, and the sort of pseudo-intellectual package using terms and language that originally appeared in academic fields to make their ideologies and worldviews seem more credible or palatable to a broader, a potentially recruitable, but at least convinceable audience online. I think a good example of this and I know that we’ve discussed in the past would be the co-opting of by anti feminist groups, the term hypergamy. And this is something that originated in the social sciences in the late 19th century. And originally, it was used to refer to a social phenomenon where women would marry up socioeconomically. Again, this was in a historical context where this is being used, where other opportunities or pathways to upward social mobility were essentially closed off to women. However, when it’s being used by male supremacists and anti-feminists online, the term hypergamy is instead like transformed from a necessary strategy that was used to achieve financial security by certain women in a very specific historical context to something that’s voluntary, and is a malicious choice that’s characteristic to all women, regardless of financial circumstances, to leave their current partner who’s always assumed to be a man for the next best thing at the drop of a hat.
Alice Marwick 31:55
So this points to sort of this other strategy used by some of these extremist groups, which is kind of trying to, I don’t wanna say whitewash, but they’re kind of like trying to science wash, their sort of like old school hateful ideologies by, you know, trusting them up in a white lab coat and using fancy words and trying to make them seem more legitimate.
Becca Lewis 32:14
Race realism, as they call it, is incredibly popular, and a really powerful radicalization vector. And basically, what it relies on is centuries old, scientific racism. So this has been a historical trend, starting in the 18th century and into the 19th century, were scientists who were driven by racism and a desire to claim that white people were intellectually morally superior to other races, they would kind of use the trappings, the terminology, the tools of science, to try to make these claims. And so you had things at that time, like phrenology, measuring people’s skull sizes to try to show that, you know, white people’s skulls were bigger than black people’s skulls. When you look at, you know, the way that eugenics programs in the United States, for example, drew on this scientific racism research. And in fact, then Nazi Germany was inspired by the United States eugenics program. So you can see kind of these links. And yet despite all of that, a lot of this research has continued and persisted in kind of both alternative academic spaces and at times within mainstream academic spaces. And a lot of what YouTubers do is draw on that research that has continued to make claims about biological differences between the races. And the the powerful thing about YouTube is that people don’t even have to read those books. They don’t have to go back and kind of, you know, go through the entire Bell Curve to get this propaganda they can listen to someone give a summary of The Bell Curve or kind of, you know, dilute its points and put it into an entertaining video.
Katie Furl 34:06
A lot of the internal logics that are used to justify what might be seemingly on their face, disparate hatreds among misogynists, anti-feminists and white supremacists are actually very porous and transferable. So one example that I’m particularly familiar with, since it’s especially common among misogynist incels, or involuntary celibates is an unwavering preoccupation with the tiniest details of human physical appearance. There’s a lot of quantifying, categorizing, and ranking of just minute aspects of how you look, the angles of people’s jaws, the tilts and the distance between their eyes, the distance between the bottom of someone’s nose and the top of their lips. Those are all used to sort people along these hierarchies that are, at least on the surface about romantic and sexual prospects, but they’re also taken either explicitly or implicitly as measures of their inherent value or worth. And those hierarchies among male supremacists are not only gendered, they’re often racialized as well. And in that way, males supremacist communities online can serve as a gateway to white supremacy and other prejudices that one might participate in online. Because if you’re willing to accept that the inherent worth of a person is something that’s inextricably linked to the shape of their jaw, or their exact height, and you’re also able to use that to justify your misogyny and anti-feminism, then potentially, you can similarly be persuaded to apply that same logic to race and ethnicity, and use it to justify racism, ethnocentrism, and ultimately, white supremacy.
Alice Marwick 35:47
So this brings up the question of, you know, some of this content historically circulated in pretty fringe spaces. And now more and more, it’s accessible, or it’s even sort of promoted. And is, it’s much easier for these ideas to reach a larger audience. How does this happen?
Francesca Tripodi 36:10
I think looking at the historical legacy of these tropes, these narrative tropes. So for example, that civil rights activists are communists, you know, this is a strategy that’s been going on since George Wallace and the quote unquote, Red Scare of the civil rights era. And so I think what’s interesting to me is less like, how it goes from fringe to mainstream, but rather, this was part of mainstream, it became increasingly unpopular to say these things as mainstream. And so there has been a concerted effort by conservative politicians and media pundits to rebrand what was once explicit racism into just very mainstream dog whistles, right? That concepts like law and order, or concepts like privatization, or individualism, were actually historical legacies of maintaining segregation in the 1950s and 60s. And so something that I think is really important is looking at that through line of disinformation campaigns, and that it’s tempting to think of it as only this extremist and that, yes, we classify white supremacists as extremism, but that in so many ways, it’s just, it’s just been interlaced throughout.
Katie Furl 37:26
I think that’s a really good point. I also think that it’s important to remember how these are things that are underlying a lot of mainstream society, these prejudices, they have centuries long histories, and they’re still present.
Becca Lewis 37:41
It’s so much easier to kind of try to treat these things as fringe moving into the mainstream. And it’s actually incredibly important to grapple with the fact that, you know, they may ebb and flow a bit. But for the most part, these underlying structures have been there. But just to delve into, I think there are some things that the internet has provided that are kind of effective tools for people to continue to spread and amplify these ideas. And one thing that the internet has provided is an easy way to link and network kind of these genuinely fringe communities kind of open neo-nazis and white supremacists to more mainstream discourse. And so there’s a lot of ways we can see that happening. But one example is YouTube or other mainstream social media platforms acting as amplifiers. So you have conspiracy theories that start in very fringe online spaces. But then a YouTuber will cover the same conspiracy theory and it gets millions of views. And then you start to see the conspiracy theory and make its way into Fox and other more mainstream conservative spaces. And that I think, is a specific mechanism that we’ve seen emerge more recently, that kind of repeats, you know, it helps repeat these older patterns from the past.
Francesca Tripodi 38:54
I think in terms of the reach and scope, the internet is huge. But I do think this idea of like cross promotion, and having certain celebrities serve as guests on these different shows and channels, right, is stuff that they were doing with radio broadcasting as well.
Becca Lewis 39:13
Or even seeing Richard Spencer who his genius was being able to kind of present white supremacy, as you know, this dapper, sophisticated thing. And that’s exactly what David Duke was doing in the 1970s. And it was just that he was using daytime talk shows and Richard Spencer was using kind of like online digital publications, right. So I agree completely, it’s same goals, even similar strategies.
Francesca Tripodi 39:37
One thing that I’ve been thinking a lot about is the way that people manipulate how information is ordered and the relationship that has to the way you see the world. And so I think a lot about how returns are shaped by your starting points, right? What are the key words that you’re beginning with when you start information in this study done by Esther Hargittai back in 2007, she was looking at information seekers trying to learn more information about the morning after pill. And they were searching for morning after pill, one of their top returns was this website morningafterpill.org. And because that was .org, they thought it was more credible, but and this website still exists. So if you go to it, you can still see it was actually run by an anti abortion group whose goal is to get you to not take Plan B or the morning after pill. And so what’s fascinating right is how if you look at the historical legacy of conservative elites and conservative activists, you see a really deep understanding of how information flows. And so in the same way that they were creating a bunch of small scale publications and referencing them on radio networks, and then buying up book publishers and then printing out mass copies of books and distributing them at different conferences, so that they gamed The New York Times review.
Becca Lewis 41:06
The mainstream media plays an incredibly big role in a range of different ways. And the far right has been incredibly savvy and how they interact with the mainstream media, far right figures have, for a long time known that, even if the vast majority of figures who read that coverage, see people in a negative light, there may be a small fraction of people that see it in a positive light. And at the very least, it gives them a chance to continue to disseminate their message, which helps to frame broader public discussion of various issues. So again, you have a similar dynamic here, once again, playing out with the mainstream media, where these well respected media organizations assume that by providing coverage, it’s necessarily going to kind of help counter these hateful movements when instead it often ends up lending them a helping hand, disinformation doesn’t exist in a vacuum, it is used as a tool in service of these broader ideological goals. And I think that so often that gets kind of forgotten in these broader conversations. There’s an assumption that if we simply present people with the right facts, that then they will kind of see the light and understand the error of their ways. And that’s not how it works at all that people are incredibly invested in the narrative that disinformation is given to them. And it’s because it feeds into these larger issues that they care about.
Kathryn Peters 42:40
Thanks to Alice for walking us through how these media manipulation campaigns work and why. Thank you also to our guests, Whitney Phillips and Jessie Daniels for sharing their time and talents with us and to our CITAP guests Francesca Tripodi, Becca Lewis and Katie Furl for their insights and additions. Next week, Francesca Tripodi will be hosting Does Not Compute to talk about the role of search engines and doing your own research in how disinformation spreads. Spoiler alert, it’s surprising.
Does Not Compute as a team production including our researcher hosts and guests as well as executive producer Jacob Kramer-Duffield, senior producer and editor Amarachi. Anakaraonye and CITAP Project Coordinator and Production Assistant Joanna Burke. Our music is by Ketsa.
You can find us on Apple podcasts, Spotify, or your favorite podcast listening platform at Does Not Compute. On the web, visit us at citap.unc.edu or connect with us on Twitter at @unc_citap.
Does Not Compute is supported by a grant from the John S. and James L. Knight Foundation.
Made by: Amarachi Anakaraonye (Senior Producer), Joanna Burke (CITAP project coordinator), Jacob Kramer-Duffield (Executive Producer), and Kathryn Peters (etc)
Music: Ketsa, Parallel Worlds
Art: Logo by Contia’ Janetta Prince