Episode 1: Introducing Does Not Compute

On our first season of Does Not Compute, we’ll be talking about identity and disinformation. How do our communities shape what we search for, share about, and even believe? And how do malicious actors manipulate our identities to promote their ideologies? What role do big tech platforms play in spreading disinformation, and how can they help address the problem?

We won’t quite answer all those questions and more in just one episode, but we’ll begin digging into these topics and introducing our hosts.

 

Listen

Apple Podcasts

Spotify

Learn more

In this episode, we referenced:

Full transcript

Deen Freelon 00:00

What I’m really interested in is the overlap between conspiracy theories, disinformation, and racism. It strikes me that a lot of the conversations about conspiracy theories seem to have overlooked that at least formally a lot of the aspects of conspiracy theories have a lot to do with the structure of how racism works psychologically.

Kathryn Peters 00:23

That’s Deen Freelon, a professor at the University of North Carolina’s Hussman School of Journalism and Media. Deen’s work has explored how the Black Lives Matter hashtag and movement took shape on Twitter, how the Russian Internet Research Agency impersonated Black Americans on Twitter as a disinformation tactic and how the left and right organized differently online. In short, he’s an expert on identity and digital politics and he’s exploring if there might not be even more connections to discover.

Deen 00:48

Do you think that conspiracy theories and prejudice, in general, share similar underlying psychological risk factors in defense mechanisms? These would be things like possibly high need for control, prediction, belief that one’s own group is oppressed, what’s called the need for cognition or critical thinking, believing anecdotes over data, ignoring conflicting evidence, that sort of thing? Are there any sort of underlying psychological mechanisms that seem to be shared across both prejudice and conspiracy ideation?

Kathryn 01:20

Welcome to Does Not Compute, a podcast about technology, people, and power hosted by the Center for Information Technology and Public Life, or CITAP, at the University of North Carolina. At CITAP, we study technology as it’s tangled up in our society and our economics, race, politics, and culture.

Deen 01:37

It’s especially important to understand that the internet specifically and technology generally does not do unilaterally. In other words, we all have to– My colleagues will recognize this as the obligatory, a dismissal of technological determinism. You’ve got to stamp that in the face before you go any further on that.

Kathryn 01:57

CITAP’s researchers study how social differences, including race and ethnicity, gender, class, and sexual identity shape the dynamics of technology and information systems in unequal ways. On our first season of Does Not Compute, we’ll be talking about identity and disinformation. How who we are, and what communities we claim shape what we search for, share about, and even believe. A lot of the public conversation we’re having about disinformation is incomplete or even wrong. Many stories start with the 2016 US presidential election and define this as a new problem.

Most of the focus is on social media platforms, and plenty of people assume that they’re the cause of the problem. Between fact-checking or post-labeling, we get caught up in what’s true and what’s false in ways that make us lose sight of why these stories get created and how they really spread. Maybe you saw The Social Dilemma.

 

Roger McNamee, The Social Dilemma 02:46

If you want to control the population of your country, there has never been a tool as effective as Facebook.

Tristan Harris, The Social Dilemma 02:53

We have less control over who we are and what we really believe.

Kathryn 02:58

The Social Dilemma suggests that social media platforms have caused polarization, have caused higher rates of anxiety, that their engagement algorithms are sucking us away from our lives. They even play Nina Simone singing, I Put a Spell on You to suggest how powerful the tools they’ve built are, but what if we told you that the disinformation problem we’re addressing now isn’t caused by social media? It’s packaged in their shiny new push notifications but in fact, it’s rooted in some of the deepest and most lasting divisions in our history.

The Social Dilemma gets one big thing right, we do need new regulation of big technology platforms if we’re going to protect our democracy and our public discourse, but like many other stories about technology, the movie overlooks the human factors that we also need to understand if we’re going to design meaningful solutions. We need to understand who’s telling these false stories, why they’re choosing to do so, and how they spread them through manipulating the media and our emotions. We need to understand who shares this information and disinformation and why they do so as part of their online activity.

We want to understand the communities that are targeted with disinformation campaigns, and how community organizers and other leaders are able to put out accurate information and push back on lies that come into their communities. We need to understand what the tech platforms do that helps miss and disinformation spread, how they contribute to the problem and therefore what they can contribute to helping solve it. I’m Kathryn Peters, CITAP’s executive director and your guide through this season of Does Not Compute. I came to the University of North Carolina in July of 2020 after spending 10 years doing online voter engagement.

I’ve built tools that helped people register to vote and send them reminders about when elections were happening. I created data sets that let them know where polling places were and what their hours might be, and in 2016 I had a front row seat to seeing how disinformation can sow confusion and create doubt in people’s minds. I found myself asking, “What would happen if we ran a free and fair election and no one believed in the results?” That question led me to the researchers at CITAP, who I’m pleased to share are deeply expert in these questions and have been working on them for far longer than since 2016 alone.

Over the next few weeks, each of CITAP’s researchers will host an episode of Does Not Compute, asking and answering these questions with one another, and outside experts. Next week, you’ll hear how Deen Freelon answered his questions about racial bias and conspiracy beliefs. Episode three will be hosted by Alice Marwick, a professor in UNC’s College of Arts & Sciences, who wrote Media Manipulation and Disinformation Online. She studied the manosphere, QAnon, and how and why fringe communities operate online to promote their beliefs.

Alice Marwick 06:01

There’s obviously a whole body of people studying the far-right, men’s rights communities, neo-fascists, et cetera. Then there’s a whole bunch of people studying disinformation, and there’s obviously overlap between those two, but then I think sometimes when we talk about disinformation, it tends to get strangely neutralized and the really big impact that extremist communities have on disinfo I think gets lost.

Alice 06:28

This ability for far-flung people to connect on issues that may be niche in their face-to-face community lives. If you’re living in a place where it’s not acceptable for you to express outright racism- I’m not really sure what places those are, but if you are in that context, you can then feel very comfortable with that.

Kathryn 06:47

If you haven’t considered the dark side of participatory culture yet, Alice knows just how deep the rabbit hole can go. Identity doesn’t just shape what we believe in who we connect with though. Its influence is deep enough to change what we search for and how we interpret the results. Episode four is hosted by Francesca Tripodi, a professor in the School of Information and Library Science, who has studied how conservative communities’ use of scriptural inference or media literacy skills derived from biblical study shapes their online search behavior in ways that conservative news sources exploit.

Francesca Tripodi 07:15

Conservative media and literacy practices, in particular, this concept I call scriptural inference, which is the way in which conservatives lifts texts from what they describe as sacred. It started with evangelical teachings of the Bible, but then it’s been transferred into things like The Constitution or even just Trump’s discussions specifically. When the document itself becomes sacred, it’s easy to decontextualize its meaning.

Part of the baked-in conspiratorial logic is saying, “Don’t just believe me, go out there and do the research yourself,” and when they do the research themselves, which in this information environment means going to Google, you seek out these keywords and phrases, and then before you know it, the search results are reconfirming what you come to the panel to seek out.

We saw this actually– Safiya Noble was one of the first persons to speak about this in her really amazing Algorithms of Oppression, but she talks about how Dylann Roof going to Google, searching out Black on White crime, returned The Council of Conservative Citizens, which helped him write his manifesto and led to very real-world violence. He was out there doing his own research, but by understanding and knowing these objectives of their audience, the platforms become very exploitable practices. We’re seeing this use of media literacy to weaponize information in ways that I think is really powerful.

Kathryn 08:47

Disinformation campaigns cause real harm. Rachel Kuo, a post-doctoral researcher at UNC, studies race, social movements, and digital technology. She’s also a co-founder of the Asian-American Feminist Collective, who has seen firsthand how disinformation campaigns manipulate community identities, and how organizers push back. In episode five, she’ll share those stories.

Rachel Kuo 09:08

It feels like we’re in this never-ending crisis, and really thinking about– Especially with grassroots organizing, that we’re constantly put in a rapid response mode that makes it really difficult for a longer-term work or a long view. We’re also thinking about these platforms that are designed both for speed and distraction. There’s this presentist view that often happens, which accounts for this historical amnesia. Thinking about, “How did this happen?” but there’s a longer history that undergirds that.

I think when we think about solutions, they can’t just be technological ones and we can’t rely on the technology themselves for our social and political problems, especially when also these corporate platforms lack a kind of deeper relationship-building and accountability to marginalized communities, and that often actually- the kinds of ways that corporate platforms are embroiled in this diversity, industrial complex also really undermines justice and ways that we think about that playing out. I think one of the things that really struck me is the role of history that comes up again and again. As you mentioned, how people are navigating different sources and also anchoring their political identity here in the US.

Kathryn 10:19

Shannon McGregor, a professor in the School of Journalism and Media who studies the role that social media play in our political processes, hosts episode six. She’ll be thinking about how the deep hooks that disinformation has in our identities and communities informs how we respond.

Shannon McGregor 10:34

It’s like the disinformation that sticks in some ways sticks because it plays on these tropes. Racist tropes, sexist tropes, whatever, all sorts of biases that people have, either explicitly or implicitly that, obviously, vary across people’s identities themselves like they’re a receiver of some of that and so it makes it more powerful. I think the point that we’ve been trying to make is all the truth in the world can’t solve that kind of disinformation problem. This is not a problem that can be solved by bringing better facts to bear.

When your embrace of disinformation is undergirded by white supremacist ideas and by hanging on to power and fighting against a multiracial coalition, you know the facts and that’s what’s motivating you to engage in this sort of behavior. I think what we’ve been trying to argue and what I think happened here is not a culmination of people being fed the wrong information and taking action on that, it’s that that disinformation fueled ideas and actions that they were already primed to take. Disinformation plays a role, but it is not on its own an organic and motivating feature.

Kathryn 11:45

Daniel Kreiss studies platform governance and how tech companies’ policies shape our political communication as a professor at the Hussman School of Journalism and Media. In episode seven, he’ll dig into what these tech platforms can and can’t do to support healthy democracy.

Daniel Kreiss 12:01

We don’t think Facebook is causing this or Google is causing that, but Facebook and Google are interacting within other political forces in political systems in political contexts, which shape how they get used, but then also the effects that they might have over what incentives political candidates have to engage in certain forms of speech and not others, et cetera.

Oftentimes, the internet and platforms and social media can provide a convenient excuse and enable many to misdiagnose the fundamental problems. It can provide a convenient narrative of what is wrong in a way that enables many to sidestep what is actually wrong, as well as the history of the US. This includes the fact that US is a young democracy. It’s only multiracial multiethnic since the 1960s and yet, we often talk about things as being from time immemorial the US was a democracy. Information disorder is not some deraced apolitical generalized phenomenon.

It has underlying political and racial roots that we need to focus on. You see lots of concerns over things like social media and polarization, as if polarization only happens through the affordances of platforms. Again, does not have underlying political group power and status routes that play out online.

Kathryn 13:18

These are the hosts of Does Not Compute. Along the way, they’ll interview other scholars, community leaders, and experts from a variety of fields. We’ll also pull in the talents of other CITAP affiliates, including Professor Tressie McMillan Cottom, an expert in how we recreate bias across systems, including online.

Tressie McMillan Cottom 13:36

I think it does not matter whether or not people believe what they’re doing is democratic gatekeeping. It is that you are doing democratic gatekeeping and you can’t do that without at least representative democratic governance. That democratic is not about the form of participation, but about the form of control. I don’t think that platforms can be counseled out of understanding their role in the reproduction of some of the uglier impulses of public life unless they are made to do so through regulation, governance, and probably fundamentally structuring the economic incentives for them not to do so.

Kathryn 14:21

We’re telling these stories because solving for polarization, or disinformation, or whatever magical spell that we think this technology might have cast over us, won’t fix things if we don’t also address the social, economic, and political incentives that drive the problem. We’ll just keep innovating new forms of the same old inequalities. After the 2016 election, I worried that disinformation campaigns could undermine the work I’d done to increase voter participation.

CITAP’s research shows how much bigger the problem is than just lies shared online. It also suggests how we might build healthy public lives and a robust democracy in the internet era. We’re excited to share all of this with you on Does Not Compute. You can find us on Apple Podcasts, Spotify, or your favorite listening platform as Does Not Compute. On the web, visit us at citap- C-I-T-A-P .unc.edu, or connect with us on Twitter at @unc_citap.

Does Not Compute is the work of an amazing team including our faculty hosts, our executive producer, Jacob Kramer-Duffield, senior producer and editor Amarachi Anakaraonye, and CITAP project coordinator and production assistant Joanna Burke. Music is by Ketsa. Thank you so much for joining me for this inaugural episode of Does Not Compute. We look forward to having you join us again next week with Dr. Deen Freelon discussing conspiracy theory belief and bias.

 

Credits

Does Not Compute is supported by a grant from the John S. and James L. Knight Foundation

Featuring: Tressie McMillan Cottom, Deen Freelon, Daniel Kreiss, Rachel Kuo, Alice Marwick, Shannon McGregor, Kathryn Peters, and Francesca Tripodi

Made by: Amarachi Anakaraonye (Senior Producer), Joanna Burke (CITAP Project Coordinator), Jacob Kramer-Duffield (Executive Producer), and Kathryn Peters (etc)

Music: Ketsa, Parallel Worlds

Art: Concept by Contia’ Janetta Prince