Skip to main content

Tech platforms didn’t create our political divides. They aren’t blameless, either. In this episode, Daniel Kreiss sits down with Katie Harbath & Tressie McMillan Cottom to understand the role of “efficiency machines” in social contexts and imagine the guardrails we need for tech platforms to become stewards of a healthy democracy—because public life is far easier to destroy than rebuild.

Listen

or

 

 

About our experts

Host: Daniel Kreiss

danielkreiss.com | @kreissdaniel

Daniel Kreiss is an Associate Professor in the UNC Hussman School of Journalism and Media and Adjunct Associate Professor in the Department of Communication at the University of North Carolina at Chapel Hill. Kreiss’s research explores the impact of technological change on the public sphere and political practice. In Taking Our Country Back: The Crafting of Networked Politics from Howard Dean to Barack Obama (Oxford University Press, 2012), Kreiss presents the history of new media and Democratic Party political campaigning over the last decade. Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy (Oxford University Press, 2016) charts the emergence of a data-driven, personalized, and socially-embedded form of campaigning and explains differences in technological adoption between the two U.S. political parties. Kreiss is an affiliated fellow of the Information Society Project at Yale Law School and received a PhD in Communication from Stanford University.

Guest: Katie Harbath

@katieharbath

Katie Harbath is a global leader at the intersection of elections, democracy, and technology. As the chief executive of Anchor Change, she helps clients think through their civic engagement online. She is also a nonresident fellow at the Atlantic Council and a fellow at the Bipartisan Policy Center. Previously, Katie spent 10 years at Facebook. As a director of public policy, she built and led global teams that managed elections and helped government and political figures use the social network to connect with their constituents. Before Facebook, Katie held senior digital roles at the Republican National Committee, the National Republican Senatorial Committee, and the DCI Group, as well as multiple campaigns for office. She is a board member at the National Conference on Citizenship, Democracy Works, and the Center for Journalism Ethics at the University of Madison-Wisconsin. Follow Katie on Twitter at @KatieHarbath and connect with her on LinkedIn.

CITAP expert: Tressie McMillan Cottom

tressiemc.com | @tressiemcphd

Tressie McMillan Cottom is an award-winning author, professor, and sociologist, whose work has earned national and international recognition for the urgency and depth of its incisive critical analysis of technology, higher education, class, race, and gender. She is an Associate Professor at the UNC School of Information and Library Science (SILS), faculty affiliate at Harvard University’s Berkman Klein Center for Internet and Society, and MacArthur Fellow. McMillan Cottom earned her doctorate from Emory University’s Laney Graduate School in sociology in 2015. Her dissertation research formed the foundation for her first book Lower Ed: The Troubling Rise of For-Profit Colleges in the New Economy (The New Press 2016). With hundreds of thousands of readers amassed over years of writing and publishing, McMillan Cottom’s columns have appeared in The Atlantic, The New York Times, The Washington Post, and Dissent Magazine. Her most recent book, THICK: And Other Essays (The New Press 2019), is a critically acclaimed Amazon best-seller that situates Black women’s intellectual tradition at its center. THICK won the Brooklyn Public Library’s 2019 Literary Prize and was shortlisted for the 2019 National Book Award in nonfiction.

Learn More

In this episode, we referenced:

Transcript

Katie Harbath 00:03

Even just a few weeks ago, you had a congressional hearing about algorithmic transparency on like a Tuesday, and on Thursday was earnings calls. If you’re in the tech companies, you’re getting conflicting messaging about where you should be focusing your time.

Tressie McMillan Cottom 00:24

I think the question for us, both as citizens and for you and I, as people who think about these things on behalf of the citizenry, is to try to delineate what they shouldn’t be able to do, what they shouldn’t be able to do. In my mind the very big answer to that question is they shouldn’t be able to ruin the framework that made them possible.

Kathryn Peters 00:50

Welcome to Does not Compute, a podcast about technology, people and power, brought to you by the Center for Information, Technology and Public Life. Our host this week is Dr. Daniel Kreiss, a professor in the School of Journalism and Media who studies how political campaigns and parties adopt new technologies and how social media platforms’ policies shape political speech.

He’s thinking about what technology platforms owe democracy and how we hold them accountable. To answer those questions he’ll be talking to former Facebook Public Policy Director, Katie Harbath, and with Tressie McMillan Cottom, a CITAP senior faculty researcher whose current work explores the intersection of platform capitalism and racial capitalism.

Daniel Kreiss 1:36

What should platforms be responsible for when it comes to democracy? I think all of our guests answer that question in various ways, but one thing they all agree on is it’s really complicated. I think the big misconception that exists in the public is that social media is destroying democracy. That platforms are responsible for things like polarization, that they’re responsible for things like no one can agree on what’s true and what’s not. No, Facebook and Twitter are not destroying democracy, but neither are they off the hook. They play really complex roles in very complex societies with many different moving parts.

I think they’re the terrain within which political groups vie for power. I think we need a much more nuanced conversation about what platforms are responsible for, what they can effectively take action against, and what their roles and responsibilities should be. Even though platforms might not be responsible for the democratic crisis we are now in, they are absolutely responsible for creating certain incentives for political actors to do and say things that might weaken democracy in the long-term.

The clearest anecdote of massive platform failures to protect democracy is the delayed action around Donald Trump. All the major platforms let Trump violate their policies for a year running into election. None took action until January 6th, which was absolutely the right call. In the process I think what you see is the platforms are responsible for weak policies, flexibly interpreted, and a reluctance to enforce them against powerful political actors. Platforms themselves have really fallen down on the job when it comes to clearly defining their policies, clearly enforcing them against powerful people who would seek to undermine democracy.

What we discuss in these various conversations are the responsibilities that companies like Facebook have to protect democracy and to defend it against those who seek to undermine it. The big question here is: what should platforms do when they’re responsible for navigating this very complex moment and very fractured moment in American history?

Okay. Hey Katie.

Katie 4:17

How are you?

Daniel 4:19

I’m good. It’s been a little crazy around here.

Katie 4:22

You’ve had a bit going on. I’ve been following.

Daniel 4:26

Yes. I don’t know what it looks like to the outside world, but it certainly seems like it’s attracted some attention.

Katie 4:33

Yes. It also reminds me too, I just feel like workplaces are going through a very big transformation right now. I think Edelman’s trust survey said that employees are 50% more likely to feel like they need to speak up and be activists within their organizations.

Daniel 4:51

I think it seems like politics suffuses much more our daily lives, whether it’s work, whether it’s education, whether it’s platforms and the decisions that they’re making than they used to at least 10 or 15 years ago. That’s not to say that the world was any less political 10 or 15 years ago. It’s just that I think there are a lot more active political debates, a lot more unsettledness that leads us to this moment where things are constantly contested in some very fundamental and deeply political ways.

One of the big framing questions that we had for this podcast was, what are platforms responsible for when it comes to politics and broadly, what are they actively shaping versus how are platforms a forum for other sorts of political debates?

Katie 5:52

The million dollar question, right? I look at this in a couple of different buckets. From a tech company perspective, there’s the platform itself, the technology that they can provide and the choices they are making as part of that technology. Then there’s the question of where should the companies be making decisions versus where should people be making decisions and have the freedom of choice?

When you sign up for a platform right now you initially make the choice of, these are the people I want to be friends with, these are the pages I want to follow, but then the companies are making the decisions because that’s too much content for you alone to sort through, are making the decisions what’s in your news feed, what are you recommended and what ads are you shown?

A lot of these questions I think are around who gets to decide where those boundaries are, what those interventions are. A lot of it has been the tech companies making those decisions on their own, which then brings up the concerns around how much power those companies have over the public discourse. Also what, if any due process, is there in terms of if somebody had a piece of content or they were themselves deplatformed, what due process do they have of trying to get back on these platforms or even understanding what the rules are and what rule that they broke.

Tech companies obviously have a role in this, but I don’t think it’s fair to say to them, “We don’t like how much power you have. We want you to do something, but we don’t like that you have to do it alone. We as the government or others aren’t going to do anything to give you any guardrails around it.” We’ve got to find that right balance of what are the tech companies responsible for versus where is it regulation versus where is it just new societal norms?

Daniel 7:44

It strikes me in what you’re saying is that a fair amount of the issue with guardrails is that we as a society haven’t really worked out where those guardrails should and should not be. Part of that is relating to the underlying complicatedness. Tech companies are constantly being asked to step in on one hand, but also exercise forbearance on another and let the free play of speech work itself out. It strikes me that exactly what you’re describing relates to a set of fundamental debates that as a society I don’t think that we’ve completely resolved as of yet. Tech companies seem to be caught in the middle and muddling through.

Katie 8:29

I definitely agree. I’m glad these debates are happening. These are not easy questions. Mis- and disinformation is not necessarily a new thing. However, the tool of using the internet and how fast information can spread and things of that nature certainly are new so it does require some new rules around thinking about that. The only way you’re going to develop societal norms is by actually having those conversations out in the open and seeing how some of these things play out when tech companies do make those decisions or governments pass laws.

We’ve got different models in Europe that they’re exploring. You’ve got the model in India that just passed that is also somewhat troubling, but also somewhat, from the Indian government perspective, it’s also super politicized there in the same way it is in the United States. I think there’s also a role for the international community that I would like to see step up a bit more and having some of these conversations around what should these norms be about how we handle speech online, particularly speech from influential people and politicians, but nobody really wants to touch it because it’s like right at the core of free expression and the protection of political speech that people are so used to. We’ve got to grapple with that and figure out other ways how we hold people accountable for what they say.

Daniel 10:00

I’m curious whether you think, what ethical responsibility do platforms have, what is working when it comes to platform companies broadly in terms of making these speech decisions? Are there any particular areas that you think they’re doing well in currently? Then what ethical obligations do you think they have more broadly? What are they responsible for?

Katie 10:28

I think tech in general needs to be doing a bit more soul searching around the tension between growth and what they’re measured on of being a “successful” company versus, to your point, the ethical questions and safety. Understanding early on and thinking through early on what impacts their technologies might have and what changes may they need to make at the front end.

If you look even just a few weeks ago, you had a congressional hearing about algorithmic transparency on like a Tuesday and on Thursday was earnings calls. If you’re in the tech companies you’re getting conflicting messaging about where you should be focusing your time because your shareholders and reporters and others are looking at, are you successful based upon, is your growth slowed, where’s your revenue at, stuff like that. Then you do have elected officials and civil society and others being like, “Whoa, whoa, we need to slow down, we need to change the business model.’ They need to have an ethical approach of thinking about what their impact on society is and it shouldn’t just be about those growth numbers. Silicon Valley hasn’t had that conversation enough yet.

Daniel 11:52

I think one of the things that you’re pointing to is that a company like Facebook has multiple stakeholders. On the one hand there’s the shareholders of Facebook who expect a certain degree of return on their investments. That’s one way is that, not only venture capitalists, but business journalists are evaluating Facebook’s performance.

On another, there are the vast majority of Facebook users who don’t use the platform for political purposes. They use it to keep up with their friends and family or consume entertainment media and the like, and they have a different set of expectations of the things that they’re going to want to see versus not. Or parents concerned about what their kids are viewing on YouTube or might see inadvertently when they go and look at skateboarding videos or something like that. In addition to political users who prioritize using the political expressive capabilities of the platform to mobilize people to their cause, to make a set of persuasive arguments including deeply controversial ones.

Balancing all those things simultaneously strikes me as a thankless task and at least in the United States, one where the government is not really going to set a lot of guardrails raises the question of political ads in some ways as maybe like a case study. To my eyes, political ads encapsulate all of these debates, so how much should Facebook or any other platform company be screening, let’s say, political ads for things like falsehoods, how much should they enable things like targeting, which also talks about how the design and actual technologies of platforms make these problems harder in some way? What should they facilitate or not? Does Facebook have an obligation to protect democratic discourse, to protect against things like incendiary appeals or polarization?

Katie 13:47

To take a quick step back for folks and put it in context of, after 2016 a lot of questions around what type of advertising did President Trump and his campaign do? They credited a lot to Facebook for their win, so it naturally brought up a lot of questions around that and we brought up the debate about how ads on the internet are not as transparent as ads on other mediums. I think you’re right, it’s a really good case study of self-regulation in some ways.

Now, no doubt that some of that came from not only controversies coming out of 2016, the ads from the Russian internet research agency for Facebook with Cambridge Analytica, all rolled into one to help pressure Facebook, Twitter, and Google, and some of the other tech companies as well to put in these transparency measures. For companies to find the unknown actors, they had to flip campaign finance on its head because past campaign finance was, it started with the actor. You are a candidate, you are a pack, you are this type of organization thus your ads, must have this type of disclaimer.

In order to find the unknown actors who might be talking about these topics, you have to start with the advertisement. The advertisement talks about these types of issues. Thus the advertiser must go through these steps. That started to include a whole bunch of organizations that would not normally call themselves political underneath the old definition. I wish there had been more guidance on how to define what an issue ad, what sort of IDs and stuff would be needed at both the federal and state level for companies to be able to comply to, but there wasn’t. We did it anyways, and I think there’s a lot of lessons to be learned from that process to help make better legislation if you want to.

Then you had companies like Twitter just be like, “You know what? I don’t want to deal with this. So we’re going to say no political or issue ads for anyone.” Google was like, “We’re going to adjust the targeting a little bit.” I personally would like to see are we looking at the overall campaign finance rules, and again, what should the societal norms be that we’re okay with?

Daniel 16:17

Yes, completely. I think you make a really strong case for what’s broken and that at least in the US is the entire regulatory framework around paid political communications. Different mediums have different rules, they have different things they have to accept all the seams in the system where given the federal failure to define what should categorize a political ad on a platform you get radically different definitions of that with Facebook being much more expansive in that definition and Google being much more restrictive.

Then you got companies like Twitter, which I think frankly make really bad decisions because once you ban political ads then what you’re doing is favoring corporate speech, even if that corporate speech has underlying political consequences of it. If you’re an oil and gas company you’re free to keep advertising for oil and gas, but if you’re an environmental organization trying to run ads to change consumer behavior to get people to use more, let’s say, energy efficient cars, you’re going to be banned from Twitter. It seems like in that moment you’re putting the scale in favor of certain interests over others.

Katie 17:28

Yes. I’m also worried too. There’s a world where this goes down where a bunch of the platforms are like, you know what? We just want to be out of the politics business altogether, and you actually have less transparency into the political conversations. I do think that whether you are a tech company, you’re a citizen in academia, the media, anythingbeing okay with being uncomfortable right now with these conversations and being willing to have these conversations to figure out what these norms should be, because I’d rather have this all out in the open and we’re openly debating it versus hiding it and pretending that it doesn’t exist because we don’t want to have the hard conversations.

Daniel 18:12

There’s also this idea that people are just these easy dupes of what they see online, whether that’s in a political advertisement or whether it’s in a message from a friend, et cetera. That they’re going to radically change their political beliefs and ideas, and that that’s going to destroy democracy. I don’t really think that the empirical evidence bears that out. People generally are really difficult to change their minds. However, there is one important thing that I do want to just push back on you a little bit is to say, and it’s to go back to your scale question, it’s that yes, everything should be happening out in the open and I buy that argument. However, there is something qualitatively very different about direct now versus the scale of conversations that can be happening around things like Facebook. To me, that’s a big difference here that needs to be really grappled with.

Katie 19:04

No. I 100% agree with you. I think the scrutiny is correct on them and others to be more consistent in explaining what their policies are and more consistent in enforcing them, sharing what their beliefs are. I absolutely believe that their feet should be held to that fire, but I also think it’s important for us to look at the broader ecosystem as well because it’ll be interesting to see. Right now President Trump still has quite a hold on the Republican party even though he’s off these social media platforms. I’ve been thinking a lot about just that influencer network and to your point of like what people will believe online, what does it mean of who’s sharing it? How is it being amplified for them in different things? Who would they believe?

Daniel 19:54

Can I ask you one more quick question? Thinking through what comes next and in particular levels of accountability, we already talked about those guardrails from regulatory agencies around the world and they may or may not come particularly in a country like the US. I’m wondering just from your perspective, how sensitive is a company like Facebook to public pressure? Help me understand why Facebook makes all the different policy shifts that it does, let’s say, in the course of an election, is that all driven by various arguments that are being made externally that they’re then responsive to or were you all having debates and sort of coming at, oh, this is a better course of action?

Katie 20:33

It’s usually a combination of things. One, it is the ongoing changing environment of what is happening, and this goes into public pressure is absolutely one. What people are comfortable with, what people are okay with of the role of companies having. Then some of it is from testing and seeing how the different tech solutions actually work. In political ads at first the thought was like, “Oh my God, we can never do this because the advertisers, not just political advertisers but regular advertisers would go through the roof. What impact would that have?” Then it turns out– then you do some conversations. It’s like, “Oh, maybe it’s not as bad as we thought.”

Then our first round of what we launched in May of 2018, we learned a lot from there even just having free flowing, you can put in whatever disclaimer you want, didn’t work. When you’re making these policies it’s hard to predict the exact type of situations you’re going to be facing and the exact context, what’s going to be happening, what’s going to be successful. It’s usually a combination of those internal, ongoing conversations with experts, et cetera, what the media is paying attention to, what policymakers are paying attention to, what the company is seeing internally.

Daniel 21:48

What is your overall goal? Exactly in that context, when you say like disclaimers weren’t working, like how do you come to that definition of what works and what you’re striving for?

Katie 22:00

I don’t think there’s any true north star of what is ‘right’ but you just keep trying to tweak it to see what’s the best thing at that point in time to do.

Daniel 22:11

Awesome. Well, thank you so much.

Hey, Tressie, it’s good to see you here.

Tressie 22:20

You too, you got a haircut.

Daniel 22:22

I did. Yes. I had to let the pandemic hair go. What I was hoping to have a conversation with you about today is just, what do platforms owe us in democratic societies? That includes what are they responsible for? What should they be responsible for? You’ve written about many aspects of platforms, the economy platforms, social platforms and people use them and perform on them. Most of my work is a little bit more around institutional politics so campaigns, elections, sometimes movements. I wanted to get your take on what role do they play in democratic life and what role should they play in democratic life?

23:12 Tressie: That second question is bigger than the first because it’s so easy to point and label and see what they’re doing. I think our trick there is to not let them convince us that what we see with our senses isn’t happening like that. That’s the trick. I’ve got a very sophisticated public narrative about what platforms do that is so sophisticated that I think it asks us to distrust our own experience of them, what we can see and touch and feel for ourselves.

Right now they can do just about everything that works for their business model. I think the question for us, both as citizens and for like you and I as people who think about these things on behalf of the citizenry, is to try to delineate what they shouldn’t be able to do. In my mind the very big answer to that question is they shouldn’t be able to ruin the framework that made them possible.

The business model, as most platforms stand now, is to disrupt the environment that made them possible. I really like to frame that for people to go, well now, so you have to ask what are they disrupting and is that more valuable than what they offer us? You’re an institutional person. I can’t believe that at this stage in my life I’m having to defend institutions as an idea and as a functional good. Maybe just maybe what we have learned is that democracy as a practice and a process relies heavily on healthy institutions. Now healthy and what that should mean is probably up for debate.

They just want to supplant institutions with just an idea of an institution rather than having the infrastructure of one. I think that we got to think about how much we rely on helping institutions for democratic public life. What we got to come together around is what the platforms shouldn’t be able to do if we value that, if we value public life and I do.

Daniel 25:18

I think you keyed up a lot of the big questions. Here’s something I struggle with. When I look out at the landscape of what platforms do, I think there’s two things. One is they make it really easy for people like Donald Trump who want to undercut the workings of the Republican party who want to mobilize white backlash, who want to storm the Capitol given their vision of what the country should be versus at the same time, and often based on their very same functionings, Facebook and other social media platforms have allowed Black Lives Matter to flourish, has allowed Black witnessing, to use Alyssa Richardson’s term, has enabled viral videos of police brutality to go around the world and build mainstream movements. It’s the same architecture that underpins both. One is in the service of democracy and one is not, one is in the service of equality and inclusion and one is not, and where platforms fit in that and what they should or shouldn’t be solving for often gets really tricky, right?

Tressie 26:40

Yes. We’ve been talking about that very question. The trade off is how I describe it. That is the fundamental tension and okay, sociologist here so I’m going throw out the word but what we’re talking about here is that we’re living a dialectic where those two things co-create each other, those two things you just described, co-create each other and stem from the same basis. That is what makes I think bipartisanship so attractive. I think that’s what makes that confrontational style of ideology as identity that we see so much of on internet and discourse.

I think that fundamental tension and what platforms do makes becoming those people really attractive because you are actually making the same argument, whether you are on the white supremacist side of the governance platform argument or you are on the Black Lives Matter side. This is something we got to be honest about and careful but honest, you’re actually making the same argument. They’re coming from the same foundation, which is know the platform facilitates a counter argument or narrative to the dominant narrative in a way that makes this world better for me and the people that I cast my identity lot with.

If you are of the belief that that’s the part that matters and you want to create an environment where that can happen, then you’re talking about how do you make it possible for people to find each other, for them to make genuine, authentic connections and for them to, participatory media, for them to have some control over the terms of how that happens. That’s what you’re saying you want.

So then my question becomes, “Okay, is there a way to support that and not support finding your people, connecting with them in an authentic way so you can get together at the white power rally?” That actually turned to like the Black Lives Matter activists who inspire a lot in me. One of those things is that they became increasingly sophisticated about their selective use and deployment of platforms as their organizing and movement making matured on the ground.

I actually don’t think it’s true that right now if you went out and you asked some of these young people who have been out in the streets for like the last 18 months, almost continuously, if they thought like social media, for example, was a net positive. They will actually tell you no or they will certainly, I think, have a more qualified answer, like, “Well, it depends. Does it allow us to do consciousness raising? Yes. Should we be out here using it when we’re doing on the ground action? Absolutely not.” They have a far more sophisticated analysis of what parts of that work. Then I think the question for us is to bring people to the table to say, there’s a benefit to that kind of political mobilization. I’ve got to believe that we can create a governance structure that would allow one and not the other but that’s the sophistication we need.

Daniel 29:49

Yes. No, and I really love that distinction because I think so much of the research, as you know, around things like mediated social movements have focused on the public opinion, the visibility, the media practice side of things, but less so exactly what you’re talking about is that there then becomes the transitional moment and Zeynep wrote about this in her book, where you need to engage in the party building. You need to engage in the organizational aspects and that is not always visible and sometimes visibility kills that work in some ways.

Tressie 30:24

Yes. We conflate what’s good for the platform with what’s good for activism and political life, because the platforms, again, encourage us to think of them that way. They’re not doubling down on their structure because it’s good for democracy. They’re doubling down on it because it’s very profitable and wherever those two things diverge, we should choose political life over profit. I really think it’s that simple, but yes, you’re exactly right. I think about Zeynep’s work, I think about, oh gosh, so much of the work that happened around like tracing Black Lives Matter through like social media content, like scholars like Melissa Brown who trace these ideas across social media content and how those strategies changed.

Again, as Black Lives Matter activists started doing that work that Zeynep points out too, that we see across like Middle Eastern movements especially like the feminist movements that have been happening in that part of the world where they understand that visibility is not a net positive, but see, the platforms would sell you on the idea that all visibility is good. That’s what I mean about, we just got to make a choice about who we want to believe on that, and got to believe that we can do a structure that would allow one without making it so profitable for everyone to do that no matter what their political aims are.

Daniel 31:46

The argument that you’re making now connects really, really nicely with work by people like Chris Bell who has demonstrated that there are certain affordances built into platforms that encourage this form of performativity. Being visible is about performing status, performing political identities and political affiliations that in turn leads people to very skewed ways of understanding what are the ranges of opinion that exists, not only within any one group, but also on the other side or the many sides, so to speak.

Although I did want to ask you about something that follows on for that too, which is that, I think so much in our, especially our media narrative, but also academic narrative puts so much blame on platforms that sometimes I feel like we miss the forest for the trees, which is to say, a lot of what’s been happening in, let’s say, US politics since the Civil Rights era is this larger sorting of the two parties along lines of things like race, on religion, on geography, so when we point to like Facebook and say, you’re responsible for polarization, we’re missing a much broader movement afoot that relates to the two parties and the electorate.

Tressie 33:05

Yes, I would totally agree with that. I don’t think we’re at a point yet where we can certainly blame Facebook more than we would our residential segregation and we chose that. What Facebook actually was able to do or yes, Facebook as the stand in for everybody, you get what we’re saying, is they made it more ruthlessly efficient to maximize our individual choices which were already trending towards extreme polarization along race, class and gender.

What platforms do, their underlying rationale, is a mathematical formula that makes more efficient whatever the social environment was already doing. It isn’t an efficiency machine. You drop social media into a polarizing society and guess what it’s going to make more efficient? I have thought for quite some time that it’s efficiency is part of what I think we should be critiquing and debating and trying to figure out how to break apart more than breaking apart Facebook, the company.

Its model of efficiency seems to me to be the more precise line of attack and to get people to think about the trade off of efficiency, both in our individual lives and to healthier out our institutions and our politics and our political life, because you get rid of Facebook and you’re still going to have schools that are more segregated in 2021 than they were in 1969. That’s a problem. It’s the kind of problem that anybody who wants to profit, make ridiculous amounts of money in this environment is going to figure out a way to make more efficient and more routinized.

Daniel 34:52

I loved your metaphor of, what do you do if you drop a platform into an already polarized society and framing it in terms of efficiency. I think what’s so smart about your comment is that it raises the inevitable questions like what are you solving for? So much of our discourse has just not been clear on this basic question of like, what are you solving for, if you say you’re solving for efficiency, particularly like socially harmful efficiencies.

Tressie 35:20 Yes. One of our challenges is that something like Facebook does solve an implicit problem for us. That is that there are lots of people who want it to do exactly what it’s doing. There’s the latent function, we might say, of something like Facebook is that it is solving for an X. We’re just not honest about the X or overt about what it is solving for.

Listen, I was just in my Facebook group. I live in Pleasantville and so my Facebook group for my neighborhood is 9/10 about schools because that’s what you’re here for. 1/10 is about they saw a lost dog. And they are concerned that that school isthe framing, because it was pitch perfect for our timesI think she called it, she called the school anti-academically gifted in the interest of equity.

You have to think about it. You almost have to write that down and be like, “Wait. She said what?” That to me is what Facebook is solving for the latent function. So, yes, target the efficiencies, absolutely. Care about the health of public discourse so you want to control for some of this extreme manifestation so that it doesn’t get oxygen, absolutely, but then you still got the lady who is saying this about her neighborhood and that is the underlying thing. The problem is that politically, you talk about political polarization, Politically that’s still a really salient message, that we will help you maintain and reproduce that. That Facebook and the like just happened to map onto that in a way that makes it politically viable in a polarized electorate.

Daniel 37:12

It’s not really clear, like what does Facebook do with that lady? I think that’s part of the challenge. Then also, you made me think historically, like where would that person then be without social media? They would be organizing, right?

Tressie 37:28

At the PTA, that is very true, but you wouldn’t have this cross section of PTAs that now almost functions like a lobby group. Yes, there’s something about the scale of like the counter movement that is required now to like get that lady, isolate her in her a little house and make her too afraid to seek out her kin on the internet.

Daniel 37:56

On the same token we had a very political denial of tenure for Nicole Hannah Jones in our school in the past two weeks. Those same things enabled us as a faculty to use Google Docs to write a collaborative statement, put it up on Medium, send it out on Twitter and across the world and do like a massive organizing push. To, again, do visibility very well, exactly what you’re saying.

Tressie 38:22

I can imagine that Google stocks could still have been underwritten as a public form of public communication tool. Not unlike the way we make telephone lines available to people in crisis and in communities, it could just be part of a working community. That statement could have been still done collaboratively and put up on public media, and that it could have run on UNC TV and across the PBS network stations and made it to NPR and then made it to CBS and do all those things.

I got to believe that version is possible and would because of the democratic impulse of the institutions that are underwriting the media wouldn’t make them not impervious, but less susceptible to extremism. Now that’s me sounding again far too much like an institutionalist than I’m comfortable with and I totally get that and accept it and embrace it but it’s a complicated society.

Daniel 39:23

The strength of what you’re arguing then is that there’s a clear set of values at stake, and they are public values at stake as opposed to what you’ve written about in your other work, as opposed to this very simple capitalistic value that’s being driven all by that efficiency and how to, in essence, create less friction to make more money.

Tressie 39:44

Yes. Here’s the thing, Daniel. I don’t think we get the public back if we lose it. I think those need to be the stakes. Not that oh, some people are going to lose their jobs in Silicon Valley. This is how this gets structured, or that somebody won’t be able to find their friends on the internet, as great as that is. I think that the dissolution of public life is far easier destroyed than rebuilt and that we have to say that those are the stakes. I think once you accept that those stakes matter to enough people, that it’s worth defending and defining, you have a really different conversation about what should Facebook be doing, what they should be allowed to do.

Daniel 40:25

That’s an awesome place to wrap. That was a perfect closing. I don’t know if you did that on purpose.

Tressie 40:30

I didn’t but it worked.

Kathryn 40:36

The dissolution of public life is far easier to destroy than to rebuild. Thank you to Daniel Kreiss, our host, for raising these questions today. Thank you to our guests, Katie Harbath and Tressie McMillan Cottom for bringing their insights and expertise to these questions. Thank you for listening along with us. I hope you’ll join us again next week as we continue a discussion of where we go from here and how we address these challenges. It will be a great conversation.

Does Not Compute is a team production. I’d also like to thank our executive producer, Jacob Kramer-Duffield, senior producer, Amarachi Anakaraonye, CITAP project coordinator, and production assistant, Joanna Burke. Our music is by Ketsa and our artwork is by Contia’ Prince. You can find Does Not Compute wherever you download podcasts. You can also find us on the web at citap.unc.edu or on Twitter @unc_citap. Till next week.

 

Credits

Does Not Compute is supported by a grant from the John S. and James L. Knight Foundation.

Made by: Amarachi Anakaraonye (Senior Producer), Joanna Burke (CITAP project coordinator), Jacob Kramer-Duffield (Executive Producer), and Kathryn Peters (etc)

Music: Ketsa, Parallel Worlds

Art: Logo by Contia’ Janetta Prince