(Research summary by Katherine Furl)
Technologies provide their users with a wide variety of different affordances, or all the things it is possible to accomplish through any given technology. Though researchers have considered the ways users’ actions are impacted by platform affordances, less research has focused on how technological affordances can be used in interactional, participatory ways. In “‘Do Your Own Research’: Affordance Activation and Disinformation Spread,” Francesca Tripodi, Lauren Garcia, and Alice Marwick consider how participation and interactions made possible through technological affordances are central to the spread of disinformation.
Tripodi, Garcia, and Marwick reveal how the ability to ‘verify’ false claims through platform affordances allow users to collectively be involved in the spread of disinformation. The authors refer to these as instances of ‘affordance activation,’ where the capabilities of affordances are activated by users attempting to verify false claims. The authors present three cases of affordance activation in the spread of disinformation across three very different platforms:
(1) claims on Twitter that conservatives are being silenced
(2) the use of Google Scholar citations to legitimize pseudoscientific, white supremacist journals
(3) QAnon participants’ use of Yandex, a Russian search engine, to validate a conspiracy claiming the US furniture company Wayfair is involved in human trafficking of children
These three cases illustrate how users leverage platform affordances—whether they be Twitter hashtags, citation metrics on Google Scholar, or the lack of content moderation on Yandex compared to other popular search engines—to lend credibility to unsubstantiated claims and white supremacist and conspiratorial logics.
Importantly, Tripodi, Garcia, and Marwick distance affordance activation from users’ intent as well as the overall legitimacy or veracity of their claims. Instead, the authors focus on technological capabilities and their outcomes. As the authors put it,
“Engagement is less about the veracity of the claim and more about interacting with the affordance.”
By actively participating with the unique affordances of different technological platforms, users are able to directly engage in the spread of disinformation in novel ways. As Tripodi, Garcia, and Marwick reveal through presenting three demonstrative cases of affordance activation, the potential implications of affordance activation are wide-reaching and vital to our understanding of how disinformation spreads and false claims are legitimized.
Digital trace data can change the study of misinformation, algorithmic bias, and well-being.
In this coauthored article, Deen Freelon joins with scholars across the United States and the Netherlands to discuss methods of digital trace data collection and measurement. Digital trace data (or DTD) are the tracks we leave in the snow of the digital spaces we occupy (e.g., cookies in your browser after an online shopping spree or the log data tracked by social media platforms). Although we commonly think of these kinds of data as privacy violations, they are also an opportunity for researchers to overcome the known limitations of platform-centric and user-centric approaches. Of course, collecting this data, organizing, analyzing, and measuring it in meaningful ways comes with its own challenges. Read on for an in-depth deep dive into the best practices for using DTD.
The strategies you teach about search engine literacy might be out of date
In a recent paper for the Library Journal, Francesca Tripodi, joined by Jade Angelique Stevenson, Rachel Slama, and Justin Reich, built and tested search literacy interventions for librarians. Using ethnographic methods on libraries on Montana, Tripodi and her coathors explore the difficulties librarians face on the frontlines of the fight against disinformation. The librarians they spoke with argue that improving search engine literacy requires building trusting relationships with patrons. Additionally, Tripodi and her colleagues offer the following search literacy tips for librarians, teachers, and anyone else educating others to combat misinformation:
- Use Wikipedia as a starting point. People who use Wikipedia find better answers with greater accuracy in less time than people who don’t.
- Don’t judge a site based on its URL. Some .com sites are reliable and some .org sites are misleading! In addition, plenty of nonprofits produce untrustworthy information.
- Don’t trust website appearance. Attractive websites don’t always mean high quality information; plain websites are sometimes accurate and helpful
- Use lateral reading strategies. The best way to check the quality of a website is by leaving that website. Many people verify sources vertically, by reading About pages and checking the links provided by that website. In contrast, best practice is to distrust your own ability to verify quality of the source and see what others are saying about it across websites (i.e., “lateral reading”).
In addition, this paper provides the following insights for policy makers and technology professionals:
- Trust ethnographic research methods. People routinely misreport their search practices, so ethnographic methods provide insight to instructional designers and UX professionals on how the technological affordances and platform design shape information-gathering practices.
- Consider mobile first. In rural areas where broadband is limited, people increasingly rely on their phones for information. Mobile designs can shape search practices.
- Create a budget for training resources. Instructional resources for librarians will pay dividends in the fight against mis/disinformation