Not everyone deserves their own Wikipedia page; that’s why Wikipedia’s notability guidelines exist. But definitions of notability are unevenly applied across race and gender lines:
Wikipedia’s editors are less likely to consider you “notable” if you’re not a white man.
Using a combination of qualitative and statistical analysis of Wikipedia pages nominated for deletion, Mackenzie Emily Lemieux and Rebecca Zhang join with Francesca Tripodi to explore how Wikipedia’s notability considerations are applied for female and BIPOC academics. They examined two key metrics used in the process of establishing notability on Wikipedia: the Search Engine Test and the “Too Soon” metric. The search engine test determines if a person’s online presence is well covered by reputable, independent sources. In their analysis, Lemieux, Zhang, and Tripodi found that this test predicts whether or not white male academics’ pages will be kept or deleted. But academics who are women and people of color are more likely to have their Wikipedia page deleted—even if they have equivalent or greater online presence than their white male peers.
The second metric, “too soon,” is a label applied to Wikipedia pages when a Wikipedian thinks there aren’t enough independent, high quality news sources about the page’s subject. Women of all races are more likely than men to be considered not yet notable (i.e., “too soon” to be on Wikipedia). The online encyclopedia’s editors were more likely to justify this label applied to women based on their career stages (e.g., “she’s an assistant professor” and therefore not yet notable). But this tag was applied to women on average further in their careers than men who received the tag. Individual bias continues to disadvantage women and people of color on Wikipedia; and Wikipedia continues to allow these hidden biases to influence processes of determining notability.
The strategies you teach about search engine literacy might be out of date
In a recent paper for the Library Journal, Francesca Tripodi, joined by Jade Angelique Stevenson, Rachel Slama, and Justin Reich, built and tested search literacy interventions for librarians. Using ethnographic methods on libraries on Montana, Tripodi and her coathors explore the difficulties librarians face on the frontlines of the fight against disinformation. The librarians they spoke with argue that improving search engine literacy requires building trusting relationships with patrons. Additionally, Tripodi and her colleagues offer the following search literacy tips for librarians, teachers, and anyone else educating others to combat misinformation:
- Use Wikipedia as a starting point. People who use Wikipedia find better answers with greater accuracy in less time than people who don’t.
- Don’t judge a site based on its URL. Some .com sites are reliable and some .org sites are misleading! In addition, plenty of nonprofits produce untrustworthy information.
- Don’t trust website appearance. Attractive websites don’t always mean high quality information; plain websites are sometimes accurate and helpful
- Use lateral reading strategies. The best way to check the quality of a website is by leaving that website. Many people verify sources vertically, by reading About pages and checking the links provided by that website. In contrast, best practice is to distrust your own ability to verify quality of the source and see what others are saying about it across websites (i.e., “lateral reading”).
In addition, this paper provides the following insights for policy makers and technology professionals:
- Trust ethnographic research methods. People routinely misreport their search practices, so ethnographic methods provide insight to instructional designers and UX professionals on how the technological affordances and platform design shape information-gathering practices.
- Consider mobile first. In rural areas where broadband is limited, people increasingly rely on their phones for information. Mobile designs can shape search practices.
- Create a budget for training resources. Instructional resources for librarians will pay dividends in the fight against mis/disinformation
You can’t understand privacy without understanding power.
When we think about “privacy,” we often think of an individual’s privacy, her individual right to privacy, and individualized strategies for retaining control over private information. This is an ahistorical way of thinking that ignores the ways privacy violations are patterned in ways traceable to marginality and domination. Populations identified as “dangerous,” like Muslims after 9/11 or transgender individuals in the current political climate are disproportionately surveilled, their privacy deemed violable in the interests of other people’s safety.
Unfortunately, as Marwick shows in this paper, such issues of systemic issue are rarely addressed at global conferences about privacy like IAPP and SOUPS. Instead, techno-optimism rules the day, complete with sales booths offering the newest products individuals might use to protect themselves. Privacy isn’t an individual problem for which individual solutions that fiddling with privacy settings will help: privacy violations are gendered (most stalkerware is used against women by current or former male partners) and they are raced (as Black and brown communities in the U.S. are all too aware). Conceptualizing privacy without power is simply inadequate in the current moment.