Author Archives: Peyton Cordero

Tailoring Communications About AI to Different Audiences

This week’s readings offered a variety of perspectives about the harms and complications of machine learning and AI. The ProPublica article, Machine Bias, by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner took both a quantitative and qualitative approach to understanding the potential bias inherent to risk assessment algorithms. Their findings suggest that risk assessment scores, commonly used in courtrooms around the country as a way to assess someone’s recidivism, are more likely to label people of color as high-risk offenders compared to their white counterparts. Their article couples data analysis with interview-based responses from individuals directly affected by this system. With visuals embedded throughout, this article seems to have the most general audience in mind as the language is very accessible and the images allow the reader to connect with the article in a more humanistic and emotional way. They also break down their data analysis in a way that is translatable to a broad array of individuals which makes its impact potentially more far reaching than the other two publications we read this week. 

The paper presented at the 2022 ACM Conference on Fairness, Accountability, and Transparency, Accountability in an Algorithmic Society: Relationality, Responsibility, and Transparency, definitely read like an academic paper. I enjoyed reading this piece because it talked about how we can operationalize accountability and the authors drew on several scholars and theories from other fields to help support their claims. The interdisciplinary nature of this article encompassed how I think we should be discussing public interest technology issues, since they often don’t exist in isolation and require analysis from social, political, economic, and technical perspectives. This paper did take me quite a while to read, though, and I feel its impact is probably limited to academic and professional audiences. 

Lastly, the We are AI comics by Julia Stoyanovich and Falaah Arif Khan offered a creative and refreshing way to educate the public about AI. The addition of colorful images and relatable examples to aid in explaining how AI works and its potential implications is a great way to make this information accessible and useful to the broader public. The comics were something that I would feel comfortable sharing with many people, regardless of their age or education level. The note at the beginning of each comic was also helpful for letting people know in a simple and direct way how these materials can be shared and how to properly credit the authors. We talked about the importance of acknowledgment and giving proper credit when it’s due, especially in the tech space, so I think them including that page on the front of their comics helps to actively combat that issue and make it easy for people to credit their creativity. 

I think each of these publications are successful in communicating the complexities of machine learning and AI in different ways. The ProPublica article was empirically-driven, but humanistic. The ACM conference paper was well-articulated and well-supported, but is likely to only be accessible to those who are already familiar with the literature. The AI comics were fun, creative, and simplistic, but it is difficult to image how people would find these comics if they weren’t already looking at Data, Responsibly’s website. 

Data Journalism as a Mechanism for Unraveling Systems of Oppression

This week’s readings about data journalism further complicated my understanding of data and how we use it. In the podcast interview with Mimi Onuoha and Lam Thuy Vo, they discussed the ways in which documentation can be a form of violence to already-marginalized communities. Documentation, in itself, is a way of reducing people or stories into something more easily understandable or measurable. It strips data of crucial context that is necessary to understanding the complexity of human systems, and it works to actively disempower people and knowledge. They also note that many of the questions we have as data journalists ultimately center on power; who holds power in this data? Who loses power? Who holds power over this data? These questions are incredibly important to keep at the forefront when collecting and analyzing data, because data does not exist in isolation from human beings and their communities. 

Building upon this argument, Nikki Stevens talks about the ways in which the concepts of “dirty” and “clean” data can further reinforce harmful stereotypes. She states that “a focus on a data’s cleanliness is a way of controlling which knowledge is ‘valid’ and is directly counter to intersectional aims” (p. 12). This quote struck me, as did our conversation about raw data as an oxymoron last week, because I’ve been so primed to accept the clean/messy data binary as an academic. Consequently, I never understood the ways in which this standardization can be used to eliminate some of the nuance that exists within and between human relationships. Taking an intersectional approach to data collection and analysis is a substantial goal, however the author notes that the positivist assumptions made in quantification are somewhat incompatible to the broader intersectionality framework. The question I was left with from this article was, how can reimagine data journalism in completely new, black feminist structure that seeks to flip conventional power dynamics on its head? 

During his talk, Alex Howard explained that data + journalism + activism + responsive institutions = social change. The articles previously discussed tackle the first three components of this equation by discussing how data journalism can be used as a form of activism that empowers vulnerable communities. However, I believe responsive institutions is one of the most important ingredients to social change. How do we make institutions responsive to us as the public? Understanding this accountability mechanism disrupts the status quo and requires the people to take power in a way that they previously were unable to. This, I believe, is the core role of data journalism—to shift the narratives that focus on outdated social and power structures to center voices that have previously been silenced. Data journalism has the capacity to reinvent how we think about numbers representing people while also highlighting the limitations to this logic. 

Beyond the Public, Beyond the Commons

Though it was written nearly a century ago, John Dewey’s The Public and Its Problems is a relevant piece of literature that aptly describes the relationship between technology, politics, and the public. Dewey notes that “indirect, extensive, enduring and serious consequences of conjoint and interacting behavior call a public into existence having a common interest in controlling these consequences” (126). My interpretation of Dewey’s definition of the public is that a public exists anytime people have shared interests in cooperating because the consequences of not doing so would be large and experienced by all members of the public. The United States could be considered a public, under this definition, as we have a shared interest to work together to preserve the stability of our democracy. On a larger scale, the entire human population could be considered a public as we each have shared interests to avoid things like extreme climate change when the consequences would impact every one of us. 

The public can only thrive, Dewey argues, once converted to a Great Community. The Great Community takes the idea of the public and extends it beyond the incentive framework and into a space centered in human connection. To this point, Dewey says that “association itself is physical and organic, while communal life is moral, that is emotionally, intellectually, and consciously sustained” (151). Humans instinctively seek to form relationships with others, but community requires action and motivation to maintain harmony. 

This is a difficult task for societies because man is a selfish creature; and here is where we encounter the tragedy of the commons. Hardin’s tragedy of the commons is a theory that is twofold—on the one hand, when humans have a shared resource (e.g. the commons), they must constantly evaluate the cost-benefit ratio of taking for themselves and leaving some for others. On the other hand, individuals each play a part in making problems for the commons that everyone in the public must endure (e.g., pollution, waste, etc.). Hardin contends that there is no technical solution to the problem of the commons, and based on this week’s readings I would argue that the solution to the problem of the commons is community. Social ties, respect for each other, and respect for the commons is the foundation for overcoming the tragedy. Ostrom et al. supports this theory by stating that “although tragedies have undoubtedly occurred, it is also obvious that for thousands of years people have self-organized to manage common-pool resources, and users often do devise long-term, sustainable institutions for governing these resources” (213). 

The commons is at once a space for tragedy, a space for solution, and a space for political and social contest. As with all other human systems and phenomena, the public and the commons that exist around it is constantly in flux. Our needs, priorities, relationships, struggles, and ideologies will continue to change and how we navigate those changes is also continually in motion. This conclusion was heavily influenced by Harney & Moten’s The Undercommons.  Harney & Moten push us to challenge our understanding of the commons and how different classes of people are valued within that framework. The undercommons represents a revolution that does not recognize the commons as a defined space pre-determined and managed by colonizers.  The commons, they believe, should be upended as we know it to create a new society that is inclusive and beyond politics because politics aren’t necessary. People engage with governance and one another without the need for politics because they are one community above all else. To end this argument where we began, Dewey makes it clear that “‘we’ and ‘our’ exist only when the consequences of combined action are perceived and become an object of desire and effort” (151). 

Technology presents new challenges to the public and serves as yet another extension of the commons. Public interest technology is a field that helps address the issues of the undercommons and the inequalities that exist for marginalized groups. It centers transparency, equity, and accountability to determine not just how we can use certain technologies, but why and by whom? Though regulation has already lagged far behind innovation, creating something different and something new is always an option the public will have to reclaim the commons.