Originally published in 4S Backchannels on 17th September 2021.
By: Ranjit Singh and Rigoberto Lara Guzmán
The possibilities of leveraging Big Data and AI-based interventions are often poised to flow as innovations that emerge from the Global North to the rest of the world. Such flows tend to position the Global North as the active center and the Global South as the passive periphery of these innovations. Problematizing these flows, we are collaborating on a project at Data & Society to map the conceptual vocabulary of AI in/from the Global South, where we begin with the argument that the Global South is neither a passive recipient, nor the periphery of emerging developments in these data-driven technologies.
Our project builds on the broadest conception of AI. Similarly, we consider the concept of ‘Global South’ to be equally diverse in its meaning ranging from  a site in its own right to study situated technological developments, and  a method to understand, analyze, and build developmental, postcolonial, and decolonial computing practices to  a metaphor for varying forms of suffering caused by capitalism and colonialism at a global scale, and  an effort to creatively resist and subvert such suffering.
Given the fractured data environments in Global South countries, building, maintaining, and appropriating data-driven technologies are infrastructural problems. We hope to identify concepts, keywords, and ideas that underlie the meaning(s) and future(s) of these infrastructural problems and their solutions. Aligning with this infrastructural orientation, keywords from the Global South often illustrate the ways in which infrastructures configure everyday lives and desires and showcase experiences with data-driven technologies as emerging ways of life. They include, but are not limited to, keywords and concepts such as postcolonial computing, decolonial computing, data extractivism, human dignity as the bridge between AI and human rights, data colonialism, indigenous data sovereignty, feminist solidarity in design practices, and data justice. Quite predictably, they have a different tone and priorities than concerns around keywords from the Global North such as bias and fairness, accountability, transparency, human-centered explainable AI, and responsible AI, which often approach data-driven technologies as tools subject to human ingenuity. One approach does not replace the other; they add further nuances to our ability as analysts to frame the emergent challenges of living with data-driven technologies. Mapping the similarities and differences between the approaches of different countries to these technologies opens a space for future comparative analysis.
Extending this project’s frame of research, we are currently in the process of organizing a storytelling workshop, where we ask: What stories do we tell of a world that has increasingly come to rely on AI-based data-driven interventions to resolve social problems? How do we characterize the differences and similarities between these stories as they emerge from different parts of the world? When do such stories become illustrative parables to theorize the unevenly distributed conditions for and consequences of data and AI in/from the Global South? We see stories as a core resource for building a shared understanding around a research topic and situating a shared sensibility towards how an academic practitioner’s job is to be done. In the context of our project, they are also analytic resources for describing everyday forms of harm and redress that people encounter in living with data and AI.
Stories that have pedagogical value and elicit the nature of the practice of an academic discipline often share three features:  They provide the empirical foundation for conceptual work of a discipline;  They represent a situated perspective centered on particular forms of critique; and  They are inevitably entwined with the emotions they produce. In this workshop, we are inviting stories that provide empirical foundations for the emerging theoretical work on understanding and researching AI in/from the Global South. Our intention is to foreground everyday experiences that capture the challenges of leveraging data-driven technologies for development and infrastructuring data and AI in everyday life. This workshop is an effort to build a diverse collection of such stories on everyday experiences from across the world as illustrative parables to study the unevenly distributed conditions and consequences of data-driven technologies.
Parables are foundational building blocks of academic fields. For example, the story of racist bridges in New York City is central to STS scholarship on the politics of artifacts and the story of the bicycle grounds diverse conversations on social construction of technology. These stories are told, taught, learned, become readily cited paradigmatic case studies, support the legitimacy of theoretical frameworks, and sometimes, refute the same legitimacy. In short, they underpin common sense understanding of academic fields as professions. The power of such parables emerges from several interconnected aspects: they leave room for multiplicity in interpretations, yet they preserve a concrete reference to real life experiences; and they often capture broad theories and concepts integral to the practice of academic fields that resonate with common sense. For example, illustrative parables in the field of studying algorithmic harms and AI Ethics in the United States, include but are not limited to, stories of discrimination in online ad delivery, machine bias in tracking recidivism, automating inequality in accessing welfare programs, the gender shades project on bias in facial recognition, and the protests of tenants living in rent-controlled units in Brooklyn against their landlord’s use of facial recognition technology. Much like the use of proverbs, these stories are used in an ad hoc manner within a range of situations to which they apply, but are dropped when their immediate relevance is not obvious. These stories have provided a common ground for ongoing conversations of fairness, bias, and accountability in algorithmic systems.
Our hope is that such stories will provide a foundation for developing a conceptual vocabulary grounded in everyday experiences of living with data and AI in/from the Global South. They are opportunities to explore ordinary ethics and the co-constitutive relationship between the collective and the individual in taking everyday stories of particular persons and communities as illustrative parables for building a field of study around AI in/from the Global South. To conclude this post, we argue that STS has a crucial role to play in this emerging field of study. Our collective experience in unpacking implications of technoscience highlights that the search for illustrative parables is not an individual project, nor does it end with organizing one event. It is a process to understand how the present moment of living with data and AI is simultaneously a reflection of the diverse imagined futures and contested pasts; how we are all implicated in this process in one way or another. Describing everyday life in a data-driven world requires that we take storytelling as a craft and an intellectual practice seriously. So, what is your story?
This post is a mashup of other writings for their project on Mapping AI in/from the Global South by Postdoctoral Scholar Ranjit Singh and Producer Rigoberto Lara Guzmán from the Data & Society Research Institute. Ranjit (he/him) studies the intersection of data infrastructures, global development, and public policy. His dissertation research advances public understanding of the affordances and limits of Aadhaar, India’s biometrics-based national identification infrastructure, in practically achieving inclusive development and reshaping the nature of citizenship. Rigoberto Lara Guzmán (they/he) is a xicanx producer, organizer, and strategist. They live in the forest as a remote tech worker focused on growing the technology research environment, facilitating critical dialogue, and theorizing decolonial computation at the Data & Society Research Institute.