Between 1996 and 2002, South Africa convened a series of hearings—as part of its recovery from the injustices of aparthheid—known as the Truth And Reconciliation Commission (TRC). Having heard from apartheid’s victims and perpetrators, one outcome stood out: that there was very little overlap of victims between those seeking restitution and those seeking amnesty. This was a period of disjunction, split between those who came before the Commission and who repeatedly said they could not forget the harms done to them, and those who sought amnesty but repeatedly could not remember any details of their acts and complicity. Can’t forget–can’t remember1.
We begin with this TRC story because it is the remains of a fire. A fire burning across centuries, even today, and always near. In history’s ashes we find recurring patterns of people and entire societies left with unforgettable memories of harm and pain, somehow coupled to a larger society that can’t remember those harms, or worse, never knew. For we who are dedicated to developing advanced technologies using machine learning and artificial intelligence (AI), this fire is also burning within our field of study. As history reminds us, technological evolution tends to set ablaze high hopes coupled with cruel capabilities. So as technologists and researchers, this story should lead us to ask: are we today the ones who can’t remember?
That ever-near fire is fueled by colonialism, and its contemporary forms. So this first post tells other stories of the fire, connects colonialism to our work in AI, and describes one way to understand the dynamics of power in the world and in our field, using decolonial theories. This piece, and those that follow, is part of our work and action to put the fire out.
“Mankind barely noticed when the concept of massively organised information quietly emerged to become a means of social control”. You might guess that this is a recent description of algorithmic harms that emerged over the past two decades. No. This is Edwin Black’s description of the world of 1900 and the invention of the punch card and a new wave of automation and data collection2. During this period, the little-known fields of computing and statistics, the pillars of today’s field of AI, became turbo-charged with the ability to sort through large amounts of data. With enhanced statistical ability, we were now able to acccuractely, and at national-scale, identify objects of interest, assess the benefits of competing choices, organise the best sequence of actions, and finally audit the efficiency of chosen actions. This was a boon for almost every sector. But what if those actions and efficiencies were about the murder of Jewish people.
This is how statistics, and the tech industry of the age, entered its dark phase and turned towards evil ends and genocide. For Nazi officials and throughout the second world war, statistics became “invaluable for the Reich”, and the Reich hoped to “give statistics new tasks in peace and war”3. By abstracting away from the context of its work, statistical science uncritically advanced its methods for greater accuracy, efficiency and financial profit—and enabled the death of millions. Importantly, statistics and computing were not alone in this programme of dispossession and dehumanisation. That work went about, supported and entangled—in the US, and across Europe and its colonies—by corporations and national structures and incentives and powerful leaders and corrupt ideologies. Do we remember this foundation of our field of AI.
This story is connected to another thread of seemingly unrelated history. In 1810, Saartjie Baartman—22 years old, a member of the Khoi people of Southern-Africa, hard-working, fluent in at least !Xam, English, French, and Dutch, going through her life searching for growth and love and meaning—found herself trapped in London, exploited, and on public display in a human menagerie. She was used to generate profit, to indulge the fetishisation of Black women, and used to establish the racist science of human difference. And Saartjie was just the first of many other members of the Khoi, including children, put on show in this way. One beginning of a long history of African (and other) peoples, and their diaspora, used for profit and experimentation.
A century later, in 1904 the Herero and Nama peoples of Namibia were subjected to even worse treatment: forced into sterilisation, medical experimentation and eventual genocide. Namibia was treated as a site of medical beta testing, where medical methods (of harm) were first developed and tested and refined; these methods were then later exported and used against the Jewish people by Nazi Germany—already identified and in-waiting due to the efforts of statistical analysis. This mindset of experimentation and mistreatment is not an aberration, and has continued unabated into our present: in the clinical development through forced contraception of MPA in Zimbabwe in the 1970s, one of the world’s most widely-used and today, safest drugs for contraception; in the ongoing questions of illegal blood exports and data and ownership during the 2014-2016 West Africa Ebola epidemic; or in the suggestion in April 2020 by French scientists that vaccines for COVID-19 should first be tested in Africa. There is a historic continuity between 19th century slavery and exploitation, to 20th century medical beta testing and genocide, to 21st century experimentation and data acquisition—and the computational methods that connects them all4. This continuity has directly affected Africans, but is entangled with our larger global history. Here again, are there those that can’t forget and those that can’t remember.
The fire burns still.
The difference between those that can’t forget and those that can’t remember is one of power. The powerful are those—individuals, organisations and systems—that, in some way, have authority: they project values and standards, shape thinking and discourse, and influence the outcomes of socio-political processes. There are wide-ranging theories of power, but we give a central place to the history and theorisation of colonialism and its views on power, because of its broad view on the technical-social-political world.
Historical colonialism was enacted by territorial appropriation, exploitation of the natural environment, exploitation and enslavement of people, and the direct control of social and economic structures. In this way, colonialism sought to exert power over every part of land and life across the world, fundamentally altering the course of human progress. Although formal colonialism has ended5, its effects endure in the present, and when these colonial characteristics are identified with present-day activities, we speak of the more general concept of coloniality6 7 8: coloniality is what survives colonialism9.
The suffering of Saartjie Baartman and of the Herero people was produced by colonialism. And today, coloniality reproduces colonialism’s mindset and ills, now amplified by predictive systems and AI. One recent example is the operations of the political consulting firm Cambridge Analytica (CA). CA beta-tested tools for political influence during the 2013 Kenyan and 2015 Nigerian elections, with these countries chosen, in part, due to the weak data protection laws compared to their base of operations in the United Kingdom. CA’s work was found to have actively interfered in these electoral processes and worked against social cohesion10. And repeating a story we’ve heard before, the refined versions of their predictive tools were then exported back to the US and UK during those countries’ own electoral processes. In this story, we see coloniality set ablaze globally: a British company, using data obtained from hackers in Israel or illegally from US-based social media platforms, able to reach into the lives of people across the world, in Kenya and Nigeria and the US and the UK, solely driven by motives of enrichment and profit, and supported by global networks of finance and infrastructure and data and science.
Decolonial theory gives us a framework with which to understand systems of power that, as in this example, are continuities of colonial practices. In decolonial theory, power can be understood by how it exercises control over social structures. Quijano defines four axes of this control, through authority, economy, gender and sexuality, and knowledge and subjectivity; Maldonado-Torres similarly uses three axes, of hierarchies of race, gender, and geopolitics. In addition, this cartography of power can be further elaborated by demarcating the terrain of analysis: by identifying the metropoles—the centres of power—and their peripheries, which hold relatively less power and contest the metropole’s authority, participation and legitimacy in shaping everyday life11.
These decolonial theories can serve as transformative tools for research and impact if infused into our field. By doing so, we can create a decolonial field of AI that takes advantage of history and memory, invigorates our creative and design processes, and expands our understanding of what meaningfully contributing to well-being in the world using technology can be. This new field of AI is not out of reach, and its forms are already emerging. We see it in the widespread commitments to fairness and transparency, in the wide adoption of ethical frameworks and charters, in the deep considerations about misuse in areas like synthetic media and natural language processing, in the major successes preventing future harms from facial recognition by our field’s leaders, in new approaches for auditing predictive systems, and in the openness to engage in hard conversations about the role of colonialism and coloniality in the field of AI.
It is tempting to think that colonialism has no role in science, and that—as individuals, collectives and organisations—imperialism is in the far past, over, and of little relevance to our work today. If for no other reason, the hindsight we can all claim as an inheritance from colonial history is our advantage, and a powerful tool of foresight into the alternative futures we can create. But reflecting on the small things in our daily lives will reveal the untenability of a position on coloniality’s irrelevance. You are reading this post in English because of colonialism; your national identity is shaped by borders created by colonialism; the train lines that came to villages in Europe were first paid for by enslavement and indenture; the concept of the passport was created by empire; the production of our phones and devices rely on inequalities created by colonialism; the labelling of data we use to advance our state of the art is mainly done in former colonies; the foundations of our field were advanced by an unquestioning drive for greater empire.
Our hope over the course of this series, is to build a dialogue with readers about what a decolonial field of AI could be, to identify all the points of agreement, disagreement, contention and alterity, and to explore new ways of together doing research that meets our common aspirations to responsibly advance our field in support of greater prosperity, nurture and wellbeing.
We can put out the fire. Here is one more tale of how.
Resurrection, by Koleka Putuma12
The graves are bleeding trauma
The memories say, let me out
The massacres say, remember me
The graves say, it still hurts
The skeletons point to where it does
The blood says, find me on the perpetrators’ hands
The blood says, wash me from the victims’ body
The blood says, do not let the children see the bath
The blood says, do not let the youth wash in it
The blood says, the grave is no place for healing
There is enough blood spilled to map the countries it has flooded
There is enough blood lost for us to die
There is enough blood left for us to live
Their tongues are burning in our mouths
When we talk about our history, we put the fire out
This essay was written safely indoors while listening to We’ve Landed by Tony Allen & Hugh Masekela (Absence dub remix). This essay series is an alternative exploration of the paper Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence. It is one of many to come, and from many others, to experiment with the decolonial.
– Shakir Mohamed
In the Decolonial AI paper, we used a different historical thread of experimentation on African Amercians, starting with 19th century gynaecology experiments, to 20th century Tuskeegee Syphillus trials, to 21st century algorithmic discrimination.