Last summer I visited my grandparents who live in a village in the Anatolian part of Turkey. As a personal inquiry, I asked my grandfather what “Europe” represented for him; what the first words, symbols, images that came to his mind when I said “Europe”. He immediately answered: “progress, technology, intelligence, hard work…” and added, “we are corrupt and not hard working enough. That’s why we are left behind.” I felt a cramp in my stomach, but I smiled.
There is nothing more disempowering and invalidating than believing there is something inherently wrong with your culture, upbringing, and ways of knowing, feeling and making sense of the world. Like my grandfather thinks, do Europeans or the West have advanced economies because they know, they have the technology on their side, and they are more hardworking and morally superior than others?
There was a missing part in my grandfather's story. One that contradicts the globalized narratives of modernity and coloniality. It is a story of global exploitation, extraction, racism, capitalism, dualistic thinking and hegemony. It is the story of this invisible web of natural resources and generations of racialized human labor holding the shiny promises of efficiency marketed through technological advancement.
This is very often missing from the mainstream discourse around Artificial Intelligence (AI) technologies, too. AI exists in a socio-technical system, inevitably interacting with, shaping, and shaped by what is social. It is not only a scientific discipline and a business but also has its own mythology: “‘AI’ is best understood as a political and social ideology rather than as a basket of algorithms. The core of the ideology is that a suite of technologies, designed by a small technical elite, can and should become autonomous from and eventually replace, rather than complement, not just individual humans but much of humanity.”
AI and colonial power
AI is deeply entangled with the coloniality of power in various ways. The coloniality of power shapes not only the political and economic spheres but also the sites of knowledge production, perception, feeling, and imagination. In AI ecosystem(s), scientific knowledge, technological invention, and corporate profit reinforce each other and lead to consolidations of political and economic power. The economic system AI is embedded in is one of extractive capitalism. It molds the goals, modes of production, forms of labor, and the distribution of wealth and power around this technology.
In its current stage, AI is born into economies of scale and digital platforms. Big tech companies accumulate critical mass through huge returns to scale and network effects. This leads to the centralization of digital infrastructures upon which a considerable share of commercial and various types of activities are built. These major tech companies’ current reach and power are reminiscent of empires; only that they now extract data alongside natural resources and racialized labor from othered parts of the world.
The dominant discourse mystifies AI as a self-reliant, abstract technology. The popular imagination fixates on technology by ignoring all the natural and material resources as well as human labor that make AI possible in the first place. Kate Crawford and Vladan Joler’s Anatomy of an AI system shows all material resources, human labor, and data required for the lifecycle of a single Amazon Echo from production to disposal. This includes mining earth minerals such as lithium to produce the hardware, a huge infrastructure such as the internet, the labor of data labelers and more. Human labor is not only vital for refining, assembling, distributing, and transporting the physical and virtual components of a system, but consumers also continuously perform labor by generating data and ultimately helping to improve the systems.
Data as the ultimate source of knowledge
Increasing reliance on data as the ultimate source of knowledge imposes a new epistemological order based on the datafication/commodification of everything. The main training datasets for machine learning (NMIST, ImageNet, Labelled Faces in the Wild, etc.) originated in corporations, universities, and military agencies of the Global North. Socially constructed categories of race and gender binary are cemented in data classification systems and taxonomies, bolstering social, political and economic implications in the distribution of power of such categories. The outcasts, minoritized bodies, and subalterns that don’t fit into classifications and colonial taxonomies are subjected to algorithmic violence and discrimination.
The account of AI’s coloniality is vast and complex. In “Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence”, Shakir, Png and Williams provide a comprehensive view on the sites of decoloniality in AI. Similarly, Ricaurte develops a theoretical model for understanding the coloniality of power in data in “Data Epistemologies, The Coloniality of Power, and Resistance”. This scholarly activity certainly contributes to what Adolfo Albán calls re-existence: “a strategy of questioning and making visible the practices of racialization, exclusion and marginalization, procuring the redefining and re-signifying of life in conditions of dignity and self-determination, while at the same time confronting the bio-politic that controls, dominates, and commodifies subjects and nature.”
However, decoloniality also has its contradictions and questions. For instance, the language of decoloniality risks being co-opted: "how to write (produce) without being inscribed (reproduced) in the dominant white structure and how to write without reinscribing and reproducing what we rebel against." or what is left of AI once it's decolonial? Does this necessarily lock decoloniality into the seemingly opposite positions of technology refusal and/or inevitable co-option? I hear the desired idea of purity and the binaries of colonial vs decolonial that my mind produces in these questions. I try to resist and remember “there is no proprietor or privileged master plan for decoloniality” and appreciate the hybridity and complexity of things.
Rather than giving up on the urge of the modern mind for certainty, hierarchy, and ready-made answers, decoloniality can be seen as a continuous process and a praxis that is not only epistemological but also emotional, spiritual, contextual. It’s about building situated patterns of the otherwise: living, reflecting, analyzing, theorizing, actioning, and leaving what was built (and co-opted) to start over. It is about centering life: “The decolonial option ... starts from the idea that ‘the regeneration of life shall prevail over [the] primacy of recycling the production and reproduction of goods.”.
The wisdom of not knowing
My grandfather is a farmer, his father was an imam and a farmer, too. My grandfather didn't go to high-school, he is not steeped in the most valued type of knowledge for modernity/coloniality. When I explained to him how AI was used in farming for disease detection and seasonal forecasting; he didn’t seem very interested. Because he knows that whatever he does, if there is harsh hail this year, it means they might lose an important share of their harvest. The threat of crop loss is always present, but I have never seen him defiantly complaining about the forces of nature -as he does about the forces of capital-.
This spiritual humility and my grandparents’ personal relationship to nature and soil taught me there is wisdom in not knowing in the modern/colonial sense. It’s quite counterintuitive for the modern/colonial mind and its technologies as it sells the illusion of control that is obtained by knowing. At this point, it’s not about knowing more; it’s about respecting life’s constantly changing nature by honoring other types of knowing. In The Left Hand of Darkness, Ursula K. Le Guin says: “To learn which questions are unanswerable, and not to answer them: this skill is most needful in times of stress and darkness.”
Edit & proofread by Katy McKinney-Bock.
 Crawford, Kate et al. 2014. ‘Critiquing Big Data: Politics, Ethics, Epistemology | Special Section Introduction’. International Journal of Communication.
 Lanier, Jaron. 2021. "AI Is An Ideology, Not A Technology". Wired. https://www.wired.com/story/opinion-ai-is-an-ideology-not-a-technology/.
 Kate Crawford and Vladan Joler, “Anatomy of an AI System: The Amazon Echo As An Anatomical Map of Human Labor, Data and Planetary Resources,” AI Now Institute and Share Lab, (September 7, 2018) https://anatomyof.ai
 Pasquinelli, Matteo, and Vladan Joler. n.d. ‘The Nooscope Manifested Artificial Intelligence as Instrument of Knowledge Extractivism’, 23.
 Mohamed, Shakir, Marie-Therese Png, and William Isaac. 2020. "Decolonial AI: Decolonial Theory As Sociotechnical Foresight In Artificial Intelligence". Philosophy & Technology 33 (4): 659-684. doi:10.1007/s13347-020-00405-8.
 Adolfo Albán Achinte, “¿Interculturalidad sin decolonialidad? Colonialidades circu-
lantes y prácticas de re-existencia,” in Diversidad, interculturalidad y construcción de ciudad, ed. Wilmer Villa and Arturo Grueso (Bogotá: Universidad Pedagógica Nacional/Alcaldía Mayor, 2008), 85–86. Retrieved from Mignolo, Walter D, and Catherine E Walsh. 2018. On Decoloniality. Duke University Press.
 Gloria Anzaldúa, Light in the Dark, Luz en lo Oscuro: Rewriting Identity, Spirituality, Reality, ed. Analouise Keating, 7 (Durham, NC: Duke University Press, 2015). Retrieved from Mignolo, Walter D, and Catherine E Walsh. 2018. On Decoloniality. Duke University Press, 20-21.
 Ibid. 108.
 Bhambra, Gurminder K. "Postcolonial and Decolonial Reconstructions." Connected Sociologies. London: Bloomsbury Academic, 2014. 137. Bloomsbury Collections.