The Coloniality of “Smart Borders”

The European border and asylum regime is becoming increasingly restrictive and violent.  Digital technologies play a central role in this. Technologies that are supposed to prevent migration are making border crossings even more risky and deadly, as people are forced to choose more dangerous routes.  

Taste auf einem alten Mobiltelefon im Detail, auf der Taste steht "Talk", englisch für "sprechen" oder "reden"

Only an understanding of the history of today's digital border regimes allows us to imagine a different, fair migration for the future

The European border and asylum regime is intensifying, growing more restrictive, securitized, and violent. Digital technologies play a central role in this violent securitization of borders and asylum. The descriptor “smart borders“ gathers together a number of projects intended to police and secure borders, including the prediction of crossings, surveillance and monitoring efforts, and the outsourcing of decision-making processes in asylum determination. At the border we see that technologies intended to deter migration actually mean that border crossing becomes more dangerous and more deadly as people are forced to take more dangerous routes.

Since 2015, the beginning of what media and policymakers refer to as the European refugee crisis,[1] better termed the long summer of migration,[2] the German state specifically has implemented a whole range of media technological tools in asylum determination processes. These include different biometric technologies used to verify the identity of people seeking asylum, such as the analysis of people’s dialects that falsely ties someone’s dialect to citizenship, or the expansion of migration databases and increased interoperability with police and intelligence institutions. Another practice implemented by a number of European countries is the extraction of phone data at border crossings or during asylum processes. Authorities claim the ability to reconstruct flight routes and flight smuggling networks by examining the data found on people’s phones, which supposedly reveal where both the phone and the person have traveled. 

These data-driven tools are often framed as new and innovative. The German authorities present them as efficient and secure solutions to address higher volumes of asylum cases and to create more streamlined administrative processes. At an EU level, the so-called smart border package is hailed as increasing security and efficiency at border crossings. These rhetorics of newness present digital borders as unprecedented, without history and without continuity. In my research I look at how so-called smart, digital technologies inform and transform political notions of belonging, citizenship, personhood, rights, recognition and humanity. By analyzing what changes and what stays the same with the use of digital media tech in migration and border policing over time, I argue that we can better grasp how such technology (re)produces racist discrimination and vulnerability of borders.

 

The Case of Dialect Recognition 

One specific case of a digital and data-driven technology that the German Ministry for Migration and Refugees (abbreviated as BAMF in German) has instituted can be traced across histories of colonialism and the coloniality of our present. Since 2016, BAMF has used a dialect recognition technology as part of asylum determination cases to determine the country of origin of people seeking asylum. The software can supposedly recognize the dialect someone speaks using asylum applicants’ speech samples.

As I have written elsewhere, this software has been widely critiqued as it reproduces and increases the vulnerability of those subjected to it. It has a high error rate of 20 percent. It is in many ways black-boxed.[3] We don’t know exactly how it comes to its conclusions and information on training data as well as the technical and algorithmic operations of the software is highly classified. The software has also been reportedly used on people who don’t actually speak one of the dialects it can distinguish between.  

More fundamentally, the software relies on the flawed assumption that language and dialect can reliably indicate a person’s origin, social background, or even nationality based on commonplace and lay assumptions about language as unchanging and people as monolingual. These assumptions produce the idea of what I call a linguistic passport where language is expected to serve as a form of official state identification. This belief has made language analysis an appealing method for state identification; unlike passports, which can be misplaced or forged, an accent is perceived as an inherent, stable trait tied to a person’s identity. The allure of accent articulated here is that it is harder to fake than documents. As a result, people seeking asylum are increasingly identified through their data and their testimonies are increasingly replaced by biometric information and measurements. 

 

Colonial sound archives in the past, digital speech recognition today 

Despite the praise for German state authorities data-driven asylum administration as innovative, the failures of technologies like dialect identification and their fundamentally flawed assumptions do not exist in a historical vacuum. Instead, there are a number of ways in which this technology shows what I call the coloniality of migration infrastructures. With this I mean that histories of colonialism and racism are deeply embedded in the formation of supposedly new and innovative technologies. 

The conception of a linguistic passport ties language to a place of origin according to what linguists call language ideology, mapping linguistic boundaries onto territorial ones. This identification of linguistic and territorial boundaries disregards how dialects are distributed independently of geopolitical borders especially because they are often products of colonial forms of border making. Colonial powers divided up territories that did not adhere to linguistic communities. As such, the idea of languages and their distinction is in itself an artifact of colonization and nationalism. 

We can also trace dialect recognition technology back to the history of comparative linguistics as it was developed in early 20th century Germany. While I was researching BAMF’s voice recognition software a number of people pointed me in the direction of the Berlin Sound Archive (Berliner Lautarchiv). The archive holds a collection of voice recordings made in German Prisoner of War camps that predominantly incarcerated soldiers from British and French colonies during the first World War. As I started to learn about this archive and listen to the voice recordings, I came to consider the colonial classifications of language and dialect the linguists established at the time. One of those POW camps was located in the small town Wünsdorf in Brandenburg. Linguists at the time saw this camp and its prisoners as an opportunity to record and collect what they called the “voices of the world” without having to travel to the colonies to engage in field research. They attempted to establish a scientific method for collecting and analyzing the languages and dialects spoken in the Wünsdorf camp. The voice recordings were used for the development of fields like phonetics and musicology in addition to colonization projects.

The voice recordings produced in Wünsdorf were part of the imperial project of the German Kaiserreich. Linguists attempted to establish a classification of non-German languages and dialogues as part of a nomenclature of different cultures and races that were part of the establishment of “racial” science in the late 19th and early 20th century. The linguists tried to achieve this by creating a methodology of comparison and by correlating the voice with other parts of the body and what they saw as cultural elements of the speakers. To this end their research was accompanied by anthropological research that took measurements of the prisoners’ bodies to establish racial and racist classifications deeply tied to the development of scientific racism. 

 

Prison camp in the past, initial reception facility today

The rationale of language recognition can be traced back to the early 20th century and illuminates a mode of listening to an objectified voice and the state’s desire to locate people by means of language and dialect that was established amidst German imperialism. Not incidentally, the site of Wünsdorf’s former POW camp, also called Halfmoon camp, now houses an initial reception facility (Erstaufnahmeeinrichtung) for people seeking asylum in Germany. In this sense, the history of the Halfmoon camp functions as an infrastructure that shapes and makes possible the further development of the racial listening of dialect recognition. 

The Martinican writer Aimé Césaire has famously described European fascism as colonialism’s boomerang effect to explain how the continuities of colonial exploitation and the tactics of domination used in the colonies came to be employed against populations in European metropoles. Césaire’s indictment of European colonialism and fascism emphasized the continuities and the reproduction of exploitation and domination. The classification of language and dialect and the abstraction of people into data points do not come out of nowhere, but they are built onto tracks; that is, they work as an infrastructure through which the histories and present of colonialism and racialization form the underpinnings of “smart border” technologies.

The coloniality of Smart Borders also manifests in how border policing is used to innovate technologies and forms of governing migration. Specifically, the policing or what governments often call “management” of migration becomes a testing site for new technologies and forms of governance. As the lawyer and anthropologist Petra Molnar argues, migration - and the border specifically -work as an incubator central to technological innovation more generally. Such experiments become possible because migration and borders often work as an exception. Similarly, the POW camp functioned as an exception in which the incarcerated soldiers and their voices were forced to participate in the research activities of linguists and anthropologists. The way in which linguists used the POW camp as a convenient fieldsite to capture “the voices of the world,” which they produced as culturally and racially different from Europe, reverberates in digital dialect recognition today. 

Still, the continuous expansion of smart border technologies that is legitimated through modes of crises, urgency, and racist emergencies is only one possible outcome of our present that we can confront and resist. Understanding those histories and continuities is central to understanding how we can confront the racial exploitation of borders in the present, and articulate alternative just futures of migration. 


 


[1] The term refugee crisis has been widely critiqued for framing human movement as emergency and exception, rather than pointing to the crisis of borders as a global system of apartheid, as activist and writer Harsha Walia points out.

[2] To emphasize the summer of 2015 as a success of the autonomous movement of people across and despite the continuous expansion of European borders Yurdakul et al. propose the term “the long summer of migration.” 

 

[3] A term originally derived from cybernetics and systems theory, a black box describes a system or devices that provides information without revealing how its operations calculate that information