If you can’t stand the heat, get out of the kitchen

someone holding a poster at a demonstration with the word "hate" crossed out on it

Digital violence, continued exclusion practices and hate speech are still present online. Sexism, racism, anti-Semitism, ableism, trans- and homophobia figure prominently in hate speech. Moreover, membership in more than one group which is targeted online increases the danger of becoming a victim of digital violence. As Amnesty International confirmed in 2018, “women of colour, religious or ethnic minority women, lesbian, bisexual, transgender or intersex (LBTI) women, women with disabilities, or non-binary individuals who do not conform to traditional gender norms of male and female, will often experience abuse that targets them in unique or compounded way”. This is dangerous. If socially discriminated groups experience additional violence in the digital sphere and therefore withdraw from participation, this negatively affects the rationality of socio-political discourse processes mediated by technology.

Discrimination in digital spaces is not limited to forms of digital violence. Rather, the internet acts as a mirror of society in many ways, shaping all forms of discrimination as diverse as society itself.

All technologies that create, organize and expand the digital are not neutral or unbiased, but are social constructions which are always tied to existing relations of power, domination and discrimination. In doing so, they have proven to link up with colonial practices where the collection of social data already supported the establishment of a patriarchal power structure.

Data has traditionally been collected for surveillance and monitoring; since that very moment, individual freedom and the right to privacy have been abandoned for the sake of the alleged safety of everyone. In this context, the fact that surveillance and control have always manifested systems of social exclusion is of particular significance: “After all, surveillance has long functioned as a powerful patriarchal tool to control women’s bodies and sexuality. Online harassment, stalking, and other forms of sexualised violence often directly rely on practices and technologies of surveillance.” (Shephard, 2017a. Technology is never neutral. Stereotypes of discrimination have been manifested in the code and are transferred to deep learning mechanisms through the use of biased training data. The normalization and standardization of human bodies and lifestyles is implicitly inscribed in the code. Biometric facial recognition is widely known to be unable to identify People of Colour because it usually relies solely on white training data sets. Similar to this, AI training data sets from autonomous vehicles disregard training data from non-normalized bodies such as wheelchair users*. Such discriminatory systems are increasingly gaining ground: “Take for example full body scanners at international airports and how they disproportionately affect particular bodies, including people with disabilities, genderqueer bodies, racialised groups or religious minorities. To illustrate how algorithms are by no means neutral we can also revisit the discussions of Google image search results for ‘unprofessional hair’ (hint: black women with natural hair), ‘women’ or ‘men’ (hint: normatively pretty white people). Whether we argue that Google’s search algorithm is racist per se, or concede that it merely reflects the racism of wider society – the end result remains far from neutral.”3 (Shepard, 2016) Digital technology by no means makes us a community of equals – it rather strengthens existing systems of power and exclusion. For this reason, digital innovation must always be critically questioned.

The sad truth is the Internet is not a neutral platform for global empowerment. Rather information and communication technologies mirror the structures of social power and domination in our societies. They are saturated with systems of discrimination and exclusion. If left unchecked, vulnerable groups will be marginalized online as well, and prejudice and discriminatory practices will be digitalized and exacerbated. We need to stop that, we need to have inclusive discourses that are liberated of capital- and power structures, we need to question established systems and discuss them openly. We have to admit that equality and justice can only be reached if we bring everyone at the table and that the issue of empowerment will continue to matter in years to come.

This article was first published (12th November 2019) online via hiig.de and is part of the publication "Critical Voices, Visions and Vectors for Internet Governance". In order to find the sources and literature used for the individual statements, please visit the publication.