Existing content moderation practices, both algorithmically-driven and people-determined, are rooted in white colonialist culture. Black women’s opinions, experiences, and expertise are suppressed and their online communication streams are removed abruptly, silently, and quickly. Studying content moderation online has unearthed layers of algorithmic misogynoir, or racist misogyny directed against Black women. Tech companies, legislators and regulators in the U.S. have long ignored the continual mistreatment, misuse, and abuse of Black women online. This paper explores algorithmic misogynoir in content moderation and makes the case for the regular examination of the impact of content moderation tactics on Black women and other minoritized communities.
Francesca Schmidt is drafting a new social contract for the digital sphere. Drawing on two core thematic and discursive areas, “Digital Violence” and “Surveillance versus the Private Sphere”, she outlines what a gender-equal digital world might look like. In the process, she provides a historical context through references to discussions dating back to the 1980s and 1990s, especially in the context of cyberfeminism.