E-paper

The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act

For free

Social media platforms play a crucial role in supporting freedom of expression in today's digital societies. Platforms can empower groups that have previously been silenced. However, platforms also host hateful and illegal content, often targeted at minorities, and content is prone to being unfairly censored by algorithmically biased moderation systems. This report analyzes the current environment of content moderation, particularly bringing to light negative effects for the LGBTIQA+ community, and provides policy recommendations for the forthcoming negotiations on the EU Digital Services Act.

Read also the study "Algorithmic misogynoir in content moderation practice" by Brandeis Marshall

Product details
Date of Publication
June 2021
Publisher
Heinrich-Böll-Stiftung European Union and Heinrich-Böll-Stiftung Washington, DC
Number of Pages
23
Licence
Language of publication
English
Table of contents

List of abbreviations
1. Background and objectives of this report
1.1. Power structures in content moderation
1.2. The cases of Salty and PlanetRomeo
2. Different approaches to content moderation
2.1. PlanetRomeo’s community model for moderating content
2.2. Technical solutions and their challenges
3. The regulatory framework in the EU on content moderation
3.1. From the e-commerce directive to the digital services act
3.2. A crucial distinction: Harmful vs. illegal content
3.3. Lessons from NetzDG
4. Policy recommendations for the EU Digital Services Act
4.1. Involve users and civil society
4.2. Educate and prevent through EU programs
4.3. Introduce measures for a sustainable digital ecosystem with diverse platforms
4.4. Foster content moderation practices that are more inclusive and accountable
References

Your shopping Cart is loading …