Principal investigators: Prof. Dr. Christian Katzenbach, Dr. Paloma Viejo Otero

Lab: Platform Governance, Media, and Technology

Funding: YUFE Postdoctoral Programme

The project „Equality, Wellbeing and Platform Governance“ asks how social platforms policies on hate speech play a significant role in the discrimination and consequent wellbeing of minority groups in Europe, and if so, how the input from European minorities groups to platform governance would contribute to improve access to equality and social wellbeing in the digital sphere. 

Social Wellbeing and Discrimination

The fundamentals of Social Wellbeing Theory are that a well-lived life is the one in which individuals  social value lies in their capacity to contribute to society (Kayes 1998). However, European research points out that not all individuals begin their conquest for social wellbeing with the same assets. For  instance, Roma population and homeless people, continue being subject of vigilance and  securitization by local groups of vigilantes who organise online (Vasiuc 2019, Viejo Otero and Siapera  2015), migratory background continues to negatively affect the life trajectories of migrants’  descendants (Gabrielli and Impicciatore 2021), and individuals with diverse functionality continually  encounter hardand softbarriers that ultimately determine long-term inequality gaps (Venturiello  et al. 2020). In order to have a good life, Social Wellbeing theory considers that it is the social  structures duty to facilitate an individuals ability to contribute to society by guaranteeing or  facilitating access to opportunity, freedom, or equality (Adler and Seligman 2016). Question, however is, how social media platforms, understood as social structures, guarantee minorities equality?

Traditionally, Europe derives from a history a duty of remembrance, vigilance, and combat against  discrimination that targets specific groups i.e.: individuals with diverse functionality, Roma, Jews, or  women (EU 2015). European position, however, opposes to the United States whose understanding  of hate speech regulation oscillates between freedom of expression absolutist positions to speech- by  which speech has not boundaries-(OFlaherty, 2012, White 1996), and neutral viewpoints towards discrimination, that specifically punishes the act of discriminationregardless the historical  background of the target or the perpetrator (Altman 1993).

Platform Governance on Hate Speech and Discrimination

These particular differentiations on hate speech regulation are reflected on how social media  platforms govern content. Popular social media platforms operating in Europe, are rather based in the  United States or follow the success of its business model. Their policies derive from the understanding  that users have the right of freedom of expression and the right of safety. Consequently, popular  platforms like Facebook understand equality under neutral lenses and operate hate speech policies  under security lenses (Viejo Otero 2021). 

In treating hate speech as a security matter, most platformscommunity standards guidelines restrict blatant hate speech and incitement to violence by eliminating them in less than 24 hours (European  Code of Conduct). However, their policies do not specifically prioritise minority-specific groups,  neither the history or context of these groups. Instead, they adopt neutral viewpoint standards for  what constitutes discrimination, resulting in the free circulation of content that specially affects  minorities (Matamoros Fernandez, 2017). As result, popular social media platforms do not facilitate minorities equal recognition, neither diminish the effects of structural barriers that minorities  encounter to contribute to society (Siapera and Viejo, 2021, Siapera et. All. 2018)). 

In considering all the above, it is crucial to carefully examine popular and alternative social media  platforms operating in Europe and their specific treatment to European minorities. It is important to  document the effects that current social media platforms have upon minority groups in Europe. And  it is equally important to make visible ideas and contributions from minorities to improve current and  future media platforms governing systems. 

Therefore, in considering these needs and the relevance of the present research, the proposed project  has been organised on three Phases

Phase 1: : A systematic analysis of Platform Policies that operate in Europe according to their treatment  to minorities groups. Analysis will be followed by a categorisation of this platforms according to four approaches to hate speech regulation : Social Justice Approach, Neutral Approach,  Freedom of Expression Absolutism Approach, European Approach´ (Viejo Otero 2022)

Phase 2::To document the sentiments and thoughts on the effects of popular social media platforms  over minority groups by using in depth interviews. To organise a series of workshops with minority  communities to create ideas of how to improve social media platforms policies. 

Phase 3::To inform platform governance policies on European minority groups views by editing a  graphic book that includes the blueprints of social media tools and models of governance that would  potentially facilitate minorities access to equality and wellbeing in the digital sphere. 

(Project is currently on phase 1)



: Prof. Dr. Christian Katzenbach
Prof. Dr.

Christian Katzenbach

Institution Centre for Media, Communication and Information Research of the University of Bremen (ZeMKI)

Building/room: LINZ6 60120
Phone: +49(0) 421 218 676 29
E-Mail: katzenbachprotect me ?!uni-bremenprotect me ?!.de

: Dr. Paloma Viejo Otero

Paloma Viejo Otero

Institution Centre for Media, Communication and Information Research of the University of Bremen (ZeMKI)

Building/room: LINZ4 40240
Phone: +49(0)421-218-67656
E-Mail: palomaprotect me ?!uni-bremenprotect me ?!.de

Updated by: ZeMKI