As former Facebook employee turned whistleblower, Frances Haugen, reveals damning evidence around Facebook’s inaction on hate, violence and misinformation shared through its platforms, Claire Thomas, Deputy Director of Minority Rights Group and member of the CREID Steering Committee, describes what it’s like to be flagging up hate speech and see little action or “too little too late” action from the social media giant. Facebook needs to work with partners to address this endemic problem and regain public trust.
“Frances Haugen’s critique of Facebook is timely and needed. Facebook (and other social media companies) continuously say that controlling hate filled content is a difficult and complex problem. If that were true, one would think that Facebook would reach out to all credible allies to help them solve it. That has not been our experience.
It has proved difficult to even begin to have any in-depth conversation with FacebookClaire Thomas, MRG
Instead, it has proved difficult to even begin to have any in-depth conversation with Facebook about the findings of our research and the implications for their operations, business model and the harm that Facebook and others are doing in divided societies” says Claire Thomas, Deputy Director of Minority Rights Group and member of the CREID Steering Committee.
CREID partner, Minority Rights Group, has been working to understand and challenge hate speech against religious minorities in India, Iraq, Myanmar and Pakistan. These are contexts in which hate speech results directly in murders, physical attacks, and blasphemy allegations that can also carry the death penalty. As part of the CREID project, lead by IDS and funded by FCDO, MRG and partners are both monitoring hate speech but also discovering with partners how to rebalance content with positive messages that can also go viral.
In our experience, teams moderating content are understaffedClaire Thomas, MRG
“Our partners routinely flag hate speech, but all of the platforms are slow to react, often by the time the content is removed, the damage has been done,” reports Claire Thomas. “In our experience the teams moderating content are understaffed. Much reliance is placed on automated systems, which from our experience do not seem to have the sophisticated understanding of hate speech to allow them to identify it consistently.”
“Facebook is hoping that a repeat of Myanmar in 2018 is not going to happen, but to be able to operate responsibly, it needs to be able to guarantee that that is the case, and it seems to be far from being a position to do so, from the information available to us at this time. We cannot allow an second mass atrocity fed by social media abuse to take place and Frances Haugen’s intervention should prompt a great deal of honest reflection inside all social media platforms today.”
New podcasts share CREID research on hate speech
Last week, CREID launched four new podcasts which share research on the extent of and damage caused by online hate speech, drawing on evidence from Iraq and Pakistan, and discuss ways to mitigate hate speech. Needless to say, much more than algorithms are required to do this, though training software can help to identify and map the extent of hate speech.
- Incite! What is causing the rapid spread of hate speech against religious minorities and how can this be countered? Podcast interview with Claire Thomas (MRG) and Naumana Suleman (MRG)
- Tracking and monitoring online hate speech: keywords and software Podcast interview with Haroon Baloch (Bytes for All) and Pshtiwan Faraj (Independent Media Organization in Kurdistan)
- Defining the language of hate speech against religious minorities: compiling a lexicon of key words Podcast interview with Aila Gill (NCJP) and Pshtiwan Faraj (Independent Media Organization in Kurdistan)
- Countering online hate speech: creating a counter narrative through good journalism and by working with young people Podcast interview with Abdul Bari (Bargad), Haroon Baloch (Bytes for All) and Salam Omer (KirkukNow)
As well as featuring the podcast interviewees, the launch event also heard expert commentary from the BBC Disinformation Unit, BBC Urdu language service and the Avast Foundation.