Debating Ideas is a new section that aims to reflect the values and editorial ethos of the African Arguments book series, publishing engaged, often radical, scholarship, original and activist writing from within the African continent and beyond. It will offer debates and engagements, contexts and controversies, and reviews and responses flowing from the African Arguments books.
It is not news that Facebook – now Meta – can and does spread disinformation and hate speech, exacerbating division, discord and violence, and that extreme content is not a glitch, but rather integral to its algorithm-driven engagement and profit model. Examples from around the world are numerous. Neither is it new information that this dynamic is currently taking place in Ethiopia: civil society groups in the region have been urgently highlighting the issue since long before October 2021, when whistle blower Frances Haugen made headlines denouncing her former employer, including for their role in fomenting conflict in Ethiopia.
As digital activist Berhan Taye describes in a recent Guardian article, despite ongoing conversations with Facebook since 2018, ‘they’re actively failing every time’. The ‘Facebook Papers‘, internal documents leaked by Haugen, reveal that the company has been well aware of their failure to address hate speech and incitement in Ethiopia since at least 2019. As posts that dehumanise other ethnic groups, spread lies, throw doubt upon facts and straightforwardly call for violence continue to reach millions daily, Facebook’s inadequate response in Ethiopia is a scandal that can no longer be overlooked.
Internet usage has exploded in Ethiopia in the past decade, from the second lowest proportion of internet users in the world in 2011 to 21 million users today, alongside infrastructural improvements, increased mobility and rising literacy levels.
While conducting research with public servants in a rural area in 2014 and 2015, I witnessed how electrification, improved mobile networks and large-scale deportation of illegal Ethiopian workers from Saudi Arabia coincided to result in hugely increased smartphone usage, especially among young people. Nationally between 6.4 and 11 million Ethiopians now use Facebook, a number projected to grow threefold in the coming decade. It has become part of the language and texture of everyday life; as one friend in Addis Ababa, whose family’s city centre neighbourhood had been razed, grimly joked that their new house in one of the condominium blocks that circle the outskirts of the city was ‘like Facebook for them’, as they re-encountered previous acquaintances from all over the city forced to relocate.
The rapid growth of Facebook and other social media platforms has had serious social costs. The role of Facebook in contributing to a ‘horrifying assemblage of disjointed, antagonistic, mutually cancelling communities and political groupings’, in academic Semeneh Ayalew’s words, has been especially catastrophic since the killing of musician Haacaaluu Hundeessa in June 2020 and the start of the conflict in Tigray and beyond from November 2020. Comparisons abound to the situation in the Rohingya in Myanmar, where the UN found Facebook had contributed to genocide. In Ethiopia and among the large Ethiopian global diaspora, hate speech is not the preserve of any one group. As Teddy Workneh, a media studies researcher, told Vice in 2020, ‘there is no doubt’ that hate speech on Facebook is accompanied by the attack, displacement and killing of ethnic and religious minorities in regional states. Facebook Papers documents seen by The Continent in South Africa show that the company itself entertains no doubt that individuals and networks are using Facebook to ‘coordinate calls for violence… promoting armed conflict, co-ordinated doxxing, recruiting and fund-raising’ for militia groups known to have carried out massacres.
Despite their assertion of lessons learnt from Myanmar, Facebook has failed to invest in moderators with knowledge of local languages and contextual understanding, in turn affecting algorithmic moderation. Ethiopian law, in the form of a proclamation from March 2020, gives social media companies 24 hours to take down disinformation and hate speech. This time requirement is very frequently exceeded; posts reported by users for inciting violence are not removed for weeks or longer or at all, even where internal investigations have recognised users as coordinating hate speech. Perhaps the most shocking – if not surprising – revelation in the Facebook Papers is that Mark Zuckerberg responded to briefings by the Civic Integrity Unit where Haugen worked with the instruction not to proceed with technical changes, such as making it harder to share harmful content, ‘if there was a material trade-off with MSI [Meaningful Social Interaction] impact’, central to Facebook’s business model. As Nanjala Nyabola elegantly puts it: ‘the site was optimised to privilege virality over truth’, in order not to affect profit.
Under pressure as the situation in Ethiopia escalated in late 2021, Meta released a statement on November 9, authored by Mercy Ndegwa, Public Policy Director East Africa, and Mark Smith, Director, Global Content Management, conveying concern and sending their ‘thoughts’ to the people of Ethiopia. Details about additional steps the company has taken do not include the actual number of human content moderators in the four Ethiopian languages supported by Facebook (Amharic, Oromo, Somali and Tigrinya), as specifically called for by activists for years. The fact that there is any content moderation at all is portrayed as an act of corporate generosity, saying investment in local language moderators comes ‘despite the fact that the country’s lower internet adoption means that less than 10% of the population uses Facebook.’ Fears that the ‘world-leading’ automated content moderation the statement boasts about (in Amharic and Oromo only) cannot be effective without a nuanced understanding of linguistic and cultural subtleties are underlined by the two sources who reported to The Continent that fewer than a hundred people are employed to moderate content in the four languages: one moderator per 64,000 users. Part of a wider pattern of ‘chronic underinvestment in non-Western countries’, the unwillingness of Facebook to fully take responsibility for its role in Ethiopia’s conflict is a scandalous example of corporate greed.