Ethiopia. The complicit silence of Facebook in the spread of ethnic hatred against Tigray and Oromo.

While Twitter has acted firmly to block all messages of ethnic hatred and incitement to genocide created by Eritrea and Ethiopia regimes and aimed against the ethnic groups of Tigray and Oromia, Facebook has adopted an incomprehensible passivity. After receiving heavy criticism from the international community about this behaviour, Mark Zuckerberg promised to apply stricter moderation. Promises that seem to have fallen on deaf ears.

Here it is not a question of controlling some isolated extremist acting on the net. It is about blocking genocidal and ethnic propaganda criminal networks made up of thousands of users and designed to create effective passwords and messages to make people act against the two main ethnic groups: those of Tigray and Oromia.

The code of conduct that regulates the use of the famous social platform Facebook is very detailed in prohibiting the spread of ethnic hatred which is divided into three levels of severity.

The first concerns content aimed at individuals or a group of people, which contains statements of incitement or support for violence, speeches or dehumanizing images. The second level concerns contents relating to the affirmation of cultural, moral, physical, intellectual inferiority of persons or of a group of persons. Expressions of hatred, rejection and disgust towards a particular group of people. Messages containing expressions of contempt, anti-Semitic, sexophobic, xenophobic, homophobic, racist, Islamophobic intolerance are also prohibited.

The third and final level concerns content aimed at people or groups of people who explicitly invite their segregation, political, economic, social exclusion, threatening statements or wishes in support of violent actions against them, incitement to violence and genocide.

Facebook warns its users that it will act in full autonomy in removing all pots containing messages falling within these 3 levels and removing the user page if the messages of ethnic hatred and incitement to violence are repeated and configure a clear racial and genocidal communication strategy.

These rules do not seem to apply to the Ethiopian civil war, where propaganda and the use of social media to convey messages of hatred and incitement to violence and genocide, are an integral part of the policy of annihilation of the Tigrayan ethnic group and in the policy of submission with the strength of the Oromo ethnic group (the first ethnic group in Ethiopia), conceived and promoted by the Eritrean dictator Isaias Afwerki and diligently applied by his Ethiopian associated: the Ethiopian Premier Abiy Ahmed Ali and the fascist Amhara leadership.

The regimes of Asmara and Addis Ababa have prioritized communication on social media, transforming it into a weapon of war. Various fake profiles have been created as well as propaganda coordination on social networks. The most active are: #EthiopiaPrevail, managed directly by experts in Asmara and #NoMore managed by a group of Amhara entrepreneurs from the American diaspora under lavish remuneration from the fascist regime of Addis Ababa.

The most striking example of this communication policy occurred at the end of October. Dejene Assefa, an Addis Ababa extremist known for his appearances on state television in Ethiopia, posted a message to his more than 120,000 followers on Facebook. The post urged his compatriots to rise up across the country and assassinate members of the Tigrinya ethnic group. “The war is against those you grew up with, your neighbour,” he wrote in Amharic. “If you can free your forest of these thorns … the victory will be yours.” The message was shared nearly 900 times and attracted over 2,000 reactions. Many of the responses echoed the call to violence and promised to heed Dejene’s advice.

“The content is some of the most terrifying I’ve ever seen anywhere,” commented Timnit Gebru, a former Google data scientist and leading bias expert in artificial intelligence, who is fluent in Amharic. “It was literally a clear and urgent call for genocide. This is reminiscent of what was seen on Radio Mille Collines in Rwanda ”. Radio Television Libre des Mille Collines, a station created by Hutu extremists in Rwanda, broadcast calls for violence that helped trigger the genocide in the country in 1994.

Despite the blatant incitement to genocide against the Tegaru ethnic group, the post was never removed or obscured from Facebook. Dejene Assefa’s post is just the tip of the iceberg. Faced with the lack of reaction, the communication experts of the Ethiopian and Eritrean regimes concentrated their genocidal propaganda efforts on the Facebook platform also following the effective repressive action adopted by Twitter which deactivated several fake accounts and others that spread ethnic hatred, as well as to prevent the spread of hashtags by groups specialized in disinformation and ethnic propaganda such as #EthiopiaPrevail and #NoMore.

The management of Facebook recognizes the risks of disinformation of hate speech coming from Ethiopia or the Ethiopian diaspora but claims to have great difficulties in controlling dangerous content, especially those written in the Amharic language. Mark Zuckerberg acknowledges that moderation capacity is insufficient due to language barriers that prevent the Facebook moderation service from detecting dangerous messages or for users to report them.

To try to bridge the gap in its understanding of Ethiopian and Eritrean genocidal messages, the company proposed using “network-based models”, an opaque and experimental mechanism. Various experts have judged this decision a simple and shameful trick, stating that a simple automatic translation program of the Amharic language or programming the Artificial Intelligence responsible for controlling the contents with the keywords of the Amharic language usually used in messages of ethnic hatred and incitement would be enough to stop the genocide communication policy.

A year and a month after the outbreak of the civil war in Ethiopia, Facebook has not yet solved the problem. On the contrary, it is now trying to ignore it and not to take seriously the complaints of international associations for the defense of human rights.

Here it is not a question of controlling some isolated extremist acting on the net. It is about blocking genocidal and ethnic propaganda criminal networks made up of thousands of users and designed to create effective passwords and messages to make people act against the two main ethnic groups: those of Tigray and Oromia.

Human rights groups have documented atrocities on both sides, but a joint report by the United Nations High Commissioner for Refugees found abuses committed by government forces, including ethnic Tigrinya massacres and armed rape of what could amount to thousands of women. The US government is considering declaring the campaign a genocide.

In Ethiopia, where journalists have been jailed and state media censors all reports of abuse by state and allied forces, the government’s response was backed by an army of activists and social media personalities, who produce consensus for the conduct of his forces. Many have a large following on Facebook, which has more than 6 million users in the country.

The NISS political police and the NSA telematic police (created in 2008 by Abiy Ahmed Ali himself) also use Facebook to identify journalists, human rights activists and anyone who is critical of the regime, labeling them as “traitors” and trying to identify their identity to initiate punitive actions against them or reprisals against relatives, friends and loved ones.

The dehumanizing language aimed at ethnic minorities created by the Amhara and Eritrean communication experts has entered the current use of the Amhara language becoming the pillar of messages disseminated through Facebook. The social platform is used without restraint by the Nobel Peace Prize winner himself. Last July, in the face of the TPLF and OLA offensive, a frustrated Abiy Ahmeda posted a series of commercials on Facebook vowing to crush “Tigrinya cancer” by calling them “mice”, “weeds” “virus”. Facebook did not react.

An incomprehensible inertia given that one year Facebook’s decision to obscure a post by US President Donald Trump that instigated racial hatred using graphic images that recalled Nazi symbolism had caused a sensation. In January 2021, Facebook went further, announcing the suspension of the profile of the former American president, also excluding him from Instagram due to the favorable comments published by Trump on the assault on Capitol Hill.

When compared with the posts of the Ethiopian Premier, those of Trump appear innocent thoughts of a mythomaniac. Yet not even the mildest invitation to moderation was sent by Facebook to the Ethiopian Premier. The fascist Amhara regime has interpreted this inaction as tacit complicity, concentrating its communication efforts on the Facebook platform, in order to compensate for the restrictions and difficulties encountered on Twitter.

The international community, one year after the start of this communicative experiment using Facebook, agrees that the messages conveyed are not only a reflection of the political environment of Ethiopia but that they have contributed to the worsening of ethnic violence. In September, world public opinion learned of the genocidal plan against Tigray, of the ethnic cleansing in Addis Ababa against Tegaru and Oromo, of the existence of at least 4 internment camps for Tegaru modeled on the Nazi example. Real extermination camps to “uproot the weeds”. Despite these evidences, the Facebook management continues not to take serious action to stop the campaign to promote the genocide on its social platform.

In November, Zecharias Zelalem, an Ethiopian freelance journalist who collaborates with Al-Jazeera, Quartz, Addis Standard and Open Democracy and Peter Guest, corporate editor of the information platform Rest of World, discovered the existence of two disinformation groups of the Ethiopian diaspora. The first in Egypt, affiliated with the Amhara paramilitary militia: FANO and the second in Sudan, affiliated with a Oromo terrorist group that calls for violence against the federal state and against the Amhara ethnic group. Group not recognized by the Oromo community and the Oromo Liberation Army which is suspected to be a creation of the Eritrean dictator Isaias Afwerki used for military propaganda warfare.

Both groups focused on spreading racial hatred and incitement to genocide were not repressed by Facebook. Their activities almost ended only after the intervention of the Egyptian and Sudanese cyber police.

“The content posted on Facebook has had a real impact on civilians in Ethiopia,” says Yohannes Ayalew, a former law professor at Bahir Dar University in Ethiopia, now a PhD candidate at Monash University in Australia. He pointed to the revenge campaign on Facebook in July 2020 following the state assassination of Oromo activist and singer Hachalu Hundessa as an example. A campaign that generated a wave of brutal violence among the Oromo in which hundreds of Amhara civilians and members of other minorities in the Oromia region were murdered. The political opposition Oromo and the Oromo Liberation Army have found it hard to stop this wave of Facebook-sponsored violence.

During testimony given before a US Senate subcommittee last October, Frances Haugen, a Facebook whistle-blower, said the company’s failures in Ethiopia could match those in Myanmar, where UN officials said Facebook it played a leading role in facilitating the military regime’s genocidal violence.

Internal documents show the reasons for the company’s failure. Facebook knows it does not have sufficient coverage of local languages ​​to actively identify hate speech or calls for violence. It also collected a low number of user reports to help identify problematic content, which it attributed to digital literacy, but its reporting interfaces remain confusing for Ethiopian users, and the lack of Amharic language translation support persists at one year after the start of these campaigns of violence and ethnic hatred.

These shortcomings were well known as early as June 2020 when Facebook employees examining the platform’s “signals” — the data, collected from users and partners — said they found “significant gaps” in the countries most at risk, particularly in Myanmar and Ethiopia. Employees proved that Facebook’s monitoring and moderation system was completely inadequate.

An internal Facebook report from October 2021 reveals that Ethiopia was not listed in “level 1” risk countries but enjoyed an outlier with the lowest tracking rate compared to other countries known to promote ethnic hatred, messages of violence or support for international terrorism. Level 1 indicates countries where governments actively and knowingly promote disinformation campaigns and encouragement of ethnic and political violence.

This complaint dismantles Facebook’s defense of language and translation difficulties. Here we are talking about a precise company policy aimed at protecting the Ethiopian regime and its Eritrean associates by allowing them to continue using Facebook to convey their very dangerous messages of violence.

Facebook, now in the crosshairs of worldwide criticism, has proposed using a different approach to tackle the problem: network-based moderation. Rather than using specific words or phrases to directly identify hate speech or disinformation, moderation on the net relies on identifying patterns of behavior consistent with malicious activity.

“However, network-based moderation is a particularly opaque form of content moderation, about which the company has released few details. This form of moderation is based on research and data from the platform, which it rarely shares with external researchers. Documents published as part of the Facebook Papers show that this approach is still experimental and that it is unclear whether its models work in the context of hate speech, even in the United States, where it is based and where it has the largest volume of data. “, Explains Evelyn Douek, lecturer at Harvard Law School and expert in moderation on social media

Internal Facebook documents denounce the company of unwillingness to understand the communication network for the spread of ethnic hatred, both those in favour of white supremacists in the United States and those in favour of brutal regimes in Ethiopia and Myanmar. The same spokesperson for META, the holding recently renamed the owner of Facebook, said last November that the company has not yet now used the new network-based moderation protocol to stop malicious networks in Ethiopia. Beyond the damage the mockery…

Even if that protocol goes into action, it is unlikely to work. “You cannot enter a market without language understanding or contextual understanding or political experience and expect this kind of moderation to be sufficient or prevent harm. It is simply not adequate.” Douek says.

Insider leaks reveal that Facebbok encountered serious problems with the spread of ethnic hatred against Tigray and Oromia, promoted by the regime, as early as 2019. A Facebook employee revealed that the dangers of ethnic hatred messages and the their rapid multiplication on the social network recorded from 2019 onwards, seem to be of no concern to Facebook, META and Mark Zuckerberg.

Here collapses the defense of Zuckerberg attempted in September 2021 that his society was not ready for the “sudden” wave of hatred and promotion of violence originated by the regimes of Ethiopia and Eritrea. The promise to remedy also falls on deaf ears.

On the contrary, there is indirect support for these campaigns of pure ethnic hatred and incitement to genocide. The most striking example occurred last September when Facebook encouraged the dissemination of a post about an alleged terrorist attack carried out by another Ethiopian ethnic minority: the Qemant which had caused the death of dozens of Amhara civilians. The news was clearly false and Facebook did not intervene in the face of reports of inappropriate content and fakenews sent by hundreds of users.

The day after the post was published, a Qemant village was attacked by the FANO militias, sacked and set on fire. This time the victims were real. The fakenews who blamed the Qemant for massacres of civilians had only served to justify the war crime committed against this ethnic minority that opposes Amhara domination over the country.

The Qemant are a small ethnic group from north-western Ethiopia who live in Gondar, in the Amhara region. They are related to the Agaw Ethiopian people. the Qemants have been attacked several times by government forces and allied militias. Thousands of Qemant civilians have fled their homes to take refuge in neighboring Sudan.

Ethnic hate messages broadcast on Facebook are undergoing a dangerous mutation. From messages aimed at reinforcing the feelings of hatred towards fellow citizens of Tigray and Oromia, Facebook is now used by the Amhara and Eritrean secret services, to spread death sentences and coordinate pogroms and ethnic cleansing.

In August, Ethiopian state-run media commentator Miktar Oussman with over 210,000 followers on Facebook put down a series of Tigrinya-born university professor names to delete. Two months later two of the teachers mentioned on the list were killed. Their assassination was celebrated by Muktar on Facebook with the terrifying phrase: “Two less”.

META’s Director of Public Policy for the East and the Horn of Africa: Mercy Ndegwa, last September tried to address the growing criticism against Facebbok and the suspicions of connivance and silence in favor of the Ethiopian and Eritrean regimes, stating that they are strengthening their efforts for communication security in Ethiopia, aware that the risks on the ground that messages are transformed into concrete actions are currently very high. Ndegwa reassured the tens of thousands of users scandalized by this passivity and international public opinion that META and Facebook will take effective action to meet human rights protection needs.

Ndegwa’s assurances turned out to be red herrings. In October, an Ethiopian university researcher reported a series of genocide-inciting government posts posted on Twitter and Facebook calling for their removal. While Twitter intervened immediately, Facebook sent the following communication to the owners of the profiles involved in the operation. “Your post was reported as offensive and in violation of the internal code of conduct. After careful analysis we inform you that your post does not violate the standards of the Facebook Community “. Only later did META intervene to delete the offending posts.

Unfortunately, they had been seen by millions of people and had accounted for over 50,000 shares. The posts contributed to the collaboration between particularly extremist Amhara citizens and the political police and Eritrean troops in the hunt for the Tigrinya unleashed in Addis Ababa. They also encouraged a few thousand unemployed and outcasts of society to join the death militias invented by Abiy to defend the capital modestly called: “Urban people’s self-defense committees”.

In early November, Facebook removed for the first time a post by Premier Abiy Ahmed calling on citizens to rise up and “bury” the rebels. The post was found to have violated Facebook’s rules on inciting violence. It was an important speech, analysts said, but the day after Abiy’s post was removed, the Mayor of Addis Ababa: Adanech Abiebie went to Facebook to applaud volunteers leading the legacies and neighborhood violence in the whole city, adding, “No doubt the men in the junta [a term used to refer to the Tigray rebels] will be buried wherever they go!” This post has never been removed from the platform.

“I have the distinct feeling that Facebook is doing a moderation service only in words due to the bombshell criticism coming from all sides. In reality, Facebook continues its incomprehensible policy of silence and complicity in favor of the Ethiopian government, “said professor Yohannes Ayalew, of Monash Unversity, one of the most prestigious universities in Australia.

In early December, Facebook declared that Ethiopia had been designated a temporary place with a high risk of spreading ethnic hatred, promising to remove all posts promoting violence and disinformation. META immediately moved on to concrete acts. He added the Oromo Liberation Army to an internal blacklist by blocking posts that contain any references to the OLA.

Strangely, however, he has not yet taken serious action against the criminal genocidal communication networks run by the Eritrean and Ethiopian secret services, nor has he implemented moderation or limitations in the ethnic and violent messages published on the official accounts of the Ethiopian government or on the private accounts of its members, cadres of the Prosperity Party and Amhara men and women from the entertainment world. A definitive position by Zuckerberg that clarifies his political orientation in the Ethiopian civil war.

Facebook appeals to its role of neutrality in the free offer of its social service. Evidently Zuckerberg has never heard the words of the South African Archbishop Desmond Tutu who disappeared during Christmas: “If you are neutral in situations of injustice, you have chosen the side of the oppressor”

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Fulvio Beltrami Freelance Journaliste Africa
Fulvio Beltrami Freelance Journaliste Africa

Written by Fulvio Beltrami Freelance Journaliste Africa

The duty of a journalist is to write down the truths which the powerful keep secret. Everything else is propaganda. Italian Jounalist Economic Migrate in Africa

No responses yet

Write a response