Facebook in India: Why is it still allowing hate speech against Muslims?

daytrader

ODI Debutant
Joined
Jul 11, 2015
Runs
10,061
Post of the Week
1
Facebook's algorithm helped fuel the viral spread of hate and violence during Ethiopia's civil war, a legal case alleges.

Abrham Meareg, the son of an Ethiopian academic shot dead after being attacked in Facebook posts, is among those bringing the case against Meta.

They want a $2bn (£1.6bn) fund for victims of hate on Facebook and changes to the platform's algorithm.

Meta said it invested heavily in moderation and tech to remove hate.

A representative said hate speech and incitement to violence were against the platform's rules.

"Our safety-and-integrity work in Ethiopia is guided by feedback from local civil society organisations and international institutions," the representative said.

Famine-like conditions
The case, filed in Kenya's High Court, is supported by campaign group Foxglove.

Meta has a content moderation hub in Kenyan capital Nairobi.

Hundreds of thousands of people have died in the conflict between the Ethiopian government and forces in the northern Tigray region, with 400,000 others living in famine-like conditions.

Last month, a surprise peace deal was agreed - but recently, there has been an upsurge in ethnically motivated killings between Amhara- and Oromo-speaking communities.

Last year Mr Meareg's father became one the casualties of the country's violence.

On 3 November 2021, Prof Meareg Amare Abrha was followed home from his university by armed men on motorbikes and shot at close range trying to enter the family home.

Threats from his attackers prevented witnesses coming to his aid as he lay bleeding, his son says. He died lying on the ground seven hours later.

Before the attack, Facebook posts slandered and revealed identifying information about him, his son says.

Despite repeated complaints using Facebook's reporting tool, the platform "left these posts up until it was far too late".

One was removed after his father's killing.

Another, which the platform had committed to remove, remained online as of 8 December 2022.

'Woefully inadequate'
"If Facebook had just stopped the spread of hate and moderated posts properly, my father would still be alive," Mr Meareg said.

He wanted to ensure no family suffered as his had, he said, and "a personal apology" from Meta.

In a sworn statement filed with the court, Mr Meareg alleges Facebook's algorithm promotes "hateful and inciting" content because it is likely to draw more interaction from users.

He also claims Facebook's content moderation in Africa, is "woefully inadequate", with too few moderators who deal with posts in key languages Amharic, Oromo and Tigrinya.

Meta - which owns Facebook - told BBC News: "We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya."

It maintains Ethiopia is a high priority - although, less than 10% of the population uses Facebook - and says the steps it has taken include:

reducing posts' virality
expanding violence and incitement policies
improving enforcement

This is a significant development, an attempt to legally take to task a social-media company over its actions during the conflict in Ethiopia.

Critics say Meta and other social-media companies do too little to prevent the sharing and spread of disinformation and content promoting hate and incitement against various ethnic groups. And in some cases, it takes too long for content to be removed, mostly after people report it.

Others claim the company has been unfair in its crackdown on hateful content, targeting posts written in some languages disproportionally.

Meta has always insisted it does a lot and has heavily invested in capacity to catch hateful and inflammatory content in the languages spoken most widely in the country.

It does have content moderators conversant with the main local languages but also relies on artificial intelligence and local partners to flag content. And how many moderators work with Meta focusing on Ethiopia has never been clear.

But this is not the first time Facebook has been accused of doing too little to stop the spread of content promoting ethnic hate and violence in Ethiopia.

In 2021, whistleblower Frances Haugen, a former employee, told the US Senate the platform's algorithm was "fanning ethnic violence... picking up the extreme sentiments, the division" as those posts attracted high engagement, while Facebook could not adequately identify dangerous content, and lacked sufficient expertise in many local languages - including some spoken in Ethiopia.

Other plaintiffs in the case include the Katiba Institute and Fisseha Tekle, who alleges Facebook's moderation failures made his human-rights reporting on the conflict, for Amnesty International, impossible - and risked his family's lives.

They are asking the court to order Facebook to take steps to remedy the situation, including:

  • creating a restitution fund of about 200bn Kenyan shillings (Ksh) ($1.6bn) for victims of hate and violence incited on Facebook and a further 50bn Ksh for similar harm from sponsored posts
  • preventing its algorithm recommending "inciteful, hateful and dangerous content"
  • employing enough moderators to translate local content, ensuring equity between the moderation in Nairobi and that for US users


BBC
 
Critics accuse Facebook's parent company Meta of not moderating its social media platform effectively - with disastrous consequences


The head priest was clear about what had to be done. “I want to eliminate Muslims and Islam from the face of the Earth,” he declared. His followers listened, enraptured. Delivered in October 2019, the speech by Yati Narsinghanand, head of the Dasna Devi temple in the north Indian state of Uttar Pradesh, was filmed and posted on Facebook. By the time Facebook removed it, the tirade had been viewed more than 32 million times.

The priest spelled out his vision more clearly in a speech posted on Facebook in the same month, which has received more than 59 million views. “As long as I am alive,” he promised, “I will use weapons. I am telling each and every Muslim, Islam will be eradicated from the country one day…”

Three years after it was delivered, the speech can still be viewed on Facebook. Meta, as Facebook’s parent company has been known since October 2021, failed to explain why when asked by Middle East Eye.

This, experts say, is the language of genocide.

In March 2021, controversy erupted in India after a 14-year-old Muslim boy entered Narsinghanand’s temple to drink water and was brutally attacked by his followers.

In the wake of the public furore, The London Story, a Netherlands-based organisation which counters disinformation and hate speech, reported that interactions on the preacher’s fan pages rose sharply.

Narsinghanand has done little to present himself as peaceful and tolerant: violent and chauvinistic imagery are central to his brand, as they are to many who share his views. Facebook photos in December 2019 showed him and his followers brandishing machetes and swords. His targets are never in doubt: India’s 200 million-plus Muslims.

Read more on

https://www.middleeasteye.net/big-story/facebook-meta-india-muslims-allow-hate-speech
 
Last edited:
Back
Top