Facebook's 'double standard' on hate speech against Russians

Meta logo is placed on a Russian flag in this illustration taken February 26, 2022. REUTERS/Dado Ruvic/Illustration

Meta logo is placed on a Russian flag in this illustration taken February 26, 2022. REUTERS/Dado Ruvic/Illustration

What’s the context?

Meta's decision to allow hate speech against Russians is troubling and can impact other conflict areas, say experts.

  • Meta decision on hate speech against Russians "not acceptable"
  • Meta calls it a temporary change in unprecedented circumstances
  • Discrimination by Big Tech is "a feature not a bug"

By Rina Chandran and Maya Gebeily

BANGKOK/BEIRUT - Facebook's decision to allow hate speech against Russians due to the war in Ukraine breaks its own rules on incitement, and shows a "double standard" that could hurt users caught in other conflicts, digital rights experts and activists said.

Facebook owner Meta Platforms will temporarily allow Facebook and Instagram users in some countries to call for violence against Russians and Russian soldiers in the context of the Ukraine invasion, Reuters reported last week.

It will also allow praise for a right-wing battalion "strictly in the context of defending Ukraine", in a decision that experts say demonstrates the platform's bias.

People hold Hamas flags as Palestinians gather after performing the last Friday of Ramadan to protest over the possible eviction of several Palestinian families from homes on land claimed by Jewish settlers in the Sheikh Jarrah neighbourhood, in Jerusalem's Old City, May 7, 2021. REUTERS/Ammar Awad
Go DeeperInstagram, Twitter blame glitches for deleting Palestinian posts
Nora, a Zimbabwean domestic worker living in Johannesburg, irons the clothes of her employer
Go DeeperAs online hate speech hits Africa, social media firms told to act
People hold Hamas flags as Palestinians gather after performing the last Friday of Ramadan to protest over the possible eviction of several Palestinian families from homes on land claimed by Jewish settlers in the Sheikh Jarrah neighbourhood, in Jerusalem's Old City, May 7, 2021. REUTERS/Ammar Awad
Go DeeperInstagram, Twitter blame glitches for deleting Palestinian posts

The move represents a "glaring" double standard when set against Meta's failure to curb hate speech in other war zones, said Marwa Fatafta at digital rights group Access Now.

"The disparity in measures in comparison to Palestine, Syria or any other non-Western conflict reinforces that inequality and discrimination of tech platforms is a feature, not a bug," said Fatafta, policy manager for the Middle East and North Africa.

"Tech platforms have a responsibility to protect their users' safety, uphold free speech, and respect human rights. But this begs the question: whose safety and whose speech? Why were such measures not extended to other users?" she added.

Last year, hundreds of posts by Palestinians protesting evictions from East Jerusalem were removed by Instagram and Twitter, who later blamed technical errors.

Digital rights groups slammed the censorship, urging greater transparency on how moderation policies are set and ultimately enforced.

One policy for all?

Facebook has come under fire for failing to curb incitement in conflicts from Ethiopia to Myanmar, where United Nations investigators say it played a key role in spreading hate speech that fuelled violence against Rohingya Muslims.

"Under no circumstance is promoting violence and hate speech on social media platforms acceptable, as it could hurt innocent people," said Nay San Lwin, co-founder of advocacy group Free Rohingya Coalition, who has faced abuse on Facebook.

"Meta must have a strict policy on hate speech regardless of the country and situation - I don't think deciding whether to allow promoting hate or calls for violence on a case-by-case basis is acceptable," he told the Thomson Reuters Foundation.

Scrutiny over how it tackles abuse on its platforms intensified after whistleblower Frances Haugen leaked documents showing the problems Facebook encounters in policing content in countries that pose the greatest risk to users.

In December, Rohingya refugees filed a $150 billion class-action complaint in California, arguing that Facebook's failure to police content and its platform's design contributed to violence against the minority group in 2017.

Meta recently said it would "assess the feasibility" of commissioning an independent human rights assessment into its work in Ethiopia, after its oversight board recommended a review.

Ukraine exception

In a report on Wednesday, Human Rights Watch said tech firms must show that their actions in Ukraine are "procedurally fair," and avoid any "arbitrary, biased, or selective decisions" by basing them on clear, established, and transparent processes.

In the case of Ukraine, Meta said that native Russian and Ukrainian speakers were monitoring the platform round the clock, and that the temporary change in policy was to allow for forms of political expression that would "normally violate" its rules.

"This is a temporary decision taken in extraordinary and unprecedented circumstances," Nick Clegg, president of global affairs at Meta, said in a tweet, adding that the company was focused on "protecting people's rights to speech" in Ukraine.

Russia has blocked Facebook, Instagram and Twitter.

And Meta's new tack underlines how hard it is to write rules that work universally, said Michael Caster, Asia digital programme manager at Article 19, a human rights organisation.

"While the policies of a global corporation should be expected to change slightly from country to country, based on ongoing human rights impact assessments, there also needs to be a degree of transparency, consistency and accountability," he said.

"Ultimately, Meta's decisions should be shaped by its expectations under the U.N. Guiding Principles on Business and Human Rights, and not what is most economical or logistically sound for the company," he said in emailed comments.

Unilateral decision

For Wahhab Hassoo, a Yazidi activist who has campaigned to hold social media firms accountable for failing to act against Islamic State (ISIS) members using their platforms to trade Yazidi women and girls, Facebook's moves are deeply troubling.

Hassoo's family had to pay $80,000 to buy the release of his niece from the jihadists, who abducted her in 2014 then offered her "for sale" in a WhatsApp group.

"I am shocked," said Hassoo, 26, of Meta's decision to allow hate speech against Russians.

"When they can make certain decisions unilaterally, they can basically promote propaganda, hate speech, sexual violence, human trafficking, slavery and other forms of human abuse related content - or prevent it," he said.

"The last part is still missing."

Hassoo and fellow Yazidi activists compiled a report that urged the United States and other nations to probe the role social media platforms including Facebook and YouTube played in crimes against their minority Yazidi community.

Meta's actions on Ukraine confirm what their research showed, said Hassoo, who resettled in the Netherlands in 2012.

"They can promote or ban what fits in their interests and what they find important," Hassoo said. "It is not fair that a company can decide on what's good and what's not."


Context is powered by the Thomson Reuters Foundation Newsroom.

Our Standards: Thomson Reuters Trust Principles


Tags

Facebook
Meta
Social media

FEATURED PODCAST

An illustration photo shows the globe with a tree standing on top. On the left hand side, a red backed illustration shows barren trees and oil refinery towers. On the right hand side, a green backed illustration shows wind turbines and solar panels. A sound equaliser image crosses the screen to indicates audio.
6 EPISODES
Podcast

Just Transition

The human stories behind the shift to a green economy

Just Transition promo image
Podcast



Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Today On Context