The word, which translates as ‘martyr’, accounts for more content removals on the company’s platforms than any other word.
Meta‘s oversight board says it will review the moderation of the Arabic word “shaheed”, which means “martyr” in English, because it accounts for more content removals on the company’s platforms than any other single word or phrase.
Thomas Hughes, director of oversight board administration, said on Thursday that this was “a complex moderation issue” that “impacts how millions of people express themselves online”.
Hughes said the high number of content removals raised questions about “whether Muslim and Arabic-speaking communities are subject to over-enforcement of their content because of Meta’s enforcement practices”.
“Shaheed” has multiple meanings in Arabic, including that of “witness” to an event, and is often used to refer to people who have died in sacrifice to a sacred cause.
Meta policy prohibits praise, support or representation of entities or people designated as dangerous or placed on “terrorism” lists, including a number of Palestinian groups opposing Israel’s decades-long occupation.
Meta, whose services include Facebook and Instagram, has asked the oversight board for advice on whether it should treat “shaheed” as praise and continue to remove posts that use the term to refer to individuals designated as dangerous or use a different approach, the board said.
Moderating the word could have an impact on news reporting in Arabic-speaking countries, the board noted, and called for public comments to assist with its deliberations.
The oversight board was created in late 2020 to review Facebook’s and Instagram’s decisions on taking down or retaining certain content and make rulings on whether to uphold or overturn the social media company’s actions.
The company has been criticised for failing to police abusive content in countries where such speech has been likely to cause the most harm, but the board’s latest case suggests overpolicing could also be a problem.
Digital rights of Palestinians
In September, a report produced by an independent consulting firm commissioned by Meta found over-enforcement resulted in significantly disproportionate consequences for the digital rights of Palestinians and Arabic-speaking users.
The report found that Meta’s practices violated Palestinians’ right to freedom of expression and assembly, political participation and non-discrimination.
Twitter, controlled by Elon Musk, has also come under fire for censoring Palestinian public figures.
The Washington bureau chief for the Jerusalem-based Al-Quds, one of the most widely read Palestinian daily newspapers, had his account suspended.
Asked whether he thought his suspension from Twitter was related to him being outspoken about Palestine, Said Arikat told Al Jazeera: “I believe it does. I can’t think of any other reason.”
Among the reasons offered by the platform were violating community standards, and some accounts were said to have been suspended by mistake or as a result of technical glitches. Some critics believe unspoken reasons include a general increase in hate speech and incitement against Arabs, including Palestinians.