A video link posted on Facebook on June 20 showed a man cooking human body parts in a pot over a wood fire.
In Cameroon, the footage went viral. Some Facebook users said the man was a cannibal and that the video was shot in the countryโs English-speaking west, where separatist insurgents are fighting to create a breakaway state.
Local websites quickly debunked this notion. The man in the video was not a separatist fighter or cannibal, and the body parts were not real. The clip was taken on a Nigerian film set and uploaded to Instagram on June 17 by make-up artist Hakeem Onilogbo, who uses the platform to showcase his work.
But the videoโs rapid spread raises questions about Facebookโs ability to police millions of posts each day and crack down on hate speech in a country where internet use is rising fast, social media are used for political ends and the company has no permanent physical presence.
The day the link was posted on Facebook, a member of the government brought the video to the attention of international diplomats in the capital, Yaounde, via the WhatsApp messaging service, according to messages seen by Reuters.
Five days later, Cameroonโs minister for territorial administration cited it as justification for an army clampdown against the secessionists that was already under way in the Anglophone regions.
The minister, Paul Atanga Nji, compared the rebellion โ over decades of perceived marginalisation by the French-speaking majority โ to an Islamist insurgency waged by the Nigeria-based militant group Boko Haram which has killed 30,000 people.
โBoko Haram committed atrocities, but they did not cut up humans and cook them in pots,โ the minister said in comments broadcast on state television and widely reported in Cameroon.
Nji did not respond to requests for comment. Government spokesman Issa Tchiroma Bakary said that in future the government would work to verify information before commenting.
Facebook said the video had not been reported by users and that it could not comment further on the clip. It was no longer available on the site by late October.
A senior Facebook official said tackling misinformation in Cameroon was a priority for the company, which acknowledges more needs to be done.
โWeโre prioritising countries where weโve already seen how quickly online rumours can fuel violence, such as Myanmar and Cameroon,โ said Ebele Okobi, Director of Africa Public Policy at Facebook.
UNDER FIRE
Facebook is under fire for carrying misleading information, including in the United States and Britain, and over posts against the Muslim Rohingya minority in Myanmar which have had deadly consequences.
Sri Lankan authorities briefly banned Facebook this year because the government said it was fueling violence between Buddhists and Muslims. In India, messages on Facebook-owned WhatsApp have been linked to attacks on religious minorities.
In Cameroon, Facebook has been used both to incite violence and to make threatening posts.
Simon Munzu, a former United Nations representative, said he was the target of death threats on Facebook after it was announced in July that he would help organise negotiations in the separatist conflict. Afraid, Munzu went to stay with friends.
Facebook removed the posts in October, after it was made aware of them by Reuters, saying they violated company standards.
Esther Omam, who runs a non-governmental organisation (NGO) called Reach Out, hid at a church and then fled to the Francophone region after receiving death threats from separatists following a peace march which she led, she told Reuters.
โThe crisis has destroyed my life and my family,โ she said. โI cannot work anymore. My family is divided. My husband is elsewhere, my children are elsewhere.โ
Facebook has no staff operating permanently in Cameroon and says it monitors the country from Britain and the United States. It has an Africa-focused team that frequently visits the region, and has partnered with NGOs and civil society in Cameroon in recent months to combat hate speech.
This included paying several thousand dollars to civil society to help organise training sessions for journalists to spot falsehoods online, representatives from two groups involved told Reuters. Some groups also flag offensive posts to Facebook.
Facebook has removed pages and accounts related to the separatist conflict, and is working to slow the spread of kidnapping videos, the company said.
It declined to say how many people it had helping it in Cameroon, how much money it had so far invested or how many posts it had taken down.
Reuters found dozens of pages posted in recent months showing graphic images in Cameroon, some of which were months old.
One Facebook user on July 18 posted a picture of the decapitated body of a Cameroonian policeman lying in a gutter, and said the image gave him joy.
The same day, separatist spokesman Ivo Tapang applauded the killing of two Cameroonian soldiers and linked to a website raising funds for guns, ammunition and grenade launchers. Tapang did not respond to requests for comment.
A Facebook spokeswoman said the company was unaware of the posts before Reuters pointed them out but that they were both removed after review. It is against Facebook rules to celebrate suffering or crowdfund for arms, she said.
Facebook has artificial intelligence that it uses globally to detect problematic posts. But in Cameroon, it does not have a consortium of fact-checking companies to monitor posts โ as it does in the United States.
Leading civil society figures in Cameroon say Facebook needs more resources and faces an increasingly difficult task as internet use grows.
โIt is not possible to stop misinformation on Facebook,โ said Maximilienne Ngo Mbe, executive director of REDHAC, a civil society group that has organised training sessions and flags indecent posts to Facebook.
NO EASY FIX
The number of people with internet access in Cameroon rose from 0.86 million in 2010 to 5.9 million in 2016, about a quarter of the population, according to the International Telecommunications Union, a U.N. agency.
The government shut down the internet in English-speaking regions for three months last year because of the unrest.
After service resumed in April 2017, Facebook was the main outlet for people speaking out against the army crackdown, in which soldiers razed villages and shot dead unarmed civilians.
But misleading and hateful posts have persisted, groups that monitor posts say, echoing issues Facebook sees worldwide.
Facebook is not the only service facing a battle to tackle misinformation and hate speech. Offensive videos and images are posted on Twitter or transmitted by WhatsApp.
WhatsApp cannot view private, encrypted conversations, a WhatsApp spokeswoman said, so detecting hate speech there is harder. A Twitter spokeswoman said it prohibits the promotion of violence and encourages users to flag those posts.