Facebook Extends Fact-Checking Program To 10 More African Countries
In 2018, Facebook partnered with Africa Check, the first independent fact-checking organization on the continent, to begin reviewing content in South Africa, Kenya, Nigeria, Senegal and Cameroon.
The newly expanded fact-checking effort is made possible through partnerships with Facebook and respected media firms.
Listen to GHOGH with Jamarlin Martin | Episode 68: Jamarlin Martin
Jamarlin talks about the recent backlash against Lebron James for not speaking up for Joshua Wong and the violent Hong Kong protestors.
Facebook’s third-party fact-checking services are now available in the following countries in partnership with these media companies:
- Ethiopia – AFP
- Zambia – AFP
- Somalia – AFP
- Burkina Faso – AFP
- Uganda – Pesa Check
- Tanzania – Pesa Check and AFP
- The Democratic Republic of Congo – France 24 Observers and AFP
- Cote d’Ivoire – France 24 Observers and AFP
- Guinea – France 24 Observers
- Ghana – Dubawa
Facebook says that local posts and articles will be fact-checked and photos and videos verified.
In addition to English, Facebook says it will now review African content in Hausa, Yoruba, and Igbo in Nigeria, Swahili in Kenya, Wolof in Senegal, and Afrikaans, Zulu, Setswana, Sotho, Northern Sotho and Southern Ndebele in South Africa, TheCitizen reports.
Fact-checking, and then what?
While Facebook publicizes its efforts to control fake news and misinformation, the policy of the social media giant is to limit the exposure of false news posts rather than to remove them altogether.
Facebook relies on feedback and reporting from users to flag content that should be reviewed.
If one of Facebook’s fact-checking partners identifies a story as false, Facebook will show it lower in the news feed, which it says “significantly reduces its distribution.”
Facebook says that it allows people to post fake news content as a form of expression, but says it will not show that content at the top of the news feed, according to BBC.
The social media firm will only delete content if it is deemed to violate Facebook’s rules against graphic violence or nudity.