Due to the upcoming presidential elections in Kenya, a conflict between the government of the country and a well-known social network has flared up. Increased political activism led to a split among the population. Residents began to post campaigns to vote for their candidate. As a result, some of the Kenyan tribes began to attack each other.
While the Kenyan government accuses Facebook of inaction, the social network is releasing an article that talks about their active fight against inappropriate content with political appeals in Kenya. Two human rights organizations, Global Witness and Foxglove, decided to test Facebook and published posts in English and Swahili calling for the extermination of Kenyans along racial lines. Both posts were approved by the social network.
The organization published a report with the results of the tests, to which Facebook immediately responded and posted a new article. It said they were using artificial intelligence and user feedback to block unwanted content. However, the system is not perfect, and they are constantly working to improve it.
Human rights organizations have not abandoned their attempts to post propaganda, and all of them have been approved again. After that, the NCIC said that Facebook is violating the law and threatened to block the social network in Kenya. After another ultimatum, Meta again promised to improve its efforts to block political content.
As a result, the Kenyan government announced that it would not block the social network in the country. The authorities confirmed that Meta had blocked some 37,000 posts that led to political conflicts.
Do you use Facebook often? Have you noticed that the social network skips over violent posts or blocks neutral information