The 5 most important revelations from the ‘Facebook Papers’

Facebook

Facebook secret list

A more full picture of how Facebook was acutely aware of its negative consequences emerged on Monday. During Frances Haugen’s evidence before the British Parliament and through a series of reports based on internal papers. The leaked documents are “The Facebook Papers.” During the 2.5 hour question-and-answer, Haugen said, Facebook puts “growth over safety.” 

The following are the most shocking revelations from Oct. 25 statement and internal papers

Facebook fails to moderate harmful content in developing countries

The records are authenticated by various news outlets. It shows problems with hate speech and misinformation are much higher in poorer countries, where content regulation is generally worse. According to one document, the United States receives 87 percent of Facebook’s worldwide budget for time spent on classifying misinformation. While the rest of the world receives 13 percentage. Even though North American users account for only 10% of its daily users.

AI fails to accurately detect dangerous content in non-English languages

According to Facebook papers, Facebook’s algorithm wrongly prohibited a hashtag referencing the Al-Aqsa Mosque in Jerusalem’s Old City. It mistook it for the militant group Al-Aqsa Martyrs Brigade, an armed affiliate of the secular Fatah party. “Facebook says things like, ‘we support 50 languages,’ when in reality, most of those languages get a tiny fraction of the safety systems that English gets,” Haugen informed British lawmakers. 

It labels election misinformation as “harmful, non-violating” content

Internal documents confirmed, during the 2020 presidential election, Facebook employees raised numerous red flags about misinformation. But corporate officials did little to address the issues. While the documents demonstrate Facebook’s awareness of election misinformation. They do not reveal the company’s decision-making process for labeling election misinformation as non-violent content.

Facebook was aware of maid selling on its platform

According to internal documents obtained by the Associated Press, Facebook admits to “under-enforcing on confirmed abusive behavior” when it neglected to take action after Filipina maids complained of being abused and sold on the network.“In our investigation, domestic workers frequently complained to their recruitment agencies of being locked in their homes, starved, forced to extend their contracts indefinitely, unpaid, and repeatedly sold to other employers without their consent,” one Facebook document reads.

It internally debated removing the Like button

According to the documents, the Like button generated “stress and anxiety” among the platform’s youngest users if their posts didn’t receive many likes from friends – yet when removed, users interacted less with posts and ads, and it failed to reduce social anxiety as they hoped.

Exit mobile version