After parts 1 and 2 of this article, The Facebook Papers: Internal Documents leaked Part-3 will let you know more about Facebook leaked papers
Struggles of Facebook
Facebook has been struggling to overcome the negative impacts associated with its “like” button the share button as well as its group feature. New York Times states that based on internal documents the company has examined certain of its key features and the ways they could be harmful:
The findings of the researchers were usually not always positive. They repeatedly discovered that users misused key features, or increased the amount of toxic content, in addition to other consequences. In an internal memo, researchers claimed that it was Facebook’s “core product mechanics” which refers to the basic principles of how the platform functioned — that allowed false information and hate speech to flourish on the site.
“The mechanism of our platform isn’t completely neutral.” the authors concluded.
The Times also notes that, while the internal documents don’t disclose the manner in which Facebook did in response to the findings, the majority of its core experience remains the same and “Many important changes on the platform were halted for the sake of growing and keeping users active in the process, former and present executives have said.”
Facebook utilizes an opaque tier-based system that identifies which countries are receiving the greatest harm-prevention tools
Verge co-author Casey Newton explains that one aspect that stands out in the documents leaked is “the substantial differences in the content moderation options available to different countries, based on criteria that aren’t publicly available or subject to external scrutiny”:
The United States, India, and Brazil were placed in “tier zero”,” the top priority. Facebook created “war rooms” to ensure that the network was monitored continuously. They developed dashboards that analyzed the activity of the network and alert officials in local elections to any issues. Germany, Indonesia, Iran, Israel, and Italy were put in tier one.
They will receive similar resources, with the exception of some to enforce Facebook’s rules as well as for alerts in the absence of the election. In the second tier 22 countries, Facebook added 22. They’d have to be without war rooms which Facebook also refers to as “enhanced operations centers.” .”
Other countries were put into the third tier. Facebook would look into material related to the election in the event that it was escalated to them by moderators of content. If not, Facebook will not take action.
Issues that Facebook didn’t fix
Facebook did not fix its language issues which left unsafe content inaccessible to foreigners
The Associated Press reports that the internal documents show that Facebook did not allocate sufficient resources to combat hate speech and incitement to violence in many nations around the globe -and it knew:
The study of the files shows that in some of the world’s most dangerous regions terror-related content and hateful speech thrive due to the fact that the company isn’t stocked with moderators who can are fluent in local languages and comprehend the cultural contexts. And its platforms have failed to develop artificial intelligence solutions that can detect harmful content in different languages.
Human trafficking on Facebook
The company is aware of human traffickers who use its platforms for this purpose from at least the year 2018. the documents reveal …
Facebook documents show women who are trafficked through this method being exposed to sexual and physical assault and being denied food and money and being detained for travel documents in order to prevent them from fleeing.
This year Facebook released an internal Facebook report that revealed that “gaps remain in our analysis of the on-platform companies that are engaged in domestic servitude” and detailed the ways in which Facebook’s platforms are utilized to hire, buy and sell what Facebook’s documents refer to as “domestic servants.” .”
Steps taken by Apple toward Facebook
Apple warned to block Facebook for maids that are sold and traded on the platform. In the year 2019 Apple threatened to pull Facebook 2019 threatened to remove Facebook and Instagram out of its store due to reports of the trading of maids and women in the Middle East region on Facebook within the Middle East.
CNN claims that, as per the internal files, Facebook workers “rushed to delete content they found offensive and implement emergency changes to in order to avoid what they called potentially serious consequences for the company.”
According to the Associated Press adds that Facebook admitted to its internal staff its mistake of “under-enforcing the law on suspected abuse.” It’s continuing issue, according to the AP:
The crackdown on Facebook appears to have had only a small impact. Even the present, a simple lookup for “Kadima,” or “maids” in Arabic is likely to bring up accounts with photos of Africans and South Asians. They will also have the ages and prices posted alongside their photos.
This is even more so since it is the Philippines authorities have a group of employees who scan the Facebook pages every day to ensure the safety of the desperate job seekers from the gangs of criminals as well as untrustworthy recruiters on the platform.
Crackdown on Vietnam
Mark Zuckerberg sided with an authoritarian crackdown on Vietnam.
The Facebook CEO was confronted with an ultimatum by Vietnam’s autocratic government to restrict posts on pages that oppose the government or be compelled to end operations in the country bowed to the autocrats. The Washington Post reports that before an election was scheduled in Vietnam, Zuckerberg personally gave permission to comply with government demands.
This led to “Facebook dramatically increased the amount of censorship for posts that are deemed to be anti-state, and gave the government complete control over the platform according to local activists and free speech supporters.” The nation has strict regulations regarding opinions expressed on social media and authorities frequently arrest and prosecute those who are in violation of these rules.
Zuckerberg’s decision highlights how Facebook’s pledge to freedom of speech changes significantly across nations. It also illustrates the vital role that the social media platform plays in spreading information throughout the globe, an aspect that is often ignored in discussions about the company’s American operations.
Qanon content to a conservative user as part of an internal test
The process took 2 days to get Facebook to begin recommending Qanon content to a conservative user as part of an internal test. In the years 2019 and 2020 researchers at Facebook made fake accounts of users on the platform to examine how Facebook’s recommendation systems sucked up false information and created content that is polarizing.
One of the test users, created in the summer of the year 2019 was a conservative mother known as Carol Smith from North Carolina who was interested in parenting, politics, and Christianity.
In just two days Facebook began recommending groups called QAnon on behalf of the lady. This continued after the user of the test didn’t follow the recommended groups.
In a paper called “Carol’s Journey to QAnon,” the researcher concluded that Facebook provided “a flood of violent graphic, conspiratorial and violent information.” Facebook has since been able to remove QAnon groups from its platform, however, NBC News reports: “The studies consistently revealed that Facebook forced some users into “rabbit holes,”
which were increasingly narrow echo chambers in which the most violent conspiracy theories flourished. People who became radicalized through these rabbit holes comprise just a small portion of all users, but on Facebook’s magnitude, that could mean millions of users .”
The researcher quit the company in the year 2020, citing their slow responses to the growth of QAnon as the reason for her letter of resignation.
Warning signs that raised alarms
There were warning signs that raised alarms following the election on November 3, 2020, presidential election.
On the 5th of November, 2020 November 5, 2020, a Facebook employee informed colleagues that false information about elections has been circulating in comments in response to posts and that the most harmful of these posts were increased to the top of threads on comments.
On November 9th a Facebook data scientist told his colleagues that around 10 percent of the U.S. views of political content on the platform consisted of posts that claimed that there was fraud in the election up to one view for every 50 in Facebook at the moment. He also said the fact that there was “also an element of violence-related incitement” in the posts, according to an article in the New York Times.
Facebook rules do not stop the growth of Stop the Steal groups
Facebook rules and regulations have not stopped the growth of Stop the Steal groups
Facebook has slashed certain safeguards is implemented to combat the spread of false information ahead of and after the 2020 election in the documents leaked. The New York Times reports that three former employees claimed that Facebook was concerned about the backlash of users was able to begin reducing certain safeguards in November.
The company also disbanded the 300-member “Civic Integrity” team in December at the same time that the Stops the Steal was Stop Steal Steal movement gained increasing momentum, especially on Facebook.
Certain Stop the Steal Facebook groups had record-breaking growth when compared to all other groups on Facebook prior to that point. It was evident that the group’s organizers were working hard to get over Facebook’s moderation processes.
The documents, which also include an analysis of the postmortems of company employees show that Facebook was unable to tackle the entire movement and, consequently, didn’t do what it could have done to prevent the expansion of Stop the Steal on the platform. Facebook was left scrambling to take emergency measures on January 6, the day the movement was transformed into an insurrection at U.S. Capitol.
Facebook employees slammed the company during an internal debate
On the 6th of January, after Facebook Chief Executive Officer Mark Zuckerberg and CTO Mike Schroepfer posted messages condemning the Capitol protest on Facebook’s internal discussion forum, a few employees reacted with anger. Some of their posts included:
I came to this place with the intention of being able to influence change and make a difference in society But all I’ve witnessed is the decline and the abdication of accountability .”
Workers from the ranks have played their part in helping discover ways to enhance our platforms, however, they have been prevented from doing so. .”
“This isn’t a brand new issue. We’ve been watching the behavior of politicians such as Trump and the – at best — naive decisions of the company’s leadership for many years. We’ve been reading”farewell” posts of reliable, knowledgeable, and beloved colleagues who state that they can’t imagine being employed by a company which does not take action to reduce the negative impact on its platforms .”
” We’ve been fueling this flame for quite a while, and it shouldn’t come as a surprise that it’s now out of control. .”
“I would like to feel differently However, it’s not enough to claim that we’re adjusting since we ought to have made the necessary changes some time ago. There were numerous Stop the Steal groups active from yesterday and I doubt that they spoke lightly about their plans. .”
The leaks provide an incomplete view of the events that took place
The New York Times highlighted in its Friday report:
What these documents do not give is a complete view of decision-making within Facebook. Certain internal studies suggest that the company had difficulty trying to control the size of its network as well as the speed with which information circulated and spread, while other reports suggested that Facebook was worried about losing its engagement or harming its image. But what was clear was that Facebook’s employees believed that the social media company could be more effective as per the reports.
Facebook is in a mess in India
Facebook is in a mess in India, where it had to combat anti-Semitism, as well as other negative media
The biggest user base for Facebook is located in India which has there are 340 million users on one of its social media platforms. However, according to the New York Times reports that the documents leaked “provide clear evidence for one of most grave critiques made by human rights advocates and political leaders against the world-spanning company:
It enters the country without understanding the potential impact it could have on local politics and culture and does not deploy the resources needed to deal with issues when they arise.”
A document leaked revealed there was only 13 percent of Facebook’s worldwide budget for identifying false information was allocated for countries outside that of the U.S., despite the fact that 90% of Facebook’s users reside overseas. (Facebook claimed to that the Times that the figures were not complete)
The documents show that Facebook is struggling to stop the spread of fake news and anti-Semitism (including the creation of anti-Muslim material) and the celebration of violence in India. Its efforts have been hindered because of a lack of resources, an absence of experience in India’s various languages, as well as other issues such as bots associated with some of India’s political parties.
One example is a Facebook researcher, who conducted an experiment in 2019, where the test user from India adhered to all recommendations provided by the algorithm on the platform and later stated in a report “Following the experiment’s News Feed, I’ve seen more photos of deceased people over the last 3 weeks than I’ve ever seen throughout my entire life .”
Facebook warned EU politicians
EU politicians are warned by Facebook about the polarization of opinion on their site in the year 2019
The Washington Post reported that an official of Facebook went to Europe to discuss the European Union and heard critiques from politicians regarding a change to the algorithm used by the site that was introduced in the year 2018. The company claimed that the change was a change in politics “for the worst,” according to an April 2019 report.
The team expressed concern regarding Poland where members of political parties believe that Facebook contributed to the development of a “social Civil War” in which negative political activity got more weight and more attention on the platform.
In Warsaw The two main parties -the ruling party Law and Justice and the Civic Platform — the main opposition Civic Platform and the Civic Platform — both of which oppose Civic Platform –have accused social media of increasing the political polarization in Poland by describing the circumstances being “unsustainable,” the Facebook report claimed.
“Across several European countries, the major mainstream parties have complained about the incentive structure to engage in political attacks,” the report said. “They recognize a direct link to the huge influence of radical parties on stage .”