Frances Haugen, Facebook’s worst nightmare? 1:08
Washington (CNN Business) – Facebook week got off to a tumultuous start. On Monday, Facebook, WhatsApp and Instagram fell for about six hours. On Tuesday, Frances Haugen, the Facebook whistleblower, testified before a Senate subcommittee, following the release of thousands of pages of internal investigations and documents.
Haugen, the 37-year-old former Facebook (FB) product manager who worked on civic integrity issues at the company, revealed her identity during a segment of the “60 minutes” show that aired Sunday night.
He has reportedly filed at least eight complaints with the Securities and Exchange Commission, alleging that the company is hiding the investigation into its shortcomings from investors and the public.
He also shared the documents with regulators and The Wall Street Journal, which published multi-part investigation showing Facebook was aware of problems with its apps.
“60 Minutes” published eight of Haugen’s complaints on Monday. Here are four takeaways:
Facebook mechanics promote the spread of misinformation
Internal documents cited in the complaints show that Facebook knows that both hate speech and misinformation on its platforms are having a social impact and that its “core product mechanics such as virality recommendations and optimization for engagement are an important part of why these types of speech thrive. “
In a study of disinformation and polarization risks found through recommendations, it took Facebook’s algorithm just a few days to recommend conspiracy pages to an account that follows official and verified pages of conservative figures like Fox News and Donald Trump. It took the same account less than a week to get a recommendation from QAnon. And according to documents titled “They used to post selfies now they’re trying to reverse the election” and “Does Facebook reward outrage?” cited in complaints, Facebook’s algorithms not only reward posts on topics like voter fraud conspiracies with likes and shares, but also “‘the more negative comments a piece of content generates, the greater the likelihood that the link get more traffic. ‘”
A document titled “What is Collateral Damage?” he even goes so far as to point out that “the net result is that Facebook as a whole will actively (if not necessarily consciously) promote such activities. The mechanics of our platform are not neutral.”
Facebook has taken limited steps to address existing misinformation
According to an internal document on problematic narratives that do not violate references in at least two of the complaints, Facebook removes only 3% to 5% of hate speech and less than 1% of content that is considered violent or inciting. to violence. This is because the volume is too much for human reviewers and it is challenging for their algorithms to accurately classify content when context must be considered.
Internal documents on Facebook’s role in the 2020 elections and the January 6 insurrection also suggest that those who spread misinformation are rarely apprehended by the company’s intervention mechanisms. One document notes that “enforcing pages moderated by page managers posting more than 2 pieces of misinformation in the last 67 days would hit 277,000 pages. Of these pages, 11,000 of them are current repeat offender pages.”
Facebook shares may not fall in the long term 1:15
Despite Facebook’s claims that they “remove content from Facebook no matter who posts it, when it violates our standards,” according to Haugen, “in practice, the ‘XCheck’ or ‘Cross-Check’ system effectively ‘lists white ‘privileged and / or high profile users’. An internal error prevention document cited in a complaint maintains that “‘over the years many XChecked pages, profiles and entities have been exempted from enforcement.'”
Internal documents on “quantifying the concentration of shares and their VPVs between users” and an “automatic outage plan for all group recommendation surfaces” indicates that Facebook also reversed some changes that have been shown to reduce misinformation because those changes reduced the platform growth.
In addition, Haugen claims that the company falsely told advertisers that they had done everything possible to prevent the insurrection. According to a document cited in the presentation titled “The Capitol Riots Shatters the Glass,” the safer parameters Facebook put in place for the 2020 elections, such as downgrading content such as hate speech that is likely to violate its community standards, actually they were later reversed and re-established only “after the insurrection broke out.”
In a document, a Facebook official states that “we were willing to act only * after * things had fallen into a terrible spiral.”
Facebook has misled the public about the negative effects of its platforms on children and adolescents, especially girls
When asked during a Congressional hearing in March whether Facebook platforms “harm children,” Facebook CEO Mark Zuckerberg said, “I don’t think so.”
However, according to Facebook’s own internal investigation cited in one of Haugen’s complaints, “13.5% of teenage girls on Instagram say the platform makes thoughts about ‘Suicide and self-harm’ worse” and 17 % say the Facebook-owned platform makes “eating problems” like anorexia and bulimia worse. Their research also claims that Facebook platforms “worsen body image problems for 1 in 3 teens.”
Facebook executive appears before US Senate 1:16
Facebook knows that its platforms allow human exploitation
Although Facebook’s community standards state that it “removes content that facilitates or coordinates human exploitation,” internal company documents cited in one of Haugen’s complaints suggest that the company knew that “domestic servitude content remained on the platform “prior to a 2019 BBC News investigation into a black market of domestic workers on Instagram.
“We are enforcing little enforcement of confirmed abusive activity with a nexus to the platform,” read a document titled “Domestic Servitude and Follow-up in the Middle East.”
“Our research finding demonstrates that … our platform enables the three life cycle stages of human exploitation (recruitment, facilitation, exploitation) through real-world networks … Traffickers, recruiters and facilitators of these ‘ agencies’ used FB profiles, IG profiles, Pages, Messenger and WhatsApp. “