The Cambridge Analytica case, revealed in 2018, seemed to promise a change in the perception of Facebook in particular and social networks in general, and also promised by the company changes in the treatment of data for greater security and respect for its users.
Now we know that none of that has changed, but that it has gotten worse. Since last September 15, the Wall Street Journal has been revealing Through leaks of Facebook employees, a thread of reports that, beyond the evils of Facebook, show that it knew them and allowed them with treachery.
A new knowledge that can test the trust that its 2.9 billion active users (between Facebook, Instagram and WhatsApp) They deposit in the company of Mark Zuckerberg, who since 2017 has been trying to detach himself from a string of scandals that do not stop jumping, but that, based on his income statement, do not seem to affect him.
Facebook has had three major groups of scandals throughout its history. The spur of it all was Cambridge Analytica, where they exploited one of its vulnerabilities to encourage connections and virality for political purposes. Then there are its continuous data leaks, which have been so many that anyone who has a Facebook account before 2019 has a 50% chance of having suffered them. And finally, its internal idiosyncrasies, where they are from how Zuckerberg tried to weigh Instagram down even after he bought it so that it would not overshadow Facebook, his creation, how are you that the WSJ has been revealing or how he did not do everything he could when the platform was used to foment the Rohingya genocide.
This is a summary of this latest information:
The Facebook archives: more fuel for an already huge bonfire
Photo by Brett Jordan
The origin: Someone on Facebook gave the WSJ a selection of internal investigation documents. A catalog of all the ways Facebook has a lot of harmful content, it knows it and it doesn’t control it. We all had an idea of this already, however, what is not so clear is to what extent these documents give a complete picture of how Facebook has hidden them to continue improving the metrics of its social network.
Time and time again, the docs show that from within Facebook have identified the harmful effects of the platform. Despite the US Congressional hearings, its own promises, and numerous complaints from the media, the company did not fix them. The documents offer perhaps the clearest picture yet of how Facebook’s troubles are known within the company, including Mark Zuckerberg.
Time and again, documents show that from within Facebook they have identified the harmful effects of the platform
They have been joined by a parallel revelation by The New York Times in which it is ensured that Facebook has modified the algorithm of its news feed (the wall) to show friendlier content in the face of so many scandals. A way to wear pink that you already tried in the past or at least promised.
“Facebook Inc. knows, in great detail, that its platforms are plagued with flaws that cause damage, often in a way that only the company fully understands,” reads the WSJ investigation, which has so far divided its reports into five big blocks.
Certain Facebook users don’t abide by its rules
The first dose of this information reveals with internal papers that, in private, the company has created a system that has exempted high-profile users from some or all of its rules. As if there were a kind of VIP users who are above their standards and code of ethics.
The program, known as “cross check” or “XCheck”, was conceived as a quality control measure for high-profile accounts. Today, it protects millions of VIPs, documents show. Many abuse this privilege, posting material that includes harassment and incitement to violence, which would normally lead to penalties. Facebook says in its responses, set out in a statement – signed by Nick Clegg, the former UK Deputy Prime Minister Zuckerberg signed specifically to deal with the scandals – that the criticism of the show is fair, that it was designed with a good purpose and that the company is working to fix it.
Facebook knows Instagram is toxic, especially among teens
Photo by Brett Jordan on Unsplash
The next block of revelations shows that within Instagram they have been studying for years how their photo-sharing app affects millions of young users. Repeatedly, the company has found that Instagram is harmful to a considerable percentage of them, especially for teenage girls.
In public, Facebook has downplayed the negative effects of the app, not making its research public or making it available to academics or lawmakers who have requested it. In response, Facebook says that the negative effects are not pervasive, that mental health research is valuable, and that some of the harmful aspects are not easy to address.
When Facebook tried to kick the media and it came out with an even more irate social network
This is something that anyone who has followed the evolution of the platform has realized. Facebook went in 2018 from being a kind of what is now TikTok where people shared cats and video news to announcing a change in its algorithm to prioritize comments from close people that ended up forming the echo chamber so studied by sociologists that makes it so dangerous. In short: what you think and think, Facebook gives back to you, creating a swarm that promotes dangerous postures.
Zuckerberg stated at the time that his goal was to strengthen the bonds between users and improve their well-being. encouraging interactions between friends and family. Inside the company, documents show that employees realized the change was having the opposite effect. It was making Facebook, and those who used it, angrier. Zuckerberg resisted some of his team’s proposed fixes, according to the documents, because he was concerned that people would interact less on Facebook.
The cartels and armed groups also took advantage of Facebook and the network did hardly anything
The dozens of Facebook documents also show employees sounding the alarm about the use of the platforms in developing countries, where their user base is huge and expanding. Employees noted that human traffickers in the Middle East used the site to lure women into abusive work situations that sometimes appeared to be related to sexual exploitation. They warned that armed groups in Ethiopia were using the site to incite violence against ethnic minorities, in a situation that appears to be similar to the scandal that already hit Facebook in the Rohingya massacre.
According to the documents, some workers sent alerts to their bosses about messages selling organs or pornography. They also show the response of the company, which in many cases is inadequate or null.
Facebook tried to use its platform for good in the face of COVID, but its own monster turned against it
The latest revelation has even a touch of karma at work. Facebook supported the promotion of vaccination against COVID-19, in a personal bet by Zuckerberg to try to assert that his creation is a powerful force for social good. But that went wrong.
Internal reports show how the alarm went off again from within the company because the denial group used the very tools provided by Facebook to spread its message. Activists flooded the network with what Facebook calls “anti-vaccination” content, internal memos show.
Another drop in a glass of water that does not fill
None of these disclosures is in itself a known thing: everyone who has investigated Facebook and its algorithms is aware of the damage it can and has caused over time, and Facebook itself has said time and again that it is would launch to fix it.
In its latest quarterly results, it had revenue of $ 29 billion
The only certainty is that for now Facebook has not stopped growing. In its latest quarterly results, it had revenue of $ 29 billion, which translates into a year-on-year growth of 56%. At the moment, the social network seems to have not been affected in its history by these scandals or by the privacy changes promoted by Apple in iOS that would limit the effectiveness of its ads.