in

Tim Cook challenges Donald Trump: “He must be held accountable”

Bloomberg

Dark side of the web would save Facebook and Twitter: Alex Webb

(Bloomberg) – Facebook Inc. has long defended its indifferent stance toward disinformation on its networks, arguing that if stricter conditions were imposed, this type of content would proliferate, anyway, elsewhere. It’s more useful to control the conversation … or something like that. But pushing false information and incitement to violence into the darkest corners of the web might be a good idea. For a long time, Facebook’s line of reasoning has seemed insincere , especially due to the number of people who turn to the social media giant. Monthly, it has 2.5 billion users on its platforms, which include Instagram and WhatsApp. Together with Alphabet Inc.’s YouTube, these platforms attract the largest number of people ever witnessed in the world. Where else could disinformation find such a massive audience? Facebook’s ban of Donald Trump, along with a similar decision by Twitter Inc., could help us find out. The first alternative for many was Parler, a social media app that boasts of being a bastion of free speech and is backed by the billionaire Mercer family. But such freedom of expression came at a cost: Last week, the network was unable to moderate the content that organized the violent acts on Capitol Hill. As a result, Amazon.com Inc. removed the site from its servers, while Apple Inc. and Google removed its app from their mobile stores. Trump has since said he can build his own social network while Parler struggles to rebuild himself. Whatever happens with Parler, pulling the most outlandish currents of political discourse off major platforms could be a good thing. To drive user engagement, social media companies tend to reward provocative content with increased exposure, while also implementing algorithms that personalize user feeds. That produces engines that incubate and accentuate radicalization, which could have the effect of turning moderates into radicals. Facebook is removing all mentions of the slogan “stop the theft,” which was used by state electoral conspiracy theorists. United States, while Twitter has blocked more than 70,000 for spreading conspiracy theories associated with QAnon. If QAnon, electoral fraud conspirators, and those who promote that the earth is flat are encouraged to move elsewhere, for example to Parler, or a Gab, a site supposedly frequented by white supremacists, then the number of people traveling into the mainstream would be reduced and they would be pushed into its conspiracy-filled vortexes. Network users would have to search harder for misinformation about COVID-19 or anti-vaccination campaigns, rather than find it organically in their Facebook or YouTube feeds. This approach, of course, would not end online radicalization, according to Dipayan Ghosh, author of the book “Terms of Disservice: How Silicon Valley Is Destructive by Design”. It could even push those already inside the bubble into an even more violent frenzy. But it is “the right thing to do because it removes extremist views from the mainstream,” he said. Major platforms don’t need to worry about someone else surpassing them. Despite all claims by Facebook CEO Mark Zuckerberg that competition on social media is fierce, the advertising giant continues to increase user numbers, revenue and profits. His leadership is not easy to beat. In November, Parler’s peak daily active user count was just 3.4 million globally, and it only had 1.6 million daily users last week, according to app analytics firm Apptopia. That’s 0.09% of the 1.8 billion people who log into a Facebook service every day. Competition will only pose a threat to Facebook and YouTube dominance when they are able to build sustainable business models. That means securing ad dollars, which isn’t easy since brands are less likely to want to include ads alongside problematic content. It also means tightly managing costs. Google and Apple have already made it more difficult for Parler, by stipulating that you must adhere to their own content moderation policies if you want your app to be available in either store. In other words, the application needs to hire a large number of content moderators. It’s the right thing to do, given the evidence that last week’s riots were planned on the platform. Facebook also had to have built those costs into its business early on, but initially opted for low overhead to scale quickly. Taking the most extreme political speeches out of the mainstream would formalize content bubbles that, in essence, already exist. The result would be very far from the cyber utopian vision of the internet that was had in the 1990s, that of a green town or an agora for the free and open exchange of ideas. It would be far, but it would be the best.Original Note: The Web’s Dark Recesses Can Save Facebook and Twitter: Alex WebbFor more articles like this, please visit us at bloomberg.comSubscribe now to stay ahead with the most trusted business news source. © 2021 Bloomberg LP

The United States Justice plans to accuse “sedition and conspiracy” to “several hundred” of the robbers on the Capitol

Can I miss work because of the snow? This is what the Workers’ Statute says