It’s excellent news that the federal government and almost fifty states are suing Facebook for being an illegal monopoly. Their aim is to break up the company. But it’s too bad Facebook management can’t be sued for being immoral creeps. They know how bad they are, just like the managers at cigarette companies who knew they were causing cancer and the oil company executives who knew decades ago they were destroying the climate.
This report is from the Daily Mail back in May (it’s a newspaper that specializes in less important topics — note the brief paragraphs):
Facebook researchers learnt as far back as in 2016 that 64 percent of all extremist group joins are due to its own recommendations but executives . . . killed any efforts to fix the problem, according to sources.
Research at the social media giant in 2016 and again in 2018 unearthed a worrying trend linking the platform’s recommendations to extremist views on the site.
But despite researchers coming up with several different solutions to tackle the problem of extremism, no action was taken.
People familiar with the matter have told The Wall Street Journal that the move to dismiss the recommendations was largely down to Facebook VP for policy and former George W. Bush administration official Joel Kaplan, who famously threw Brett Kavanaugh a party when he was appointed Supreme Court Justice in the middle of sexual assault allegations in 2018 . . . .
In 2016, the company carried out research that found there was a worryingly high proportion of extremist content and groups on the platform.
Facebook researcher and sociologist Monica Lee wrote in a presentation at the time that there was an abundance of extremist and racist content in over a third of large German political Facebook groups.
The presentation states ‘64% of all extremist group joins are due to our recommendation tools.’
Most of the joining activity came from the platform’s ‘Groups You Should Join’ and ‘Discover’ algorithms, she found, meaning: ‘Our recommendation systems grow the problem.’
Facebook then launched new research in 2017 looking at how its social media platform polarized the views of its users.
The project was headed up by Facebook’s then-chief product officer Chris Cox who led the task force known as ‘Common Ground’.
It revealed the social media platform was fueling conflict among its users and increasing extremist views.
It also showed that bad behavior among users came from the small groups of people with the most extreme views, with more accounts on the far-right than far-left in the US.
The concerning findings were released in an internal presentation the following year.
‘Our algorithms exploit the human brain’s attraction to divisiveness,’ a slide from the 2018 presentation read.
‘If left unchecked,’ it warned, Facebook would feed users ‘more and more divisive content in an effort to gain user attention and increase time on the platform.’
Cox and his team offered up several solutions to the problem, including building a system for digging out extreme content and suppressing clickbait around politics.
Another initiative called ‘Sparing Sharing’ involved reducing the spread of content by what it called ‘hyperactive users’ – who are highly active on the platform and show extreme views on either the left or the right, the sources told the Journal.
But the efforts – and the research – were reportedly blocked by senior executives including founder Mark Zuckerberg and Kaplan.
According to sources, Kaplan killed any attempts to change the platform branding the move ‘paternalistic’ and citing concerns that they would mainly impact right-wing social media users, the Journal reported.
Unquote.
Facebook has become a big part of the right-wing media machine, partly because the company was criticized for being unfair to right-wingers. In response to that criticism, they hired Republican executives to make sure right-wing lies and conspiracy theories weren’t interfered with, in fact, that they were promoted, as the report above shows. Thus, from The Guardian last month:
Since election day, 16 of the top 20 public Facebook posts that include the word “election” feature false or misleading information casting doubt on the election in favor of Txxxx, according to a Guardian analysis of posts with the most interactions using CrowdTangle, a Facebook-owned analytics tool. Of those, 13 are posts by the president’s own page, one is a direct quote from Txxxx published by Fox News, one is by the rightwing evangelical Christian Franklin Graham, and the last is the Newsmax Higbie video [“a laundry list of false and debunked claims casting doubt on the outcome of the presidential election”].
The four posts that do not include misinformation are congratulatory messages by Barack Obama and Michelle Obama for Biden and Kamala Harris and two posts by Graham, including a request for prayers for Txxxx and a remembrance by Graham of his father, the conservative televangelist Billy Graham.