Keep This in Mind When You Hear the Right Claim They’re Censored on Social Media

It’s bullshit. From The Washington Post:

A new report calls conservative claims of social media censorship “a form of disinformation”.

[The] report concludes that social networks aren’t systematically biased against conservatives, directly contradicting Republican claims that social media companies are censoring them. 

Recent moves by Twitter and Facebook to suspend [the former president’s] accounts in the wake of the violence at the Capitol are inflaming conservatives’ attacks on Silicon Valley. But New York University researchers today released a report stating claims of anti-conservative bias are “a form of disinformation: a falsehood with no reliable evidence to support it.” 

The report found there is no trustworthy large-scale data to support these claims, and even anecdotal examples that tech companies are biased against conservatives “crumble under close examination.” The report’s authors said, for instance, the companies’ suspensions of [the ex-president’s] accounts were “reasonable” given his repeated violation of their terms of service — and if anything, the companies took a hands-off approach for a long time given [his] position.

The report also noted several data sets underscore the prominent place conservative influencers enjoy on social media. For instance, CrowdTangle data shows that right-leaning pages dominate the list of sources providing the most engaged-with posts containing links on Facebook. Conservative commentator Dan Bongino, for instance, far out-performed most major news organizations in the run-up to the 2020 election. 

The report also cites an October 2020 study in which Politico found “right-wing social media influencers, conservative media outlets, and other GOP supporters” dominated the online discussion of Black Lives Matter and election fraud, two of the biggest issues in 2020. Working with the nonpartisan think tank Institute for Strategic Dialogue, researchers found users shared the most viral right-wing social media content about Black Lives Matter more than ten times as often as the most popular liberal posts on the topic. People also shared right-leaning claims on election fraud about twice as often as they shared liberals’ or traditional media outlets’ posts discussing the issue.

But even so, baseless claims of anti-conservative bias are driving Republicans’ approach to regulating tech. Republican lawmakers have concentrated their hearing exchanges with tech executives on the issue, and it’s been driving their legislative proposals. . . .

The New York University researchers called on Washington regulators to focus on what they called “the very real problems of social media.”

“Only by moving forward from these false claims can we begin to pursue that agenda in earnest,” Paul Barrett, the report’s primary author and deputy director of the NYU Stern Center for Business and Human Rights said in a statement. 

The researchers want the Biden administration to work with Congress to overhaul the tech industry. 

Their recommendations focus particularly on changing Section 230, a decades-old law shielding tech companies from lawsuits for the photos, videos and posts people share on their websites. . . . 

The researchers warn against completely repealing the law. Instead, they argue companies should only receive Section 230 immunity if they agree to accept more responsibilities in policing content such as disinformation and hate speech. The companies could be obligated to ensure their recommendation engines don’t favor sensationalist content or unreliable material just to drive better user engagement. 

“Social media companies that reject these responsibilities would forfeit Section 230’s protection and open themselves to costly litigation.” the report proposed.

The researchers also called for the creation of a new Digital Regulatory Agency, which would serve as an independent body and be tasked with enforcing a revised Section 230. 
The report also suggested Biden could empower a “special commission” to work with the industry on improving content moderation, which would be able to move much more quickly than legal battles over antitrust issues. It also called for the president to expand the task force announced by Biden on online harassment to focus on a broad range of harmful content. 

They also called for greater transparency in Silicon Valley. 

The researchers said the platforms typically don’t provide much justification for sanctioning an account or post, and when people are in the dark they assume the worst. 

“The platforms should give an easily understood explanation every time they sanction a post or account, as well as a readily available means to appeal enforcement actions,” the report said. “Greater transparency—such as that which Twitter and Facebook offered when they took action against [a certain terrible person] in January— would help to defuse claims of political bias, while clarifying the boundaries of acceptable user conduct.”

One Way to Start Fixing the Internet

Yaël Eisenstat has been a CIA officer, White House adviser and Facebook executive. She says the problem with social media isn’t just what users post — it’s what the platforms do with that content. From Harvard Business Review:

While the blame for President Txxxx’s incitement to insurrection lies squarely with him, the biggest social media companies — most prominently my former employer, Facebook — are absolutely complicit. They have not only allowed Txxxx to lie and sow division for years, their business models have exploited our biases and weaknesses and abetted the growth of conspiracy-touting hate groups and outrage machines. They have done this without bearing any responsibility for how their products and business decisions effect our democracy; in this case, including allowing an insurrection to be planned and promoted on their platforms. .  . .

The events of last week . . . demand an immediate response. In the absence of any U.S. laws to address social media’s responsibility to protect our democracy, we have ceded the decision-making about which rules to write, what to enforce, and how to steer our public square to CEOs of for-profit internet companies. Facebook intentionally and relentlessly scaled to dominate the global public square, yet it does not bear any of the responsibilities of traditional stewards of public goods, including the traditional media.

It is time to define responsibility and hold these companies accountable for how they aid and abet criminal activity. And it is time to listen to those who have shouted from the rooftops about these issues for years, as opposed to allowing Silicon Valley leaders to dictate the terms.

We need to change our approach not only because of the role these platforms have played in crises like last week’s, but also because of how CEOs have responded — or failed to respond. The reactionary decisions on which content to take down, which voices to downgrade, and which political ads to allow have amounted to tinkering around the margins of the bigger issue: a business model that rewards the loudest, most extreme voices.

Yet there does not seem to be the will to reckon with that problem. Mark Zuckerberg did not choose to block Txxxx’s account until after the U.S. Congress certified Joe Biden as the next president of the United States. . . . And while the decision by many platforms to silence Txxxx is an obvious response to this moment, it’s one that fails to address how millions of Americans have been drawn into conspiracy theories online and led to believe this election was stolen — an issue that has never been truly addressed by the social media leaders.

A look through the Twitter feed of Ashli Babbit, the woman who was killed while storming the Capitol, is eye-opening. A 14-year Air Force veteran, she spent the last months of her life retweeting conspiracy theorists, QAnon followers, and others calling for the overthrow of the government. . . . The likelihood that social media played a significant part in steering her down the rabbit hole of conspiracy theories is high, but we will never truly know how her content was curated, what groups were recommended to her, who the algorithms steered her towards.

If the public, or even a restricted oversight body, had access to the Twitter and Facebook data to answer those questions, it would be harder for the companies to claim they are neutral platforms who merely show people what they want to see. Guardian journalist Julia Carrie Wong wrote in June of this year about how Facebook algorithms kept recommending QAnon groups to her. . . .  The key point is this: This is not about free speech and what individuals post on these platforms. It is about what the platforms choose to do with that content, which voices they decide to amplify, which groups are allowed to thrive and even grow at the hand of the platforms’ own algorithmic help.

So where do we go from here?

I have long advocated that governments must define responsibility for the real-world harms caused by these business models, and impose real costs for the damaging effects they are having on our public health, our public square, and our democracy. As it stands, there are no laws governing how social media companies treat political ads, hate speech, conspiracy theories, or incitement to violence. This issue is unduly complicated by Section 230 of the Communications Decency Act, which has been vastly over-interpreted to provide blanket immunity to all internet companies — or “internet intermediaries” — for any third-party content they host. Many argue that to solve some of these issues, Section 230, which dates back to 1996, must at least be updated. But how, and whether it alone will solve the myriad issues we now face with social media, is hotly debated.

One solution I continue to push is clarifying who should benefit from Section 230 to begin with, which often breaks down into the publisher vs. platform debate. To still categorize social media companies — who curate content, whose algorithms decide what speech to amplify, who nudge users towards the content that will keep them engaged, who connect users to hate groups, who recommend conspiracy theorists — as “internet intermediaries” who should enjoy immunity from the consequences of all this is beyond absurd. The notion that the few tech companies who steer how more than 2 billion people communicate, find information, and consume media enjoy the same blanket immunity as a truly neutral internet company makes it clear that it is time for an upgrade to the rules. They are not just a neutral intermediary.

However, that doesn’t mean that we need to completely re-write or kill Section 230. Instead, why not start with a narrower step by redefining what an “internet intermediary” means? Then we could create a more accurate category to reflect what these companies truly are, such as “digital curators” whose algorithms decide what content to boost, what to amplify, how to curate our content. And we can discuss how to regulate in an appropriate manner, focusing on requiring transparency and regulatory oversight of the tools such as recommendation engines, targeting tools, and algorithmic amplification rather than the non-starter of regulating actual speech.

By insisting on real transparency around what these recommendation engines are doing, how the curation, amplification, and targeting are happening, we could separate the idea that Facebook shouldn’t be responsible for what a user posts from their responsibility for how their own tools treat that content. I want us to hold the companies accountable not for the fact that someone posts misinformation or extreme rhetoric, but for how their recommendation engines spread it, how their algorithms steer people towards it, and how their tools are used to target people with it.

To be clear: Creating the rules for how to govern online speech and define platforms’ responsibility is not a magic wand to fix the myriad harms emanating from the internet. This is one piece of a larger puzzle of things that will need to change if we want to foster a healthier information ecosystem. But if Facebook were obligated to be more transparent about how they are amplifying content, about how their targeting tools work, about how they use the data they collect on us, I believe that would change the game for the better.

As long as we continue to leave it to the platforms to self-regulate, they will continue to merely tinker around the margins of content policies and moderation. We’ve seen that the time for that is long past — what we need now is to reconsider how the entire machine is designed and monetized. Until that happens, we will never truly address how platforms are aiding and abetting those intent on harming our democracy.

A Surprising Free TV Service for Us Cord Cutters (World Series Edition)

We canceled our cable TV service a few years ago and haven’t really missed it. But there are times being a “cord cutter” is a problem, like when a certain team is playing football and the game is on a local TV station. (We could try putting an antenna on the roof and watch for free — like in olden times — but that’s not a good option for us.)

Tonight being the first game of the World Series, somebody asked whether we could watch it. In the past, that’s meant signing up for one of the services that transmit local stations over the internet. We’ve used those a couple of times (via our handy Roku box) but they’re not worth the monthly subscription.

In search of a good option, I got a very pleasant surprise. There is a free service that transmits local TV stations on the internet. It’s called Locast. They can explain:

Locast is a not-for-profit service offering users access to broadcast television over the internet. We stream the signal . . . to select US cities. Locast has modernized the delivery of broadcast TV by offering streaming media free of charge. This is your right, this is our mission. 

In today’s modern world, we find ourselves in many different settings. Access to broadcast TV is our right. The existing antiquated technology doesn’t come close to meeting the needs of the average user who deserves to access broadcast programming, using the Internet as we do for almost every other service.

. . . many households just can’t get a proper signal to receive broadcast TV. This can be due to geographic anomalies or living in more isolated rural areas. Rather than relying on a traditional rebroadcast antenna, these folks should be allowed to use a modern method of streaming through our digital transcoding service. Free your TV!

From what I can see, this thing actually works. I created an account and registered our Roku box. Lo and behold, there are maybe 30 channels being broadcast out of New York City. Lo and behold, it’s Locast!

The service is free, but they do ask for donations, beginning at $5 a month (a reasonable request):

To do this we will need your support. There are considerable costs for equipment, bandwidth, and operational support that helps run Locast. These costs will only go up as we expand our service to new markets, as well as when more and more people cut the cord to become new Locasters.

There’s actually more to the story. I wondered who’s behind this operation. It turns out to be an organization called Sports Fans Coalition:

SFC is a grassroots, sports fans advocacy organization. We’re made up of sports fans who want to have a say in how the sports industry works, and to put fans first. 

We have one goal: to give you a seat at the table whenever laws or public policy impacting sports are being made.

So in addition to doing things like lobbying Congress and suing TV networks, they are making local TV available to around 44% of the US population. 

But wait! Is this legal? Apparently it is.

Locast.org is a “digital translator,” meaning that Locast.org operates just like a traditional broadcast translator service, except instead of using an over-the-air signal to boost a broadcaster’s reach, we stream the signal over the Internet . . . 

Ever since the dawn of TV broadcasting in the mid-20th Century, non-profit organizations have provided “translator” TV stations as a public service. Where a primary broadcaster cannot reach a receiver with a strong enough signal, the translator amplifies that signal with another transmitter, allowing consumers who otherwise could not get the over-the-air signal to receive important programming, including local news, weather and, of course, sports. Locast.org provides the same public service, except instead of an over-the-air signal transmitter, we provide the local broadcast signal via online streaming.

According to Locast, federal law makes this possible:

Before 1976, under two Supreme Court decisions, any company or organization could receive an over-the-air broadcast signal and retransmit it to households in that broadcaster’s market without receiving permission (a copyright license) from the broadcaster. Then, in 1976, Congress passed a law overturning the Supreme Court decisions and making it a copyright violation to retransmit a local broadcast signal without a copyright license. This is why cable and satellite operators . . . must operate under a statutory . . . copyright license or receive permission from the broadcaster.

But Congress made an exception. Any “non-profit organization” could make a “secondary transmission” of a local broadcast signal, provided the non-profit did not receive any “direct or indirect commercial advantage” and either offered the signal for free or for a fee “necessary to defray the actual and reasonable costs” of providing the service. 17 U.S.C. 111(a)(5).

Sports Fans Coalition NY is a non-profit organization under the laws of New York State. Locast.org does not charge viewers for the digital translator service (although we do ask for contributions) and if it does so, will only recover costs as stipulated in the copyright statute. Finally, in dozens of pages of legal analysis provided to Sports Fans Coalition, an expert in copyright law concluded that under this particular provision of the copyright statute, secondary transmission may be made online, the same way traditional broadcast translators do so over the air.

For these reasons, Locast.org believes it is well within the bounds of copyright law when offering you the digital translator service.

One last word from Locast:

Why hasn’t anyone done this before?

Good question. We don’t know. But we did a lot of due diligence before launching and learned that the technology to offer a digital translator service has gotten a lot less expensive and the law clearly allows a non-profit to provide such a service. So we’re the first. You’re welcome.

Now, if World Series games didn’t average 3 1/2 hours. . .

Sacha Baron Cohen on the Greatest Propaganda Machines in History

The comedian spoke out this week. The problem he discusses may be insurmountable, given that anyone with an internet connection has the technological ability to communicate with everyone else who has one. Nevertheless, it’s encouraging that more people are demanding reasonable limits on the power of these gargantuan, unregulated companies.

The Guardian has a full transcript.

 

Facebook, Google, Twitter: You Are “Crime Scenes”

British journalist Carole Cadwalladr has taken fifteen important minutes to explain how the tech giants are damaging democracy.

One excellent point she makes is that these massive corporations refuse to divulge which misleading political advertisements are being directed at which voters, and who is behind those advertisements, and how much money is being spent on them. As a result, the British laws that limit campaign spending and have been in effect for 100 years no longer work, thanks to the “gods of Silicon Valley”. She addresses Zuckerberg, Sergey Brin and others directly:

Liberal democracy is broken. And you broke it. This is not democracy. Spreading lies in darkness paid for with illegal cash from God knows where. It’s subversion. And you are accessories to it.

Of the Democrats seeking the presidency, Senator Elizabeth Warren is the one who has offered a plan to rein in the tech giants. You might consider donating to her campaign.

Meanwhile, give Carole Cadwalladr fifteen minutes of your time. She is worth listening to.