Looking Toward January 6, 2025

Republicans are predictably screaming about their cult leader being kept off Facebook for the time being. They’re citing the First Amendment, of course, but that’s got nothing to do with social media platforms (until the government starts operating its own platform or regulating their content).

Or as our congressman, Tom Malinowski, tweeted:

The 1st Amendment gives us the right to say crazy things without gov’t interference. It doesn’t require Random House to give us a book contract, or FOX to give us a prime time show, or Facebook to amplify our rantings to billions of people. Freedom of speech is not freedom of reach.

A much more significant issue is the speed with which the Republican Party is deteriorating. From Greg Sargent of The Washington Post: 

Rep. Liz Cheney’s fate appears sealed: Republicans are set to oust the Wyoming Republican as the No. 3 in the House GOP leadership . . . This is being widely depicted as a battle over the past . . . Most accounts portray it as a sign that in today’s GOP, fealty to the former president is a bedrock requirement, denouncing his lies about 2020 has become unacceptable, and telling the truth about the Jan. 6 insurrection is disqualifying.

All that is true, but the forward-looking dimension to this story is getting lost. What also seems unavoidably at stake is that the GOP appears to be plunging headlong into a level of full-blown hostility to democracy that has deeply unsettling future ramifications.

. . . Republicans may be unshackling themselves from any obligation to acquiesce to future presidential election outcomes they don’t like — that is, liberating themselves to overturn those outcomes by any means necessary.

. . . A Cheney spokesperson denounced her GOP enemies for wanting to “perpetuate lies about the 2020 election and attempt to whitewash what happened on Jan. 6.” This comes after Cheney told GOP colleagues that those lies are “poison in the bloodstream of our democracy” and that insurrection “is a line that cannot be crossed.”

Cheney has also urged Republicans not to turn “their back on the rule of law.” And she insists that the commission examining Jan. 6 should focus on the insurrection, not on leftist extremism that Republicans are hyping to muddy the waters around their ongoing radicalization.

So why is all this disqualifying? [It’s because] she’s demanding something important from fellow Republicans: a full and unequivocal renunciation of the lie that the election’s outcome was dubious. . . .

Now consider what else we’re seeing. Some Republicans are increasingly asserting a willingness to overturn future elections: Rep. Jody Hice’s primary challenge to the Georgia secretary of state is driven by the promise to use his power to invalidate future outcomes.

Other Republicans are asserting the freedom to keep alive the fiction that the election was stolen forever. In Arizona, a GOP-sponsored recount is underway [in hopes of] bolstering that false conclusion.

This combination is toxic: Republicans are untethering themselves from any obligation to recognize future legitimate election outcomes, which will provide the rationale to overturn them, a freedom they are also effectively in process of appropriating. Cheney is insisting on a GOP future premised on a full repudiation of these tendencies, and getting punished for it.

Guess what: These same House Republicans might control the lower chamber when Congress is counting electors after the 2024 presidential election.

“We should start to very much worry about what Jan. 6, 2025, looks like,” Edward Foley, a renowned election law scholar and a Post contributing columnist, told me.

Imagine a 2024 election decided in one state, where a GOP-controlled legislature sends electors for the GOP candidate in defiance of a close popular vote. The same House Republicans who punished Cheney — many of whom already voted against President Biden’s electors, but now control the House and have continued radicalizing — could vote to certify that slate. . . .

This places burdens on Democrats. Democratic strategist Simon Rosenberg told me that this obliges Democrats to level with voters about the threat Republicans pose to democratic stability.

“If Cheney is ousted, Democrats will have to make the radicalization of the GOP a major part of the 2022 conversation,” Rosenberg said.

And as elections scholar Rick Hasen told me, Democrats should try to get patriotic Republicans to support revisions to the Electoral Count Act, to make it “harder for a legislature to send a separate slate when there was no problem with how the election was run.”

Cheney’s ouster should prompt this, along with a much greater public and media focus on the brute reality of the GOP’s fundamental turn away from democracy.

“The core component of the democratic process is that we count the votes as cast,” Foley told me. The punishing of Cheney, Foley concluded, suggests that the Republican Party might [might???] be institutionally “abandoning the very essence of democracy”.

Using the Legal System Against Facebook and Other Titans of the Internet

Two Democratic members of Congress are trying to stop big social media companies from doing so much damage:

Imagine clicking on a Facebook video alleging that a “deep-state cabal” of Satan-worshiping pedophiles stole the election from [a horrible person]. Moments later, your phone rings. The caller says, “Hey, it’s Freddie from Facebook. We noticed you just watched a cool video on our site, so we’ll send you a few dozen more videos about election-related conspiracy theories. As a bonus, we’ll connect you to some people who share your interest in ‘stopping the steal’. You guys should connect and explore your interest together!”

The scenario is, of course, made up. But it basically captures what social media platforms do every day. In the real world, “Freddie from Facebook” is not a person who calls you, but an algorithm that tracks you online, learns what content you spend the most time with and feeds you more of whatever maximizes your engagement — the time you spend on the platform. Greater engagement means that users see more ads, earning Facebook more revenue.

If you like cat videos, great; you’ll get an endless supply. But the same is true for the darkest content on the Web. Human nature being what it is, the content most likely to keep us glued to our screens is that which confirms our prejudices and triggers our basest emotions. Social media algorithms don’t have a conservative or liberal bias, but they know if we do. Their bias is to reinforce ours at the cost of making us more angry, anxious and afraid.

Facebook recently played down the role of its algorithms in exploiting users’ susceptibilities and enabling radicalization. The company says that users, not its product, are largely responsible for the extreme content showing up in their news feeds.

But Facebook knows how powerful its algorithms can be. In 2016, an internal Facebook study found that 64 percent of people who joined an extremist group on the platform did so only because its algorithm recommended it. Recently, a member of the Wolverine Watchmen, the militia accused of trying to kidnap Michigan Gov. Gretchen Whitmer (D), said he joined the group when it “popped up as a suggestion post” on Facebook because he interacted with pages supporting the Second Amendment.

Policymakers often focus on whether Facebook, YouTube and Twitter should take down hate speech and disinformation. This is important, but these questions are about putting out fires. The problem is that the product these companies make is flammable. It’s that their algorithms deliver to each of us what they think we want to hear, creating individually tailored realities for every American and often amplifying the same content they eventually might choose to take down.

In 1996, Congress passed Section 230 of the Communications Decency Act, which says that websites are not legally liable for content that users post (with some exceptions). While the law helped to enable the growth of the modern Internet economy, it was enacted 25 years ago when many of the challenges we currently face could not have been predicted. Large Internet platforms no longer function like community bulletin boards; instead, they use sophisticated, opaque algorithms to determine what content their users see. If companies such as Facebook push us to view certain posts or join certain groups, should they bear no responsibility if doing so leads to real-world violence?

We recently introduced a bill that would remove Section 230 protection from large social media companies if their algorithms amplify content that contributes to an act of terrorism or to a violation of civil rights statutes meant to combat extremist groups. Our bill would not force YouTube, Facebook or Twitter to censor or remove content. Instead, it would allow courts in cases involving extreme harm to consider victims’ arguments against the companies on the merits, as opposed to quickly tossing out lawsuits on Section 230 grounds as would happen today.

Liability would incentivize changes the companies know how to make. For example, last year Facebook tested a new system in which users rated posts on their news feeds as “good” or “bad” for the world. The algorithm then fed those users more content that they deemed good while demoting the bad. The experiment worked. The company’s engineers referred to the result as the “nicer news feed.” But there was one problem. The nicer news feed led to less time on Facebook (and thus less ad revenue), so the experiment died.

This is the fundamental issue: Engagement-based algorithms made social media giants some of the most lucrative companies on Earth. They won’t voluntarily change the underlying architecture of their networks if it threatens their bottom line. We must decide what’s more important: protecting their profits or our democracy.

Unquote.

The authors of the article are Rep. Tom Malinowski, who represents a traditionally Republican district in suburban New Jersey, and Rep. Anna Eshoo, who represents the part of California that includes Silicon Valley.

Members of Congress Want Action from Facebook, YouTube and Twitter

Below is a press release from my congressman. It contains links to letters he and another member of Congress sent to the CEOs of three social media behemoths. The letters represent the view of dozens of representatives. Each letter is worth looking at, since each one highlights specific problems relating to the company in question:

Today, in the aftermath of the violent attack on the U.S. Capitol, Congressman Tom Malinowski (NJ-7) and Congresswoman Anna G. Eshoo (CA-18) sent letters to the CEOs of Facebook, YouTube, and Twitter urging the companies to address the fundamental design features of their social networks that facilitate the spread of extreme, radicalizing content to their users.

Representatives Malinowski and Eshoo, along with dozens of their colleagues, called on the companies to reexamine their policy maximizing user engagement as the basis for algorithmic sorting and promotion of news and information, and to make permanent and platform-wide design changes to limit the spread of harmful, conspiratorial content. 

The lawmakers note that the rioters who attacked the Capitol earlier this month were radicalized in part in digital echo chambers that these platforms designed, built, and maintained, and that the platforms are partially responsible for undermining our shared sense of objective reality, for intensifying fringe political beliefs, for facilitating connections between extremists, leading some of them to commit real-world, physical violence.

To view the full text of the letters and their respective signers click on the links below.

  • Letter to Mark Zuckerberg, Facebook 
  • Letter to Susan Wojcicki and Sundar Pichai, YouTube; Alphabet/Google 
  • Letter to Jack Dorsey, Twitter 

“Social media platforms’ algorithms are designed to feed each of us increasingly hateful versions of what we already hate, and fearful versions of what we already fear, so that we stay glued to our screens for as long as possible. In this way, they regularly promote and recommend white supremacist, anti-Semitic, anti-government, and other conspiracy-oriented material to the very people who are most susceptible to it — some of whom just attacked our Capitol,” said Rep. Malinowski. “We are urging the CEOs of these large social media companies to make permanent and platform-wide changes to limit the frictionless spread of extreme, radicalizing content – something they’ve shown they are capable of doing but are consciously choosing not to.” 

“For years social media companies have allowed harmful disinformation to spread through their platforms, polluting the minds of the American people. Online disinformation is not just about removing bad content. I see it as largely a product design issue. The algorithmic amplification and recommendation systems that platforms employ spread content that’s evocative over what’s true,” said Rep. Eshoo. “The horrific damage to our democracy wrought on January 6th demonstrated how these social media platforms played a role in radicalizing and emboldening terrorists to attack our Capitol. These American companies must fundamentally rethink algorithmic systems that are at odds with democracy.”

Last Fall, Representatives Malinowski and Eshoo introduced the Protecting Americans from Dangerous Algorithms Act, legislation to hold large social media platforms accountable for their algorithmic amplification of harmful, radicalizing content that leads to offline violence.

Rep. Malinowski represents New Jersey’s 7th congressional district. . . .Rep. Eshoo represents California’s 18th congressional district, which includes much of Silicon Valley. . . .

Menace to Society

It’s excellent news that the federal government and almost fifty states are suing Facebook for being an illegal monopoly. Their aim is to break up the company. But it’s too bad Facebook management can’t be sued for being immoral creeps. They know how bad they are, just like the managers at cigarette companies who knew they were causing cancer and the oil company executives who knew decades ago they were destroying the climate.

This report is from the Daily Mail back in May (it’s a newspaper that specializes in less important topics — note the brief paragraphs):

Facebook researchers learnt as far back as in 2016 that 64 percent of all extremist group joins are due to its own recommendations but executives . . . killed any efforts to fix the problem, according to sources.

Research at the social media giant in 2016 and again in 2018 unearthed a worrying trend linking the platform’s recommendations to extremist views on the site.

But despite researchers coming up with several different solutions to tackle the problem of extremism, no action was taken.

People familiar with the matter have told The Wall Street Journal that the move to dismiss the recommendations was largely down to Facebook VP for policy and former George W. Bush administration official Joel Kaplan, who famously threw Brett Kavanaugh a party when he was appointed Supreme Court Justice in the middle of sexual assault allegations in 2018 . . . .

In 2016, the company carried out research that found there was a worryingly high proportion of extremist content and groups on the platform.

Facebook researcher and sociologist Monica Lee wrote in a presentation at the time that there was an abundance of extremist and racist content in over a third of large German political Facebook groups.

The presentation states ‘64% of all extremist group joins are due to our recommendation tools.’

Most of the joining activity came from the platform’s ‘Groups You Should Join’ and ‘Discover’ algorithms, she found, meaning: ‘Our recommendation systems grow the problem.’

Facebook then launched new research in 2017 looking at how its social media platform polarized the views of its users.

The project was headed up by Facebook’s then-chief product officer Chris Cox who led the task force known as ‘Common Ground’.

It revealed the social media platform was fueling conflict among its users and increasing extremist views.

It also showed that bad behavior among users came from the small groups of people with the most extreme views, with more accounts on the far-right than far-left in the US.

The concerning findings were released in an internal presentation the following year.

‘Our algorithms exploit the human brain’s attraction to divisiveness,’ a slide from the 2018 presentation read.

‘If left unchecked,’ it warned, Facebook would feed users ‘more and more divisive content in an effort to gain user attention and increase time on the platform.’

Cox and his team offered up several solutions to the problem, including building a system for digging out extreme content and suppressing clickbait around politics.

Another initiative called ‘Sparing Sharing’ involved reducing the spread of content by what it called ‘hyperactive users’ – who are highly active on the platform and show extreme views on either the left or the right, the sources told the Journal.

But the efforts – and the research – were reportedly blocked by senior executives including founder Mark Zuckerberg and Kaplan.

According to sources, Kaplan killed any attempts to change the platform branding the move ‘paternalistic’ and citing concerns that they would mainly impact right-wing social media users, the Journal reported.

Unquote.

Facebook has become a big part of the right-wing media machine, partly because the company was criticized for being unfair to right-wingers. In response to that criticism, they hired Republican executives to make sure right-wing lies and conspiracy theories weren’t interfered with, in fact, that they were promoted, as the report above shows. Thus, from The Guardian last month:

Since election day, 16 of the top 20 public Facebook posts that include the word “election” feature false or misleading information casting doubt on the election in favor of Txxxx, according to a Guardian analysis of posts with the most interactions using CrowdTangle, a Facebook-owned analytics tool. Of those, 13 are posts by the president’s own page, one is a direct quote from Txxxx published by Fox News, one is by the rightwing evangelical Christian Franklin Graham, and the last is the Newsmax Higbie video [“a laundry list of false and debunked claims casting doubt on the outcome of the presidential election”].

The four posts that do not include misinformation are congratulatory messages by Barack Obama and Michelle Obama for Biden and Kamala Harris and two posts by Graham, including a request for prayers for Txxxx and a remembrance by Graham of his father, the conservative televangelist Billy Graham.

Escaping Facebook Hell

Charlie Warzel of The New York Times monitored two average people’s Facebook feeds. It was as bad as you’d expect, but his article suggests solutions to the problem:

In mid-October I asked two people I’d never met to give me their Facebook account passwords for three weeks leading up to and after Election Day. I wanted to immerse myself in the feeds of a type of person who has become a trope of sorts in our national discussion about politics and disinformation: baby boomers with an attachment to polarizing social media.

I went looking for older Americans — not full-blown conspiracy theorists, trolls or partisan activists — whose news consumption has increased sharply in the last few years on Facebook. Neither of the two people I settled on described themselves as partisans. Both used to identify as conservatives slowly drifting leftward until Dxxxx Txxxx’s takeover of the Republican Party offered a final push. Both voted for Joe Biden this year in part because of his promise to reach across the aisle. Both bemoaned the toxicity of our current politics.

Every day, Jim Young, 62, opens up his Facebook app and heads into an information hellscape. His news feed is a dizzying mix of mundane middle-class American life and high-octane propaganda.

Here’s a sample:

A set of adoring grandparents posing with rosy-cheeked babies. “Mimi and Pop Pop’s first visit since March,” the post reads.

Next, a meme of Joe Biden next to a photoshopped “for sale” sign. “For more information contact Hunter,” the sign reads.

After that is a post advertising a “Funny rude” metal sign displaying a unicorn in a tutu giving the middle finger. “Thought of you,” the post reads.

Below that is a screenshot of a meme created by the pro-Txxxx group Turning Points USA. “Your city on socialism,” the post reads, displaying a series of photos of abandoned buildings, empty grocery store shelves and bleeding men in makeshift, dirty hospital beds.

The feed goes on like this — an infinite scroll of content without context. Touching family moments are interspersed with Bible quotes that look like Hallmark cards, hyperpartisan fearmongering and conspiratorial misinformation. Mr. Young’s news feed is, in a word, a nightmare. I know because I spent the last three weeks living inside it.

Despite Facebook’s reputation as a leading source for conspiracy theories and misinformation, what goes on in most average Americans’ news feeds is nearly impossible for outsiders to observe. . . .

After years of reading about the ways that Facebook is radicalizing and polarizing people I wanted to see it for myself — not in the aggregate, but up close and over time. What I observed is a platform that gathered our past and present friendships, colleagues, acquaintances and hobbies and slowly turned them into primary news sources. And made us miserable in the process. . . .

Mr. Young joined Facebook in 2008 as a way to reconnect with his high school classmates from Illinois. He reunited quickly with old friends and neighbors. It was exciting to see how people had changed. . . .

It was a little voyeuristic, nostalgic and harmless fun. Before 2016, Mr. Young told me, he’d see the occasional heated disagreement. It wasn’t until the last few years that his feed really started to turn divisive.

He first noticed it in the comments, where discussions that would usually end in some version of “agree to disagree” exploded into drawn-out, conspiratorial comment threads. Political disagreements started to read like dispatches from an alternate reality. He didn’t enjoy fact-checking his friends or picking fights, but when a post appeared obviously untrue he had to say something.

His time on the site ticked upward.

“It’s like going by a car wreck. You don’t want to look, but you have to,” he said. He believes his feed is a perfect storm for conflict in part because he’s lived in both liberal and conservative areas of the country and throughout his life he’s lived, worked with and befriended all manner of liberals and conservatives. . . .

But then he noticed some of his friends start to post more political memes, often with no link or citation. When he’d try to verify one, he’d realize the post was fake or debunked by a news site. “Most times there’s no real debate. Just anger. They’re so closed-minded. Sometimes, it scares me.”

Scrolling through Mr. Young’s feed after Election Day, I found a number of these posts.

Many examples of misinformation came from Facebook text posts created and shared by Mr. Young’s friends repeating baseless voter-fraud claims, [for example, one claiming] the number of votes in Wisconsin exceeded the number of registered voters (with no links to these numbers or any authoritative news source).

On Nov. 5, one of Mr. Young’s friends posted about “something fishy” alongside a link to a Bing search. The link returned a page of information about voters having ballots thrown out after using Sharpies to fill them out, including a link to a Facebook post on #Sharpiegate with over 137,000 shares.

One featured a screenshot from a Fox 2 Detroit news broadcast with the banner “Detroit Voter Roll Lawsuit.” The screenshot alleged potential voter fraud. “And so it begins!” the friend wrote. According to a Snopes debunk, the segment actually aired in December 2019 and had nothing to do with the 2020 election.

Another text post suggested that people shouldn’t trust Facebook’s fact checkers. “When the fact checkers are controlled by the same people doing the lying, what do you call it?” the post read. Below, commenters sounded off. “Democrats,” one exclaimed.. . . .

Mr. Young’s feed stood in stark contrast to the other Facebook account I spent time in. That feed belongs to Karen Pierce, a 55-year-old schoolteacher from Virginia. Ms. Pierce described herself to me as a “middle-child peacekeeper who is uncomfortable with politics.”

Unlike Mr. Young, she is not politically active on Facebook and never intervenes, even when she sees things she thinks might be conspiratorial or fake. As a result, her feed surfaced less politically charged content. The day after the election, the first post I noticed from a friend in her feed was a simple, apolitical exclamation: “It’s official! I make a damn good pot of stew!”

The political posts that appeared in Ms. Pierce’s feed were mostly anodyne statements of support for the Biden-Harris campaign peppered in between comments from fellow teachers frustrated by remote learning and an avalanche of cute dog photos and memes. Occasionally, a meme popped up mentioning Hunter Biden’s laptop, but most lacked the vitriol or the contentious commenter debates of Mr. Young’s feed.

Yet, in my conversations with Ms. Pierce over the last month, she expressed just as much frustration with her experience on Facebook as Mr. Young. “It’s so extreme,” she told me in mid-October. “I’ve watched people go from debating the issue to coming up with the craziest thing they can say to get attention. Take the whole anti-abortion debate. People started talking, then started saying ‘if you vote for Biden you’re a murderer.’ Now there’s people posting graphic pictures of fetuses.”

When I told her I hadn’t seen anything that extreme on her page, she suggested it was because of a three-month break she took from the platform this summer. “It got to be too much with the pandemic and the politics,” she said. The final straw was seeing people in her feed post QAnon adjacent memes and content. “There was a lot of calling Biden a pedophile. Or Txxxx voters posting pictures with assault rifles. It made me very uncomfortable.”

Like millions of Americans, Ms. Pierce logs onto Facebook to feel more connected. “I use it to see how people are doing,” she said. “I believe in prayer and sometimes I check to see who is struggling and to see who to pray for. And then, of course, you see some news and read some articles.”

It was when she was using the platform for news that she started seeing disturbing, conspiracy posts from people in her network. “It was so disappointing to realize the hate that’s out there,” she said. . . .

She’s worried about the long-term effects of such a toxic environment. “I think it’s affecting the mood of everybody.”

Living inside the Facebook account of strangers — even with their permission — feels invasive, like poking around in their medicine cabinet. But it offered me a unique perspective. Two things stood out. The first is the problem of comments, where strangers, even in the most mundane of articles, launched into intense, acrimonious infighting. In most cases, commenters bypassed argumentation for convenient name-calling or escalated a civil discussion by posting contextless claims with no links or source. In many cases, it appeared that a post from one user would get shared by a friend into his or her network, where it would [attract] strangers.

The more I scrolled through them, the more comments felt like a central and intractable issue. Unlike links to outside articles, comments aren’t subject to third-party fact checks or outside moderation. They are largely invisible to those people who study or attempt to police the platform.

Yet in my experience they were a primary source of debunked claims, harassment and divisive rhetoric. I showed one comment thread to a colleague who doesn’t use Facebook and my colleague found it shocking. “Facebook created a town hall for fighting,” they said. “It’s almost like if you were building a machine to make a country divisive and extreme — if you were to sit down and plan what that would look like —- it would be this.”

[Facebook’s] evolution, from a friendly social networking site into the world’s largest information platform, is the source of its biggest problems.

Sifting through Mr. Young and Ms. Pierce’s feeds and talking to them about what I saw, it became clear that the two found themselves tormented as a result of decisions they made in their early days on the platform. Both explained that they joined to reconnect with old friends.

Like most of us, they gave little thought to the connections they made. Mr. Young added friends he hadn’t spoken to in decades. When Ms. Pierce joined a nonprofit organization she accepted dozens of friend requests — some from people she’d met only in passing. “I meet people on airplanes all the time and we exchange Facebook handles,” she told me.

But as Facebook evolved, these weak connections became unlikely information nodes. Mr. Young and Ms. Pierce were now getting their commentary from people they hardly knew, whose politics had once been unknown or illegible.

“When Facebook first started it made me feel so good. It feels like I signed up for one thing and it’s become something totally different,” Ms. Pierce said. . . .

Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy . . . , described this phenomenon as what happens when “social-networking sites transformed into social media,” creating “a digital economy built on engagement.” Dr. Donovan argues that this decision spawned the algorithmic echo chambers we now live in and created a fertile environment for our information crisis.

For Mr. Young, the fallout of these decisions is painful. After weeks of watching his feed, I presented him with some of the most notorious posters in his feed. When I read aloud the name of one Facebook friend who constantly shared debunked claims, often with language intended to provoke, he sighed. He described the person as a longtime friend and neighbor who was once so close they practically lived at each other’s houses. Now, he spends his time debating whether it’s worth the energy to stop the friend from sharing conspiracy theories. . . .

The psychological toll of watching friends lose touch with reality has both Mr. Young and Ms. Pierce re-evaluating their choice to spend so much time on the platform. Mr. Young, for his part, tried to stay off during election week; Ms. Pierce is heartened that her feed has become less toxic after her Facebook sabbatical and is planning another. “My emotional and mental state improves greatly the further away I get from this place,” she told me.

Even if both manage to stay away from Facebook for good, their stories are just two in a sea of billions. No story is the same because no feed is the same. And yet these same dynamics that tortured my two participants — a sea of contextless news and acrimonious comments revealing their neighbors’ worst selves — are on display for millions of Americans every day. . . .

Unquote.

So what can be done?

  1. CLOSE YOUR FACEBOOK ACCOUNT. IT’S THE EASIEST AND MOST EFFECTIVE SOLUTION.
  2. UNFOLLOW EVERYONE YOU AREN’T CLOSE TO OR WHO SENDS YOU CRAP.
  3. DON’T READ THE COMMENTS, UNLESS THE SUBJECT IS CATS OR DOGS.

One thing Facebook could do is close the accounts of the people whose lies are shared the most. Researchers have found that a small group of social media accounts are responsible for the spread of a disproportionate amount of false information [New York Times].

But since Facebook has no morality and Republicans revel in the lying, see the list above, especially item 1.