Charlie Warzel of The New York Times monitored two average people’s Facebook feeds. It was as bad as you’d expect, but his article suggests solutions to the problem:
In mid-October I asked two people I’d never met to give me their Facebook account passwords for three weeks leading up to and after Election Day. I wanted to immerse myself in the feeds of a type of person who has become a trope of sorts in our national discussion about politics and disinformation: baby boomers with an attachment to polarizing social media.
I went looking for older Americans — not full-blown conspiracy theorists, trolls or partisan activists — whose news consumption has increased sharply in the last few years on Facebook. Neither of the two people I settled on described themselves as partisans. Both used to identify as conservatives slowly drifting leftward until Dxxxx Txxxx’s takeover of the Republican Party offered a final push. Both voted for Joe Biden this year in part because of his promise to reach across the aisle. Both bemoaned the toxicity of our current politics.
Every day, Jim Young, 62, opens up his Facebook app and heads into an information hellscape. His news feed is a dizzying mix of mundane middle-class American life and high-octane propaganda.
Here’s a sample:
A set of adoring grandparents posing with rosy-cheeked babies. “Mimi and Pop Pop’s first visit since March,” the post reads.
Next, a meme of Joe Biden next to a photoshopped “for sale” sign. “For more information contact Hunter,” the sign reads.
After that is a post advertising a “Funny rude” metal sign displaying a unicorn in a tutu giving the middle finger. “Thought of you,” the post reads.
Below that is a screenshot of a meme created by the pro-Txxxx group Turning Points USA. “Your city on socialism,” the post reads, displaying a series of photos of abandoned buildings, empty grocery store shelves and bleeding men in makeshift, dirty hospital beds.
The feed goes on like this — an infinite scroll of content without context. Touching family moments are interspersed with Bible quotes that look like Hallmark cards, hyperpartisan fearmongering and conspiratorial misinformation. Mr. Young’s news feed is, in a word, a nightmare. I know because I spent the last three weeks living inside it.
Despite Facebook’s reputation as a leading source for conspiracy theories and misinformation, what goes on in most average Americans’ news feeds is nearly impossible for outsiders to observe. . . .
After years of reading about the ways that Facebook is radicalizing and polarizing people I wanted to see it for myself — not in the aggregate, but up close and over time. What I observed is a platform that gathered our past and present friendships, colleagues, acquaintances and hobbies and slowly turned them into primary news sources. And made us miserable in the process. . . .
Mr. Young joined Facebook in 2008 as a way to reconnect with his high school classmates from Illinois. He reunited quickly with old friends and neighbors. It was exciting to see how people had changed. . . .
It was a little voyeuristic, nostalgic and harmless fun. Before 2016, Mr. Young told me, he’d see the occasional heated disagreement. It wasn’t until the last few years that his feed really started to turn divisive.
He first noticed it in the comments, where discussions that would usually end in some version of “agree to disagree” exploded into drawn-out, conspiratorial comment threads. Political disagreements started to read like dispatches from an alternate reality. He didn’t enjoy fact-checking his friends or picking fights, but when a post appeared obviously untrue he had to say something.
His time on the site ticked upward.
“It’s like going by a car wreck. You don’t want to look, but you have to,” he said. He believes his feed is a perfect storm for conflict in part because he’s lived in both liberal and conservative areas of the country and throughout his life he’s lived, worked with and befriended all manner of liberals and conservatives. . . .
But then he noticed some of his friends start to post more political memes, often with no link or citation. When he’d try to verify one, he’d realize the post was fake or debunked by a news site. “Most times there’s no real debate. Just anger. They’re so closed-minded. Sometimes, it scares me.”
Scrolling through Mr. Young’s feed after Election Day, I found a number of these posts.
Many examples of misinformation came from Facebook text posts created and shared by Mr. Young’s friends repeating baseless voter-fraud claims, [for example, one claiming] the number of votes in Wisconsin exceeded the number of registered voters (with no links to these numbers or any authoritative news source).
On Nov. 5, one of Mr. Young’s friends posted about “something fishy” alongside a link to a Bing search. The link returned a page of information about voters having ballots thrown out after using Sharpies to fill them out, including a link to a Facebook post on #Sharpiegate with over 137,000 shares.
One featured a screenshot from a Fox 2 Detroit news broadcast with the banner “Detroit Voter Roll Lawsuit.” The screenshot alleged potential voter fraud. “And so it begins!” the friend wrote. According to a Snopes debunk, the segment actually aired in December 2019 and had nothing to do with the 2020 election.
Another text post suggested that people shouldn’t trust Facebook’s fact checkers. “When the fact checkers are controlled by the same people doing the lying, what do you call it?” the post read. Below, commenters sounded off. “Democrats,” one exclaimed.. . . .
Mr. Young’s feed stood in stark contrast to the other Facebook account I spent time in. That feed belongs to Karen Pierce, a 55-year-old schoolteacher from Virginia. Ms. Pierce described herself to me as a “middle-child peacekeeper who is uncomfortable with politics.”
Unlike Mr. Young, she is not politically active on Facebook and never intervenes, even when she sees things she thinks might be conspiratorial or fake. As a result, her feed surfaced less politically charged content. The day after the election, the first post I noticed from a friend in her feed was a simple, apolitical exclamation: “It’s official! I make a damn good pot of stew!”
The political posts that appeared in Ms. Pierce’s feed were mostly anodyne statements of support for the Biden-Harris campaign peppered in between comments from fellow teachers frustrated by remote learning and an avalanche of cute dog photos and memes. Occasionally, a meme popped up mentioning Hunter Biden’s laptop, but most lacked the vitriol or the contentious commenter debates of Mr. Young’s feed.
Yet, in my conversations with Ms. Pierce over the last month, she expressed just as much frustration with her experience on Facebook as Mr. Young. “It’s so extreme,” she told me in mid-October. “I’ve watched people go from debating the issue to coming up with the craziest thing they can say to get attention. Take the whole anti-abortion debate. People started talking, then started saying ‘if you vote for Biden you’re a murderer.’ Now there’s people posting graphic pictures of fetuses.”
When I told her I hadn’t seen anything that extreme on her page, she suggested it was because of a three-month break she took from the platform this summer. “It got to be too much with the pandemic and the politics,” she said. The final straw was seeing people in her feed post QAnon adjacent memes and content. “There was a lot of calling Biden a pedophile. Or Txxxx voters posting pictures with assault rifles. It made me very uncomfortable.”
Like millions of Americans, Ms. Pierce logs onto Facebook to feel more connected. “I use it to see how people are doing,” she said. “I believe in prayer and sometimes I check to see who is struggling and to see who to pray for. And then, of course, you see some news and read some articles.”
It was when she was using the platform for news that she started seeing disturbing, conspiracy posts from people in her network. “It was so disappointing to realize the hate that’s out there,” she said. . . .
She’s worried about the long-term effects of such a toxic environment. “I think it’s affecting the mood of everybody.”
Living inside the Facebook account of strangers — even with their permission — feels invasive, like poking around in their medicine cabinet. But it offered me a unique perspective. Two things stood out. The first is the problem of comments, where strangers, even in the most mundane of articles, launched into intense, acrimonious infighting. In most cases, commenters bypassed argumentation for convenient name-calling or escalated a civil discussion by posting contextless claims with no links or source. In many cases, it appeared that a post from one user would get shared by a friend into his or her network, where it would [attract] strangers.
The more I scrolled through them, the more comments felt like a central and intractable issue. Unlike links to outside articles, comments aren’t subject to third-party fact checks or outside moderation. They are largely invisible to those people who study or attempt to police the platform.
Yet in my experience they were a primary source of debunked claims, harassment and divisive rhetoric. I showed one comment thread to a colleague who doesn’t use Facebook and my colleague found it shocking. “Facebook created a town hall for fighting,” they said. “It’s almost like if you were building a machine to make a country divisive and extreme — if you were to sit down and plan what that would look like —- it would be this.”
[Facebook’s] evolution, from a friendly social networking site into the world’s largest information platform, is the source of its biggest problems.
Sifting through Mr. Young and Ms. Pierce’s feeds and talking to them about what I saw, it became clear that the two found themselves tormented as a result of decisions they made in their early days on the platform. Both explained that they joined to reconnect with old friends.
Like most of us, they gave little thought to the connections they made. Mr. Young added friends he hadn’t spoken to in decades. When Ms. Pierce joined a nonprofit organization she accepted dozens of friend requests — some from people she’d met only in passing. “I meet people on airplanes all the time and we exchange Facebook handles,” she told me.
But as Facebook evolved, these weak connections became unlikely information nodes. Mr. Young and Ms. Pierce were now getting their commentary from people they hardly knew, whose politics had once been unknown or illegible.
“When Facebook first started it made me feel so good. It feels like I signed up for one thing and it’s become something totally different,” Ms. Pierce said. . . .
Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy . . . , described this phenomenon as what happens when “social-networking sites transformed into social media,” creating “a digital economy built on engagement.” Dr. Donovan argues that this decision spawned the algorithmic echo chambers we now live in and created a fertile environment for our information crisis.
For Mr. Young, the fallout of these decisions is painful. After weeks of watching his feed, I presented him with some of the most notorious posters in his feed. When I read aloud the name of one Facebook friend who constantly shared debunked claims, often with language intended to provoke, he sighed. He described the person as a longtime friend and neighbor who was once so close they practically lived at each other’s houses. Now, he spends his time debating whether it’s worth the energy to stop the friend from sharing conspiracy theories. . . .
The psychological toll of watching friends lose touch with reality has both Mr. Young and Ms. Pierce re-evaluating their choice to spend so much time on the platform. Mr. Young, for his part, tried to stay off during election week; Ms. Pierce is heartened that her feed has become less toxic after her Facebook sabbatical and is planning another. “My emotional and mental state improves greatly the further away I get from this place,” she told me.
Even if both manage to stay away from Facebook for good, their stories are just two in a sea of billions. No story is the same because no feed is the same. And yet these same dynamics that tortured my two participants — a sea of contextless news and acrimonious comments revealing their neighbors’ worst selves — are on display for millions of Americans every day. . . .
So what can be done?
- CLOSE YOUR FACEBOOK ACCOUNT. IT’S THE EASIEST AND MOST EFFECTIVE SOLUTION.
- UNFOLLOW EVERYONE YOU AREN’T CLOSE TO OR WHO SENDS YOU CRAP.
- DON’T READ THE COMMENTS, UNLESS THE SUBJECT IS CATS OR DOGS.
One thing Facebook could do is close the accounts of the people whose lies are shared the most. Researchers have found that a small group of social media accounts are responsible for the spread of a disproportionate amount of false information [New York Times].
But since Facebook has no morality and Republicans revel in the lying, see the list above, especially item 1.