Charlie Warzel of The New York Times monitored two average people’s Facebook feeds. It was as bad as you’d expect, but his article suggests solutions to the problem:
In mid-October I asked two people Iâd never met to give me their Facebook account passwords for three weeks leading up to and after Election Day. I wanted to immerse myself in the feeds of a type of person who has become a trope of sorts in our national discussion about politics and disinformation: baby boomers with an attachment to polarizing social media.
I went looking for older Americans â not full-blown conspiracy theorists, trolls or partisan activists â whose news consumption has increased sharply in the last few years on Facebook. Neither of the two people I settled on described themselves as partisans. Both used to identify as conservatives slowly drifting leftward until Dxxxx Txxxxâs takeover of the Republican Party offered a final push. Both voted for Joe Biden this year in part because of his promise to reach across the aisle. Both bemoaned the toxicity of our current politics.
Every day, Jim Young, 62, opens up his Facebook app and heads into an information hellscape. His news feed is a dizzying mix of mundane middle-class American life and high-octane propaganda.
Hereâs a sample:
A set of adoring grandparents posing with rosy-cheeked babies. âMimi and Pop Popâs first visit since March,â the post reads.
Next, a meme of Joe Biden next to a photoshopped âfor saleâ sign. âFor more information contact Hunter,â the sign reads.
After that is a post advertising a âFunny rudeâ metal sign displaying a unicorn in a tutu giving the middle finger. âThought of you,â the post reads.
Below that is a screenshot of a meme created by the pro-Txxxx group Turning Points USA. âYour city on socialism,â the post reads, displaying a series of photos of abandoned buildings, empty grocery store shelves and bleeding men in makeshift, dirty hospital beds.
The feed goes on like this â an infinite scroll of content without context. Touching family moments are interspersed with Bible quotes that look like Hallmark cards, hyperpartisan fearmongering and conspiratorial misinformation. Mr. Youngâs news feed is, in a word, a nightmare. I know because I spent the last three weeks living inside it.
Despite Facebookâs reputation as a leading source for conspiracy theories and misinformation, what goes on in most average Americansâ news feeds is nearly impossible for outsiders to observe. . . .
After years of reading about the ways that Facebook is radicalizing and polarizing people I wanted to see it for myself â not in the aggregate, but up close and over time. What I observed is a platform that gathered our past and present friendships, colleagues, acquaintances and hobbies and slowly turned them into primary news sources. And made us miserable in the process. . . .
Mr. Young joined Facebook in 2008 as a way to reconnect with his high school classmates from Illinois. He reunited quickly with old friends and neighbors. It was exciting to see how people had changed. . . .
It was a little voyeuristic, nostalgic and harmless fun. Before 2016, Mr. Young told me, heâd see the occasional heated disagreement. It wasnât until the last few years that his feed really started to turn divisive.
He first noticed it in the comments, where discussions that would usually end in some version of âagree to disagreeâ exploded into drawn-out, conspiratorial comment threads. Political disagreements started to read like dispatches from an alternate reality. He didnât enjoy fact-checking his friends or picking fights, but when a post appeared obviously untrue he had to say something.
His time on the site ticked upward.
âItâs like going by a car wreck. You donât want to look, but you have to,â he said. He believes his feed is a perfect storm for conflict in part because heâs lived in both liberal and conservative areas of the country and throughout his life heâs lived, worked with and befriended all manner of liberals and conservatives. . . .
But then he noticed some of his friends start to post more political memes, often with no link or citation. When heâd try to verify one, heâd realize the post was fake or debunked by a news site. âMost times thereâs no real debate. Just anger. Theyâre so closed-minded. Sometimes, it scares me.â
Scrolling through Mr. Youngâs feed after Election Day, I found a number of these posts.
Many examples of misinformation came from Facebook text posts created and shared by Mr. Youngâs friends repeating baseless voter-fraud claims, [for example, one claiming] the number of votes in Wisconsin exceeded the number of registered voters (with no links to these numbers or any authoritative news source).
On Nov. 5, one of Mr. Youngâs friends posted about âsomething fishyâ alongside a link to a Bing search. The link returned a page of information about voters having ballots thrown out after using Sharpies to fill them out, including a link to a Facebook post on #Sharpiegate with over 137,000 shares.
One featured a screenshot from a Fox 2 Detroit news broadcast with the banner âDetroit Voter Roll Lawsuit.â The screenshot alleged potential voter fraud. âAnd so it begins!â the friend wrote. According to a Snopes debunk, the segment actually aired in December 2019 and had nothing to do with the 2020 election.
Another text post suggested that people shouldnât trust Facebookâs fact checkers. âWhen the fact checkers are controlled by the same people doing the lying, what do you call it?â the post read. Below, commenters sounded off. âDemocrats,â one exclaimed.. . . .
Mr. Youngâs feed stood in stark contrast to the other Facebook account I spent time in. That feed belongs to Karen Pierce, a 55-year-old schoolteacher from Virginia. Ms. Pierce described herself to me as a âmiddle-child peacekeeper who is uncomfortable with politics.â
Unlike Mr. Young, she is not politically active on Facebook and never intervenes, even when she sees things she thinks might be conspiratorial or fake. As a result, her feed surfaced less politically charged content. The day after the election, the first post I noticed from a friend in her feed was a simple, apolitical exclamation: âItâs official! I make a damn good pot of stew!â
The political posts that appeared in Ms. Pierceâs feed were mostly anodyne statements of support for the Biden-Harris campaign peppered in between comments from fellow teachers frustrated by remote learning and an avalanche of cute dog photos and memes. Occasionally, a meme popped up mentioning Hunter Bidenâs laptop, but most lacked the vitriol or the contentious commenter debates of Mr. Youngâs feed.
Yet, in my conversations with Ms. Pierce over the last month, she expressed just as much frustration with her experience on Facebook as Mr. Young. âItâs so extreme,â she told me in mid-October. âIâve watched people go from debating the issue to coming up with the craziest thing they can say to get attention. Take the whole anti-abortion debate. People started talking, then started saying âif you vote for Biden youâre a murderer.â Now thereâs people posting graphic pictures of fetuses.â
When I told her I hadnât seen anything that extreme on her page, she suggested it was because of a three-month break she took from the platform this summer. âIt got to be too much with the pandemic and the politics,â she said. The final straw was seeing people in her feed post QAnon adjacent memes and content. âThere was a lot of calling Biden a pedophile. Or Txxxx voters posting pictures with assault rifles. It made me very uncomfortable.â
Like millions of Americans, Ms. Pierce logs onto Facebook to feel more connected. âI use it to see how people are doing,â she said. âI believe in prayer and sometimes I check to see who is struggling and to see who to pray for. And then, of course, you see some news and read some articles.â
It was when she was using the platform for news that she started seeing disturbing, conspiracy posts from people in her network. âIt was so disappointing to realize the hate thatâs out there,â she said. . . .
Sheâs worried about the long-term effects of such a toxic environment. âI think itâs affecting the mood of everybody.â
Living inside the Facebook account of strangers â even with their permission â feels invasive, like poking around in their medicine cabinet. But it offered me a unique perspective. Two things stood out. The first is the problem of comments, where strangers, even in the most mundane of articles, launched into intense, acrimonious infighting. In most cases, commenters bypassed argumentation for convenient name-calling or escalated a civil discussion by posting contextless claims with no links or source. In many cases, it appeared that a post from one user would get shared by a friend into his or her network, where it would [attract] strangers.
The more I scrolled through them, the more comments felt like a central and intractable issue. Unlike links to outside articles, comments arenât subject to third-party fact checks or outside moderation. They are largely invisible to those people who study or attempt to police the platform.
Yet in my experience they were a primary source of debunked claims, harassment and divisive rhetoric. I showed one comment thread to a colleague who doesnât use Facebook and my colleague found it shocking. âFacebook created a town hall for fighting,â they said. âItâs almost like if you were building a machine to make a country divisive and extreme â if you were to sit down and plan what that would look like â- it would be this.â
[Facebook’s] evolution, from a friendly social networking site into the worldâs largest information platform, is the source of its biggest problems.
Sifting through Mr. Young and Ms. Pierceâs feeds and talking to them about what I saw, it became clear that the two found themselves tormented as a result of decisions they made in their early days on the platform. Both explained that they joined to reconnect with old friends.
Like most of us, they gave little thought to the connections they made. Mr. Young added friends he hadnât spoken to in decades. When Ms. Pierce joined a nonprofit organization she accepted dozens of friend requests â some from people sheâd met only in passing. âI meet people on airplanes all the time and we exchange Facebook handles,â she told me.
But as Facebook evolved, these weak connections became unlikely information nodes. Mr. Young and Ms. Pierce were now getting their commentary from people they hardly knew, whose politics had once been unknown or illegible.
âWhen Facebook first started it made me feel so good. It feels like I signed up for one thing and itâs become something totally different,â Ms. Pierce said. . . .
Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy . . . , described this phenomenon as what happens when âsocial-networking sites transformed into social media,â creating âa digital economy built on engagement.â Dr. Donovan argues that this decision spawned the algorithmic echo chambers we now live in and created a fertile environment for our information crisis.
For Mr. Young, the fallout of these decisions is painful. After weeks of watching his feed, I presented him with some of the most notorious posters in his feed. When I read aloud the name of one Facebook friend who constantly shared debunked claims, often with language intended to provoke, he sighed. He described the person as a longtime friend and neighbor who was once so close they practically lived at each otherâs houses. Now, he spends his time debating whether itâs worth the energy to stop the friend from sharing conspiracy theories. . . .
The psychological toll of watching friends lose touch with reality has both Mr. Young and Ms. Pierce re-evaluating their choice to spend so much time on the platform. Mr. Young, for his part, tried to stay off during election week; Ms. Pierce is heartened that her feed has become less toxic after her Facebook sabbatical and is planning another. âMy emotional and mental state improves greatly the further away I get from this place,â she told me.
Even if both manage to stay away from Facebook for good, their stories are just two in a sea of billions. No story is the same because no feed is the same. And yet these same dynamics that tortured my two participants â a sea of contextless news and acrimonious comments revealing their neighborsâ worst selves â are on display for millions of Americans every day. . . .
Unquote.
So what can be done?
- CLOSE YOUR FACEBOOK ACCOUNT. IT’S THE EASIEST AND MOST EFFECTIVE SOLUTION.
- UNFOLLOW EVERYONE YOU AREN’T CLOSE TO OR WHO SENDS YOU CRAP.
- DON’T READ THE COMMENTS, UNLESS THE SUBJECT IS CATS OR DOGS.
One thing Facebook could do is close the accounts of the people whose lies are shared the most. Researchers have found that a small group of social media accounts are responsible for the spread of a disproportionate amount of false information [New York Times].
But since Facebook has no morality and Republicans revel in the lying, see the list above, especially item 1.
Like this:
Like Loading...
You must be logged in to post a comment.