Harry Truman’s Healthcare Plan and Our Current Sorry State

What President Truman tried to do and where we are today, by David Oshinsky for The New York Review of Books:

“Bow your heads, folks, conservatism has hit America,” The New Republic lamented following the 1946 elections. “All the rest of the world is moving Left, America is moving Right.” Having dominated both houses of Congress throughout President Franklin Roosevelt’s three-plus terms in office (1933–1945), Democrats lost their majorities in a blowout. Some blamed it on the death of FDR, others on the emerging Soviet threat or the bumpy return to civilian life following World War II. The incoming Republican “Class of ’46” would leave a deep mark on history; its members, including California’s Richard Nixon and Wisconsin’s Joseph McCarthy, were determined to root out Reds in government and rein in the social programs of the New Deal.

One issue in particular became fodder for the Republican assault. In 1945 President Harry Truman had delivered a special message to Congress laying out a plan for national health insurance—an idea the pragmatic and immensely popular FDR had carefully skirted. As an artillery officer in World War I, Truman had been troubled by the poor health of his recruits, and as chairman of a select Senate committee to investigate the defense program during World War II, his worries had grown. More than five million draftees had been rejected as “unfit for military service,” not counting the 1.5 million discharged for medical reasons following their induction. For Truman, these numbers went beyond military preparedness; they spoke to the glaring inequities of American life. “People with low or moderate incomes do not get the same medical attention as those with high incomes,” he said. “The poor have more sickness, but they get less medical care.”

Truman proposed federal grants for hospital construction and medical research. He insisted, controversially, not only that the nation had too few doctors, but that the ones it did have were clustered in the wrong places. And he addressed the “principal reason” that forced so many Americans to forgo vital medical care: “They cannot afford to pay for it.”

The facts seemed to bear him out. Close to half the counties in the United States lacked a general hospital. Government estimates showed that about $11 million was spent annually on “new treatments and cures for disease,” as opposed to $275 million for “industrial research.” Though the nation claimed to have approximately one physician per 1,500 people, the ratio in poor and rural counties regularly dipped below one per 3,000, the so-called danger line. On average, studies showed, two thirds of the population lacked the means to meet a sustained health crisis.

The concept of government health insurance was not entirely new. A few states had toyed with instituting it, but their intent was to replace wages lost to illness or injury, not to pay the cost of medical care. Truman’s plan called for universal health insurance—unlike the Social Security Act of 1935, which excluded more than 40 percent of the nation’s labor force, mostly agricultural and domestic workers. Funded by a federal payroll tax, the plan offered full medical and dental coverage—office visits, hospitalization, tests, procedures, drugs—to all wage and salary earners and their dependents. “Needy persons and other groups” were promised equal coverage “paid for them by public agencies.”

People would be free to choose their own doctors, who in turn could participate fully, partly, or not at all in the plan. Private health insurance programs would continue to operate, with policyholders required to contribute to the federal system as well—a stipulation the president compared to a taxpayer choosing to send a child to private school. “What I am recommending is not socialized medicine,” Truman insisted. “Socialized medicine means that all doctors work as employees of government. The American people want no such system. No such system is here proposed.”

It did him no good. At the first Senate hearing on the proposal, Ohio’s Robert A. Taft, . . .  known to his admirers as “Mr. Republican,” denounced it as “the most socialistic measure that this Congress has ever had before it.” A shouting match ensued. . . . Taft retreated, but not before vowing to kill any part of the plan that reached the Senate floor.

. . .  A predictable coalition soon emerged, backed by pharmaceutical and insurance companies but directed by the American Medical Association, which levied a $25 political assessment on its members to finance the effort. At its crudest, the campaign pushed a kind of medical McCarthyism by accusing the White House of inventing ways to turn a brave, risk-taking people into a bunch of “dainty, steam-heated, rubber-tired, beauty-rested, effeminized, pampered sissies”—easy pickings for the nation’s godless cold war foe. “UN–AMERICAN SYSTEM BLUEPRINTED IN THE KREMLIN HEADQUARTERS OF THE COMMUNIST INTERNATIONALE,” read one AMA missive describing the origins of Truman’s plan.

Precious freedoms were at stake, Americans were told: when the president claimed that medical choices would remain in private hands, he was lying; federal health insurance meant government control; decisions once made by doctors and patients would become the province of faceless bureaucrats; quality would suffer and privacy would vanish. Skeptics were reminded of Lenin’s alleged remark—likely invented by an opponent of Truman’s heath plan—that socialized medicine represented “the keystone to the arch of the socialized state.”

The economist Milton Friedman once described the AMA as “perhaps the strongest trade union in the United States.” It influenced medical school curriculums, limited the number of graduates, and policed the rules for certification and practice. For the AMA, Truman’s proposal not only challenged the profession’s autonomy, it also made doctors look as if they could not be trusted to place the country’s needs above their own. As a result, the AMA ran a simultaneous campaign congratulating its members for making Americans the healthiest people in the world. The existing system worked, it claimed, because so many physicians followed the golden rule, charging patients on a sliding scale that turned almost no one away. If the patient was wealthy, the fee went up; others paid less, or nothing at all. What was better in a free society: the intrusive reach of the state or the big-hearted efforts of the medical community?

Given the stakes, the smearing of national health insurance was not unexpected. What did come as a surprise, however, was the palpable lack of support for the idea. For many Americans, the return to prosperity following World War II made Truman’s proposal seem less urgent than the sweeping initiatives that had ended the bread lines and joblessness of the Great Depression. Even the Democratic Party’s prime constituency—organized labor—showed limited interest. During the war, to compensate workers for the income lost to wage controls, Congress had passed a law that exempted health care benefits from federal taxation. Designed as a temporary measure, it proved so popular that it became a permanent part of the tax code.

Unions loved the idea of companies providing health insurance in lieu of taxable wages. It appeared to offer the average American the sort of write-off reserved for the privileged classes, and indeed it did. Current studies show that union members are far more likely to have health insurance and paid sick leave than nonunion workers in the same industry. . . .

At about the same time, popular insurance plans like Blue Cross emerged to offer cheap, prepaid hospital care . . . . In 1939 fewer than six million people carried such insurance; by 1950, that number had increased fivefold. In the years after Truman’s plan died in Congress, the government filled some of the egregious gaps in the private insurance system with expensive programs for the poor, the elderly, and others in high-risk categories, thereby cementing America’s outlier status as the world’s only advanced industrial nation without universal health care. . . .

[In the United Kingdom, the National Health Service] succeeded because the Labour Party won a landslide victory in 1945 in a country battered by war and facing a bleak economic future—precisely the opposite of the American experience. Opinion polls in the UK showed strong support for a government-run system offering universal, comprehensive, and free health care financed by general taxation. But the threat of a physicians’ strike forced Labour’s health minister, Aneurin Bevan, to scrap the idea of turning doctors into full-time government employees. . . .

The UK excels in universal coverage, simplicity of payment, and protection of low-income groups. While the NHS remains quite popular, it also is seriously underfunded: the UK ranks dead last in both health care spending per capita ($3,900) and health care spending as a percentage of gross domestic product (9.6) among the six European nations [reviewed in Ezekiel Emanuel’s book Which Country Has the World’s Best Healthcare?] The most common complaints . . .  concern staff shortages and wait times for primary care appointments, elective surgeries, and even cancer treatments . . .  “The public does not want to replace the system with an alternative,” writes Emanuel. “All the public wants is a fully operational NHS.”

By contrast, the US health care system—if one can call it that—excludes more people, provides thinner coverage, and is far less affordable. It combines socialized medicine practiced by the Department of Veterans Affairs, four-part federal Medicare (A, B, C, D) for the elderly and disabled, state-by-state Medicaid for the poor, health coverage provided by employers, and policies bought privately through an insurance agent or an Affordable Care Act exchange—all of which still leave 10 percent of the population unprotected. . . . “The United States basically has every type of health financing ever invented,” Ezekiel adds. “This is preposterous.”

And extremely expensive. America dwarfs other nations in both health care spending per capita ($10,700) and health care spending as a percentage of GDP (17.9). Hospital stays, doctor services, prescription drugs, medical devices, laboratory testing—the excesses are legion. Childbirth costs on average about $4,000 in Western Europe, where midwives are used extensively and charges are bundled together, but close to $30,000 in the US, where the patient is billed separately by specialists—radiologists, pathologists, anesthesiologists—whom she likely never meets, and where charges pile up item by item in what one recent study called a “wasteful overuse of drugs and technologies.” There is no evidence that such extravagance makes for better health care outcomes. The rates of maternal and infant death in the US are higher than in other industrialized nations, partly because the poor, minorities, and children are disproportionately uninsured.

For head-spinning price disparities, however, nothing compares to pharmaceuticals. Americans account for almost half the $1 trillion spent annually for prescription drugs worldwide, while comprising less than 5 percent of the world’s population. It is probably [i.e. definitely] no coincidence that the pharmaceutical industry spent almost twice as much on political lobbying between 1998 and 2020 as its nearest competitor, the insurance industry. . . .

Unquote.

Whenever the president is asked why he wants to eliminate the Affordable Care Act (which means people with “pre-existing conditions” would no longer be protected, among other things), he says he’s going to announce a beautiful replacement for the ACA “in two weeks”. Or “next month”. It’s always in two weeks or next month. Reporters never press him for details, because they know he’s full of crap.

Una volta un truffatore, sempre un truffatore (once a con man, always a con man).

PS:  Ezekiel Emanuel says different countries do different things very well, but if he had to choose his personal favorite, he’d pick healthcare in The Netherlands, with Germany, Norway and Taiwan in the running.

“Merit” vs. Community

An Oxford professor of economics and public policy writes about “meritocracy and its critics” for the Times Literary Supplement:

What is going on with our conception of community? Amid the prevailing cacophony of mutual abuse, serious answers to that question are sorely needed and, belatedly, the cavalry is arriving. Communitarian intellectuals, who see a good society as a web of mutual regard rather than a random accumulation of entitled individuals, are beginning to turn the tide on decades of damaging ideas. Michael Sandel’s new book, The Tyranny of Merit, is a valuable reinforcement to this process: Sandel is the most important and influential living philosopher. And Sandel is not alone. For example, in The Third Pillar (2019), Raghuram Rajan, the world’s most respected financial economist, set out a powerful critique of our exaggerated reliance on states and markets: his missing third pillar was community. Many other similar analyses are out or currently in press: an intellectual cascade is under way.

Journalists have also caught up with community. David Goodhart’s new book, Head Hand Heart, critiques the excessive prestige awarded to cognitive skills, relative to equally demanding vocational skills, and the moral strengths needed for care work. In a telling statistic, the author shows that, in contrast to other European societies, the UK spends eight times more on training the cognitively gifted half of the population than on everyone else. . . .

The tide may be turning but Sandel and his fellow communitarians are all building on a long-dead, and mutually acknowledged, pioneer: The Rise of the Meritocracy (1958) by Michael Young, the remarkable social activist who wrote the Labour Party manifesto of 1945. Young presciently realized that meritocracy would be even more socially divisive than the then-prevailing class system of inherited status. His essential insight, based on his experience as a social anthropologist in the East End, was that a fully meritocratic society, with widespread ladders by which “the best” could ascend, would create a new class of “the best”, thereby turning those left at the bottom into “the worst”, bereft of dignity.

And so it has proven. The costs are both physical and mental – physical as evidenced by the falling life expectancy recently documented by Anne Case and Angus Deaton in Deaths of Despair and the Future of Capitalism; mental as evidenced by the anger harnessed by populist politicians in recent years. For while the intellectual cavalry was still asleep, mavericks spotted it coming and offered snake oil remedies that identified the anxiety while proposing fantasy solutions, leading to the political mutinies that baffled and exasperated so many of the successful. Even in 1958, this argument was uncongenial to many on the Left. The Fabian Society refused to publish Young’s book; denial has since become more entrenched.

Sandel develops Young’s critique of meritocracy by tracing its history back to theological disputes between grace and deeds as the criteria for entry into heaven. In the fifth century Saint Augustine emphasized grace, arguing that we did not earn heaven but were granted it by God’s grace. Yet heaven as the reward for deeds kept reappearing. The sale of indulgences by the Church to finance St Peter’s helped to provoke Martin Luther’s rebellious insistence on grace. The same dispute then rapidly infected Protestants. John Calvin took the power of grace into the cul de sac of predestination: some were born blessed by grace and others were not. How could we tell who was blessed? Because they performed good deeds.

Repeatedly, Sandel argues, societies have veered into exaggerated respect for success. . . . Meritocracy intrinsically over-emphasizes the distinctive individual attributes of “the best”. And as those attributes in modern materialist society are exceptional cognitive ability and exceptional effort, the rich and successful have come to see themselves as uniquely clever and hard-working. And deserving. This attitude is Sandel’s target, and it has been the leitmotif of our times . . .

Yet something is lost in that translation of grace into a secular vocabulary. It is the need to transcend “me” and “now”. In short, Sandel offers a profound critique of individualism, making the case for the move away from self to community, from “my wants now” to “the common good”. By this approach we transcend ourselves neither by the utilitarian calculus of the biggest sum of utilities nor the Rawlsian contrivance of detachment from our place in society by a veil of ignorance, but rather through the satisfaction gained from fulfilling social obligations. . . . A healthy society would aim to equip everyone to be able to contribute in some way to our common good: an objective quite different from “let the best rise”. . . .

An efficient journalistic magpie, Goodhart picks out an eclectic range of telling evidence. On the rise of “my wants now”, he cites the sharp decline in moral language: the use of words such as “gratitude”, “humility” and “kindness”, he claims, drawing on a Google study of words published in books, has dramatically reduced over recent decades, to be replaced by more economic language. On his final page Goodhart cites recent research on measuring wisdom, not a social science concept but one used by psychiatrists. They find it, he tells us, to be unrelated to cognitive ability. Psychiatrists define wisdom as “concern for the common good”, the loss of which being where Goodhart ends and Sandel starts.

I end with my initial question. What Sandel, Goodhart and all the communitarians are lambasting is the recent division of society created by a cognitive route to success that belittles all else. . . . Sandel’s thesis is . . . accurately captured as one of “insiders” versus “outsiders”, a distinction first formulated in the analysis of the labour market. Insiders have habitually defended privilege from outsiders: see the professionals such as lawyers, medics and accountants, whose high earnings are protected by their various associations through control of entry (eg setting entry standards unnecessarily high to prevent delegation to the less skilled). But insider advantage extends far beyond the labour market: many of our aspirations are set by the prevailing narratives of the privileged. In Happy Ever After (2019) the behavioural scientist Paul Dolan . . . showed how unwarranted norms set by the insider class, such as the over-emphasis on cognitive achievement, condemn the outsider class to a loss of respect and self-worth. . . .

Insider privilege has become both educational and spatial: the cognitively endowed, clustered together in the metropolis, have life chances radically superior to those of the outsiders. And insider advantage, just like the class system that it replaced, replicates itself. By assortative mating and hothousing their children, the insiders pass their privilege on: they have rapidly become a hereditary caste. All have the opportunity to succeed but the insiders have decisively rearranged the ladders, while – especially on the Left – bemoaning the “inequality” for which [the insiders] are primarily responsible. Goodhart tells a story about the advice offered by senior civil servants to the Minister of Education during the UK years of austerity. It was to save money by closing the colleges of further education. The 8:1 differential in spending on tertiary education, in favour of universities, would become 8:0. Their justification was that “nobody would notice”. What they meant was that the insiders (such as they themselves) wouldn’t notice, since they sent their children to university.

Not before time, the smugly successful are getting their comeuppance: our understanding of contemporary society is finally changing. An insider with a belated conscience, as these disruptive ideas are absorbed by my class, I will try to resist the pleasures of watching hubris turn to nemesis.

Unquote.

I was suspicious about psychiatrists saying “wisdom” involves concern for the common good, but the American Psychological Association offers this definition

wisdom: the ability of an individual to make sound decisions, to find the right—or at least good—answers to difficult and important life questions, and to give advice about the complex problems of everyday life and interpersonal relationships. The role of knowledge and life experience and the importance of applying knowledge toward a common good through balancing one’s own, others’, and institutional interests are two perspectives that have received significant psychological study.

Will society ever devote fewer resources to cultivating the head and more to helping the hand and heart? Recent appreciation for workers who keep society functioning, not just doctors and nurses and medical technicians but truck drivers, grocery store workers, sanitation workers, nursing home staff, et al. seems unlikely to reorder society’s priorities unless government takes much more control of “the market”. Will more people’s merit be recognized and rewarded? Time and the results of future elections will tell.

Conspiracies and Conspiracists

Below is one-third of a piece “on the conspiracist mind” by British novelist James Meek. If you’re interested in the whole London Review of Books article, which includes 5,000 words I left out, go here.

When the pandemic hit, social media, hyper-partisan broadcasters, Txxxx-era populism and conspiracy theory were already creating a self-contained alternative political thought space conducive to the cross-fertilisation of conspiracist ideas. Covid-19 and government efforts to control it . . . appear, in the conspiracist mind, as the most open moves yet by a secret group of sadistic tyrants who want to reduce the human population and enslave those who remain. The pandemic and official countermeasures are interpreted as proof, and Covid becomes the string on which any and all conspiracy theories may be threaded. Seen through the conspiracist filter, by forcing us to wear masks, by closing bars and isolating the frail elderly, by trying to terrify us over, as they see it, a dose of flu, or by microwaving us with 5G, the secret elite has shown its hand.

Now that its existence, nature and power have been proved to us, why shouldn’t we believe that the members of this group arranged 9/11? Or that Bill Gates is planning to kill us with vaccines, or inject us with nanochips hidden in vaccines, or both? Why shouldn’t the entire course of world events have been planned by a group of elite families hundreds, even thousands, of years ago? Why shouldn’t there be a link between the bounds to individual freedoms that governments have drawn up to slow climate change and the restrictions they’re carrying out in the name of beating Covid? Surely these two hoaxes are cooked up by the same firm, with the same agenda? Why, as followers of the American conspiracy theory known as QAnon insist, shouldn’t a group of politicians, tycoons and celebrities be kidnapping and torturing children on a massive scale?

A large survey in May conducted by researchers in Oxford found that only about half of English adults were free of what they termed ‘conspiracy thinking.’ Three-quarters of the population have doubts about the official explanations of the cause of the pandemic; most people think there’s at least a chance it was man-made. Almost half think it may have been deliberately engineered by China against ‘the West’. Between a fifth and a quarter are ready to blame Jews, Muslims or Bill Gates, or to give credence to the idea that ‘the elite have created the virus in order to establish a one-world government’; 21 per cent believe – a little, moderately, a lot or definitely – that 5G is to blame, about the same number who think it is ‘an alien weapon to destroy humanity’. Conspiracy beliefs, the researchers concluded, were “likely to be both indexes and drivers of societal corrosion … Fringe beliefs may now be mainstream. A previously defining element that the beliefs are typically only held by a minority may require revision … Healthy scepticism may have tipped over into a breakdown of trust”.

A friend, a BBC journalist, told me about a conversation he’d had with an acquaintance who began talking about the dangers of 5G and claimed that ‘every time a new kind of electromagnetic energy is invented, it causes a new kind of disease, like the invention of radar caused Spanish flu.’

‘But Spanish flu happened in 1918, and radar wasn’t invented till the 1930s,’ my friend said.

‘You would say that, wouldn’t you?’ This was uttered without a trace of a smile.

The author then provides a long description of an anti-lockdown rally in London. The star attraction was David Icke, a well-known former professional soccer player and sports broadcaster.

. . . At a time when Britain had a handful of TV channels, everyone knew his face. Shortly before he left the BBC in 1990 he experienced a metaphysical epiphany in a newsagent’s on the Isle of Wight. Not long afterwards . . . he declared he’d been chosen by a benign godlike agency as a vehicle for the revelation of truths essential to the survival of Earth and humanity. . . . Since then, Icke has worked on his material and his brand, developing his following, writing books, and giving lectures and interviews around the world. . . . In May, following an appeal from the Centre for Countering Digital Hate, which pointed out that millions of people had been exposed to online material in which he blamed Jews for the pandemic, denied the reality of Covid-19, played down the infectiousness of viruses in general and lent support to 5G conspiracists, both Facebook and YouTube – though not Twitter – took down Icke’s pages. The action had no appreciable effect on his profile, except perhaps to give him the lustre of the martyr. YouTube, and YouTube wannabes like BrandNewTube, are still thick with Icke interviews by small-time videocasters. Google will point you to them. And although he has been banned from Facebook, his fans haven’t, nor have links to his material. . . . Amazon still distributes his books.

The conspiracy narrative Icke began to weave in the early 1990s is a sprawling affair that changes to follow the headlines, veers off on tangents and is full of internal inconsistencies, but some core elements remain. Icke’s story bears similarities to the influential American conspiracist text Behold a Pale Horse by William Cooper (which was published at about the time Icke reinvented himself as a prophet), and to the pseudo-leaks that drive QAnon, though QAnon tends to avoid the extraterrestrial. A cursory and much rationalised summary of Icke’s conspiracy theory goes like this: thousands of years ago, a race of reptilian beings from another world drew up a marvellously slow plan for the enslavement of humanity, to be carried out by a tiny elite of either – the exact mechanism varies – human proxies of surpassing wickedness, or reptiles in human form. (‘I once had an extraordinary experience with former prime minister Ted Heath,’ Icke told the Guardian in 2006. ‘Both of his eyes, including the whites, turned jet black.’)

The plan continues to unfold, regularly missing prophesied deadlines. . . .

Next, the author discusses an encounter with Dominic, a young man handing out leaflets in a London park:

I skimmed the contents of the leaflet. It seemed a combination of falsehoods, misunderstandings, exaggerations and out of context snippets supporting the evil plan theory of events, all culled without attribution from the internet. . . . I somehow felt I had to intervene, not to change Dominic’s mind or to stop him handing out the leaflets, but simply to make him register that there was resistance to the falsehoods he was spreading. I went over to him – he was handing out his material to a large group of young people sitting on the grass – and told him off. I wasn’t eloquent. I said his leaflets were full of rubbish, and that he should destroy them. He said I should destroy my mask . . . I walked away. It was the kind of futile encounter between the self-appointed rationalist and the self-declared bearer of esoteric truths that happens online all the time, and it was no more satisfactory in the flesh. . . .

Karl Popper​ coined the phrase ‘conspiracy theory’ in 1952, in his book The Open Society and Its Enemies. He framed it as something that would always be singular, like game theory or chaos theory: it was only later that people started talking about ‘conspiracy theories’. . . . Popper’s notion of conspiracy theory referred to a personal predisposition that could attach itself to anything, precisely because it was nested in the holder’s brain.

Popper saw conspiracy theory as something very old, connected to the religious impulse. ‘The belief in the Homeric gods whose conspiracies explain the history of the Trojan War is gone,’ he writes. ‘The gods are abandoned. But their place is filled by powerful men or groups – sinister pressure groups whose wickedness is responsible for all the evils we suffer from – such as the Learned Elders of Zion, or the monopolists, or the capitalists, or the imperialists.’ . . .

Conspiracy theory fixes on diverse manifestations of injustice, technology and strife, on anything that’s hard to explain. That’s not to say it doesn’t have a dominant key. The othering of ethnicities or particular groups and accusations of Satanism or child abuse are frequent markers of conspiracies, but they all have in common an anarchic, nihilistic libertarianism that takes government as its ultimate enemy – specifically the kind of social democratic or socialist government that shifts resources from the wealthiest to the less well off, that offers a trade-off between greater equality and curtailments of personal freedom for the rich. This might seem implausible, given how central the idea of a gang of super-rich families is to conspiracy theory.

But only a few families are included; conspiracy theory tends to pass over the wealthy as a class. It’s striking that the two billionaires most often accused of being the chief New World Order Satanists – George Soros and Bill Gates – are the ones who have, if at times ham-fistedly, given away the largest chunks of their fortunes to worthy causes, one in support of the principle of democracy, the other in support of better health for the poorest. Gates is targeted because of the vast sums he gives to the World Health Organisation and for vaccine research, rather than for what one might assume enslavement-fearing conspiracy theorists would attack him for, the fact that the firm he used to run provides the software for most of their computers. It’s as if, to the conspiracists, Bill Gates of Microsoft is a perfectly respectable American tycoon and his philanthropic self a wicked alter ego. . . .

This isn’t a conspiracy theory about the origin of conspiracy theories. It’s an observation that the interests of conspiracy theorists and the interests of the selfish end of the plutocracy have a way of aligning. Both are cynical and mistrustful of institutions of authority, the courts, the media, the government, legislatures: the conspiracists because they think such bodies are malign agents of a secret elite, the plutocrats because they place limits on their wealth and power.

Txxxx was not the first conspiracy theorist to come to power. . . . Txxxx’s election was unusual not just because the American establishment saw itself as immune to capture by a conspiracy theorist, but because he embodies in one person the two poles of hostility to liberal democratic institutions: the plutocrat who hates taxes, regulations and impertinent journalists, and the conspiracy theorist with paranoid delusions about a deep state plot against the people. Perhaps it was inevitable that he would become a character in a phenomenon like QAnon.

Some have described QAnon as more like a religion than a conspiracy theory, and it does stand out from the others in that it imagines two duelling conspiracies – an evil conspiracy, with Hillary Clinton, Hollywood celebs and a pack of evil Democrats running a gigantic operation to kidnap hundreds of thousands of children, keep them prisoner in underground tunnels, torture them, rape them, drink their blood and use them in satanic rituals; and a good conspiracy, led by Txxxx and a team of loyal heroes in the US military, whose members are preparing to burst out, break up the paedophile Satanist ring and save the children. In QAnon, Txxxx is portrayed as a cross between Jack Ryan, the tough, smart, patriotic family man played by Harrison Ford in the movies based on the Tom Clancy novels, and the archangel Michael.

There’s​ a danger that in writing about QAnon – a social phenomenon not just in the US but in Britain, Germany and many other countries, and endorsed by a number of Republican candidates – you make it sound more interesting and mysterious than it is. It is interesting, but in the way hitting yourself in the face with a hammer is interesting: novel, painful and incredibly stupid. . . .

Although Q’s impact depends on followers believing that the posts come from a source at the heart of the American defence establishment, it seems unlikely that they would have found an audience without help. Obscure, dull, posted on websites with byzantine interfaces and repulsive content, they would have languished had it not been for two 4Chan moderators . . . persuaded a struggling YouTuber . . . to start making videos interpreting and embroidering the posts. The videos were a hit. . . . The QAnon movement spread when people who would never have gone near 4Chan began dissecting and arguing over each post, first on YouTube and Reddit, then on Facebook. Sites sprang up to relay the posts in accessible formats. . . .Websites and internet entrepreneurs discovered they could increase traffic and make money by tapping into the interest in QAnon. Faded Instagram influencers and obscure wellness gurus found new audiences by pushing hard on the child abuse angle. . .

There have been efforts to portray QAnon followers as directly dangerous: one article in the Financial Times warned that ‘QAnon has the makings of America’s al-Qaida.’ Few Q-adjacent conspiracists have gone as far as [the] North Carolinan who in 2016 marched into a pizza parlour in Washington DC with three loaded guns, intending to rescue the children he believed . . . were being kept prisoner there. But Q isn’t urging people to take direct action. He tells his followers – he refers to them as ‘patriots’ – to sit back, not worry, and enjoy the spectacle of Txxxx’s plan unfolding. ‘Get the popcorn, Friday and Sunday will deliver,’ he said in 2017 when making one abortive prediction. ‘Trust the plan. Step back,’ he told an impatient supporter in 2018. Q has told followers to ‘trust the plan’ 27 times – a plan they have no role in carrying out.

The danger of conspiracy theories is not that they promote action to tear down society but that they delegitimise, distract and divert: they divert large numbers of people from engaging in political action, leaving the field clear for the cynical, the greedy and the violently intolerant. They distract them from questioning authority about society’s real problems by promoting a gory soap opera as if it were real and the result of ‘research’. And they delegitimise the idea that institutions – courts, parliaments, the education system, the salaried media – can be anything other than malign.

To talk to conspiracy theorists like Dominic and Martin is to find yourself pitied as a credulous centrist, relegated to the world of ‘No, but …’ ‘Do you think kidnapping, raping and murdering children and drinking their blood is OK?’ ‘No, but …’ ‘Do you like the increasing control faceless corporations, unaccountable billionaires and remote authorities have over our lives?’ ‘No, but …’ ‘Are you happy about the relentless spread of incomprehensible, intrusive technology?’ ‘No, but …’

. . . In a way the saddest aspect of the epidemic of conspiracism is not the delusions about conspiracy but the delusions about what it is to learn. [In a recent book about conspiracy, A Lot of People Are Saying, the authors] write, ‘knowledge does not demand certainty; it demands doubt.’ How did it get to the point where a smart young man like Dominic can believe in a binary, red pill-blue pill world of epistemics, in which there are only two hermetically distinct streams of knowledge to choose from, his preferred ‘truth’ and the other, ‘mainstream’, ‘official’ version, which [according to him] all those who reject his truth believe without question?

Unquote.

Yes, believing these convoluted conspiracy theories offers a sense of certainty, a feeling of being “in the know”.

On the other hand, Euripides, Shakespeare and Diderot all felt (if you can believe the internet) that “a prudent skepticism is the most profitable quality a man can have”, “modest doubt is call’d the beacon of the wise” and “scepticism is the first step towards truth”.

Hmm. They all sound like reptiles in human form to me.

2020 Won’t Be 2016 (or 2000)

We’re entering what’s been called and what’s going to be “the longest two weeks in human history”. A neuroscientist who writes for Scientific American says we shouldn’t worry too much about what’s going to happen:

Will we be surprised again this November the way Americans were on Nov. 9, 2016 when they awoke to learn that reality TV star Dxxxx Txxxx had been elected president?

. . . Another surprise victory is unlikely to happen again if this election is looked at from the same perspective of neuroscience that I used to account for the surprising outcome in 2016. Briefly, that article explained how our brain provides two different mechanisms of decision-making; one is conscious and deliberative, and the other is automatic, driven by emotion and especially by fear.

Txxxx’s strategy does not target the neural circuitry of reason in the cerebral cortex; it provokes the limbic system. In the 2016 election, undecided voters were influenced by the brain’s fear-driven impulses—more simply, gut instinct—once they arrived inside the voting booth, even though they were unable to explain their decision to pre-election pollsters in a carefully reasoned manner.

In 2020, Txxxx continues to use the same strategy of appealing to the brain’s threat-detection circuitry and emotion-based decision process to attract votes and vilify opponents. . . .

But fear-driven appeals will likely persuade fewer voters this time, because we overcome fear in two ways: by reason and experience. Inhibitory neural pathways from the prefrontal cortex to the limbic system will enable reason to quash fear if the dangers are not grounded in fact. . . .

A psychology- and neuroscience-based perspective also illuminates Txxxx’s constant interruptions and insults during the first presidential debate, steamrolling over the moderator’s futile efforts to have a reasoned airing of facts and positions. The structure of a debate is designed to engage the deliberative reasoning in the brain’s cerebral cortex, so Txxxx annihilated the format to inflame emotion in the limbic system.

Txxxx’s dismissal of experts, be they military generals, career public servants, scientists or even his own political appointees, is necessary for him to sustain the subcortical decision-making in voters’ minds that won him election and sustains his support. . . . In his rhetoric, Txxxx does not address factual evidence; he dismisses or suppresses it even for events that are apparent to many, including global warming, foreign intervention in U.S. elections, the trivial head count at his inauguration, and even the projected path of a destructive hurricane. Instead, “alternative facts” or fabrications are substituted.

. . . Reason cannot always overcome fear, as [Post-Traumatic Stress Disorder] demonstrates; but the brain’s second mechanism of neutralizing its fear circuitry—experience—can do so. Repeated exposure to the fearful situation where the outcome is safe will rewire the brain’s subcortical circuitry. This is the basis for “extinction therapy” used to treat PTSD and phobias. For many, credibility has been eroded by Txxxx’s outlandish assertions, like suggesting injections of bleach might cure COVID-19, or enthusing over a plant toxin touted by a pillow salesman, while scientific experts in attendance grimace and bite their lips.

In the last election Txxxx was a little-known newcomer as a political figure, but that is not the case this time with either candidate. The “gut -reaction” decision-making process excels in complex situations where there is not enough factual information or time to make a reasoned decision. We follow gut instinct, for example, when selecting a dish from a menu at a new restaurant, where we have never seen or tasted the offering before. We’ve had our fill of the politics this time, no matter what position one may favor. Whether voters choose to vote for Txxxx on the basis of emotion or reason, they will be better able to articulate the reasons, or rationalizations, for their choice. This should give pollsters better data to make a more accurate prediction.

Unquote.

Pollsters did make an accurate prediction of the national vote in 2016 (Clinton won it). Most of them didn’t taken into account the Electoral College, however, or anticipate the last-minute intervention by big-mouth FBI Director James Comey.

In 2000, the Electoral College result depended on an extremely close election in one state. That allowed the Republicans on the Supreme Court to get involved. There’s no reason to think that will happen again, despite the president’s hopes that it will.

When Our Votes Will Be Counted

With so many ballots being mailed or otherwise submitted before Election Day, people are wondering when we’ll know the results. The good news is that only four states wait until Election Day to begin processing ballots. I think this means Election Night will provide some blessed relief, especially if states let us know what percentage of the ballots have been counted (the percentage of “precincts reported” probably won’t be as meaningful this year). Even if the result isn’t clear that night, it should be clear by the next day.

I say that because I’m convinced this election won’t be very close. Millions of voters gave the maniac the benefit of the doubt four years ago. Now they know what they had to lose (jobs, health, peace of mind, not hearing about a dangerous fool every day, etc.).

This is from The New York Times, which has more information about the process.

Untitled