Why Are False Beliefs So Persistent?
In the last essay, I wrote about historical examples of belief perseverance. While writing I collected some ideas on belief reinforcement and false beliefs which are shared below. The video is an interview with Warren S Brown, Professor of Psychology at Fuller Theological Seminary, which is relevant to the topic.
Belief perseverance, conceptual conservatism, what I called Cartesian dogmatism as the antithesis to Cartesian doubt, is the observation that beliefs can be extraordinarily resistant to disconfirming evidence from reality. A famous example of this is detailed in “When Prophecy Fails” wherein the social psychologists Leon Festinger, Henry Riecken, and Stanley Schachter studied a small UFO cult based in the Chicago area that believed in an impending catastrophe from which they would be spared. Their conclusions are directly applicable to much of what we have seen in recent years with Qanon and other ideological peculiarities.
The ideology of the cult was delivered through channeling, automatic writing, and the dreams of the group’s founder, Dorothy Martin. The group’s growth and publicity were facilitated by Dr. Charles Laughead, a doctor at a nearby college who led an extracurricular group named The Seekers which came to be the core constituency of the cult. Over time members quit or lost their jobs, ended college studies, gave away money, and ended relationships with non-believers out of the belief that past the date of December 21st they would be living among the Guardians, as the spacemen were known. The press which was solicited attracted the interest of the sociologists, who observed the group under the pretense of shared belief. As December 21st drew closer the group got more press, which led to prank calls and curiosity seekers.
Through the excruciatingly disappointing night of the 21st, the covert psychologists peppered the group with questions. In the wee hours of the morning Mrs. Martin channeled a new prophecy; as business hours commenced the group frantically called the press, more invested than ever in convincing others. The latest prophecy was announced with the most fanfare they’d given any prediction; spacemen would arrive at 6 pm on Christmas Eve in front of Mrs. Martins home; the group would be on the sidewalk outside singing Christmas carols; the event was fully open to the press and public.
Around 200 unruly people showed up as the group sang carols for 20 minutes and then disappeared into the house. Numerous interviews occurred afterward in which evasive contradictory responses were given. The researchers followed up the next day by having a heretofore unseen psychologist visit to ask the group questions. He was treated as a guest of honor as they asked for information and orders, some confirmation of their beliefs. Following the Christmas Eve event and a slew of community complaints warrants were issued for Mrs. Martin and Dr. Laughead and they were threatened with involuntary commitment. Under these conditions, the group couldn’t continue meeting, though they stayed in touch.
Out of this experience, the researchers developed the theory of cognitive dissonance as a causal mechanism for belief perseverance. While the adherents who went home for the holiday became disillusioned, those who had been in Mrs. Martin’s home largely became more intent on proselytizing their beliefs in the wake of disconfirmation. The researchers outlined 5 conditions that increase the likelihood of increased belief in the face of disconfirming evidence:
1. A belief influences action and behavior.
People are often socially penalized for inconstant behavior. In general, there can be strong reinforcements from family, the workplace, and other social systems for people to remain consistent with past behavior, which in turn can become internalized and create a feeling of prohibition to changing behavior. Beliefs that influence behavior can come to be seen as a part of our identity; changing a belief or its resultant behavior can be treated as a betrayal in some contexts; conversely, consistency can be treated as loyalty.
2. The person holding the belief has taken an action that is difficult to undo.
Taking an action that is irreversible is a strong demonstration of commitment to a belief. In the case of the narrative above, this was literally spoken aloud by Dr. Laughead after the disappointment on the 21st, “I’ve given up just about everything. I’ve cut every tie: I’ve burned every bridge. I’ve turned my back on the world. I can’t afford to doubt. I have to believe.”
3. The belief has to be specific and consequential to the real world so that it can be definitely proven true or false.
Claims that we don’t have the ability to confirm or refute aren’t of much consequence. Horoscopes are typically making low-stakes claims, it can be jarring if a horoscope makes extremely specific predictions about your day. On the other hand, a claim that Hilary Clinton will be charged and imprisoned on a specific date to a lot of publicity and fanfare is a high-stakes claim given that if it’s false it will be plainly seen. Consequently to publicly attest to a belief of this nature is a significant investment of reputation and credibility.
4. Undeniable evidence that the belief is false must occur and the holder of the belief must recognize it.
When a date for apocalypse comes and goes with no apocalypse, the people who claimed apocalypse face a lot of pressure to admit they were wrong. This is an extremely uncomfortable mental place to be in. To resolve this, either the belief has to be abandoned, or there has to be some rationalization, some other way that the belief is true, some way in which we weren’t wrong after all. In our present-day context, some fanciful examples of rationalization can be seen in the Qanon community, like the idea that Joe Biden is actually Trump in disguise or the ongoing belief in massive voter fraud.
5. There is social support from other believers.
This is the crucial ingredient to a belief persisting in the face of obvious disconfirmation. Groups that share a belief provide support for holding the belief to one another and can effectively facilitate the refutation of even the most obvious and glaring disconfirmations. In general, we look to others to inform our social behavior. We are all role models for one another, observing others is fundamental to how we learn. What a phenomenon like a cult or Qanon represents is the effective hijacking of this law of nature in a maladaptive way. While in the 50s this effect was confined to a location, the actual presence of other people, today the internet has changed how this influence works. Social support of beliefs can occur entirely at a distance now, in literal isolation. While it’s inarguably a boon that people can connect and share ideas more easily, the shadow of this newfound ease is that people can find like-minded people for beliefs that are harmful to themselves or others.
Belief is to some degree a matter of social permission, if you want to believe the moon is made of cheese but the whole of society will tell you you’re an idiot, you’re likely to abandon the belief. However, if there’s a society of Moon is Cheese Truthers who are making YouTube videos providing evidence and having conventions, one can look to these fellow believers and feel justified in their belief. “It’s ok for me to think the moon is cheese, look at all these other people who think the moon is cheese. If it was a foolish belief there’s no way these people, some of whom seem really smart and have good jobs, would share this belief. It must be true.”
Social and cultural reinforcement isn’t the only way that false beliefs are reinforced. Many studies have shown our natural fidelity to beliefs even after our senses discredit them. One study gave a group of 19 natural scientists with Ph.D.’s a fake formula to calculate the volume of a sphere. 6 participants knew the real formula beforehand, but all 19 were flummoxed by the unexpected discrepancies in their results, without initially suspecting the formula they’d been given as the source of the errors. Commenting on his results one participant said, “what I really came away with is how I try to force the real world into the theoretical and am uncomfortable when it doesn't fit."
Cognitive inertia is a broad term for the general idea of habitual thinking. This manifests with things like brand loyalty or not taking a pandemic seriously, as occurred in 1918 and 2020. People get used to thinking of the world a certain way, the longer an idea of the world is maintained the more difficult it can be to change. William James writing about habitual thought invoked the comparison of water carving a rut into the earth by its flow. His chapter from The Principles of Psychology on habit is among the best work I’ve read on the topic.
This tendency human beings have to rationalize their behavior, to think “it’s good because I’m thinking/ doing it”, can help us maintain a lot of habits of thought that are counterproductive or outright self-destructive. Nature rewards habit, what is habitual is less labor-intensive in the sense of cognition and physical effort, so our brain is eager to turn anything into a habit that it can, including our thoughts. But we don’t tend to ascribe our thoughts to habit, we tend to imagine that what we think or how we use our attention or how we feel has some merit, some reason for being that’s rational, some justification.
Recently I caught myself in a moment feeling anxiety over sharing a piece of writing. I justified the anxiety to myself with the belief that feeling anxiety served the purpose of making me a better writer by making me more concerned about the outcome. The justification was nonsense, the anxiety had nothing to do with the quality of work or the labor I do and in some cases, it was a serious hindrance. The anxiety was a pointless artifact of memory, a leftover habit of thought from an earlier time in life. Nonetheless, at the moment my first instinct was to rationalize it as having some value because it’s mine. Why would I do something or feel something that doesn’t have a utility? Well, lots of reasons, prior experiences, maladaptive beliefs, maybe just habit.
Denialism is the term given to the collective aspect of what personal denial is known to mean. Collectively in response to a painful reality like a holocaust or a pandemic, a group can engage in mass denial. Conspiracy theories can be a means to this end, rather than face up to data that is painful about one’s beliefs or behavior, a conspiracy theory can be a source of great comfort. The Holocaust has gotten this response from some people, as has climate change, as has the subject of evolution among people who hold some religious beliefs.
Status quo bias is something we encounter all the time. It’s the single biggest obstacle to making a change of any sort in any culture at any time. It doesn’t matter how bad things are, people get accustomed to something like exploitative labor practices and a significant percentage will defend it as a way of life, even if they’re miserable because of it. The status quo is immediately evident, it exists to us, intangible future states are at a perpetual disadvantage. Alternatives may not even be known, for example, in the US the simplistic capitalist/socialist dichotomy is knowingly used to manipulate people and reinforce the status quo, not just by people who may be taxed more heavily, but by everyone who identifies with or aspires to be those people.
In the process of becoming familiar with a community or a job, we learn all the justifications, all the reasons why the conventions of the status quo are “good” or “unavoidable”. Spending time with people we develop relationships, we may begin critical of a field only to find ourselves making excuses and exceptions for the very behavior we used to criticize as an outsider. This reality is the thought behind limiting term limits in the senate and why people think it’s hard for senators to be held accountable for their crimes.
People often hold a lower opinion of people seeking change and reform, judging them as extreme, self-interested, and unreasonable, due to these cognitive biases mentioned above. In this way, monarchy and despotism have always had robust social movements defending them against those seeking freedom to choose new leadership. People are uncomfortable with the indeterminate and go to great lengths to feel like the world has some order to it; there is a strong incentive to justify and explain injustices. For example, wealth inequality, mass incarceration, and the lack of worker rights in the US all have robust defenders one encounters if one publicly speaks out on these topics to enough people. Manifest destiny, scientific racism, and social Darwinism are all examples of elaborate rationalizations people use to validate and defend the injustices in society; to think of them as inevitable and unfixable.
So how can we guard against holding onto false beliefs? The most crucial ingredient is to pay attention to and trust your senses. Theories are always less valuable than the data you’re getting from the world. Habitually paying attention to and prioritizing sense information can be practiced with meditation, simply paying attention to what comes into the senses, your breathing, what you hear, sensations in the body, to the exclusion of the babble of discursive thought. You can only pay attention to one thing at a time, so when you focus on sensory information, you’re not focusing on thoughts. Practicing using your mind in this way can help build a habit of seeing the world with your senses, rather than your theories.
In a similar vein, listen to yourself. Time and again in experiments people revealed doubts which they pushed out of their mind, rationalizing that someone knew more than them, whether it’s the experimenter or the group or a medium. Research shows that being disagreeable, meaning not afraid of conflict, nonconformist, has a protective effect on memory in aging.
Another valuable tool is interdisciplinary education, not being a specialist. Buckminister Fuller wrote a lot about the ills of specialization:
We are in an age that assumes the narrowing trends of specialization to be logical, natural, and desirable. Consequently, society expects all earnestly responsible communication to be crisply brief. Advancing science has now discovered that all the known cases of biological extinction have been caused by overspecialization, whose concentration of only selected genes sacrifices general adaptability. Thus the specialist’s brief for pinpointing brevity is dubious. In the meantime, humanity has been deprived of comprehensive understanding. Specialization has bred feelings of isolation, futility, and confusion in individuals. It has also resulted in the individual’s leaving responsibility for thinking and social action to others. Specialization breeds biases that ultimately aggregate as international and ideological discord, which in turn leads to war.
We are all capable of learning to see things holistically, to understand the principles of different disciplines. A big reason why conspiracies can be so effective today or people have tended towards increased partisanship is the cultural trend towards increasingly narrow specializations. The lack of medical knowledge continues to exacerbate confusion, a problem so pervasive that Yale has created a free online course centered around understanding medical research.
In general, sources of confusion tend towards not understanding principles. We can all read the law for ourselves but without knowing the conventions or principles of legal practice that piece of knowledge can be worse than not having any knowledge at all by creating false confidence. In the US there are many examples of this in the news regarding the legal field specifically, but a lot of fields have situations that are analogous to this. Learning the principles of how something works is at once easier and more fruitful than trying to learn details and commit them to memory, analogous to learning how and why math functions can be easier than memorizing a lot of equations. Systems theory is good to learn about to this end, it’s the distillation of what systems have in common, a helpful framework that can help one make sense of different disciplines more easily. Donella Meadows's book, Thinking in Systems is a great introduction to the topic, as well as Fritjof Kapra’s A Systems View Of Life.
The fact that people are reading medical studies which they misunderstand speaks to a problem of abundance. 20 years ago people that weren’t in the medical field weren’t reading medical studies, nobody was posting their ideas about medical studies onto myspace. On the internet we have the greatest means of self-education man has ever had. It’s never been easier to be an autodidact.