You’re driving along the road, lost in thought when suddenly a black BMW cuts you off. The offending vehicle, plastered with bumper stickers and a Maryland license plate, speeds away. But, as luck would have it, they hit a red light a quarter mile down the road. As you pull up alongside them, you get a good look. You see their race, gender, age, clothing, and preference in fuzzy dice color. For the sake of argument, imagine you learn everything there is to know about them: their Zodiac sign, favorite pastime, darkest secret, and favorite Taylor Swift song. The question is, what do you do with this information?
Perhaps you think nothing of it and ignore the whole thing. Perhaps you conclude that people who drive BMWs are jerks but otherwise forget about it. Or, perhaps, you could conclude that people who drive BMWs are jerks AND Maryland drivers are terrible. It’s not obvious what conclusions to draw from this. In principle, you could update every single category that they belong to as being slightly more likely to cut people off.
People's choices about which categories to acknowledge or disregard in their judgments hinge on their individual world models. I’ve discussed world models before (notably, here), but the summary is that they are mental representations of the world, maps of the surrounding territory. These models enable you to simulate the external world and predict how it will respond to your actions. Everyday knowledge, such as understanding that a book will fall if dropped, is incorporated into this model. Similarly, interpersonal dynamics and cultural norms are embedded within these models. From infancy, humans continuously evolve these models by observing cause-and-effect relationships, recognizing patterns, and integrating new information. These world models are powerful tools, allowing you to simulate experiences you’ve never had, preparing you for life's myriad possibilities.
Immutable Beliefs
In this post, I explore two contrasting paradigms for shaping one's world model: immutable beliefs and the Litany of Tarski. Immutable beliefs are those core convictions so deeply held that they are tenaciously resistant to change. These beliefs are firmly embedded in an individual's worldview and value system, shaping their perceptions, choices, and actions. They are not merely intellectual positions but are deeply felt and integral to a person's sense of self. They serve as anchors, tying us to our communities, cultures, and personal identities while providing stability and continuity in a rapidly changing world. Contrary evidence tends to have little impact on these beliefs, and it would take an overwhelming amount of information and new experiences to change them, if they change at all.
Immutable beliefs can take many forms. In this particular case, you might believe that all people are equally good at driving, so this incident is meaningless. Or you might believe that some attributes of people, such as their age, can influence their driving, while others, such as their race or gender, have no effect. That is, some biases are OK and others are not. Still others might think that we all started the same, but our abilities have varied through experience and opportunity. Some are conflicted about their immutable beliefs—perhaps they notice that a certain demographic of people keep cutting them off, but just assume it’s bias on their part and feel bad about themselves for stereotyping. These are all manifestations of immutable beliefs at work.
Immutable beliefs play a pivotal role in simplifying our understanding of the world by offering clear-cut narratives for complex scenarios. Consider the example of gender disparities in driving behaviors or accident rates. When one holds an immutable belief in the inherent equality of men and women in driving skills, any observed differences in behavior or accident rates are automatically attributed to external factors such as societal biases or varying opportunities. This perspective negates the need to consider a more nuanced or multifaceted explanation. While this simplification process can offer comfort by reducing the complexity of the information we need to process, it also underscores the powerful influence of deep-seated beliefs in shaping our understanding of and responses to the world around us. Immutable beliefs, in this way, function as cognitive filters, providing a structured and consistent lens through which we view and interpret the world.
The Litany of Tarski
The alternative to holding immutable beliefs is to embrace the Litany of Tarski, a philosophy focused on aligning beliefs with truth, irrespective of personal biases or preferences. The core principle of the Litany of Tarski is: “If X is true, I want to believe X.” This means that for any given true statement X, the goal is to believe that X is true. It is a desire to believe everything that is true about the world and not believe anything that is untrue. The description from LessWrong reads as follows:
The Litany of Tarski is a template to remind oneself that beliefs should stem from reality, from what actually is, as opposed to what we want, or what would be convenient. For any statement X, the litany takes the form "If X, I desire to believe that X".
The Litany of Tarski champions the abandonment of immutable beliefs in favor of a dynamic approach to knowledge. It is the philosophy of taking everything into account. The goal is to use each new piece of information to create a more accurate understanding of the world. In this framework, beliefs are not static; they are incrementally adjusted with every new interaction, leading to a continuous evolution of our worldview.
Each time you get cut off in traffic, your belief that people who fit that description are more likely to cut people off rises a little bit. This process doesn’t require actively seeking counterbalancing evidence; if an equalizing factor exists, it will present itself naturally. For example, if Virginia drivers are just as bad as Maryland drivers, you’ll be cut off by one before too long, and then you’ll be perfectly calibrated again. If you aren’t, then it seems that you weren't properly calibrated in the first place. If, over the course of a year, significantly more Maryland drivers cut you off than Virginia drivers, your model will naturally shift to reflect this pattern.
Your new belief isn't set in stone, either; you continuously adjust it based on new experiences. Not all updates are accorded equivalent gravitas. For example, the findings from a scientific study typically warrant a more substantial update than a solitary anecdotal experience.
Naturally, these contrasting approaches represent a spectrum, with few, if any, individuals aligning perfectly at either extreme. Most people's belief systems lie somewhere in between, blending elements of both immutable beliefs and the Litany of Tarski.
Difficulties in Thinking
The Litany of Tarski sounds like an ideal we’re all supposed to aspire to, almost as if it’s the “correct answer”, but how realistic is it? For instance, when I was talking about the guy who cut you off and I mentioned 'people who fit that description,' it's worth asking: what exactly constitutes 'that description'? Is it race, gender, the type of car, license plate, or a combination of these factors? Ideally, the Litany of Tarski would have us consider all such variables. But there is no limit and we quickly run into an intractable problem. Perhaps the Litany of Tarski is the best technique if you could do an infinite number of calculations, but what if you don’t have a supercomputer in your pocket?
The key to following the Litany of Tarski is maintaining an unbiased stance and being willing to adjust your beliefs with each new piece of evidence. As long as you can unbiasedly update each time, you’ll end up in the right place. But there’s the rub. This technique only leads to accurate world models in the absence of bias. And that’s very hard to do because our minds are rife with bias. Perhaps you believe that Asian women are bad drivers and white men are good drivers. As you pull up next to that black BMW, you look over and see a white man. Suddenly, you wonder if he had really cut you off—perhaps you overreacted? The streets had merged in a weird way, and you hadn’t been paying as much attention as you should have been—maybe it was you who was in the wrong?
Furthermore, accurately updating your world model involves not only overcoming personal biases but also understanding how individual experiences relate to the broader context. If you observe that Maryland drivers really are more likely to cut you off, that’s only true in the places you visit. You have to adjust for sampling bias. You’re far more likely to be cut off by a Maryland driver when living in Virginia than when living in California.
Say you were just cut off by a white man in a BMW. If everyone around you is white, then the fact that this person was white isn’t a big update. But if you’ve only seen three BMWs today and two of them cut you off, that’s a more significant update to your beliefs. To accurately update your beliefs, you need to estimate what fraction of nearby drivers are white, male, and drive a BMW. Unfortunately, you don’t have the data and even if you did, performing all the calculations would make your head spin. Again, the Litany of Tarski has led us to an intractable problem, and we haven’t even started talking about interaction effects. And remember, you have to update every time someone doesn’t cut you off too.
This complexity often compels us to use heuristics—mental shortcuts that streamline our decision-making processes. These shortcuts enable us to efficiently navigate the myriad decisions we face, but they are not without drawbacks. They can introduce distortions in our perception and reasoning, which become particularly problematic when trying to adhere to the principles of the Litany of Tarski. Cognitive biases1 stem from these heuristics and can profoundly influence our judgment, leading us toward conclusions that not only oversimplify complex realities but also veer away from factual accuracy. Oftentimes, cognitive biases influence our thinking without us even realizing it. Ultimately, these hidden biases can inadvertently divert us from objective truth, compromising our efforts to cultivate a more accurate worldview.
Countering Biases
Some communities, such as the Rationalists, try to make approaches such as the Litany of Tarski work by minimizing their cognitive biases. They congregate around blogs with names such as ‘Overcoming Bias’ and ‘LessWrong’, which serve as hubs for discussions aimed at refining their thought processes and decision-making. It’s hard to know how much they succeed though. These communities have made strides in promoting awareness of cognitive biases and encouraging strategies to mitigate them, which, in theory, should lead to more accurate beliefs and better decision-making.
However, there's a potential paradox inherent in this endeavor. The very act of striving to overcome biases can, in itself, create a new form of bias—a confidence in one's ability to be less biased than others, which might lead to overconfidence in one's own beliefs. We know we cannot escape all cognitive biases, so the question is whether the effort to reduce them is a net positive. Furthermore, while these communities aim for rationality, they are not immune to the influence of social dynamics and groupthink, which can sometimes lead to a reinforcement of shared beliefs, regardless of their accuracy.
Tradeoffs
While the ethos behind the Litany of Tarski may sound like a good idea, it's not necessarily the optimal approach. For many people, it may prove more pragmatic to embrace the immutable beliefs prevalent in their society. These beliefs, though not always up-to-date or completely accurate, have been shaped and refined over generations. Adherence to most of them is unlikely to cause significant harm, largely because these societal beliefs have emerged from the collective experience and wisdom of many generations. Even with potential flaws, they offer a time-tested framework for understanding and navigating the world.
Whereas with the Litany of Tarski, if someone falls too far off the mark, it may lead to more harm than good. During the COVID pandemic, the phrase “do your own research” became somewhat synonymous with being anti-vaccine. This approach certainly aligns more closely with the ethos of the Litany of Tarski. But there are so many ways this can go wrong. A prime example of this is when you encounter bad information. In theory, you should encounter good information to counter it, but what if you’re unwittingly caught in an echo chamber? If you think you’re evaluating a random sample of evidence but are in an echo chamber, all of your calibrations will be off, and you’re just reinforcing incorrect beliefs. This inadvertent confirmation bias can lead to significantly distorted perceptions of reality. This is the difficulty with following the Litany of Tarski; a single, seemingly minor oversight in evaluating sources or data can cascade into a major misalignment of one’s worldview. If these people, instead of trying to read and understand the evidence, simply accepted the societal position that being vaccinated is safer than not being vaccinated, they would have been better off.
With immutable beliefs, you know where you stand. Your friends know where you stand. Your beliefs give you a sense of identity and stability. You are proud of your beliefs. You can put a sign in your yard telling people where you stand. But there’s a downside, when you need to stand elsewhere. Immutable beliefs, in particular, can perpetuate and justify the status quo, regardless of its ethical or moral correctness. In moments when society stands on the cusp of beneficial transformation, these immutable beliefs can blind people to the ethical or moral shifts that beckon a society towards progress.
Conversely, if you adopt a mindset where beliefs are fluid and subject to change based on data, your connection to those beliefs is less personal. Your current beliefs don’t form a core part of your identity; they don’t evoke pride or shame in you. They are simply reflections of your best synthesis of the current evidence.
Allowing your beliefs to take you anywhere means you have no idea where you’ll end up. There are no closed problem spaces. This approach can result in surprising and unconventional conclusions. You might come up with the notion that wild animals might experience net suffering or that eating beef is good for cows because, otherwise, they wouldn’t exist. This approach can have practical ramifications too. If you base your dietary choices on moral standards, and new research suggests that shrimp possess more complex capacities than previously thought, you might find yourself compelled to alter your diet. Adapting to such revelations can be inconvenient and challenging.
The Litany of Tarski presents a formidable challenge, both personally and societally. It demands that we be willing to accept truths, even when they make us uncomfortable or clash with our pre-existing beliefs. It could force us into certain beliefs among sensitive topics such as religion or human capabilities. One implication of such thinking is that there are no fundamental truths about humans; there is no inherent equality or inequality between groups written into the fabric of the universe. This could leave you saying, “I would have liked for X to be true, but it is not.”
For example, consider a scenario where scientific evidence suggests that Population X is smarter than Population Y. How many people would agree with the statement, “If Population X is smarter than Population Y, then I would like to believe that Population X is smarter than Population Y?” When confronted with such evidence, many people start looking for an exit, saying things like, “Everyone is smart in different ways,” or "IQ isn't real," or “The study is flawed.” But they seldom follow it up with “Let’s fix it and run it again.” It’s more a case of, “Good thing we found a problem and can justify ignoring the data. Let’s never run the study again. Crisis averted!”
At a societal level, this challenge becomes even more complex. In a diverse society, where populations X and Y coexist, could a fundamental inequality be acknowledged? It’s hard to know. This topic involves not only a commitment to truth but also an understanding of the ethical and moral implications of these truths. It's one thing for an individual to grapple with uncomfortable truths, but quite another for a society to collectively confront and integrate them into the broader societal narrative.
Conclusion
The contrast between immutable beliefs and the philosophy of the Litany of Tarski presents a framework for understanding how we process and integrate information into our worldviews. Immutable beliefs can add real value through a sense of identity, anchoring us in a world that is chaotic and unpredictable. But they can also blind us to new truths and hinder our ability to adapt to changing realities. On the other hand, the Litany of Tarski, which advocates for a continuously updating belief system, encourages flexibility and openness to change or revision but requires a level of knowledge and calculation that is impossible to perform, so we can only make our best attempts.
As I said, these contrasting frameworks represent a spectrum and no one is at either extreme. But I think most people would benefit from moving further in the direction of the Litany of Tarski. I think it’s valuable for people to remember that beliefs should stem from reality, as opposed to what we want. I’m willing to admit that it’s impossible to do this perfectly, and maybe even damaging when done incorrectly. But I still think people should try to be aware of their immutable beliefs and question whether they add values to their lives.
The challenge lies in distinguishing between genuinely evidence-based convictions and those rooted in immutable beliefs. If you find yourself proudly displaying your beliefs, whether through a bumper sticker or a yard sign, it may be time to introspect. Such declarations could indicate a deeper entrenchment into immutable beliefs. While these beliefs might be comforting or even beneficial, acknowledging and examining them is important. This process doesn't necessarily mean we must abandon our beliefs, but rather, it encourages us to engage with them thoughtfully, ensuring they truly serve us well.
"That which can be destroyed by the truth should be."
P.C. Hodgell
It's important to clarify the distinction between the broad concept of bias and the more specific term "cognitive biases," as the word "bias" can be interpreted differently in various contexts. While bias often carries a negative connotation, its general definition refers to any systematic deviation from a neutral or unbiased state. Essentially, a bias implies a tendency to lean in a certain direction or to believe things that are not uniformly distributed across all possibilities.
For example, holding the belief that 20-year-old men are more likely to be terrorists than 90-year-old women represents a form of bias. This belief indicates a predisposition to associate a certain demographic (20-year-old men) with a specific behavior (terrorism) at a higher rate than another demographic (90-year-old women). Whether it’s right or wrong, it’s a bias because it reflects an uneven, non-neutral expectation about the likelihood of different groups engaging in a particular action.
Cognitive biases are specific types of biases that cause systematic errors in how we process and integrate information into our world models. An example of a cognitive bias is the status quo bias, where people prefer things the way they are, even if a change would lead to improvements. Cognitive biases often arise from the brain's attempt to simplify information processing. They are the mental shortcuts, known as heuristics, that our brain uses to process information quickly but can lead to errors in thinking, decision-making, and judgment.