Kinnu

The Irrational Brain and Its Hidden Mechanisms

Understanding Cognitive Biases

On August 18, 1913, somewhere in the night of Monte Carlo, a casino roulette wheel landed on black for the 10th time in a row. The gamblers, frantically convinced the next spin would be red, began to bet heavily. More and more people joined the crowd.

A roulette wheel. Image: Ralf Roletschek, CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons

Surely a red would be next? By the time it finally showed, after 26 black spins, the crowd had lost a colossal fortune. All because of a cognitive bias.

We make thousands of decisions every day. Some of them are difficult, some of them are so easy we don’t even notice we are making them. How can we know if we are making good decisions? How can we avoid falling into traps, and betting on red?

Cognitive psychology attempts to answer these questions by uncovering the unconscious biases that underlie so many of our behaviors.

These are called cognitive biases – the many ways in which we apply our pre-existing beliefs to decisions that should be based in more rational grounds.

To date, researchers have identified around 200 cognitive biases, and you will already be familiar with many of them.

When you see a heavily discounted price, and are more likely to make a purchase, this is called ‘anchoring bias’. When you are more critical of the behaviour of rival football supporters than your own team’s, this is ‘in-group bias’. And when you are more likely to believe a news report which supports an opinion you already held. This is ‘confirmation bias’.

A colourful cartoon of rival football teams chanting - AI generated image by DALL-E 2

After taking this pathway, you will be able to identify the most significant cognitive biases that affect our lives, and how to avoid them.

You will also have learned about the amazing psychology experiments that revealed these biases, and the brilliant psychologists behind them. Lastly, you will have explored the evolutionary explanations behind cognitive biases, and be engaged in the debate around why they might not always be a bad thing.

Understanding Decision-Making

As we move through the world, we are constantly making decisions and judgments. Some choices we make come as second nature, and we do not even realize in that exact moment that a decision is being made. Mundanities like crossing the road when the light turns green are seemingly automatic habits that for most people require little effort, if any.

Crossing the road. Image: Zebra crossing, Leith Walk by Richard Webb, CC BY-SA 2.0 <https://creativecommons.org/licenses/by-sa/2.0>, via Wikimedia Commons

On the other hand, we also face difficult decisions that demand more deliberation – getting married, moving interstate for work, changing careers. In these situations, some of us draw a pros and cons list on our notepad, or we consult with close friends or mentors. In these circumstances, we tend to favor rationality over emotion, our brains over our hearts.

The ‘best’ course of action is one backed by numbers and cold, hard facts. But studies in cognitive psychology would beg to differ. Instead, research suggests that many of the decisions we make on a day-to-day basis are colored by unconscious biases that operate in the background.

When psychologists talk about bias, they're referring to our tendency to favor something or someone for whatever reason. Biases can lead to unfair situations – racial and gender stereotypes can lead to microaggressions in daily interactions. But, although the word ‘bias’ carries a negative connotation, biases are not bad per se; they are simply inclinations or preferences we carry.

One can make distinctions between conscious and unconscious biases. As the name suggests, conscious or explicit biases are those we are aware of. They help form an individual’s identity, such as in a self-proclaimed feminist.

Conscious bias can lead to good decisions – say, someone who identifies as an environmentalist who strives to minimize their carbon footprint. But conscious biases can also lead to less desirable outcomes, as when a hiring manager with racist beliefs consistently makes hiring decisions based on interviewees’ skin color.

Someone choosing to recycle. Image: R. Henrik Nilsson, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Unconscious biases can run counter to conscious biases. An individual may claim to be in favor of diversity and tolerance, but their actions may contradict their conscious beliefs. This is because unconscious or implicit biases often escape our control and consciousness.

So, how do humans come to hold biases? Some biases are a direct result of social and cultural influences – ideas we absorb from the media we consume, beliefs and attitudes modeled by our parents when we were growing up, or even ‘truths’ we learned from religious doctrine. Someone who received a strictly Catholic education, for example, may hold views on sexuality that fall in step with how they were taught in their formative years.

A traditional Catholic wedding. Image: Public domain via Wikimedia.

Biases may also come from the innately human habit of pattern seeking. Human brains look for patterns as a way of making sense of the world, and sometimes we may see patterns where there really are no meaningful ones to speak of.

In line with pattern recognition, human brains also have a tendency to take mental shortcuts to solve problems efficiently. These shortcuts are called heuristics, which refers to any of the techniques or ‘rules of thumb’ we employ in our everyday lives to simplify otherwise complicated cognitive tasks.

Mechanisms of Bias

Heuristics present themselves as common-sense tools that aid in judgment calls or decision-making. A commonly used heuristic is the anchoring heuristic, in which we tend to use the first piece of information we receive to anchor subsequent data.

For instance, a coat that was initially priced at $700 but is now for sale at 50% off will come across as a bargain, even though we may not necessarily have perceived it as such if it had originally been offered to us at $350.

Jackets on sale. Image: Peachyeung316, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

As in the previous example, heuristics are by no means infallible, but they often suffice when we have to make snap decisions based on limited information.

The concept of heuristics plays into the idea of ‘satisficing,’ where a good enough or satisfactory decision works just as well as an optimal solution. Think of it as a trade-off between accuracy and speed. Heuristics may not give us the perfect answer all the time, but they work well enough in most situations – and sometimes, good enough is good enough.

For all their utility, one downside of heuristics is that, when they fail, they often lead to cognitive biases, a term coined by psychologists Daniel Kahneman and Amos Tversky in the 1970s.

When we attempt to oversimplify a complex world such as ours, we fall prey to unconscious thinking errors – conflating ideas, misinterpreting information, and even misremembering events. Errors can occur in any of the four steps of decision-making – when we gather information, when we process data, when we make judgments, or when we receive feedback.

Doomscrolling: By Japanexperterna.se - www.japanexperterna.se/?attachment_id=3068 (archived: https://web.archive.org/web/20240511030329/https://www.japanexperterna.se/17470913285_a8eae3ebc0_o), CC BY-SA 2.0, https://commons.wikimedia.org/w/index.php?curid=115769182

Cognitive biases arise when individuals allow their perception of reality to be shaped by their pre-existing ideas. When we are on social media, for example, we have a tendency of clicking on headlines that align with our beliefs and scrolling past those we disagree with. When a colleague receives a promotion, it's because they were lucky; but if we achieve a work milestone, it's because we earned it through hard work.

In his book Thinking, Fast and Slow, 2002 Nobel Prize Winner in Economics, Daniel Kahneman, presents a framework that allows us to better appreciate why cognitive biases are so pervasive. Kahneman makes the distinction between system 1 and system 2 thinking. System 1 is fast, instinctive, emotional, but error-prone. In contrast, system 2 is deliberative, rational, and logical – but effortful and slow.

Nobel Prize Winner Daniel Kahneman. Image: nrkbeta, CC BY-SA 2.0 <https://creativecommons.org/licenses/by-sa/2.0>, via Wikimedia Commons

Specifically, system 1 drives automatic processes that require minimal effort, like basic addition (1+1=2), understanding simple sentences, or detecting distances. System 2 takes over when more consciousness is involved – walking more quickly than normal, doing long division, or parallel parking.

System 1 allows us to act quickly; system 2 provides the rationality required by complex decision-making. Though the two systems seem diametrically opposed, they work in tandem and feed off each other efficiently. System 1 sends signals to system 2, and the latter supports the former when intuition falls short.

The Role of Intuition

Adults make roughly 35,000 decisions on a daily basis. Imagine a world in which system 2 took over. Our brains would be bogged down by cognitive overload. So, while the slow, deliberate nature of our rational selves may appear more sensible, it makes sense that our intuition – system 1 – does 98% of our thinking.

We need mental shortcuts to continuously process the world around us without frying our brain. We cannot afford anything less than quick action when we are faced with emergencies like an oncoming car or a blazing fire that needs to be put out.

In fact, psychologist Gerd Gigerenzer from the Max Planck Institute for Human Development posits that heuristics, rather than being indicative of the human brain’s limited capacity, make up an ‘adaptive toolbox’ to circumvent the limitations imposed by an uncertain world.

Gerd Gigerenzer. Image: Heinrich-Böll-Stiftung from Berlin, Deutschland, CC BY-SA 2.0 <https://creativecommons.org/licenses/by-sa/2.0>, via Wikimedia Commons

That is, heuristics – and as an extension, cognitive biases – are a means for us to restore order amidst a haze of information. When we fail to see the ways in which our brain tricks us, our decision-making suffers. To address the potentially harmful outcomes of our biases, we must first be aware of them and be able to recognize them.

To date, researchers have identified 200 cognitive biases, give or take. Some biases overlap, and the sheer number of biases is enough to trigger information overload. A number of experts and thinkers have sought to design frameworks that group biases into neat categories that are more digestible.

Cognitive overload. Image: Braun Barbara, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons

Author Buster Benson suggests categories based on issues we encounter in decision-making – information, meaning, urgency, and memory. John Manoogian III’s The Cognitive Bias Codex provides a visualization of this. Meanwhile, Halvorson and Rock’s SEEDS model groups biases into five categories, with an equivalent course of action for mitigating each bias category – speed, expedience, experience, distance, and safety.