How do Mental Shortcuts Influence our Decision-Making?

Examine how decision-making processes are more complicated than they appear, and how mental shortcuts make most of the decisions we undertake in our daily lives.

Introduction

As we move through the world, we are constantly making decisions and judgments. Some choices we make come as second nature, and we do not even realize in that exact moment that a decision is being made. Mundanities like making a pot of coffee in the morning, brushing our teeth, crossing the road when the light turns green – these are seemingly automatic habits that require little effort on our part, if any.

On the other hand, we also face difficult decisions that demand more deliberation – getting married, moving interstate for work, changing careers. In these situations, some of us draw a pros and cons list on our notepad, or we consult with close friends or mentors. In these circumstances, we tend to favor rationality over emotion, our brains over our hearts. The ‘best’ course of action is one backed by numbers and cold, hard facts. But studies in cognitive psychology would beg to differ. Instead, research suggests that **many of the decisions we make on a day-to-day basis are colored by unconscious biases** that operate in the background.

Biases, conscious and unconscious

When psychologists talk about bias, they’re referring to our tendency to favor something or someone for whatever reason. Biases can lead to unfair situations – racial and gender stereotypes can lead to microaggressions in daily interactions. But, although the word ‘bias’ carries a negative connotation, biases are not bad *per se*; they are simply inclinations or preferences we carry.

One can make distinctions between conscious and unconscious biases. As the name suggests, conscious or explicit biases are those we are aware of. They help form an individual’s identity, such as in a self-proclaimed feminist. Conscious bias can lead to good decisions – say, a self-styled environmentalist who strives to minimize their carbon footprint. But conscious biases can also lead to less desirable outcomes, as when a hiring manager with racist beliefs consistently makes hiring decisions based on interviewees’ skin color.

Unconscious biases can run counter to conscious biases. An individual may claim to be in favor of diversity and tolerance, but their actions may contradict their conscious beliefs. This is because unconscious or implicit biases often escape our control and consciousness.

How we form biases

So, how do humans come to hold biases? Some biases are a direct result of social and cultural influences – ideas we absorb from the media we consume, beliefs and attitudes modeled by our parents when we were growing up, or even ‘truths’ we learned from religious doctrine. Someone who received a strictly Catholic education, for example, may hold views on sexuality that fall in step with how they were taught in their formative years.

Biases may also come from the innately human habit of pattern seeking. Human brains look for patterns as a way of making sense of the world, and sometimes we may see patterns where there really are no meaningful ones to speak of.

In line with pattern recognition, human brains also have a tendency to take mental shortcuts to solve problems efficiently. These shortcuts are called **heuristics**, and are any of the techniques or ‘rules of thumb’ we employ in our everyday lives to simplify otherwise complicated cognitive tasks.

Heuristics, or mental shortcuts

Heuristics present themselves as common-sense tools that aid in judgment calls or decision-making. A commonly used heuristic is the anchoring heuristic, in which we tend to use the first piece of information we receive to **anchor** subsequent data. For instance, a coat that was initially priced at $700 but is now for sale at 50% off will come across as a bargain, even though we may not necessarily have perceived it as such if it had originally been offered to us at $350.

As in the previous example, heuristics are by no means infallible, but they often suffice when we have to make snap decisions based on limited information. The concept of heuristics plays into the idea of ‘satisficing,’ where a good enough or satisfactory decision works just as well as an optimal solution. Think of it as a trade-off between accuracy and speed. Heuristics may not give us the perfect answer all the time, but they work well enough in most situations – and sometimes, good enough is good enough.

Cognitive bias

For all their utility, one downside of heuristics is that, when they fail, they often lead to cognitive biases, a term coined by psychologists Daniel Kahneman and Amos Tversky in the 1970s. When we attempt to oversimplify a complex world such as ours, we fall prey to unconscious thinking errors – conflating ideas, misinterpreting information, and even misremembering events. Errors can occur in any of the four steps of decision-making – when we gather information, when we process data, when we make judgments, or when we receive feedback.

Cognitive biases arise when individuals allow their perception of reality to be shaped by their pre-existing ideas. When we are on social media, for example, we have a tendency of clicking on headlines that align with our beliefs and scrolling past those we disagree with. When a colleague receives a promotion, it’s because they were lucky; but if we achieve a work milestone, it’s because we earned it through hard work.

Daniel Kahneman: System 1 vs system 2

In his book *Thinking, Fast and Slow*, 2002 Nobel Prize Winner in Economics, Daniel Kahneman, presents a framework that allows us to better appreciate why cognitive biases are so pervasive. Kahneman makes the distinction between **system 1** and **system 2** thinking. System 1 is fast, instinctive, emotional, but error-prone. In contrast, system 2 is deliberative, rational, and logical – but effortful and slow.

Specifically, system 1 drives automatic processes that require minimal effort, like basic addition (1+1=2), understanding simple sentences, or detecting distances. System 2 takes over when more consciousness is involved – walking more quickly than normal, doing long division, or parallel parking. System 1 allows us to act quickly; system 2 provides the rationality required by complex decision-making. Though the two systems seem diametrically opposed, they work in tandem and feed off each other efficiently. System 1 sends signals to system 2, and the latter supports the former when intuition falls short.

”Nobel

In defense of intuition

Adults make roughly 35,000 decisions on a daily basis. Imagine a world in which system 2 took over. Our brains would be bogged down by cognitive overload. So, while the slow, deliberate nature of our rational selves may appear more sensible, it makes sense that our intuition – system 1 – does 98% of our thinking. We need mental shortcuts to continuously process the world around us without frying our brain. We cannot afford anything less than quick action when we are faced with emergencies like an oncoming car or a blazing fire that needs to be put out.

In fact, psychologist Gerd Gigerenzer from the Max Planck Institute for Human Development posits that heuristics, rather than being indicative of the human brain’s limited capacity, make up an ‘adaptive toolbox’ to circumvent the limitations imposed by an uncertain world. That is, heuristics – and as an extension, cognitive biases – are a means for us to restore order amidst a haze of information.

Naming (and categorizing) our biases

When we fail to see the ways in which our brain tricks us, our decision-making suffers. To address the potentially harmful outcomes of our biases, we must first be aware of them and be able to recognize them.

To date, **researchers have identified 200 cognitive biases**, give or take. Some biases overlap, and the sheer number of biases is enough to trigger information overload. A number of experts and thinkers have sought to design frameworks that group biases into neat categories that are more digestible. Author Buster Benson suggests categories based on issues we encounter in decision-making – information, meaning, urgency, and memory. John Manoogian III’s *The Cognitive Bias Codex* provides a visualization of this. Meanwhile, Halvorson and Rock’s *SEEDS model* groups biases into five, with an equivalent course of action for mitigating each bias category – speed, expedience, experience, distance, and safety.

You will forget 90% of this article in 7 days.

Download Kinnu to have fun learning, broaden your horizons, and remember what you read. Forever.

You might also like

What Is The Anchoring Bias, And How Does It Impact Our Decision-Making?;

When the human brain is faced with too much information, it compensates for its limited processing capacity using mental shortcuts.

What are Cognitive Biases Related to Memory, and How Can We Overcome Them?;

The breadth of information we encounter daily challenges our brain’s limited memory capacity. As such, the brain optimizes storage capacity using a few mental tricks.

What Insights Can We Gain From an Evolutionary Perspective on Cognitive Bias?;

Round out this Pathway with an evolutionary perspective on cognitive bias.

What Cognitive Biases Arise from Urgency?;

Taking the time to deliberate on every decision we make can be exhausting and time-consuming. There are shortcuts for that – shortcuts aplenty.

How can Nudge Theory Change Behavior?;

Take a look at further research being done in the realm of cognitive bias.

What Is The SEEDS Model, And How Can It Be Used To Mitigate Bias?;

Halvorson and Rock’s SEEDS model proposes five categories of cognitive biases and their corresponding mitigation strategies.