What are some Cognitive Biases that Result from Ambiguity?

Humans create their own stories to fill in gaps where meaning is lacking, resorting to generalizations and simplifications as a way of adapting.

Introduction

The human brain has an affinity for system and order, and as such seeks patterns to create meaning out of the randomness of everyday life. Pattern seeking is one way that we learn. As children, we discover that, when we place our hands on something hot, we hurt ourselves. When we squabble with a sibling, we get a scolding. Recognizing patterns provides us with a level of predictability and order. We learn to avoid engaging in activities that lead to undesirable outcomes.

And so, we grow up seeking meaning in the seemingly mundane and arbitrary. That’s why, as a species, we love cliches like ‘everything happens for a reason.’ We dislike it when unpredictable things happen for no apparent reason because we yearn to control the world around us.

In the process of trying to fit chaos and randomness into neat little boxes that fit into our worldview, our brain takes shortcuts that sometimes end up as cognitive biases. When we try to simplify a complicated world, we end up oversimplifying. This can lead to erroneous decision-making.

The gambler’s fallacy

Imagine that a series of coin tosses produces a consecutive sequence of five heads. Is the next outcome more likely to be heads or tails?

It’s a bit of a trick question, isn’t it? Nothing in the previous paragraph provides any hint as to what the next outcome will be. A fair coin toss has an equal chance of producing either heads or tails.

Does it matter that we’ve been seeing heads consecutively? It shouldn’t. But a brain that likes to impose order on the world may tell us otherwise. If it’s truly a 50/50 chance at heads versus tails, one might intuit that logic ‘demands’ a tails to appear soon, because six heads in a row is just ‘ridiculous.’ Except it isn’t, because each coin toss is independent of each other and has no bearing on the next outcome. But when we fall prey to the **gambler’s fallacy**, we believe that previous outcomes of a random event can affect the probability of a future event, even when causality does not figure into the equation.

Monte Carlo Casino, 1913

The Monte Carlo Casino incident of 1913 best illustrates how operating under the blur of gambler’s fallacy can lead to disaster. At some point during the night, on August 18, 1913, the roulette wheel landed on black for 10 consecutive spins. Believing that a red outcome was overdue, gamblers started betting heavily on red.

The more the wheel landed on black, the more gamblers expected it to go red. As the roulette wheel rebelled against the crowd’s expectations, its audience ballooned in size, and so did their wagers. By the time the gamblers ‘won’ and the wheel finally landed on red, the crowd had collectively lost a fortune to the casino. It took a hot streak of 26 black spins to produce one red outcome.

The moral of the story? Random events are just that, random. They are in no way affected by previous events. To think otherwise can lead to catastrophic loss, as the Monte Carlo Casino’s patrons learned that fateful night.

In search of balance and order

Besides the go-to examples in gambling, the gambler’s fallacy appears elsewhere in life. Following a streak of bad luck, we might complain, *“There’s no way to go but up now. Things have to turn around soon.”* Unless we’re actively doing something to change our luck, that mindset is symptomatic of biased thinking.

Playing the stock market is its own brand of gambling too. According to economists, amateur investors *“sell winners too early and hold losers too long,”* squandering their investments. They think that, if a stock price is surging, it’s bound to dip soon. They misunderstand the fundamentals of stock prices and fail to see that, although future stock prices are dependent on many factors, past performance is not part of the equation.

The line of thinking we see in the gambler’s fallacy betrays our desire for balance and order. We incorrectly expect randomness to somehow reflect the probabilities at hand. To avoid this flawed reasoning, we should ask ourselves why we assume certain causal relationships exist, and whether these expectations are reasonable.

In-group bias

There’s almost an unwritten rule in team sports that fans of rival teams shouldn’t fraternize with each other. The same can be said of fervent supporters of political candidates during election periods. We treat individuals within our group with kindness and respect, and those outside our group with just that bit more skepticism and, perhaps, condescension. It’s not a case of shared values or culture, nor is it a matter of camaraderie or shared experience. **In-group bias** – where we display preferential treatment for members of our in-group – can be observed even in the most arbitrary and meaningless of groups.

In-group bias can go beyond merely favoring our group members. It manifests itself in the unfair treatment of individuals in the out-group, and can even see us engaging in immoral acts for the benefit of our in-group. At its worst, it can result in actively harming outsiders – for example, hate crimes against minorities. When left unchecked, widescale in-group bias can lead to social injustice.

Why we favor our groups

The concept of in-group bias was first introduced by sociologist William Sumner in his 1906 study on ethnocentrism. He pointed out how humans tend to form groups, where *”each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders.”* On a grander scale, this in-group bias is what he referred to as ethnocentrism.

Social identity theory provides insight into why this tendency exists. Being part of a group contributes to our concept of self. A woman, a mother, a nurse, a Muslim – these are all social categories that help us form a self-image and piece together our role in larger society.

But why do we, as members of social groups, hold such strong biases in favor of our in-groups? It’s because we humans like to feel good about ourselves. Being part of a group we take pride in elevates our self-esteem. But if we take that a few steps further, it may lead to putting down out-groups for the sake of elevating our in-groups.

Overcoming in-group bias

Affinity to our in-groups is healthy, but we have to be able to keep in-group bias under control lest it transforms into prejudice and discrimination against the ‘other.’ Bending rules for people in our social groups, allocating a larger portion of limited resources to our in-groups, mistreating and mistrusting others who are not part of our circle, and considering ourselves superior to others who aren’t like us – these behaviors have serious consequences when left unchecked, such as structural inequality, institutional racism, and violence against women.

Studies suggest that one way to reduce the impact of in-group bias in group dynamics is to provide clear incentives for working together. A 1950s study by Muzafer Sherif demonstrates that forced cooperation and a shared goal can help to reduce the effects of in-group bias. Through repeat interactions with members of an out-group, an individual’s in-group may shift and expand to include the out-group.

The just-world hypothesis

One of the most disheartening things to witness when a sexual assault complaint goes public is when victim blaming occurs. Instead of being offered empathy, the victim is questioned about what they could have done differently to ‘protect’ themselves. Maybe if they dressed more conservatively, drank judiciously, had acted less flirty that night?

Though most people are not prone to making such callous suggestions, psychology does provide insight into why some people engage in victim blaming. The just-world hypothesis refers to the belief that the world is fair, and, thus, whatever happens to us is well-deserved. If we do good, we are rewarded. If we do bad, we are punished. So, when misfortune befalls someone, we justify this by looking for some flaw – *“He has cancer because he doesn’t watch what he eats”* – focusing on personal traits instead of situational factors.

Why a just world works – in theory

Dr. Melvin Lerner’s formal research on a just-world mindset is credited for bringing attention to the topic. During his training as a psychologist, Lerner was intrigued by what he saw. Though the healthcare practitioners he worked with were kind and intelligent, they did not extend the same kindness to the patients under their care. In fact, the psychologists were often critical of their patients. This observation kicked off Lerner’s studies on justice beliefs.

Our belief in a just world, according to Lerner, is functional. A just world suggests a predictable world – one in which our actions dictate their consequences. Thus, it gives us a semblance of control and protects us from helplessness, ultimately benefitting our mental wellbeing.

We don’t come into this world expecting that it is fair. But, as children, we are socialized into this worldview. We hear stories about heroes that punish villains for their misdeeds, and we are taught, through positive and negative reinforcement, about the consequences of our actions. For better or worse, we bring this theory into adulthood.

Toward an empathetic world

A just-world mindset starts as functional and adaptive, but holding on too tightly to this belief is like wearing blinders. When we assume that another person’s suffering is their own doing, we turn a blind eye to the circumstances that may have contributed to the situation. When we say that all poor people are lazy, we ignore the social structures that perpetuate inequity and the mechanisms that keep poor people poor. We disregard how many low-income people work exponentially longer, more back-breaking jobs than millionaires.

Strong belief in a just world impacts our justice system and social support structures. When our perception of the world tells us that everyone gets what they deserve, we fail to see where our systems need changing. When we come to accept that the world may not always be fair, only then can we move toward empathy and a kinder sense of justice.

You will forget 90% of this article in 7 days.

Download Kinnu to have fun learning, broaden your horizons, and remember what you read. Forever.

You might also like

How do Mental Shortcuts Influence our Decision-Making?;

Examine how decision-making processes are more complicated than they appear, and how mental shortcuts make most of the decisions we undertake in our daily lives.

What Is The Anchoring Bias, And How Does It Impact Our Decision-Making?;

When the human brain is faced with too much information, it compensates for its limited processing capacity using mental shortcuts.

What are Cognitive Biases Related to Memory, and How Can We Overcome Them?;

The breadth of information we encounter daily challenges our brain’s limited memory capacity. As such, the brain optimizes storage capacity using a few mental tricks.

What Insights Can We Gain From an Evolutionary Perspective on Cognitive Bias?;

Round out this Pathway with an evolutionary perspective on cognitive bias.

What Cognitive Biases Arise from Urgency?;

Taking the time to deliberate on every decision we make can be exhausting and time-consuming. There are shortcuts for that – shortcuts aplenty.

How can Nudge Theory Change Behavior?;

Take a look at further research being done in the realm of cognitive bias.