What Cognitive Biases Arise from Urgency?

Taking the time to deliberate on every decision we make can be exhausting and time-consuming. There are shortcuts for that – shortcuts aplenty.

Introduction

Daily living entails an endless list of decisions we need to make, ranging from the mundane (what I want for lunch) to the life-changing (whether I should marry this man). Some decisions are straightforward, requiring only a simple yes or no; others pose a seemingly infinite set of options.

With the constant barrage of questions that need answering and decisions that need making, the human brain uses a number of techniques to expedite decision-making. We gravitate towards the safety of simplicity rather than the uncertain and complex. We are inclined to overestimate our abilities so that we are armed with the confidence to act. We zero in on what’s in front of us instead of worrying too much about a distant, hypothetical future. And we finish the things we start, sometimes with a doggedness that defies rationality.

Fundamental attribution error

Are high-performing students successful at school because they are smart? Without giving it too much thought, most people would agree that they are.

As humans, we are prone to associate high performance with competence and vice versa. We look at others’ actions and assume that these reflect how they are as people, ignoring the contexts and circumstances that may have driven them. This cognitive bias is called the **fundamental attribution error**. It’s why we automatically think that students who earn consistently good grades are intelligent. Maybe they are, but it could just as easily be because they work hard and have good study habits. Or they might come from a background of privilege or have a robust support system outside of school.

Though reality is often complicated and driven by countless external factors, the human brain likes simplicity. And so, we tend to overemphasize people’s personal traits when assessing their actions, even if situational factors are more likely to blame – or thank – for what they do.

Jones and Harris, 1967

A classic example of the fundamental attribution error is the 1967 experiment by Edward E. Jones and Victor Harris. This study asked individuals to write essays on Fidel Castro and assigned them to take either a positive or negative view. Another set of participants then read these essays and rated how each writer felt about Castro. Writers assigned to take a positive stance were judged pro-Castro, while those who wrote against him were rated anti.

The same assessments persisted even after subjects found out that writers were merely assigned a position to take. They ignored the external factors influencing the essay and stuck with their assumption that each essay was consistent with and indicative of their corresponding author’s personal opinions on Castro.

One reason behind the fundamental attribution error could be unawareness of the situations driving others’ behavior. But even taking this out of the equation, reasoning out why people act the way they do requires cognitive load. It is easier to assume that people do things because it’s just ‘the way they are.’

The impacts of fundamental attribution error

Failure to see past the lens of fundamental attribution error can lead to incorrect and unfair judgments. People do not always engage in objectionable behavior just because. Their actions may be a product of circumstance. If we do not afford others generosity in judgment, unfair assessments can turn into stigma.

Consider individuals struggling with substance abuse. They get a bad rap for ‘not helping themselves.’ If they internalize that their addiction stems from an innate personality trait, what hope do they have of overcoming it? If we do not acknowledge the barriers to their progress, how can we address systemic issues and provide proper support?

Conversely, associating good performance with competence without recognizing context can also be dangerous. Compare a mediocre salesperson who operated in a high-growth market versus a colleague who hustled tirelessly in a challenging environment. Conflating the former’s sales performance with innate capability rewards the wrong employee if they are promoted over the latter. When the mediocre manager must navigate tough markets in the future, the company will suffer.

Hyperbolic discounting

Humans love instant gratification, so much so that we opt for immediate rewards over long-term benefits, even when the latter is objectively preferable. Or at least, the concept of **hyperbolic discounting** says that we tend to do so. Isn’t that why, instead of investing this month’s extra cash into a retirement fund, we treat ourselves to a nice dinner? Why we choose to put off until tomorrow that chore we have been avoiding all week? Or why scientists clamoring for climate change action are constantly being ignored – because doing something now entails additional expense, even though leaving it until the next decade might be exponentially more costly, but at least that’s a problem for the future?

From the lens of hyperbolic discounting, the future is a hazy, hypothetical concept where we magically get around to doing all the things we have been putting off. Meanwhile, the here and now deserves our full attention and enjoyment. Which is why, when we compare options, immediacy can play a deciding role when other variables are more or less comparable.

Loewenstein and Thaler, ‘intertemporal’ choice

What do driver’s licenses have to do with high school dropout rates? Apparently, according to George Loewenstein and Richard H. Thaler’s 1989 paper Anomalies, they make a good pair for demonstrating how hyperbolic discounting works – at least in the context of late 1980s West Virginia. When the state passed a law revoking driving permits of school dropouts under the age of 18, it saw dropout rates fall by a third. The economists surmised that such a staggering effect could only be caused by what they called ‘extremely myopic preferences.’

Our ‘temporal myopia’ makes us focus on the here and now because the hypothetical future is uncertain. But when the consequences of our actions are immediate, as in the West Virginia example, we may reconsider our choices. Conversely, when it comes to receiving a reward, our risk aversion sees us wanting to secure a gain immediately, even if it means sacrificing a small portion of its size. Perceptually, the farther away a reward is to be received, the smaller in value it appears.

Potential perils of hyperbolic discounting

When we sacrifice too much of the long term in favor of the present, we fail to invest in long-term outcomes that benefit us – be it our financial security, physical health, or mental wellbeing in later years.

When organizations prioritize short-term profit over anything else, their flawed decision-making jeopardizes their own long-term prospects and sustainability. We see this mindset in companies where middle management is constantly under pressure to outperform themselves month after month. Management focuses on looking good now, as there is no point in planting seeds for the future when one will have already moved on to greener pastures by the time their efforts bear fruit.

The same goes for governments of the day who care about getting re-elected. They focus on currying favor with their constituents instead of making sound long-term plans and programs that truly benefit the public. A government with such a myopic view ultimately does a disservice to its people.

Sunk cost fallacy

Advice columns often feature toxic friendships and relationships where one party knows that the dynamic isn’t working but they’ve just invested too much time and effort to walk away. *“I’ve known this person since I was a toddler. She’s my oldest friend,”* or, *“I can’t just give up on a 10-year relationship.”* This line of thinking is symptomatic of the **sunk cost fallacy**.

Close relationships are naturally difficult to break off – it pushes us out of our comfort zones. But the regret and guilt we feel over walking away from something we’ve invested time and effort on comes from cognitive bias. Instead of weighing up future costs against the benefits we anticipate by maintaining status quo, we fixate on the past – resources we have already expended and can no longer recoup whether we stay or leave.

So, the mental equation has to consider two variables – what it costs to stay and what we gain from staying. Regardless of how this equation works out, sunk cost is irretrievable and, therefore, irrelevant to the equation – a mere distraction, so to speak.

Pitfalls of clinging to sunk cost

The sunk cost fallacy frequently plays out in financial decision-making. We embark on projects with optimism. Along the way, challenges sober us up. We may eventually realize that our undertaking will not deliver as expected. In such cases, a rational decision requires an objective assessment of the situation.

Unfortunately, we are less rational than we think we are. This irrationality explains why some government projects persist even when sustaining them no longer makes sense. These programs waste taxpayers’ money – the resources could be diverted to more deserving projects. Likewise, companies sometimes throw more funds into failing business units in the misguided optimism that tides will turn and they will recoup their investment.

**Loss aversion** is partly to blame for these scenarios – the notion that investments in a project are not ‘lost’ as long as it keeps running. We also see a narrative of failure, of giving up prematurely. All this highlights how tricky stay-or-leave decisions can be because we have to contend with feelings of pride, loss, and discomfort on top of the objective considerations required of us.

You will forget 90% of this article in 7 days.

Download Kinnu to have fun learning, broaden your horizons, and remember what you read. Forever.

You might also like

How do Mental Shortcuts Influence our Decision-Making?;

Examine how decision-making processes are more complicated than they appear, and how mental shortcuts make most of the decisions we undertake in our daily lives.

What Is The Anchoring Bias, And How Does It Impact Our Decision-Making?;

When the human brain is faced with too much information, it compensates for its limited processing capacity using mental shortcuts.

What are Cognitive Biases Related to Memory, and How Can We Overcome Them?;

The breadth of information we encounter daily challenges our brain’s limited memory capacity. As such, the brain optimizes storage capacity using a few mental tricks.

What Insights Can We Gain From an Evolutionary Perspective on Cognitive Bias?;

Round out this Pathway with an evolutionary perspective on cognitive bias.

How can Nudge Theory Change Behavior?;

Take a look at further research being done in the realm of cognitive bias.

What Is The SEEDS Model, And How Can It Be Used To Mitigate Bias?;

Halvorson and Rock’s SEEDS model proposes five categories of cognitive biases and their corresponding mitigation strategies.