Much is said about the repercussions of unchecked cognitive biases. This tile examines how cognitive bias impacts different aspects of our everyday lives.
Why care about cognitive bias?
Cognitive bias can reach far and wide and entrench itself deep in our unconscious if left unchecked. Completely eradicating it is an impossible task and not necessarily something we should aim to do. But, in many aspects of life, making better decisions requires addressing the flaws in our reasoning. Doing so starts with acknowledging how unconscious biases cloud our thinking and color our perception.
As can be seen with biases like the peak-end rule, the framing effect, and hyperbolic discounting, understanding how our patterns of thinking play into everyday decisions allows us to leverage them for our benefit. Properly used, our biases can nudge us into making better choices.
Understanding how cognitive bias affects different aspects of our lives sets us up to better address them. Awareness of others’ biases provides us insight into the behavior of others around us. This knowledge allows us to avoid unwanted behavior by using mitigation strategies available at our disposal.
Information sharing in the age of the Internet
When the Internet started to grow in popularity, pundits rejoiced at how it would revolutionize how the world shared information. Indeed, most of us can effortlessly jump on the Internet and read about any random topic we want, but whether we as a society have reached an information utopia is debatable. If we look at how increasingly polarized politics has wreaked havoc in some countries, one might ask if we aren’t in fact growing more disconnected from each other.
It’s not just a matter of wildly disparate opinions either; we disagree with each other about what constitutes fact. That information sharing has become so frictionless and accessible allows just about anyone with a mobile phone or computer – and Internet access – to become a content creator. The information we consume and spread on social media need not come from a subject matter expert. It might be a random stranger’s late-night thoughts, written with an air of authority. Worse, it could be a bad actor intentionally posting ‘alternative facts’ to prey on our gullibility – and our cognitive biases. *Do you believe what you are reading now?*
Cognitive bias in a post-truth era
Conspiracy theories and misinformation predate the Internet. But our current landscape lets us indulge in selective interactions more than ever. We ‘follow’ or ‘like’ thought leaders that share our convictions, and we ‘mute’ or ‘unfollow’ conflicting views. Customizing our news feeds lets us construct silos, where we are safe from the discomfort of cognitive dissonance. Tethered to our in-groups, we can nod in agreement and engage in groupthink.
Aside from internal biases, social media algorithms confine us in a self-fulfilling loop. Internet activist Eli Pariser coined the term ‘filter bubble’ to highlight how algorithms shape worldviews. In *Filter Bubbles*, Pariser explains how two people performing the same Google search can receive wildly different results. Algorithms take our search history, extrapolate a persona, and feed back results that supposedly reflect our values.
The longer we keep to our silos, the higher the walls around us. To quote author and academic Lee McIntyre, *“We are all beholden to our sources of information. But we are especially vulnerable when they tell us exactly what we want to hear.”*
Cognitive bias in politics
Politics is another area heavily impacted by cognitive biases. Public opinion can be manipulated by encouraging polarization of sociopolitical views; appealing to emotions, especially our fears; and selective reporting of information.
Our political ideologies are integral to our self-identity. We cling to existing beliefs and trust potentially suspect news sources – so long as they align with our views. And we are hypercritical of those who stand on the other side of the political fence. As political views shift further apart, differences in opinion can devolve into mistrust and hostility. Unchecked, the political divide becomes personal, no longer focused on ideologies or even the goal of nation building.
The peak-end rule and availability heuristic can also benefit unscrupulous politicians running for re-election. They ‘behave’ in the months leading to elections, pull some PR stunts to garner the sympathy of the electorate, then revert to old ways once they secure another term in office. After all, voters care about the issues of the day. Misdeeds from years past are easy to forgive when they are not fresh in the mind.
Cognitive bias and the 2008 global financial crisis
Numbers dominate the financial world – monetary figures, interest and discount rates, probabilities for different outcomes. Quantitative models guide decision-making in investment banks and financial firms, but ultimately, humans make the final call.
In 2007, the US housing bubble burst. The global financial crisis that followed resulted from a complex interaction of factors culminating in a perfect storm, including cognitive bias, of course – thanks to the human element. First was investors’ overconfidence that housing prices would appreciate infinitely. They had seen the historical trend and predicted future growth based on past performance – symptomatic of the gambler’s fallacy. With the expectation of massive gains, investors took out mortgages, which they defaulted on when the bubble burst and prices dropped. Suffering sizable losses, investors rid their portfolio of ‘risky’ assets, triggered by loss aversion. This selling spree caused a further dip in prices, worsening the burgeoning crisis.
On the side of institutions, hyperbolic discounting explains why bankers accepted an excessive level of long-term risk; the prospect of massive immediate gains blinded them to what would eventually transpire.
Cognitive bias in consumer marketing
Between the early 2000s to mid-2010s, behavioral economics garnered three Nobel Prize wins. As ‘cognitive bias’ became a buzzword, marketers gleefully counted the ways in which businesses could leverage our collective biases to better market to consumers.
Some practices we see in consumer marketing nowadays are applications of concepts from cognitive bias. Others have been in use long before psychologists and behavioral economists gave them a name. Take the mere-exposure effect, which refers to people’s tendency to prefer the familiar – what they are exposed to. Household names like Coca-Cola and McDonald’s continue to invest heavily in advertising because constant exposure to them makes us more likely to regard them in a positive light.
The phenomenon of shrinkflation applies the concept of loss aversion. Instead of raising prices, businesses opt to reduce the size of a product to cover additional costs. This way, consumers do not hurt at the prospect of higher prices – a loss from our perspective. Besides, consumers’ fixation on price means that they will sooner notice a price increase than a size reduction.
More cognitive bias in consumer marketing
Businesses use free trials to capture potential customers who have one foot in the door. Most people won’t refuse a free trial, and, once they’re enjoying a service, they will likely place a higher value on what they already consider theirs – the endowment effect. When the trial lapses, the odds of them paying for the service are higher.
Some trials require a credit card to sign up. Once the trial lapses, users who do not cancel their subscription will find their credit card automatically charged. It’s an annoying feature for individuals interested only in the free trial, but it is a clever application of hyperbolic discounting. The consumer wants to avail of the free subscription immediately, so they’ll worry about the inconvenience of unsubscribing later. If they forget to do so, the company makes money.
Alternatively, say the user confirms their subscription. The service sends them a quick email to affirm what a good choice they’ve made. It’s a simple implementation, but the reassurance makes the user feel good about their decision, effectively leveraging on confirmation bias.
Cognitive bias in disaster preparedness
Scientists worldwide had been sounding the alarm on an impending virus by the time COVID-19 became a global problem. But their warnings fell on deaf ears, and countries were blindsided. The same story can be said of other disasters. Massive floods, wildfires, and earthquakes – while we cannot predict them with complete accuracy, they are usually a question of when, not if. We know we should prepare for them. But most of the time, we do not.
Cognitive bias is a mental barrier to disaster preparedness. The normalcy bias or ostrich effect sees humans avoiding negative information because we are averse to loss and risk. We take a stance of denial and focus on the positive instead.
One can argue that governments operate on limited resources. Should they budget for a hypothetical future disaster, or should they fund programs that address more urgent problems? When weighing the options for such a dilemma, avoiding the optimism bias is crucial. Otherwise, we might discount the threat of a future danger in favor of a current crisis.
The ostrich effect
In their book *The Ostrich Paradox*, Robert Meyer and Howard Kunreuther identify six biases that prevent us from sufficiently preparing for disaster – myopia, amnesia, optimism, inertia, simplification, and herding. When things are calm and trouble nowhere to be seen on the horizon, we tend to focus on short timeframes and forget about the importance of investing resources for the long term. We forget about past catastrophes and the lessons they impart on us, and, because we have the luxury of temporal distance, we downplay the likelihood of potential future threats. As such, we choose the easy way out. We maintain the status quo.
Even when we take the time to consider potential disasters, we tend to simplify them and take a narrow view of matters. When looking at a hypothetical situation laced with uncertainty, the distance and unfamiliarity of the situation allow complexity and nuance to fall by the wayside. And so, we may instead choose to copy what others around us are doing, because, as they say, there’s safety in numbers.
Cognitive bias and the COVID-19 pandemic
In humanity’s quest to overcome the COVID-19 pandemic, vaccine uptake has been a hot issue. Governments and epidemiologists have tried to get as much of the population vaccinated and boosted, but goals set for achieving herd immunity have largely been scrapped, despite best efforts to encourage compliance.
To address vaccine hesitancy due to mistrust, for example, organizations sought to demonstrate transparency by releasing reports on adverse effects. The goal was to show skeptics that vaccines were safe and that incidence of adverse reactions was low. This plan failed to consider two cognitive biases.
First, a skeptic’s propensity to seek confirmatory information meant that, when they read about adverse events, regardless of low incidence, they fell further into the belief that vaccines were indeed unsafe. Second, reading detailed reports on adverse reactions created a vivid picture of what could go wrong. Having this imagery in one’s mind can trigger the availability bias, which means that this information – potential risks of getting vaccinated – would weigh heavier on someone’s mind when deciding to get vaccinated.
Cognitive bias in healthcare
Cognitive bias impacts heavily on healthcare, with potentially grave consequences. According to The Joint Commission, many sentinel events – adverse events resulting in patient death or severe injury – are caused by unconscious biases. Errors in diagnosis specifically account for 6–17% of adverse events in hospitals.
In the hospital setting, we often see confirmation bias operating in tandem with the anchoring bias. When a patient first recounts their symptoms, their physician forms a first impression. This first impression then becomes the anchor to which further testing and evaluation are conducted. As humans, we tend to seek confirmation of what we believe. Some physicians fail to consider and test for alternative diagnoses. Meanwhile, others may cling to an incorrect diagnosis despite evidence pointing to other probable causes. Unfortunately, misdiagnosis can lead to delayed treatment, further injury, or death.
Healthcare workers are often time-poor and overworked. Whether consciously or not, they rely on mental shortcuts. This reality highlights the importance of a system’s approach in tackling bias and minimizing human error so that stakeholders can enjoy better outcomes.