The breadth of information we encounter daily challenges our brain’s limited memory capacity. As such, the brain optimizes storage capacity using a few mental tricks.
Human memory is a tricky thing. Recent studies have suggested that it can be surprisingly accurate, yet, for some time now, the consensus has been that memory can be inconsistent, unreliable, and ever-changing. True enough, we carry some cognitive biases that result from limitations in the way we encode, store, or retrieve information from our memory.
For example, we are prone to simplifying observations into generalities, shedding away the extra weight of specifics. We trim down experiences into their most crucial components. And even after we have locked information away into the depths of our minds, we continue to edit them, consciously or not, so that every time we retrieve them from storage, they look just a little bit different than they did before.
The concept of **confirmation bias** dates back to the Greek historian Thucydides, who opined, *”For it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.”* To this day, humans engage in the questionable habit of favoring evidence that conforms to our existing beliefs.
In 1960, cognitive psychologist Peter Wason demonstrated how humans seek not to disprove their beliefs and hypotheses but to prove what they already believe. Research subjects had to identify the logical rule behind a number series – 2, 4, 6. They then created their own series of three numbers to test if their theory was correct (e.g. 8, 10, 12). They were then allowed to provide another series of numbers, to further test. However, instead of fielding guesses that explored a wide range of options – by choosing numbers that fit their initial hypothesis and those that did not – most participants only gave numbers conforming to the rule they wanted to prove. Most subjects failed the task because they could not see past their hypothesis; making no effort to disprove it or consider other options.
The path of least resistance
There’s a certain satisfaction in being right. It strokes our ego and boosts our self-esteem. Conversely, we do not like being told we are wrong because our self-esteem takes a beating. So, to protect our self-confidence, we seek out information that aligns with our beliefs.
Some opinions we maintain can turn into deeply held beliefs over time, potentially forming part of our self-identity. When these beliefs are challenged, it can feel like the very core of our being is likewise under threat. This unease further heightens our need to be correct because being wrong could send us into an existential crisis.
Ultimately, confirmation bias is a mental shortcut meant to expedite information processing. We save mental resources by simply reaffirming existing beliefs instead of exploring alternative views. It protects us from the need to acquaint ourselves with new ways of seeing and thinking about a topic – a win for efficiency.
Overcoming confirmation bias
Failure to consider viewpoints that do not align with our own can cause us to overlook crucial details that could affect our decision-making. If our view on important matters is restricted by willful blindness, we cannot make truly informed choices.
Confirmation bias can have a significant impact on individuals. Take someone who chooses to only see the positive qualities of a romantic partner and turns a blind eye to red flags. This individual may end up staying in an abusive or otherwise unhealthy relationship. On a wider scale, confirmation bias can affect the level of public discourse and impact social and political issues.
Though we might not be able to eliminate confirmation bias from our lives, being aware of it allows us to take active steps in widening our lens. We can start to practice critical thinking by being genuinely open to alternative viewpoints, questioning our research, and relying only on credible sources of information.
In 1975, President Richard Nixon flew to Beijing and Moscow. Before that happened, researchers Baruch Fischhoff and Ruth Beyth-Marom rounded up a few participants for an experiment. Subjects had to predict the likelihood of a range of outcomes for Nixon’s trip. After the President’s return, the participants were called back, this time to reconstruct their earlier guesses. With the benefit of hindsight, the subjects’ ‘predictions’ had shifted to more closely reflect what had actually occurred.
Fischhoff and Beyth-Marom’s study was one of the first on **hindsight bias**. Also called the ‘knew-it-all-along’ effect, this bias refers to our tendency to reflect on an unpredictable event and think that the outcome was actually easy to predict. According to psychologists Neal Roese and Kathleen Vohs, hindsight bias involves three elements – misremembering our initial opinions, a false belief that what transpired was inevitable, and confidence in our ability to have predicted the event.
Predicting the past with confidence
Why do we engage in hindsight bias? We distort our memories and falsely believe we ‘knew it all along’ because we feel safer in a predictable world with clear-cut cause-and-effect mechanics.
Though hindsight bias seems harmless, experts warn that it may lead to overconfidence in our ability to predict future events. This unfounded confidence may then nudge us to pursue unwarranted risks when we incorrectly assess certain situations.
Additionally, when we trick ourselves into thinking that we predicted things correctly, we fail to acknowledge our error. This denial robs us of the chance to examine and learn from our decision-making flaws.
To avoid committing hindsight bias, we can keep a decision journal, or a log of past decisions and the corresponding expectations for each choice. This provides us an objective record to refer to in the future. Experts also advise running through a list of what-could-have-beens after our predictions are proven wrong. This way, we remind ourselves that what transpired was actually not inevitable, and we can rid ourselves of the illusion that we knew it all along.
When you look back on your college years, what do you remember? Most people might think about the nervous excitement of starting a new identity, the anxiety-ridden all-nighters, the carefree nights partying with friends, and the joyful optimism of graduation.
When we look back on memories, we do not think about that specific time of our life in complete detail, like a movie reel, from end to end. Our mind retrieves snapshots that it deems important – the most emotionally charged moments, the start, and the end. The tendency to remember past experiences in this manner is called the **peak-end rule**, and it shows how, to a certain extent, our memories are biased.
For example, studies show that mothers’ memories of childbirth are tempered by the result – a beautiful baby, cherished and loved. No matter how painful labor was at its worst, the mind focuses on the peak pain and the final moment, averaged together.
A handy tool, if properly wielded
The peak-end rule distorts our memories, sometimes to our benefit, sometimes not. It allows us to look back on ‘better’ times with rose-colored glasses. Reframing the past positively may boost our mental wellbeing – unless doing so keeps us stuck in the past and checked out of the present. Someone who suffers trauma from a previous doctor’s visit may avoid check-up sessions for years. This can have serious health implications if left unaddressed.
Conscious of how our memory works in this way, we can use it to our advantage. To compensate for a mediocre date, end it on a high by enjoying an indulgent dessert. Don’t abruptly stop exercise sessions or chores day. Instead, practice a gradual end with a cool-down or a less excruciating chore, respectively. This makes us less likely to drag our feet the next time we have to do it again.
As a corollary, if you ever decide to open a restaurant, remember that dessert can be a decisive course. Just as first impressions matter, so too does our last point of contact with customers.