Halvorson and Rock’s SEEDS model proposes five categories of cognitive biases – similarity, expedience, experience, distance, and safety – and offers suggestions for mitigating bias from a systems approach.
The SEEDS Model
Most literature on cognitive bias provides mitigation strategies piecemeal. But, when we engage in decision-making, it is impractical to dissect each flawed line of thinking before applying individual strategies to address them. Social psychologist Heidi Grant-Halvorson and leadership expert David Rock’s SEEDS model offers an alternative approach to mitigating bias. The framework provides five ‘brain-based’ categories of bias – similarity, expedience, experience, distance, and safety – and offers corresponding debiasing strategies for each, focusing on the organizational or systems level .
The SEEDS model entails three steps. First is acknowledging and accepting the universality of bias among individuals and realizing that they lie underneath the surface, whether we are conscious of them or not. The second step sees individuals anticipating and labeling any bias that might creep into the decision-making process. In this step, we are encouraged to use the categories provided by the model so as to avoid being intimidated by the overwhelming number of cognitive biases. The final step involves implementing mitigation strategies that relate to each category.
Under the SEEDS model, similarity relates to our preference for our in-groups, people with whom we share commonalities. By extension, it also encompasses the out-group bias, which describes our negative perception of individuals beyond our in-group. This type of bias is common to all, but requires special attention in an organization’s recruitment and promotion activities.
Instead of futile resistance against our base instinct, removing opportunities for bias to creep in may prove an effective mitigation strategy. Across the board, we can broaden our affiliations and bring more people into our in-group so as to level the playing field. In terms of recruitment and evaluation, hiring managers are encouraged to remove potentially biasing information from materials up for review so as to minimize the impact of stereotypes on decision-making. Joint evaluations – say, assessing prospective employees’ qualifications side-by-side – may also help by refocusing our attention toward individuals’ capabilities. Comparing individuals head-on based on pertinent details minimizes the need to resort to gender stereotypes to fill in information gaps that we might encounter with separate evaluations.
Biases under the expedience category arise when our automatic, instinctive decision-making mechanism (system 1) fails us. In our rush, we make poor decisions based on our limited understanding of situations. To avoid expedience biases, we have to hit the brakes on our decision-making and engage our more deliberate, rational system 2. Organizations that want to minimize expedience biases can encourage employees to slow down their thought processes by relaxing deadlines and lightening workloads or providing incentives that encourage careful deliberation.
Managers can exercise subordinates’ critical reasoning by asking them to walk through thought processes leading to their decisions. Breaking down complex problems into smaller components not only prevents individuals from feeling overwhelmed but also slows down thinking by forcing decision-makers to examine each of the parts that make up the whole. Finally, a literal slowdown can be built into processes by imposing a mandatory cool-off period prior to finalizing decisions. This way, action steps can be re-evaluated with fresh eyes before receiving a final stamp of approval.
Experience biases relate to how humans can get caught up in their own perception of reality, in turn failing to realize how our personal histories, our biases, and our emotional state color our perception. This bias category includes the fundamental attribution error, hindsight bias, and illusion of transparency. By default, we move through the world thinking that what other people see is the same as what we see. We are, after all, the main characters of our lives.
Stepping outside of our heads requires conscious effort, but failing to do so prevents us from considering other perspectives and, thus, impacts our decision-making ability. Mere awareness of experience biases does not help us to avoid them. Instead, organizations seeking to mitigate the effects of this bias type need to set up systems that encourage the exchange of ideas. Seeking outsider opinion, for example, allows organizations to appreciate issues with a fresh set of eyes.
Humans place more value on objects in proximity, whether closer physically or closer in time. This is the primary mechanism behind cognitive biases like the endowment effect (our tendency to regard what we already own as more valuable than otherwise) and hyperbolic discounting (our preference for immediate gains over long-term rewards with the equivalent or greater value).
Allowing our decisions to be swayed by distance biases may produce suboptimal decisions, as organizations that focus disproportionately on the short term fail to plan for a longer time horizon. An unwarranted preference for choices within physical proximity may limit our options and cause us to forego better – though distant – opportunities.
To limit the impacts of distance bias on decisions, organizations can take physical and temporal distance out of the equation when comparing potential outcomes. For example by removing locations as a piece of information when deciding between 2 alternatives. Doing so levels the playing field for the options at hand and allows for a closer comparison without the undue influence of distance.
The safety category includes biases like loss aversion, the framing effect, and the sunk cost fallacy. These biases describe the human tendency to place greater weight on negative consequences than on positive outcomes. We frequently see this line of thinking in areas where financial decisions and risk assessment are involved, especially where limited resources are allocated across an array of options.
Playing things too safe may see key decision-makers foregoing viable opportunities deemed too risky, or clinging on to current projects that do not make business sense. Neither scenario is in the best interests of their organization. To avoid safety biases, individuals can impose some psychological distance between themselves and the situation. They can imagine themselves giving advice to a friend in the same circumstance, or they can pretend that the decision had already been made in the past, and they are simply re-evaluating it from the future. Removing oneself from the decision allows for an objective assessment made with clearer eyes and less emotional attachment.
An organizational perspective
The SEEDS model reiterates that bias is universal and in no way indicative of experience or expertise. We are all – across the board – prone to committing cognitive biases, whether we believe we are or not. That said, individuals high in self-awareness are better equipped to avoid some types of bias.
Most cognitive biases are best avoided by slowing down our thinking so that our decisions better reflect cognitive effort and not our gut instinct. But any serious attempts at mitigating bias would involve eliminating any space for it to occur, not tackling it head-on as it happens. To this end, organizations should put systems and processes in place to nudge individuals into debiasing practices.
Finally, prior research on group intelligence points out that addressing cognitive bias is more effective when regarded as a group endeavor and not as an individual effort . As such, organizations are encouraged to embrace a systems approach and to foster a culture that seeks to address bias by playing to its members’ diverse strengths and skills.