Since leaders and managers should aim to make decisions on a rational basis following a deliberate, analytical process, reducing bias in the decision making process is essential. This means taking an approach to decision making that recognizes biases exist and that takes steps to counteract them. Simply telling yourself to “slow down and make better decisions” may not always be sufficient. This reading reviews several tactics and practices to keep in mind as ways to structure and manage the decision making process to minimize the influence of bias.
Carving out time for reflection can counteract bias. Even spending a short time thinking about the task ahead and potential biases can significantly improve decisions. Reflection encourages deliberation and builds in time to be more critical of data and conclusions and to promote more thorough analysis.
One simple way to overcome bias is to switch from evaluating options one at a time to evaluating them simultaneously. For example, rather than considering an employee for promotion based solely on her own perceived merits, a manager could compare her performance to others at or near her level. As compared to “separate evaluation,” this type of “joint evaluation” results in less biased decisions. It works by nudging decision makers to focus more on employees’ past performance, relative to that of others, than on gender and other stereotypes that managers might fall back on when making their decision.
It can be helpful to talk a decision through with an outsider, such as a trusted friend or colleague, who will be more likely to generalize across situations and identify potential pitfalls due to their distance from the problem. It is important to seek a diverse group of individuals to solicit ideas from. This will increase the chance of finding fresh ideas and new information. Alternatively, simply trying to view the decision from an outsider’s perspective may also help you overcome bias. Doing so might lead you to compare your business plan to new businesses that you’ve seen fail, for example, and view it more critically.
You can also reduce bias by increasing accountability for decisions in your organization. Accountability achieves the best results when decision makers are responsible for the quality of the decision-making process rather than solely their outcomes. Accountability can mean that people must justify their decision-making strategy before they begin or that they must list in advance the most important types of information on which to focus.
Confirmation Bias
People have a tendency to give a lot of credence to evidence and information that support what they already believe. Conversely, we tend to discount or even dismiss data that contradicts our point of view. We can overcome the confirmation bias by methodically looking for information that disconfirms our point of view.
Looking for information that challenges what we believe can lead us to important discoveries. A team, for instance, might specifically assign one member to play the role of “devil’s advocate,” looking for data that could puncture holes in the group’s thinking. The information that you do consider should be examined with equal diligence. That may mean spending additional time to collect and analyze data, or to try different methods of analysis that are best suited to the data at hand. For instance, it may be that the data for one alternative are readily available from a previous similar project, while another alternative has never been studied before. It would be easy and tempting to take advantage of what is at hand at the expense of undertaking a costly effort to collect new data.
Anchoring
The anchoring effect is the tendency to rely unduly on a single, usually initial, piece of information on which to based subsequent judgments. In the business world, decision makers frequently risk being inappropriately swayed by anchors. For example, conversations about sales targets, budgets, and salaries tend to begin with last year’s numbers, which may not offer relevant guidance for the future.
Amateurs and experts alike tend to be overly swayed by initial estimates and offers in decision making, but you may be able to overcome this trap by focusing on your alternatives and objectives. One way to do this is to attack a problem from multiple angles. Rather than going with your original thought about something, step back and try to view the situation from different vantage points. This will allow you to see other paths to a solution or uncover new ideas.
Another approach to avoiding anchors is to delay consulting with others until you have given initial thought to a problem. This way you won’t be unduly influenced by their ideas. When you do consult others, it is better to not share too much of your thinking and tentative conclusions with them; doing so may bias them and the advice or information they give you.
Availability Heuristic
The availability heuristic is a mental short hand used to make sense of a problem or decision. Relying on it can lead to bias since we tend to judge things that come to mind easily as common or important. In practice, this can mean we are unduly influenced by the most recent advice we hear or the last piece of information we collect. The problem is that relying on familiarity skews our perception by attributing more value to things than they deserve. We are more likely to jump to conclusions based on incomplete or faulty information simply because it is readily accessible. For example, we might be tempted to sell stock in a company based on a single bit of bad news even if the fundamentals of the business’s prospects have not changed.
So that you won’t only focus on what’s vivid and easy to recall, it is important to hunt for less flashy and less memorable information that could be just as important. Undertaking rigorous research is one approach that can lead to greater awareness of information and other factors that may have an important bearing on a decision. Resisting snap judgments is another; seeking and examining alternatives slows down the decision process and reduces the chances of being trapped by the availability heuristic.
Overconfidence Bias
Overconfidence leads us to be overly optimistic in our abilities and our judgments. This bias can be particularly acute when making estimates and forecasts. Actively questioning our beliefs about what we know and what we think the future will bring may lead us to smarter decisions. One way to counter the tendency to be overconfident is to start with a range of values – from the extreme high end to the extreme low end. Then try to find evidence or supporting data to make the case for such outcomes by thinking about the conditions that would lead to them. The goal is to be realistic. For instance, what would need to happen to achieve the upper limit of the estimate, and how likely are those things to occur? This type of analysis can justify trusting your ability to make a good estimate.
Framing Bias
The way a decision and its alternatives are framed establishes the reference points and boundaries of the choice to be made. Because of framing, we are influenced by the way a problem is formulated even though it should have no bearing on the solution. It is important to recognize the initial frame and not to accept it without question (Hammond, Keeney, Raiffa). To avoid being unduly influenced by the way a problem is framed, look at it from different perspectives. For instance, choices can be presented in terms of challenges (e.g., how to avoid loss of market share) or in terms of opportunities (e.g., how to maintain and increase market share). Each frame would lead to a different set of actions; for example, growing market share would be more likely to lead to investments in innovation and new product development, while preventing loss might result in more defensive moves such as costly price reductions.
REFERENCES
Hammond, J.S., R.L. Keeney, and H. Raiffa. (January 2006). The Hidden Traps in Decision Making. Harvard Business Review. Accessed from https://hbr.org/2006/01/the-hidden-traps-in-decision-making?utm_campaign=HBR&utm_source=facebook&utm_medium=social 18 July 2016.