What are Cognitive Biases
Growing up we are all taught different values, raised in various environments, and are subject to cultural beliefs that we might not even be aware of. These factors and many others play a large roll in the way analysts collect information whether they realize it or not. Cognitive biases are shortcuts that our brain is attempting to take when trying to figure out problem. Heuer (1999) explains how cognitive biases are not decisions made based on looking out for our own interests, but instead are actually mental errors that we are not aware of (111). Cognitive biases, although in intelligence are considered "tricks", are actually the work of nature to help us make quick decisions and assumptions to keep us safe. Since they are unconscious, this type of bias is one of the hardest to overcome.
Biases in Evaluation of Evidence
As a analyst you are required to look through mountains of evidence, often coming from a variety of different sources and methods. Biases in Evaluation of Evidence relates to how real world experiences and events stand will influence our decision more than statistical studies and detailed analytics. Schroeder applies this to a really good example. Some people are terrified of flying, but will hop in a car and drive the same distance, if not further, with no hesitation at all. Even though statistics show that flying is much safer than driving, a person will remember a devastating airplane crash and apply it as the more dangerous option (1).
Biases in perception of cause and effect
Humans are always searching for answers and in gathering intelligence we seek the same, sometimes pushing a incorrect explanation for something that has none. If something happens that we don't understand completely we search for the cause, there must be one right? Not always, sometimes things happen by complete chance and with no explanation. If we have none, our brains fill in the blank space with a inaccurate answer instead. Heuer (1999) relates biases in perception of cause and effect to intelligence by presenting the example of a analyst overestimating the extent a country may be going to when pursing a certain rational goal, therefor hindering the individual by causing him/her to overestimate their own ability to predict certain outcomes (127).
Biases in Estimating Probablities
Another flaw is when we decide to estimate probabilities. One way we do this is by using the availability rule. The availability rule is when we use what is "available" in our brain when estimating a outcome. If the last time I rode a motorcycle in the rain the wheels slipped out from under me and I crashed, the assumption will be made that a person riding in the rain will do the same. What I'm not taking into account could be that the rider is a professional rider who trains others how to ride in wet conditions. This can be especially detrimental when a analysts for example takes into account how muslims act in western states and applies the same when looking at the same group but in a different region. Second, in relation to biases in estimating probabilities is the anchoring effect. This involves a individual relying heavily on the first piece of information they receive on a topic and applying it to future outcomes.
Hindsight biases in evaluation of intelligence reporting
"I'm willing to admit when I'm wrong and don't know something." Oh are you? Well unfortunately you may not even realize you are being effected by hindsight biases. Heuer (1999) explains how these biases effect the evaluation of intelligence in three ways. Analysts will often overestimate their accuracy of prior judgments, consumers will underestimate their limited knowledge of subject and how much they learned from the intelligence, and lastly, when looking at intelligence failures, supervisors will often look at the faults as easily seen when in reality at the time they weren't (161).
Can we do anything about it?
Cognitive biases as said previously are unconscious and often we don't even realize they are happening. The sad part is some times we will not be able to avoid them but there are steps we can take to minimize the role they play in intelligence gathering. Heuer (1999) lays out a checklist that as a analyst, you can follow that can help combat these biases: "defining the problem, generating hypotheses, collecting information, evaluation hypotheses, selecting the most likely hypotheses, and the ongoing monitoring of new information." (173). The common denominator of this six step process is constantly reflecting on what you looking at. You define the problem itself before even considering a hypotheses. After collecting information related the the hypotheses you again re-evaluate the different hypotheses (OBJECTIVELY), then you select the the most likely one. All is done now right? Wrong! You must constantly monitory what is happening in relation to the information you collected because at any time a event or new information could change the conclusion all together. The key to overcoming biases is to always question not only the information being presented but your own conclusions as well.