There are many models for decision making. As defined by Baker et al in their 2001 study, “efficient decision making involves a series of steps that require the input of information at different stages of the process, as well as a process for feedback.” An “informed” decision can only be as good as the information that formed it. So how do we make an “informed” decision? Our children try to tell us what happened to the broken chair, but we already “know.” Our spouses “do it again” confirming what we already knew. One political side or the other says or acts in a way that clearly shows it is worthy of our most villainous thoughts and is on the very wrong side of what is “right.” Our news programs feed us information that is tailored to our particular preferences, reinforcing what we want to hear. All this leads to misinformation, poor decisions, damaged relationships, the polarization of our country and misunderstanding of each other.
In decision making, we must first consider the source of the information that we are consuming. Miller’s Law (George Miller 1980) states that “in order to understand what another person is saying you must assume that it is true and try to imagine what it is true of.” Imagining what it is “true of” is the trick. We are bombarded by multiple sources of information in this information age. How do we discern “true” when many times the information presented to us is specifically designed to sway our decisions? To understand what is “true,” we must then look at how we, as individuals, take information in.
What humans tend to do when gathering information, especially when trying to make decisions about complex issues or when there is high uncertainty, is to use shortcuts (“rules of thumb”) called heuristics. A few examples include:
Availability heuristic – estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual or emotionally-charged examples.
Representativeness heuristic – judging probabilities on the basis of resemblance.
These shortcuts allow us to be more efficient in processing information, (or so we think) but come at a price. When we use heuristics, we are more likely to make decisions based on Cognitive Bias (distortions in how we perceive reality). It is these distortions that are exploited by others who are trying to influence our decisions.
Edwards Bernays’ book Propaganda (1928) suggested “conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.” Super PACs on both political sides would not be spending millions of dollars if they did not think that their money could influence voter decisions and their elected officials’ decisions, once successfully elected.
Examples of the results of these “thinking errors” and the use of propaganda exist throughout our collective history. A chilling example is contained in the abstract from Hitler, Roosevelt, and the nature of war propaganda. “Statistical value analysis of Hitler’s and Roosevelt’s speeches between 1935 and 1939 shows that both men used frequent appeals to traditional moral values and to ideas of national grandeur. Both painted a simple black picture of the opposition. The main distinction was that Hitler emphasized ideas of persecution and need for strength, whereas Roosevelt stressed economic values and concern for welfare of other peoples. By painting a paranoid picture of world events, Hitler strengthened German hostilities and distorted his people’s interpretation of events while clamoring for peace. This analysis shows the danger in present Soviet and U.S. propaganda in which both sides paint a completely black picture of the opposition and interpret every action in terms of a paranoid fear of attack.”
A much more recent study illustrates the impact of bias. Evolving judgments of terror risks: Foresight, Hindsight, and Emotion: A Reanalysis (Jun 2012). They studied the cognitive and emotional responses to terror risks in Americans between late 2001 and late 2002. They found that while people’s judgment of risk (how dangerous the world is) did change, they did not recognize these changes. Subjects had hindsight bias (i.e. I knew that all along) in their memories for how they used to judge risk. Interestingly, despite an intensive “debiasing” procedure, subjects failed to return to a foresightful (future) perspective (i.e. they continued to look to the past to form their judgment of risk). Researchers also used fear and anger inducing manipulations to test how emotions impacted memories and judgments. They found that priming emotions shaped not only perceptions of an abstract future (what may happen), but also perceptions of a concrete past (what did happen). This points to the importance of psychological research to ensure an informed public, as well as how specifically targeting emotions by manipulating information can change how people feel about the future as well as how they recall the past.
Financial decisions are another area where bias has wreaked havoc. A recent visit to Investopedia.com found a list of common psychological biases plaguing investors in what is an attempted to help people understand how bias can lead to negative financial outcomes (no need to go into more detail here…). Some of the ones listed include: familiarity bias, mood and optimism, overconfidence bias, status quo bias, media bias and internet information bias.
Cognitive Bias can affect individual human behavior, as well as larger social groups in a variety of ways including how we form our beliefs and how we make business and economic decisions. While there are far too many to list here, a few examples of Cognitive Bias include:
Anchor Bias. Getting stuck (anchored) to looking at something one way. Usually once the anchor is set, there is a bias toward that looking at things that way. For example, when buying a house you fixate on the appearance rather than how well it was maintained.
Bandwagon effect – the tendency to do (or believe) things because many other people do (or believe) the same. Related to “groupthink” and herd behavior.
Conservatism (not politically defined) – you tend to underestimate high likelihoods/probabilities/frequencies and overestimate low ones. Based on the observed evidence, your estimates are not extreme enough; you were overly “conservative” in your estimations of the likelihood/probabilities/frequencies.
Exaggerated Expectation– based on your estimates, real-world evidence turns out to be less extreme than our expectations (opposite of the conservatism).
The Bayesian Likelihood Bias– the tendency to insufficiently change your beliefs as a result of new evidence. Not taking new information into account.
Illusory Correlations – you think that things are related when they are not. For example, people sometimes assume that because two events occurred together at one point in the past, that one event must be the cause of the other.
Attentional Bias– you don’t examine all possible outcomes when making a judgment about something. You may focus on one or two possibilities, while ignoring the rest. This also can be an over-focus on symptoms in some mental health disorders, such as depression and anxiety. You tend to focus on the evidence that confirms the world is unsafe or depressing.
Confirmation Bias – you search for or interpret information in a way that confirms your existing ideas. You only listen to, or watch, “your channel” for information.
Hindsight Bias – the “I-knew-it-all-along” effect, the tendency to see past events as being predictable at the time those events happened. “Hindsight is 20/20.”
Just -World Hypothesis – the tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s).
What to Do?
Be aware of your own heuristics and bias. Consider how your biases affect your relationships with family members and co-workers. Unfortunately most of the research shows that we are impacted by these processes at an unconscious level and are not likely to see them in ourselves (expertly in others!). Quite amazingly, even when conscious of them, we may still make decisions based on them. Caesar’s praetorian guard shadowed him daily in order to remind him he “was but mortal.” Without your own private guard, you can still consider what information you are drawn to and why, challenge yourself with alternative views, talk with other people with differing views and try to listen as if you haven’t already made up your mind. Practice “active” listening, summarizing what the other person has told you and asking for clarification. Ask yourself for the reason information was presented as it is what is being “sold.”
Train yourself and teach your children to question information sources. Consider the source of your information carefully. Just because the byline says “Fair and Balanced” doesn’t mean it is. Likewise the expression “Lean Forward” may still mean leaning too much to one side versus the other. Peer-reviewed sources mean that other professionals have reviewed the information and considered it valid or useful. Trust sources with reputations for solid scientific or professional work, seek multiple sources, check facts versus opinion, watch out for statistics (good ones are helpful but be leery of results based on “all our callers or the number of text’s sent to that number”). Be a good consumer of information. Remember Miller’s Law? When your children tell you something, listen carefully and assume what they are telling you is the truth, then question “What is that true of?” While it may be “true” that they are trying to get out of trouble again, you will have gained their trust in just listening, and that is a start in helping them regain yours. Communication is difficult; we need to be alert and open.
Baker, D., Bridges, D., Hunter, R., Johnson, G., Krupa, J., Murphy, J. and Sorenson, K. (2002) Guidebook to Decision-Making Methods, WSRC-IM-2002-00002, Department of Energy, USA. http://emi-web.inel.gov/Nissmg/Guidebook_2002.pdf
Hitler, Roosevelt, and the nature of war propaganda. White, Ralph K. The Journal of Abnormal and Social Psychology, Vol 44(2), Apr 1949, 157-174
Evolving judgments of terror risks: Foresight, hindsight, and emotion: A reanalysis. Fischhoff, Baruch; Gonzalez, Roxana M.; Lerner, Jennifer S.; Small, Deborah A. Journal of Experimental Psychology: Applied, Vol 18(2), Jun 2012, e1-e16.
Attentional bias to negative information and 5-HTTLPR genotype interactively predict students’ emotional reactivity to first university semester.Osinsky, Roman; Lösch, Alea; Hennig, Juergen; Alexander, Nina; MacLeod, Colin Emotion, Vol 12(3), Jun 2012, 460-469.
The Psychological Price of Media Bias. Babad, Elisha Journal of Experimental Psychology: Applied, Vol 11(4), Dec 2005, 245-255.
Overlooking the incongruent: Categorization biases in the identification of political statements. Johnson, Joel T.; Judd, Charles M.Journal of Personality and Social Psychology, Vol 45(5), Nov 1983, 978-996.
Enhancement of an enemy’s power motivation as a dynamic of conflict escalation. Winter, David G. Journal of Personality and Social Psychology, Vol 52(1), Jan 1987, 41-46.
A biased view of liberal bias. Campbell, R. Sherlock; Gibbs, Bryce N.; Guinn, Jennifer S.; Josephs, Robert A.; Newman, Matthew L.; Rentfrow, Peter J.; Stone, Lori D. American Psychologist, Vol 57(4), Apr 2002, 297-298.
Determinants of voters’ beliefs about the candidates’ stands on the issues: The role of evaluative bias heuristics and the candidates’ expressed message. Ottati, V.; Fishbein, M.; Middlestadt, Susan E. Journal of Personality and Social Psychology, Vol 55(4), Oct 1988, 517-529
Bridging the partisan divide: Self-affirmation reduces ideological closed-mindedness and inflexibility in negotiation. Cohen, Geoffrey L.; Sherman, David K.; Bastardi, Anthony; Hsu, Lillian; McGoey, Michelle; Ross, Lee Journal of Personality and Social Psychology, Vol 93(3), Sep 2007, 415-430
Review of Psychological aspects of business. Husband, R. W. Psychological Bulletin, Vol 36(1), Jan 1939, 54-55.