The Brain and Biases

Much has been written about how marvelous, complex and remarkable the human brain is.  Its many different parts work together to produce truly awe-inspiring results: space travel, the Panama Canal, the Mona Lisa and Carmen, to name but a few.  But it has also produced war, slavery and the Edsel.

Now I’m not knocking the brain, but think about it: way back when, a human brain came up with the idea that war was the best way to solve a particular problem.  Not only that, but many other human brains got on board with the idea and it evolved to point where we are today, that killing other human beings who don’t agree with our ideology is an acceptable method to resolve conflicts.  Certainly not the brain (and mankind) at its best.

Now maybe there is a perfectly rational explanation for this.  Maybe, because of poor lighting, the first murderer mistook his victim for a saber-toothed tiger.  Maybe, upon examination after the fact, the murderer didn’t recognize the victim as part of his tribe and concluded that killing the interloper was indeed the best way to deal with the situation.  And if it works on an individual scale, perhaps it will work equally as well or better on a large scale.  And so it goes.

It turns out that the human mind is nowhere near as rational as we humans tend to think it is.  Books like Nudge and Predictably Irrational detail dozens of cognitive biases that humans are subject to.  To varying degrees, we all tend to: believe we are less biased than other people (bias blind spot), remember our choices as better than they actually were (choice-supportive bias), remember information in a way that confirms our preconceptions (confirmation bias) and tip attractive wait staff better than their less attractive counterparts (the Halo Effect).

Of course many biases come into play in the process of procuring life insurance.  Normalcy bias (the refusal to plan for, or react to, a disaster which has never happened before), Omission bias (the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions/inactions), Ostrich effect (ignoring an obvious (negative) situation), Outcome bias (the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made), and Optimism bias (the tendency to be over-optimistic, overestimating favorable and pleasing outcomes) are just a few that come to mind.

As impressive and magnificent as the human brain is, it isn’t perfect, and it isn’t particularly rational.  So it is in our own self-interest to be aware of the many biases that affect us and develop a plan to minimize their effect.


Return to Commentary

Return to Home Page