We suck at thinking, all of us– humanity. It’s poetically tragic given that we haven’t met any life forms who can do a better job of it yet.
We skeptics enjoy thinking of ourselves as rational and reasonable, smugly superior among a vast sea of credulous, closed-minded believers. But we’re not nearly as clever as we think, nor are we very different from the true believers.
Humans didn’t evolve to think rationally, but rather expediently. Our myriad cognitive biases probably came about because they were useful shortcuts to solve the kinds of problems hominids faced pre-historically. Some may be primitive processes that worked well enough to avoid being bred out over the millenia. Other biases probably serve as heuristics that can let us quickly reach the right answers even though our brains are not very fast or accurate processors. I’m not qualified to make claims about the origins of cognitive bias; however it happened, here we are in modern times saddled with a lot of built-in dumb.
Putting aside bias in order to think rationally is a learned behavior. We can jury-rig our minds into a feedback loop of self-analysis and doubt to avoid or detect as much of our own bias as possible. In some ways it will always be a Sisyphean effort since we are running this “rationality” program on top of an inherently biased system anyway.
When it’s really important to be fair, we have to make the human mind cancel out of the equation. Studies have to be designed with blinds so the participants and experimenters can’t inadvertently affect the results. Even then, the hypothesis or experiment can have an undetected bias baked in via unstated assumptions. Even a flawless study will be subject to a gauntlet of human peer reviewers and publishers. Finally, if a study fails to turn up something interesting (i.e. a “negative” finding), it is far less likely to even see publication.
A fully rational analysis would consist of evaluating all relevant data without bias. How is a casual human thinker to know what information can be relied upon, or where to find it? It’s clearly impossible to become an expert in every field, and even learned experts may disagree among themselves or be proven wrong in time. To be able to function, we must routinely discard the majority of information available to us. What we accept and internalize, by default, tends to align with our existing notions. See confirmation bias, expectation bias, selective perception, mere exposure, N.I.H., status quo bias, and wishful thinking.
Discarding possibly relevant information when deciding (or defending) a position is a pretty good definition of being closed-minded. In the excellent Skeptoid program (#134) Brian Dunning points out that we tend to be automatically skeptical of some information based on our personal biases. Conversely we are uncritical of information that resonates with our world view.
We are all skeptics. We are all believers.
Believers, even atheists? Scientists? Absolutely. I’m not propping up that old saw that science is some form of religion; it’s absolutely not. What we all believe in is our current mental inventory of experience and information– we each believe that we are right, and that people who disagree with us are using flawed reasoning.
The kernel of progress
The preceding enlightenment was my main reward for listening to this week’s Skeptoid. Brian also makes a related observation to pull focus back to rational thinking:
The real difference between skeptics and believers is that skeptics have a useful foundation of scientific knowledge and an aptitude for following the scientific method. These tools allow us to distinguish poor quality evidence from good quality evidence. And, importantly, they help restrain us from drawing poorly supported conclusions from the evidence that we do accept, no matter how strongly we want those conclusions to be justified.
The scientific endeavor
Science embodies the best approach we have to convert ideas and information into a robust understanding of the cosmos. It’s a collective endeavor, because individual biases can be canceled out by getting a lot of different people to argue over the same stuff. Properly applied, science will reward the mitigation of bias, and ruthlessly prefer theories that best explain the data. Since it’s run by closed-minded, skeptical, credulous humans, science is not perfect. But with each revision it gets a little better.