Friday, July 1, 2011

The Believer's Brain


In a recent book The Believing Brain, Times Books, 2011 Michael Shermer makes a strong case that the human brain is necessarily a belief engine. His case is that pattern seeking and assigning agency to the patterns is a survival trait built in to the brain and that we believe first and think about it later, if ever. In my experience this is as true of atheists and skeptics (including Shermer) as it is for religious believers.

As many will testify dragging a belief, say about UFOs, out and trying to ask whether the belief is justified or not is extremely difficult for most people. Whether you are for 'em or ag'in 'em you just cannot decide that you just don't know. My experience is that most people can't be unsure on any belief based subject. Which is to say, if Shermer is right, on all subjects. It as if "I just don't know" doesn't have a home in the human brain. 

Even Shermer falls prey to this syndrome. He spends many pages debunking UFO's, ESP, God beliefs, political beliefs, etc. It seems his confirmation bias won't admit data that casts doubt on his skeptical belief system. Not at all unusual, and in effect his bias is proof of his thesis. I would not argue that any of these belief systems are true, but in many cases e.g. God and ESP, there is enough data to indicate "I don't know" is a reasonable position to take.

"That's right!" has many homes in the brain for decisions affecting beliefs. The dorsolateral prefrontal cortex which is the center for rational evaluation of data is effectively shut down when first evaluating new data. The orbital frontal cortex the emotional center in the brain and the anterior cingulate cortex which is the conflict resolution center are both active. These centers essentially decide whether or not this data conflicts with my belief system.  Once a emotionally satisfying resolution of the conflict, that is to throw out the conflicting data, the reward mechanisms in the ventral striatum kick in. See P 260. for the experimental data. This makes a lot of sense, in the modern world "I don't know" gets in the way of many necessary decisions. Which stock to buy, which way to bet on a business decision, etc, as they say, it is better to go with the gut, i.e. the belief systems in the brain, and just do it.

The escape hatch for this belief reinforcement is the scientific method, which is the way we can after the fact check on our belief systems. But as Shermer points out even scientists have beliefs which shape their protocols, and may in fact be testing only things that reinforce their beliefs. Shermer ends with the belief that "The truth is out there, and although it may be difficult to find, science is the best tool we have for uncovering it."

One of the reasons I enjoyed the book is that he makes a hard scientific case, which is materialistic and rational, for woo-woo. Maybe I am belief disabled, or I had the wrong upbringing and went to the wrong school, but I have never been able to understand how extremely intelligent and rational people can believe weird things. I think I understand it better now, but I am still an outsider looking in.

Whether you are a believer, skeptic, or that rare breed I call "acred" who agree with Lazarus Long in Time Enough for Love (Robert A. Heinlein, 1973) that "belief gets in the way of learning," you will find this book extremely helpful in dealing with believers of any persuasion. It seems the "La, la, la, I can't hear you" is a built in function of most human brains. According to Shermer it is an evolved necessary brain function for dealing with decisions under uncertainty. As my world class athlete father would frequently state: "Don't think, you weaken the ball club." A fascinating and apparently nearly ubiquitous attitude. Shermer's insight provides a useful tool for understanding all those morons that don't agree with you.

No comments:

Post a Comment