The Stupidity of the Intelligent

September 27, 2018

The following article originally appeared in the May/June 2013 edition of The American Rationalist.


Does intelligence have anything to do with rationality? Intelligent people are good at reasoning, so we expect them to think things through and reach reasonable conclusions based on evidence and logic.

On the other hand, we associate low intelligence with gullibility. Why do scammers target the elderly? Because they associate age with cognitive impairment, so they think that old people are more likely to be suckers. On the other hand, smart people supposedly tend to be critical thinkers who skeptically evaluate claims.

No purveyor of snake oil would set up a booth at a Mensa convention, right?

There may be some truth to these stereotypes, but, really, it is amazing how weakly intelligence and rationality correlate. Very many of the smartest people use their formidable intellectual powers to promote schemes of howling lunacy.

In fact, Cloud Cuckoo Land is inhabited by many who would qualify for Mensa. Pick any goofy conspiracy theory, weird doctrine, extremist ideology, bizarre religious belief, outrageous pseudoscience—or any other kind of hogwash, flapdoodle, or balderdash—and you can bet that some very intelligent people will be supporters, often zealous ones.

When you stop and think about it, some of the most famous literary, philosophical, and scientific geniuses were also notable advocates of claims and causes that ranged from the eccentric to the bizarre to the pernicious.

Sir Arthur Conan Doyle, creator of Sherlock Holmes, wrote a book called The Coming of the Fairies that defended the existence of the tiny gossamer-winged beings.

William Butler Yeats, Nobel Prize–winning poet, dabbled in all sorts of supernatural mumbo-jumbo.

Famous empiricist philosopher George Berkeley vigorously promoted the virtues of tar-water, a concoction that he touted as a panacea.

Sir Isaac Newton devoted over a million words to the exegesis of biblical prophecies of the end-time; however, he modestly held that the end would not occur until after 2060.

Martin Heidegger, perhaps the leading German philosopher of the twentieth century, was a vigorous proponent of Nazi ideology. Clearly intelligence, even genius, is a poor safeguard against irrational obsessions.

What happens when very smart people buy into very unreasonable beliefs? How do we avoid irrationality? To answer this question we need first to delineate the nature of rationality and then note why we are so bad at achieving it.

The intelligent

My view is that rationality is a matter of epistemic rights and duties. Such a view of rationality was developed by nineteenth-century English mathematician and philosopher William Kingdon Clifford, whose incandescent intellectual brilliance was tragically snuffed out when he died at age thirty-three.

Clifford’s essay “The Ethics of Belief” notes that we have epistemic rights and duties, i.e., rights and duties with respect to how we form and hold our beliefs. For instance, when a belief needs the support of evidence, we have a duty to make sure that the evidence is adequate to ground the belief. In general, when we have conscientiously checked the credentials of a belief, then we have performed all our epistemic duties with respect to a belief, and we then have an epistemic right to that belief. In that case, it is rational for us to believe it.

If, on the other hand, we form and hold a belief without checking its credentials adequately, then we have no epistemic right to that belief and so we hold it irrationally. For instance, when we hold a belief for psychological reasons—say, because we find it emotionally gratifying or are afraid not to believe—then we hold it because of our needs or desires, not on the basis of its rational credentials.

There is no reason to think that intelligent people are any less prone than others to forming beliefs because of psychological needs rather than rational inquiry. Indeed, they may be more prone to do so since very smart people are often very good at rationalizing.

Books such as Michael Shermer’s The Believing Brain give us the bad news about human rationality. Drawing on much data from psychology and cognitive science, Shermer shows that brains are belief-generating engines that first impose patterns, in the form of beliefs, on the deliverances of the senses, and then look for confirming evidence. In short, our normal belief-forming practice is simply the reverse of the rational order, which requires evidence to precede belief. You do not have to accept all of Shermer’s gloomy theses (I don’t) to admit that he has a point and that, at the very least, rationality is far more fragile and much harder to achieve than philosophers would like to think.

We all have irrational beliefs. Many we form unreflectively or unconsciously and they are often relatively harmless. For instance, a half-heard commercial message might slip in under our radar and make us think that one brand of toilet tissue is softer than another.

Irrational beliefs start to get dangerous when we become attached to them and refuse to reject them when they are shown to be wrong, or even refuse to consider evidence that they might be wrong.

Such beliefs become even more dangerous when we become hostile toward anyone who tries to get us to examine them critically, and more dangerous still when we start to become intolerant of those who do not share the belief. Let’s say then that a dogmatic belief is one that is not only held irrationally but in an aggressive, intolerant, and unquestioning way.

Dogma is the stupidity of the intelligent. When an intelligent, educated person (especially one educated in philosophy) is devoted to a dogma, he or she has a very formidable array of weapons to deploy in defense of the dogma and can overawe or browbeat most of those who challenge the belief.

Each such successful defense of the dogma against skeptical assault further entrenches the dogma. Further, as Shermer notes, confirmation bias works with the intelligent as much as the unintelligent. That is, we notice information or argument that tends to confirm our beliefs and automatically dismiss or downplay any that count against them. Archimedes boasted that he could move the Earth with a lever if he had an immovable fulcrum. He would have had a much harder time nudging a very intelligent person from a dogma.

The most pernicious aspect of dogma arises from the fact that, as philosopher W. V. O. Quine liked to put it, our body of beliefs forms an immense and intricately connected web, with beliefs relating to other beliefs in various complex ways.

Let me explain: Beliefs do not exist separately and cannot be changed separately but, like a spider’s web, constitute a mutually supporting structure. Quine was fond of saying that our beliefs face the tribunal of experience not separately but as a body. He meant that our beliefs are logically and evidentially interdependent in so many ways that when experience leads (or compels) us to change one belief, many other beliefs will be variously affected and have to be abandoned or revised. Further, some beliefs form more structurally important strands of the web than others, so that any change in those beliefs may require revisions throughout our whole network of beliefs.

What, then, if dogma constitutes the central strands of one’s belief web? For instance, suppose that someone is a dogmatic young-Earth creationist who believes that the Earth and all living things were created in six literal days no more than 10,000 years ago.

How does this commitment affect someone’s other beliefs? Well, if you are committed to young-Earth creationism, you will need to adjust your beliefs on innumerable topics—including geology, paleontology, physics, astronomy, biology, anthropology, archaeology, linguistics, and history—to accommodate your creationist creed.

For instance, you will have to explain the entire fossil record in terms of a single cataclysmic flood event (Noah’s flood). You will then have to explain where such a volume of water came from and how precisely it could account for other features of the Earth’s surface, such as the Grand Canyon.

You will probably want to explain the origin of human languages in a single, historical event (the Tower of Babel). You will have to account for dinosaurs and other extinct creatures, perhaps by putting them on the ark with Noah (the Creation Museum in Kentucky shows dinosaurs among the animals on the ark, and one with a saddle for the convenience of antediluvian dinosaur riders).

You will need to dismiss radioactive isotope dating by holding—contrary to the claims of physicists—that decay rates have changed drastically over the last few millennia. In other words, to be a young-Earth creationist you have to reject a great deal of modern science and maintain that certain results, which scientific communities regard as very well-confirmed, are in fact wrong.

Clearly, if an unreasonable belief is at the center of your web, then your other beliefs must be made to fit that belief. A whole host of wacky beliefs is required to accommodate one big wacky belief. For this reason, irrationality cannot be isolated but, like a virulent pathogen, spreads until one’s whole belief system is infected. Now maintaining a whole system of irrational beliefs is very hard work.

Your beliefs are constantly colliding with reality—and with one another—and some sort of plausible accommodation must be made or the cognitive dissonance becomes too painfully cacophonous. Really, it takes a very intelligent person to support a whole scheme of reticulated nonsense without the absurdity of the whole structure becoming apparent even to the true believer.

High intelligence, then, is not only compatible with irrationality but is really necessary for the successful maintenance of a major system of unreason (like young-Earth creationism). This is why some of the smartest people are some of the biggest cranks.

Now for the big question: Can you be sure that you are not a crank? We all fancy ourselves as pretty smart, but we have seen that intelligence is not an effective guardian against irrationality and may actually work against rationality. How can you be rational when even reason so often leads us astray? Indeed, as Immanuel Kant argued at great length: The purer the reason, the better it is at building castles on the clouds.

The best advice I have seen comes from philosopher Robert Fogelin’s terrific little book Walking the Tightrope of Reason. In a nutshell, his advice is this: Insofar as possible let your concepts be constrained by something nonconceptual. Don’t permit reason to rely only on its own resources; discipline reason with testable, measurable fact whenever possible. Practically speaking, this means that your cognitive practices should employ the methods and techniques of science as much as possible, since those methods and techniques are the most effective ones we have found for testing beliefs vis-à-vis the natural world. As for beliefs not amenable to such testing, such as the claims of metaphysics, it is best to cultivate a healthy skepticism.

So that’s it? Be more scientific and skeptical if you want to be less dogmatic? Gee, it hardly takes a PhD in philosophy to dispense such advice. But the problem is not in saying what needs to be done but in doing it. Intellectual virtue, like moral virtue, requires careful and assiduous cultivation. Intelligence is natural, but rationality is learned, and learning to be rational can be a lifelong task. Like courage, temperance, or generosity, rationality must be cultivated and made habitual. Sadly, “critical thinking” is a faddish phrase now found too often on the lips of ignoramuses who know nothing about it and are conspicuously incapable of practicing it. However, if we could actually achieve a system of education that really did inculcate the practice of critical thinking, this would be the pearl beyond price, and the best possible antidote to dogma.