Atheistic Scientism Revisited

I recently had the great pleasure of reading The Devil’s Delusion: Atheism and Its Scientific Pretensions, by David Berlinksi. (Many thanks to Roger Barnett for recommending the book to me.) Berlinski, who knows far more about science than I do, writes with flair and scathing logic. I can’t do justice to his book, but I will try to convey its gist.

Before I do that, I must tell you that I enjoyed Berlinski’s book not only because of the author’s acumen and biting wit, but also because he agrees with me. (I suppose I should say, in modesty, that I agree with him.) I have argued against atheistic scientism in many blog posts (see below).

Here is my version of the argumment against atheism in its briefest form (June 15, 2011):

  1. In the material universe, cause precedes effect.
  2. Accordingly, the material universe cannot be self-made. It must have a “starting point,” but the “starting point” cannot be in or of the material universe.
  3. The existence of the universe therefore implies a separate, uncaused cause.

There is no reasonable basis — and certainly no empirical one — on which to prefer atheism to deism or theism. Strident atheists merely practice a “religion” of their own. They have neither logic nor science nor evidence on their side — and eons of belief against them.

As for scientism, I call upon Friedrich Hayek:

[W]e shall, wherever we are concerned … with slavish imitation of the method and language of Science, speak of “scientism” or the “scientistic” prejudice…. It should be noted that, in the sense in which we shall use these terms, they describe, of course, an attitude which is decidedly unscientific in the true sense of the word, since it involves a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed. The scientistic as distinguished from the scientific view is not an unprejudiced but a very prejudiced approach which, before it has considered its subject, claims to know what is the most appropriate way of investigating it. [The Counter Revolution Of Science]

As Berlinski amply illustrates and forcibly argues, atheistic scientism is rampant in the so-called sciences. I have reproduced below some key passages from Berlinski’s book. They are representative, but far from exhaustive (though I did nearly exhaust the publisher’s copy limit on the Kindle edition). I have forgone the block-quotation style for ease of reading, and have inserted triple asterisks to indicate (sometimes subtle) changes of topic.

*   *   *

Richard Dawkins, the author of The God Delusion, … is not only an intellectually fulfilled atheist, he is determined that others should be as full as he. A great many scientists are satisfied that at last someone has said out loud what so many of them have said among themselves: Scientific and religious belief are in conflict. They cannot both be right. Let us get rid of the one that is wrong….

Because atheism is said to follow from various scientific doctrines, literary atheists, while they are eager to speak their minds, must often express themselves in other men’s voices. Christopher Hitchens is an example. With forthcoming modesty, he has affirmed his willingness to defer to the world’s “smart scientists” on any matter more exigent than finger-counting. Were smart scientists to report that a strain of yeast supported the invasion of Iraq, Hitchens would, no doubt, conceive an increased respect for yeast….

If nothing else, the attack on traditional religious thought marks the consolidation in our time of science as the single system of belief in which rational men and women might place their faith, and if not their faith, then certainly their devotion. From cosmology to biology, its narratives have become the narratives. They are, these narratives, immensely seductive, so much so that looking at them with innocent eyes requires a very deliberate act. And like any militant church, this one places a familiar demand before all others: Thou shalt have no other gods before me.

It is this that is new; it is this that is important….

For scientists persuaded that there is no God, there is no finer pleasure than recounting the history of religious brutality and persecution. Sam Harris is in this regard especially enthusiastic, The End of Faith recounting in lurid but lingering detail the methods of torture used in the Spanish Inquisition….

Nonetheless, there is this awkward fact: The twentieth century was not an age of faith, and it was awful. Lenin, Stalin, Hitler, Mao, and Pol Pot will never be counted among the religious leaders of mankind….

… Just who has imposed on the suffering human race poison gas, barbed wire, high explosives, experiments in eugenics, the formula for Zyklon B, heavy artillery, pseudo-scientific justifications for mass murder, cluster bombs, attack submarines, napalm, intercontinental ballistic missiles, military space platforms, and nuclear weapons?

If memory serves, it was not the Vatican….

What Hitler did not believe and what Stalin did not believe and what Mao did not believe and what the SS did not believe and what the Gestapo did not believe and what the NKVD did not believe and what the commissars, functionaries, swaggering executioners, Nazi doctors, Communist Party theoreticians, intellectuals, Brown Shirts, Black Shirts, gauleiters, and a thousand party hacks did not believe was that God was watching what they were doing.

And as far as we can tell, very few of those carrying out the horrors of the twentieth century worried overmuch that God was watching what they were doing either.

That is, after all, the meaning of a secular society….

Richard Weikart, … in his admirable treatise, From Darwin to Hitler: Evolutionary Ethics, Eugenics, and Racism in Germany, makes clear what anyone capable of reading the German sources already knew: A sinister current of influence ran from Darwin’s theory of evolution to Hitler’s policy of extermination.

*   *   *

It is wrong, the nineteenth-century British mathematician W. K. Clifford affirmed, “always, everywhere, and for anyone, to believe anything upon insufficient evidence.” I am guessing that Clifford believed what he wrote, but what evidence he had for his belief, he did not say.

Something like Clifford’s injunction functions as the premise in a popular argument for the inexistence of God. If God exists, then his existence is a scientific claim, no different in kind from the claim that there is tungsten to be found in Bermuda. We cannot have one set of standards for tungsten and another for the Deity….

There remains the obvious question: By what standards might we determine that faith in science is reasonable, but that faith in God is not? It may well be that “religious faith,” as the philosopher Robert Todd Carroll has written, “is contrary to the sum of evidence,” but if religious faith is found wanting, it is reasonable to ask for a restatement of the rules by which “the sum of evidence” is computed….

… The concept of sufficient evidence is infinitely elastic…. What a physicist counts as evidence is not what a mathematician generally accepts. Evidence in engineering has little to do with evidence in art, and while everyone can agree that it is wrong to go off half-baked, half-cocked, or half-right, what counts as being baked, cocked, or right is simply too variable to suggest a plausible general principle….

Neither the premises nor the conclusions of any scientific theory mention the existence of God. I have checked this carefully. The theories are by themselves unrevealing. If science is to champion atheism, the requisite demonstration must appeal to something in the sciences that is not quite a matter of what they say, what they imply, or what they reveal.

*   *   *

The universe in its largest aspect is the expression of curved space and time. Four fundamental forces hold sway. There are black holes and various infernal singularities. Popping out of quantum fields, the elementary particles appear as bosons or fermions. The fermions are divided into quarks and leptons. Quarks come in six varieties, but they are never seen, confined as they are within hadrons by a force that perversely grows weaker at short distances and stronger at distances that are long. There are six leptons in four varieties. Depending on just how things are counted, matter has as its fundamental constituents twenty-four elementary particles, together with a great many fields, symmetries, strange geometrical spaces, and forces that are disconnected at one level of energy and fused at another, together with at least a dozen different forms of energy, all of them active.

… It is remarkably baroque. And it is promiscuously catholic. For the atheist persuaded that materialism offers him a no-nonsense doctrinal affiliation, materialism in this sense comes to the declaration of a barroom drinker that he will have whatever he’s having, no matter who he is or what he is having. What he is having is what he always takes, and that is any concept, mathematical structure, or vagrant idea needed to get on with it. If tomorrow, physicists determine that particle physics requires access to the ubiquity of the body of Christ, that doctrine would at once be declared a physical principle and treated accordingly….

What remains of the ideology of the sciences? It is the thesis that the sciences are true— who would doubt it?— and that only the sciences are true. The philosopher Michael Devitt thus argues that “there is only one way of knowing, the empirical way that is the basis of science.” An argument against religious belief follows at once on the assumptions that theology is not science and belief is not knowledge. If by means of this argument it also follows that neither mathematics, the law, nor the greater part of ordinary human discourse have a claim on our epistemological allegiance, they must be accepted as casualties of war.

*   *   *

The claim that the existence of God should be treated as a scientific question stands on a destructive dilemma: If by science one means the great theories of mathematical physics, then the demand is unreasonable. We cannot treat any claim in this way. There is no other intellectual activity in which theory and evidence have reached this stage of development….

Is there a God who has among other things created the universe? “It is not by its conclusions,” C. F. von Weizsäcker has written in The Relevance of Science, but by its methodological starting point that modern science excludes direct creation. Our methodology would not be honest if this fact were denied . . . such is the faith in the science of our time, and which we all share” (italics added).

In science, as in so many other areas of life, faith is its own reward….

The medieval Arabic argument known as the kalam is an example of the genre [cosmological argument].

Its first premise: Everything that begins to exist has a cause.

And its second: The universe began to exist.

And its conclusion: So the universe had a cause.

This is not by itself an argument for the existence of God. It is suggestive without being conclusive. Even so, it is an argument that in a rush covers a good deal of ground carelessly denied by atheists. It is one thing to deny that there is a God; it is quite another to deny that the universe has a cause….

The universe, orthodox cosmologists believe, came into existence as the expression of an explosion— what is now called the Big Bang. The word explosion is a sign that words have failed us, as they so often do, for it suggests a humanly comprehensible event— a gigantic explosion or a stupendous eruption. This is absurd. The Big Bang was not an event taking place at a time or in a place. Space and time were themselves created by the Big Bang, the measure along with the measured….

Whatever its name, as far as most physicists are concerned, the Big Bang is now a part of the established structure of modern physics….

… Many physicists have found the idea that the universe had a beginning alarming. “So long as the universe had a beginning,” Stephen Hawking has written, “we could suppose it had a creator.” God forbid!

… Big Bang cosmology has been confirmed by additional evidence, some of it astonishing. In 1963, the physicists Arno Penzias and Robert Wilson observed what seemed to be the living remnants of the Big Bang— and after 14 billion years!— when in 1962 they detected, by means of a hum in their equipment, a signal in the night sky they could only explain as the remnants of the microwave radiation background left over from the Big Bang itself.

More than anything else, this observation, and the inference it provoked, persuaded physicists that the structure of Big Bang cosmology was anchored into fact….

“Perhaps the best argument in favor of the thesis that the Big Bang supports theism,” the astrophysicist Christopher Isham has observed, “is the obvious unease with which it is greeted by some atheist physicists. At times this has led to scientific ideas, such as continuous creation or an oscillating universe, being advanced with a tenacity which so exceeds their intrinsic worth that one can only suspect the operation of psychological forces lying very much deeper than the usual academic desire of a theorist to support his or her theory.”…

… With the possibility of inexistence staring it in the face, why does the universe exist? To say that universe just is, as Stephen Hawking has said, is to reject out of hand any further questions. We know that it is. It is right there in plain sight. What philosophers such as ourselves wish to know is why it is. It may be that at the end of these inquiries we will answer our own question by saying that the universe exists for no reason whatsoever. At the end of these inquiries, and not the beginning….

Among physicists, the question of how something emerged from nothing has one decisive effect: It loosens their tongues. “One thing [that] is clear,” a physicist writes, “in our framing of questions such as ‘How did the Universe get started?’ is that the Universe was self-creating. This is not a statement on a ‘cause’ behind the origin of the Universe, nor is it a statement on a lack of purpose or destiny. It is simply a statement that the Universe was emergent, that the actual Universe probably derived from an indeterminate sea of potentiality that we call the quantum vacuum, whose properties may always remain beyond our current understanding.”

It cannot be said that “an indeterminate sea of potentiality” has anything like the clarifying effect needed by the discussion, and indeed, except for sheer snobbishness, physicists have offered no reason to prefer this description of the Source of Being to the one offered by Abu al-Hassan al Hashari in ninth-century Baghdad. The various Islamic versions of that indeterminate sea of being he rejected in a spasm of fierce disgust. “We confess,” he wrote, “that God is firmly seated on his throne. We confess that God has two hands, without asking how. We confess that God has two eyes, without asking how. We confess that God has a face.”…

Proposing to show how something might emerge from nothing, [the physicist Victor Stenger] introduces “another universe [that] existed prior to ours that tunneled through . . . to become our universe. Critics will argue that we have no way of observing such an earlier universe, and so this is not very scientific” (italics added). This is true. Critics will do just that. Before they do, they will certainly observe that Stenger has completely misunderstood the terms of the problem that he has set himself, and that far from showing how something can arise from nothing, he has shown only that something might arise from something else. This is not an observation that has ever evoked a firestorm of controversy….

… [A]ccording to the many-worlds interpretation [of quantum mechanics], at precisely the moment a measurement is made, the universe branches into two or more universes. The cat who was half dead and half alive gives rise to two separate universes, one containing a cat who is dead, the other containing a cat who is alive. The new universes cluttering up creation embody the quantum states that were previously in a state of quantum superposition.

The many-worlds interpretation of quantum mechanics is rather like the incarnation. It appeals to those who believe in it, and it rewards belief in proportion to which belief is sincere….

No less than the doctrines of religious belief, the doctrines of quantum cosmology are what they seem: biased, partial, inconclusive, and largely in the service of passionate but unexamined conviction.

*   *   *

The cosmological constant is a number controlling the expansion of the universe. If it were negative, the universe would appear doomed to contract in upon itself, and if positive, equally doomed to expand out from itself. Like the rest of us, the universe is apparently doomed no matter what it does. And here is the odd point: If the cosmological constant were larger than it is, the universe would have expanded too quickly, and if smaller, it would have collapsed too early, to permit the appearance of living systems….

“Scientists,” the physicist Paul Davies has observed, “are slowly waking up to an inconvenient truth— the universe looks suspiciously like a fix. The issue concerns the very laws of nature themselves. For 40 years, physicists and cosmologists have been quietly collecting examples of all too convenient ‘coincidences’ and special features in the underlying laws of the universe that seem to be necessary in order for life, and hence conscious beings, to exist. Change any one of them and the consequences would be lethal.”….

Why? Yes, why?

An appeal to still further physical laws is, of course, ruled out on the grounds that the fundamental laws of nature are fundamental. An appeal to logic is unavailing. The laws of nature do not seem to be logical truths. The laws of nature must be intrinsically rich enough to specify the panorama of the universe, and the universe is anything but simple. As Newton remarks, “Blind metaphysical necessity, which is certainly the same always and everywhere, could produce no variety of things.”

If the laws of nature are neither necessary nor simple, why, then, are they true?

Questions about the parameters and laws of physics form a single insistent question in thought: Why are things as they are when what they are seems anything but arbitrary?

One answer is obvious. It is the one that theologians have always offered: The universe looks like a put-up job because it is a put-up job.

*   *   *

Any conception of a contingent deity, Aquinas argues, is doomed to fail, and it is doomed to fail precisely because whatever He might do to explain the existence of the universe, His existence would again require an explanation. “Therefore, not all beings are merely possible, but there must exist something the existence of which is necessary.”…

… “We feel,” Wittgenstein wrote, “that even when all possible scientific questions have been answered, the problems of life remain completely untouched.” Those who do feel this way will see, following Aquinas, that the only inference calculated to overcome the way things are is one directed toward the way things must be….

“The key difference between the radically extravagant God hypothesis,” [Dawkins] writes, “and the apparently extravagant multiverse hypothesis, is one of statistical improbability.”

It is? I had no idea, the more so since Dawkins’s very next sentence would seem to undercut the sentence he has just written. “The multiverse, for all that it is extravagant, is simple,” because each of its constituent universes “is simple in its fundamental laws.”

If this is true for each of those constituent universes, then it is true for our universe as well. And if our universe is simple in its fundamental laws, what on earth is the relevance of Dawkins’s argument?

Simple things, simple explanations, simple laws, a simple God.

Bon appétit.

*   *   *

As a rhetorical contrivance, the God of the Gaps makes his effect contingent on a specific assumption: that whatever the gaps, they will in the course of scientific research be filled…. Western science has proceeded by filling gaps, but in filling them, it has created gaps all over again. The process is inexhaustible. Einstein created the special theory of relativity to accommodate certain anomalies in the interpretation of Clerk Maxwell’s theory of the electromagnetic field. Special relativity led directly to general relativity. But general relativity is inconsistent with quantum mechanics, the largest visions of the physical world alien to one another. Understanding has improved, but within the physical sciences, anomalies have grown great, and what is more, anomalies have grown great because understanding has improved….

… At the very beginning of his treatise Vertebrate Paleontology and Evolution, Robert Carroll observes quite correctly that “most of the fossil record does not support a strictly gradualistic account” of evolution. A “strictly gradualistic” account is precisely what Darwin’s theory demands: It is the heart and soul of the theory.

But by the same token, there are no laboratory demonstrations of speciation either, millions of fruit flies coming and going while never once suggesting that they were destined to appear as anything other than fruit flies. This is the conclusion suggested as well by more than six thousand years of artificial selection, the practice of barnyard and backyard alike. Nothing can induce a chicken to lay a square egg or to persuade a pig to develop wheels mounted on ball bearings….

… In a research survey published in 2001, and widely ignored thereafter, the evolutionary biologist Joel Kingsolver reported that in sample sizes of more than one thousand individuals, there was virtually no correlation between specific biological traits and either reproductive success or survival. “Important issues about selection,” he remarked with some understatement, “remain unresolved.”

Of those important issues, I would mention prominently the question whether natural selection exists at all.

Computer simulations of Darwinian evolution fail when they are honest and succeed only when they are not. Thomas Ray has for years been conducting computer experiments in an artificial environment that he has designated Tierra. Within this world, a shifting population of computer organisms meet, mate, mutate, and reproduce.

Sandra Blakeslee, writing for the New York Times, reported the results under the headline “Computer ‘Life Form’ Mutates in an Evolution Experiment: Natural Selection Is Found at Work in a Digital World.”

Natural selection found at work? I suppose so, for as Blakeslee observes with solemn incomprehension, “the creatures mutated but showed only modest increases in complexity.” Which is to say, they showed nothing of interest at all. This is natural selection at work, but it is hardly work that has worked to intended effect.

What these computer experiments do reveal is a principle far more penetrating than any that Darwin ever offered: There is a sucker born every minute….

… Daniel Dennett, like Mexican food, does not fail to come up long after he has gone down. “Contemporary biology,” he writes, “has demonstrated beyond all reasonable doubt that natural selection— the process in which reproducing entities must compete for finite resources and thereby engage in a tournament of blind trial and error from which improvements automatically emerge— has the power to generate breathtakingly ingenious designs” (italics added).

These remarks are typical in their self-enchanted self-confidence. Nothing in the physical sciences, it goes without saying— right?— has been demonstrated beyond all reasonable doubt. The phrase belongs to a court of law. The thesis that improvements in life appear automatically represents nothing more than Dennett’s conviction that living systems are like elevators: If their buttons are pushed, they go up. Or down, as the case may be. Although Darwin’s theory is very often compared favorably to the great theories of mathematical physics on the grounds that evolution is as well established as gravity, very few physicists have been heard observing that gravity is as well established as evolution. They know better and they are not stupid….

… The greater part of the debate over Darwin’s theory is not in service to the facts. Nor to the theory. The facts are what they have always been: They are unforthcoming. And the theory is what it always was: It is unpersuasive. Among evolutionary biologists, these matters are well known. In the privacy of the Susan B. Anthony faculty lounge, they often tell one another with relief that it is a very good thing the public has no idea what the research literature really suggests.

“Darwin?” a Nobel laureate in biology once remarked to me over his bifocals. “That’s just the party line.”

In the summer of 2007, Eugene Koonin, of the National Center for Biotechnology Information at the National Institutes of Health, published a paper entitled “The Biological Big Bang Model for the Major Transitions in Evolution.”

The paper is refreshing in its candor; it is alarming in its consequences. “Major transitions in biological evolution,” Koonin writes, “show the same pattern of sudden emergence of diverse forms at a new level of complexity” (italics added). Major transitions in biological evolution? These are precisely the transitions that Darwin’s theory was intended to explain. If those “major transitions” represent a “sudden emergence of new forms,” the obvious conclusion to draw is not that nature is perverse but that Darwin was wrong….

Koonin is hardly finished. He has just started to warm up. “In each of these pivotal nexuses in life’s history,” he goes on to say, “the principal ‘types’ seem to appear rapidly and fully equipped with the signature features of the respective new level of biological organization. No intermediate ‘grades’ or intermediate forms between different types are detectable.”…

… [H[is views are simply part of a much more serious pattern of intellectual discontent with Darwinian doctrine. Writing in the 1960s and 1970s, the Japanese mathematical biologist Motoo Kimura argued that on the genetic level— the place where mutations take place— most changes are selectively neutral. They do nothing to help an organism survive; they may even be deleterious…. Kimura was perfectly aware that he was advancing a powerful argument against Darwin’s theory of natural selection. “The neutral theory asserts,” he wrote in the introduction to his masterpiece, The Neutral Theory of Molecular Evolution, “that the great majority of evolutionary changes at the molecular level, as revealed by comparative studies of protein and DNA sequences, are caused not by Darwinian selection but by random drift of selectively neutral or nearly neutral mutations” (italics added)….

… Writing in the Proceedings of the National Academy of Sciences, the evolutionary biologist Michael Lynch observed that “Dawkins’s agenda has been to spread the word on the awesome power of natural selection.” The view that results, Lynch remarks, is incomplete and therefore “profoundly misleading.” Lest there be any question about Lynch’s critique, he makes the point explicitly: “What is in question is whether natural selection is a necessary or sufficient force to explain the emergence of the genomic and cellular features central to the building of complex organisms.”…

When asked what he was in awe of, Christopher Hitchens responded that his definition of an educated person is that you have some idea how ignorant you are. This seems very much as if Hitchens were in awe of his own ignorance, in which case he has surely found an object worthy of his veneration.

*   *   *

Do read the whole thing. It will take you only a few hours. And it will remind you — as we badly need reminding these days — that sanity reigns in some corners of the universe.


Related posts:

Same Old Story, Same Old Song and Dance
Atheism, Religion, and Science
The Limits of Science
Beware of Irrational Atheism
The Thing about Science
Evolution and Religion
Words of Caution for Scientific Dogmatists
The Legality of Teaching Intelligent Design
Science, Logic, and God
Debunking “Scientific Objectivity”
Science’s Anti-Scientific Bent
The Big Bang and Atheism
Atheism, Religion, and Science Redux
Religion as Beneficial Evolutionary Adaptation
A Non-Believer Defends Religion
The Greatest Mystery
Landsburg Is Half-Right
Evolution, Human Nature, and “Natural Rights”
More Thoughts about Evolutionary Teleology
A Digression about Probability and Existence
Existence and Creation
Probability, Existence, and Creation
The Atheism of the Gaps
Demystifying Science
Religion on the Left
Scientism, Evolution, and the Meaning of Life
Something from Nothing?
Something or Nothing
My Metaphysical Cosmology
Further Thoughts about Metaphysical Cosmology
Nothingness
Pinker Commits Scientism
Spooky Numbers, Evolution, and Intelligent Design
The Limits of Science (II)
The Pretence of Knowledge
“The Science Is Settled”
The Limits of Science, Illustrated by Scientists
Some Thoughts about Evolution
Rationalism, Empiricism, and Scientific Knowledge
Fine-Tuning in a Wacky Wrapper
Beating Religion with the Wrong End of the Stick
Quantum Mechanics and Free Will
“Science” vs. Science: The Case of Evolution, Race, and Intelligence
The Fragility of Knowledge
Altruism, One More Time
Religion, Creation, and Morality
The Pretence of Knowledge
Evolution, Intelligence, and Race

Not-So-Random Thoughts (XIII)

Links to the other posts in this occasional series may be found at “Favorite Posts,” just below the list of topics.

*     *     *

Jeremy Egerer says this “In Defense of a Beautiful Boss” (American Thinker, February 8, 2015):

Leftists have been waging a war against nearly every personal advantage for years: if they aren’t upset because your parents are rich, they’ll insult you because your parents are white, or maybe because you have a penis.  In their most unreasonable moments, they might even be upset that you deserve your own job.  It seems only reasonable to expect that sooner or later, they would be complaining about whether or not our bosses keep themselves in shape.

This is because at the heart of all leftism lies an unreasonable envy of all advantage (disguised as an advocacy of the disadvantaged) and an unhealthy hatred of actual diversity (disguised as an appreciation of difference).  They call life a meritocracy when your successful parents raise you to win, which is a lot like complaining that your parents raised you at all.  It’s almost enough to make you wonder whether they loathe the laws of cause and effect.  In the fight against all odds – not his, but everyone’s – the leftist hasn’t only forgotten that different people breed different people; he’s forgotten that different people are diversity itself, and that diversity, the thing he claims to be championing, means that someone is going to have natural advantages.

Spot on. I have addressed the left’s war on “lookism” in “How to Combat Beauty-ism” and “An Economist’s Special Pleading: Affirmative Action for the Ugly.”

*     *     *

John Ray tackles “Conservative and Liberal Brains Again” (A Western Heart, February 14, 2015):

Most such reports [Current Biology 21, 677–680, April 26, 2011 ª2011. DOI 10.1016/j.cub.2011.03.017] are … parsimoniously interpreted as conservatives being more cautious, which is hardly a discovery. And if there is something wrong with caution then there is everything wrong with a lot of things.  Science, for instance, is a sustained exercise in caution. So conservatives are born more cautious and Leftist brains miss most of that out.  So [a commentary that conservatives are] “sensitive to fear” … could be equally well restated as “cautious”.  And the finding that liberals “have a higher capacity to tolerate uncertainty and conflicts” is pure guesswork [on the part of the commentators].  As the report authors note, that is just “one of the functions of the anterior cingulate cortex”.

Despite the apparent even-handedness of the authors of the study cited by Dr. Ray, the field of psychology has long had a pro-left tilt. See, for example, my posts “Conservatism, Libertarianism, and the ‘Authoritarian Personality’,” “The F Scale, Revisited,” and “The Psychologist Who Played God.”

*     *     *

Income inequality is another item in the long list of subjects about which leftists obsess, despite the facts of the matter. Mark J. Perry, as usual, deals in facts: “US Middle Class Has Disappeared into Higher-Income Groups; Recent Stagnation Explained by Changing Household Demographics?” (AEI.org, February 4, 2015) and “Evidence Shows That Affluence in the US Is Much More Fluid and Widespread Than The Rigid Class Structure Narrative Suggests” (AEI.org, February 25, 2015). The only problem with these two posts is Perry’s unnecessary inclusion of a question mark in the title of the first one. For more on the subject, plus long lists of related posts and readings, see my post, “Mass (Economic) Hysteria: Income Inequality and Related Themes.”

*     *     *

Speaking of leftists who obsess about income inequality — and get it wrong — there’s Thomas Piketty, author of the much-rebutted Capital in the Twenty-First Century. I have much to say about Deidre McCloskey’s take-down of Piketty in “McCloskey on Piketty.” David Henderson, whose review of Capital is among the several related readings listed in my post, has more to say; for example:

McCloskey’s review is a masterpiece. She beautifully weaves together economic history, simple price theory, basic moral philosophy, and history of economic thought. Whereas I had mentally put aside an hour to read and think, it took only about 20 minutes. I highly recommend it. (“McCloskey on Piketty,” EconLog, February 25, 2015)

Henderson continues by sampling some of Piketty’s many errors of fact, logic, and economic theory that McCloskey exposes.

*     *     *

Although it won’t matter to committed leftists, Piketty seems to have taken some of this critics to heart. James Pethokoukis writes:

[I]n a new paper, Piketty takes a step or two backward. He now denies that he views his simple economic formula “as the only or even the primary tool for considering changes in income and wealth in the 20th century, or for forecasting the path of income and wealth inequality in the 21st century.” Seems his fundamental law isn’t so fundamental after all once you factor in things like how some of that wealth is (a) spent on super-yachts and bad investments; (b) divided among children through the generations; and (c) already taxed fairly heavily. In particular, the rise in income inequality, as opposed to wealth inequality, has “little to do” with “r > g,” he says….

Piketty’s modest retreat isn’t all that surprising, given the withering academic assault on his research. In a survey of top economists late last year, 81 percent disagreed with his thesis. And several used fairly rough language — at least for scholars — such as “weak” and not “particularly useful,” with one accusing Piketty of “poor theory” and “negligible empirics.”

This is all rather bad news for what I have termed the Unified Economic Theory of Modern Liberalism: Not only are the rich getting richer — and will continue to do so because, you know, capitalism — but this growing gap is hurting economic growth. Redistribution must commence, tout de suite!

But Piketty’s clarification isn’t this politically convenient theory’s only problem. The part about inequality and growth has also suffered a setback. The link between the two is a key part of the “secular stagnation” theory of superstar Democratic economist Lawrence Summers. Since the rich save more than the middle class, growing income inequality is sapping the economy of consumer demand. So government must tax more and spend more. But Summers recently offered an updated view, saying that while boosting consumer demand is necessary, it is not sufficient for strong economic growth. Washington must also do the sort of “supply-side” stuff that Republicans kvetch about, such as business tax reform.

…[C]oncern about the income gap shouldn’t be used an excuse to ignore America’s real top problem, a possible permanent downshift in the growth potential of the U.S. economy. At least Piketty got half his equation right. [“The Politically Convenient but Largely Bogus Unified Economic Theory of Modern Liberalism,” The Week, March 11, 2015]

About that bogus inequality-hurts-growth meme, see my post, “Income Inequality and Economic Growth.”

*     *     *

Harvard’s Robert Putnam is another class warrior, whose propagandistic effusion “E Pluribus Unum: Diversity and Community in the Twenty-first Century“ I skewer in “Society and the State” and “Genetic Kinship and Society.” I was therefore gratified to read in Henry Harpending’s post, “Charles Murray and Robert Putnam on Class” (West Hunter, March 20, 2015) some things said by John Derbyshire about Putnam’s paper:

That paper has a very curious structure. After a brief introduction (two pages), there are three main sections, headed as follows:

The Prospects and Benefits of Immigration and Ethnic Diversity (three pages)
Immigration and Diversity Foster Social Isolation (nineteen pages)
Becoming Comfortable with Diversity (seven pages)

I’ve had some mild amusement here at my desk trying to think up imaginary research papers similarly structured. One for publication in a health journal, perhaps, with three sections titled:

Health benefits of drinking green tea
Green tea causes intestinal cancer
Making the switch to green tea

Social science research in our universities cries out for a modern Jonathan Swift to lampoon its absurdities.

Amen.

*     *     *

Putnam is a big booster of “diversity,” which — in the left’s interpretation — doesn’t mean diversity of political, social, and economic views. What it means is the forced association of persons of irreconcilably opposed social norms. I say some things about that in “Society and the State” and “Genetic Kinship and Society.” Fred Reed has much more to say in a recent column:

In Ferguson blacks are shooting policemen as others cheer. It does a curmudgeon’s soul good: Everything gets worse, the collapse continues, and unreasoning stupidity goes thundering into the future.

We will hear I suppose that it wasn’t racial, that teens did it, that discrimination  caused it, white privilege, racism, institutional racism, slavery, colonialism, bigots, Southerners, rednecks—everything but the hatred of blacks for whites.

And thus we will avoid the unavoidable, that racial relations are a disaster, will remain a disaster, will get worse, are getting worse, and will lead to some awful denouement no matter how much we lie, preen, vituperate, chatter like Barbary apes, or admire ourselves.

It isn’t working. There is no sign that it ever will. What now?

The only solution, if there is a solution, would seem to be an amicable separation. This methinks would be greatly better than the slow-motion, intensifying racial war we now see, and pretend not to see. When the races mix, there is trouble. So, don’t mix them….

The racial hostility of blacks for whites can be seen elsewhere, for example in targeting of crime, most starkly in interracial rates of rape…. The numbers on rape, almost entirely black on white, also check out as cold fact… This has been analyzed to death, and ignored to death, but perhaps the most readable account is Jim Goad’s For Whom the Cat Calls (the numbers of note come below the ads).

Even without the (inevitable) racial hostility, togetherheid would not work well. The races have little or nothing in common. They do not want the same things. Whites come from a literate European tradition dating at least from the Iliad in 800 BC, a tradition characterized by literature, mathematics, architecture, philosophy, and the sciences. Africa, having a very different social traditions, was barely touched by this, and today blacks still show little interest. Even in the degenerate America of today, whites put far more emphasis on education than do blacks.

The media paint the problems of blacks as consequent to discrimination, but they clearly are not. If blacks in white schools wanted to do the work, or could, whites would applaud. If in black schools they demanded thicker textbooks with bigger words and smaller pictures, no white would refuse. The illiteracy, the very high rates of illegitimacy, the crime in general, the constant killing of young black men by young black men in particular—whites do not do these. They are either genetic, and irremediable, or cultural, and remediable, if at all, only in the very long run. We live in the short run.

Would it then not be reasonable to encourage a voluntary segregation? Having only black policemen in black regions would slow the burning of cities. If we let people live among their own, let them study what they chose to study, let them police themselves and order their schools as they chose, considerable calm would fall over the country.

If the races had the choice of running their own lives apart, they would. If this is not true, why do we have to spend such effort trying to force them together?

It is a great fallacy to think that because we ought to love one another, we will; or that because bloodshed among groups makes no sense, it won’t happen. The disparate seldom get along, whether Tamils and Sinhalese or Hindus and Moslems or Protestants and Catholics or Jews and Palestinians. The greater the cultural and genetic difference, the greater the likelihood and intensity of conflict. Blacks and whites are very, very different….

Separation does not imply disadvantage. The assertion that “separate is inherently unequal” is a catchiphrastic embodiment of the Supreme Court’s characteristic blowing in the political wind. A college for girls is not inherently inferior to a college for boys, nor a yeshiva for Jews inherently inferior to a parish school for Catholics. And maybe it is the business of girls and boys, Catholics and Jews, to decide what and where they want to study—not the government’s business….

Anger hangs over the country. Not everyone white is a professor or collegiate sophomore or network anchor. Not every white—not by a long shot—in Congress or the federal bureaucracy is a Mother Jones liberal, not in private conversation. They say aloud what they have to say. But in the Great Plains and small-town South, in corner bars in Chicago and Denver, in the black enclaves of the cities, a lot of people are ready to rumble. Read the comments section of the St. Louis papers after the riots. We can call the commenters whatever names we choose but when we finish, they will still be there. The shooting of policemen for racial reasons–at least four to date–is not a good sign. We will do nothing about it but chatter. [“The Symptoms Worsen,” Fred on Everything, March 15, 2015]

See also Reed’s column “Diversity: Koom. Bah. Humbug” (January 13, 2015) and my posts, “Race and Reason: The Achievement Gap — Causes and Implications,” “The Hidden Tragedy of the Assassination of Lincoln.”, “‘Conversing’ about Race,” “‘Wading’ into Race, Culture, and IQ,” “Round Up the Usual Suspects,”and “Evolution, Culture, and ‘Diversity’.”

*     *     *

In “The Fallacy of Human Progress” I address at length the thesis of Steven Pinker’s ludicrous The Better Angels of Our Nature: Why Violence Has Declined. In rebuttal to Pinker, I cite John Gray, author of The Silence of Animals: On Progress and Other Modern Myths:

Gray’s book — published  18 months after Better Angels — could be read as a refutation of Pinker’s book, though Gray doesn’t mention Pinker or his book.

Well, Gray recently published a refutation of Pinker’s book, which I can’t resist quoting at length:

The Better Angels of Our Nature: a history of violence and humanity (2011) has not only been an international bestseller – more than a thousand pages long and containing a formidable array of graphs and statistics, the book has established something akin to a contemporary orthodoxy. It is now not uncommon to find it stated, as though it were a matter of fact, that human beings are becoming less violent and more altruistic. Ranging freely from human pre-history to the present day, Pinker presents his case with voluminous erudition. Part of his argument consists in showing that the past was more violent than we tend to imagine…. This “civilising process” – a term Pinker borrows from the sociologist Norbert Elias – has come about largely as a result of the increasing power of the state, which in the most advanced countries has secured a near-monopoly of force. Other causes of the decline in violence include the invention of printing, the empowerment of women, enhanced powers of reasoning and expanding capacities for empathy in modern populations, and the growing influence of Enlightenment ideals….

Another proponent of the Long Peace is the well-known utilitarian philosopher Peter Singer, who has praised The Better Angels of Our Nature as “a supremely important book … a masterly achievement. Pinker convincingly demonstrates that there has been a dramatic decline in violence, and he is persuasive about the causes of that decline.” In a forthcoming book, The Most Good You Can Do, Singer describes altruism as “an emerging movement” with the potential to fundamentally alter the way humans live….

Among the causes of the outbreak of altruism, Pinker and Singer attach particular importance to the ascendancy of Enlightenment thinking….

…Pinker’s response when confronted with [contrary] evidence is to define the dark side of the Enlightenment out of existence. How could a philosophy of reason and toleration be implicated in mass murder? The cause can only be the sinister influence of counter-Enlightenment ideas….

The picture of declining violence presented by this new orthodoxy is not all it seems to be. As some critics, notably John Arquilla, have pointed out, it’s a mistake to focus too heavily on declining fatalities on the battlefield….

If great powers have avoided direct armed conflict, they have fought one another in many proxy wars. Neocolonial warfare in south-east Asia, the Korean war and the Chinese invasion of Tibet, British counter-insurgency warfare in Malaya and Kenya, the abortive Franco-British invasion of Suez, the Angolan civil war, the Soviet invasions of Hungary, Czechoslovakia and Afghanistan, the Vietnam war, the Iran-Iraq war, the first Gulf war, covert intervention in the Balkans and the Caucasus, the invasion of Iraq, the use of airpower in Libya, military aid to insurgents in Syria, Russian cyber-attacks in the Baltic states and the proxy war between the US and Russia that is being waged in Ukraine – these are only some of the contexts in which great powers have been involved in continuous warfare against each other while avoiding direct military conflict.

While it is true that war has changed, it has not become less destructive. Rather than a contest between well-organised states that can at some point negotiate peace, it is now more often a many-sided conflict in fractured or collapsed states that no one has the power to end….

It may be true that the modern state’s monopoly of force has led, in some contexts, to declining rates of violent death. But it is also true that the power of the modern state has been used for purposes of mass killing, and one should not pass too quickly over victims of state terror…. Pinker goes so far as to suggest that the 20th-century Hemoclysm might have been a gigantic statistical fluke, and cautions that any history of the last century that represents it as having been especially violent may be “apt to exaggerate the narrative coherence of this history” (the italics are Pinker’s). However, there is an equal or greater risk in abandoning a coherent and truthful narrative of the violence of the last century for the sake of a spurious quantitative precision….

While the seeming exactitude of statistics may be compelling, much of the human cost of war is incalculable…. [T]he statistics presented by those who celebrate the arrival of the Long Peace are morally dubious if not meaningless.

The radically contingent nature of the figures is another reason for not taking them too seriously. (For a critique of Pinker’s statistical methods, see Nassim Nicholas Taleb’s essay on the Long Peace.)…

Certainly the figures used by Pinker and others are murky, leaving a vast range of casualties of violence unaccounted for. But the value of these numbers for such thinkers comes from their very opacity. Like the obsidian mirrors made by the Aztecs for purposes of divination, these rows of graphs and numbers contain nebulous images of the future – visions that by their very indistinctness can give comfort to believers in human improvement….

Unable to tolerate the prospect that the cycles of conflict will continue, many are anxious to find continuing improvement in the human lot. Who can fail to sympathise with them? Lacking any deeper faith and incapable of living with doubt, it is only natural that believers in reason should turn to the sorcery of numbers. How else can they find meaning in their lives? [“John Gray: Steven Pinker Is Wrong about Violence and War,” The Guardian, March 13, 2015]

 *     *     *

I close this super-sized installment of “Thoughts” by returning to the subject of so-called net neutrality, which I addressed almost nine years ago in “Why ‘Net Neutrality’ Is a Bad Idea.” Now it’s a bad idea that the FCC has imposed on ISPs and their customers — until, one hopes, it’s rejected by the Supreme Court as yet another case of Obamanomic overreach.

As Robert Tracinski notes,

[b]illionaire investor Mark Cuban recently commented, about a push for new regulations on the Internet, that “In my adult life I have never seen a situation that paralleled what I read in Ayn Rand’s books until now with Net Neutrality.” He continued, “If Ayn Rand were an up-and-coming author today, she wouldn’t write about steel or railroads, it would be Net Neutrality.”

She certainly would, but if he thinks this is the first time real life has imitated Ayn Rand’s fiction, he needs to be paying a little more attention. Atlas has been shrugging for a long, long time. [“Net Neutrality: Yes, Mark Cuban, Atlas Is Shrugging,” The Federalist, March 18, 2015]

The rest of the story is outlined by the headings in Tracinski’s article:

The Relationship Between Net Neutrality and Atlas Shrugged

Internet Execs Are Already Uncomfortable with the Net Neutrality They Demanded

The Parallels Extend Into Fracking

Government Shuts Down Any Runaway Success

Atlas Shrugged Is Coming True Before Our Eyes

As I did in my post, Julian Adorney focuses on the economics of net neutrality:

After a number of false starts and under pressure from the White House, the FCC gave in and voted to regulate the Internet as a public utility in order to ban such practices, thus saving the Internet from a variety of boogeymen.

This is a tempting narrative. It has conflict, villains, heroes, and even a happy ending. There’s only one problem: it’s a fairy tale. Such mischief has been legal for decades, and ISPs have almost never behaved this way. Any ISP that created “slow lanes” or blocked content to consumers would be hurting its own bottom line. ISPs make money by seeking to satisfy consumers, not by antagonizing them.

There are two reasons that ISPs have to work to satisfy their customers. First, every company needs repeat business….

For Internet service providers, getting new business is expensive…. Satisfying customers so that they continue subscribing is cheaper, easier, and more profitable than continually replacing them. ISPs’ self-interest pushes them to add value to their customers just to keep them from jumping ship to their competitors.

In fact, this is what we’ve seen. ISPs have invested heavily in new infrastructure, and Internet speeds have increased by leaps and bounds…. These faster speeds have not been limited to big corporate customers: ISPs have routinely improved their services to regular consumers. They didn’t do so because the FCC forced them. For the past twenty years, “slow lanes” have been perfectly legal and almost as perfectly imaginary….

…ISPs shy away from creating slow lanes not because they have to but because they have a vested interest in offering fast service to all customers.

Contrary to the myth about ISPs being localized monopolies, 80 percent of Americans live in markets with access to multiple high-speed ISPs. While expensive regulations can discourage new players from entering the market, competition in most cities is increasingly robust….

ISPs still have to compete with each other for customers. If one ISP sticks them in the slow lane or blocks access to certain sites — or even just refuses to upgrade its service — consumers can simply switch to a competitor.

The second reason that ISPs seek to satisfy customers is that every business wants positive word of mouth. Consumers who receive excellent service talk up the service to their friends, generating new sign-ups. Consumers who receive mediocre service not only leave but badmouth the company to everyone they know.

In fact, this happened in one of the few cases where an ISP chose to discriminate against content. When Verizon blocked text messages from a pro-choice activist group in 2007, claiming the right to block “controversial or unsavory” messages, the backlash was fierce. Consumer Affairs notes that, “after a flurry of criticism, Verizon reversed its policy” on the pro-choice texts. The decision may have been ideological, but more likely Verizon reversed a policy that was driving away consumers, generating bad press, and hurting its bottom line.

In 2010, an FCC order made such “unreasonable discrimination” illegal (until the rule was struck down in 2014), but even without this rule, consumers proved more than capable of standing up to big corporations and handling such discrimination themselves.

In competitive markets, the consumer’s demand for quality prevents companies from cutting corners. Before the FCC imposed public utility regulations on the Internet, ISPs were improving service and abandoning discriminatory practices in order to satisfy their users. Net Neutrality advocates have spent years demanding a government solution to a problem that  markets had already solved. [“Net Nonsense,” The Freeman, March 18, 2015]

Amen, again.

Pinker Commits Scientism

Steven Pinker, who seems determined to outdo Bryan Caplan in wrongheadedness, devotes “Science Is Not Your Enemy” (The New Republic,  August 6, 2013), to the defense of scientism. Actually, Pinker doesn’t overtly defend scientism, which is indefensible; he just redefines it to mean science:

The term “scientism” is anything but clear, more of a boo-word than a label for any coherent doctrine. Sometimes it is equated with lunatic positions, such as that “science is all that matters” or that “scientists should be entrusted to solve all problems.” Sometimes it is clarified with adjectives like “simplistic,” “naïve,” and “vulgar.” The definitional vacuum allows me to replicate gay activists’ flaunting of “queer” and appropriate the pejorative for a position I am prepared to defend.

Scientism, in this good sense, is not the belief that members of the occupational guild called “science” are particularly wise or noble. On the contrary, the defining practices of science, including open debate, peer review, and double-blind methods, are explicitly designed to circumvent the errors and sins to which scientists, being human, are vulnerable.

After that slippery performance, it’s all smooth sailing — or so Pinker thinks — because all he has to do is point out all the good things about science. And if scientism=science, then scientism is good, right?

Wrong. Scientism remains indefensible, and there’s a lot of scientism in what passes for science. You don’t need to take my word for it; Pinker’s own words tell the tale.

But, first, let’s get clear about the meaning and fallaciousness of scientism. The various writers cited by Pinker describe it well, but Hayek probably offers the most thorough indictment of it; for example:

[W]e shall, wherever we are concerned … with slavish imitation of the method and language of Science, speak of “scientism” or the “scientistic” prejudice…. It should be noted that, in the sense in which we shall use these terms, they describe, of course, an attitude which is decidedly unscientific in the true sense of the word, since it involves a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed. The scientistic as distinguished from the scientific view is not an unprejudiced but a very prejudiced approach which, before it has considered its subject, claims to know what is the most appropriate way of investigating it…..

The blind transfer of the striving for quantitative measurements to a field in which the specific conditions are not present which give it its basic importance in the natural sciences, is the result of an entirely unfounded prejudice. It is probably responsible for the worst aberrations and absurdities produced by scientism in the social sciences. It not only leads frequently to the selection for study of the most irrelevant aspects of the phenomena because they happen to be measurable, but also to “measurements” and assignments of numerical values which are absolutely meaningless. What a distinguished philosopher recently wrote about psychology is at least equally true of the social sciences, namely that it is only too easy “to rush off to measure something without considering what it is we are measuring, or what measurement means. In this respect some recent measurements are of the same logical type as Plato’s determination that a just ruler is 729 times as happy as an unjust one.”…

Closely connected with the “objectivism” of the scientistic approach is its methodological collectivism, its tendency to treat “wholes” like “society” or the “economy,” “capitalism” (as a given historical “phase”) or a particular “industry” or “class” or “country” as definitely given objects about which we can discover laws by observing their behavior as wholes. While the specific subjectivist approach of the social sciences starts … from our knowledge of the inside of these social complexes, the knowledge of the individual attitudes which form the elements of their structure, the objectivism of the natural sciences tries to view them from the outside ; it treats social phenomena not as something of which the human mind is a part and the principles of whose organization we can reconstruct from the familiar parts, but as if they were objects directly perceived by us as wholes….

The belief that human history, which is the result of the interaction of innumerable human minds, must yet be subject to simple laws accessible to human minds is now so widely held that few people are at all aware what an astonishing claim it really implies. Instead of working patiently at the humble task of rebuilding from the directly known elements the complex and unique structures which we find in the world, and of tracing from the changes in the relations between the elements the changes in the wholes, the authors of these pseudo-theories of history pretend to be able to arrive by a kind of mental short cut at a direct insight into the laws of succession of the immediately apprehended wholes. However doubtful their status, these theories of development have achieved a hold on public imagination much greater than any of the results of genuine systematic study. “Philosophies” or “theories” of history (or “historical theories”) have indeed become the characteristic feature, the “darling vice” of the 19th century. From Hegel and Comte, and particularly Marx, down to Sombart and Spengler these spurious theories came to be regarded as representative results of social science; and through the belief that one kind of “system” must as a matter of historical necessity be superseded by a new and different “system,” they have even exercised a profound influence on social evolution. This they achieved mainly because they looked like the kind of laws which the natural sciences produced; and in an age when these sciences set the standard by which all intellectual effort was measured, the claim of these theories of history to be able to predict future developments was regarded as evidence of their pre-eminently scientific character. Though merely one among many characteristic 19th century products of this kind, Marxism more than any of the others has become the vehicle through which this result of scientism has gained so wide an influence that many of the opponents of Marxism equally with its adherents are thinking in its terms. (Friedrich A. Hayek, The Counter Revolution Of Science [Kindle Locations 120-1180], The Free Press.)

After a barrage like that (and this), what’s a defender of scientism to do? Pinker’s tactic is to stop using “scientism” and start using “science.” This makes it seem as if he really isn’t defending scientism, but rather trying to show how science can shed light onto subjects that are usually not in the province of science. In reality, Pinker preaches scientism by calling it science.

For example:

The new sciences of the mind are reexamining the connections between politics and human nature, which were avidly discussed in Madison’s time but submerged during a long interlude in which humans were assumed to be blank slates or rational actors. Humans, we are increasingly appreciating, are moralistic actors, guided by norms and taboos about authority, tribe, and purity, and driven by conflicting inclinations toward revenge and reconciliation.

There is nothing new in this, as Pinker admits by adverting to Madison. Nor was the understanding of human nature “submerged” except in the writings of scientistic social “scientists.” We ordinary mortals were never fooled. Moreover, Pinker’s idea of scientific political science seems to be data-dredging:

With the advent of data science—the analysis of large, open-access data sets of numbers or text—signals can be extracted from the noise and debates in history and political science resolved more objectively.

As explained here, data-dredging is about as scientistic as it gets:

When enough hypotheses are tested, it is virtually certain that some falsely appear statistically significant, since every data set with any degree of randomness contains some spurious correlations. Researchers using data mining techniques if they are not careful can be easily misled by these apparently significant results, even though they are mere artifacts of random variation.

Turning to the humanities, Pinker writes:

[T]here can be no replacement for the varieties of close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding? A consilience with science offers the humanities countless possibilities for innovation in understanding. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they cumulate [sic] and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, to say nothing of the kind of a progressive agenda that appeals to deans and donors. The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanists.

What on earth is Pinker talking about? This is over-the-top bafflegab worthy of Professor Irwin Corey. But because it comes from the keyboard of a noted (self-promoting) academic, we are meant to take it seriously.

Yes, art, culture, and society are products of human brains. So what? Poker is, too, and it’s a lot more amenable to explication by the mathematical tools of science. But the successful application of those tools depends on traits that are more art than science (bluffing, spotting “tells,” avoiding “tells,” for example).

More “explanatory depth” in the humanities means a deeper pile of B.S. Great art, literature, and music aren’t concocted formulaically. If they could be, modernism and postmodernism wouldn’t have yielded mountains of trash.

Oh, I know: It will be different next time. As if the tools of science are immune to misuse by obscurantists, relativists, and practitioners of political correctness. Tell it to those climatologists who dare to challenge the conventional wisdom about anthropogenic global warming. Tell it to the “sub-human” victims of the Third Reich’s medical experiments and gas chambers.

Pinker anticipates this kind of objection:

At a 2011 conference, [a] colleague summed up what she thought was the mixed legacy of science: the eradication of smallpox on the one hand; the Tuskegee syphilis study on the other. (In that study, another bloody shirt in the standard narrative about the evils of science, public-health researchers beginning in 1932 tracked the progression of untreated, latent syphilis in a sample of impoverished African Americans.) The comparison is obtuse. It assumes that the study was the unavoidable dark side of scientific progress as opposed to a universally deplored breach, and it compares a one-time failure to prevent harm to a few dozen people with the prevention of hundreds of millions of deaths per century, in perpetuity.

But the Tuskegee study was only a one-time failure in the sense that it was the only Tuskegee study. As a type of failure — the misuse of science (witting and unwitting) — it goes hand-in-hand with the advance of scientific knowledge. Should science be abandoned because of that? Of course not. But the hard fact is that science, qua science, is powerless against human nature, which defies scientific control.

Pinker plods on by describing ways in which science can contribute to the visual arts, music, and literary scholarship:

The visual arts could avail themselves of the explosion of knowledge in vision science, including the perception of color, shape, texture, and lighting, and the evolutionary aesthetics of faces and landscapes. Music scholars have much to discuss with the scientists who study the perception of speech and the brain’s analysis of the auditory world.

As for literary scholarship, where to begin? John Dryden wrote that a work of fiction is “a just and lively image of human nature, representing its passions and humours, and the changes of fortune to which it is subject, for the delight and instruction of mankind.” Linguistics can illuminate the resources of grammar and discourse that allow authors to manipulate a reader’s imaginary experience. Cognitive psychology can provide insight about readers’ ability to reconcile their own consciousness with those of the author and characters. Behavioral genetics can update folk theories of parental influence with discoveries about the effects of genes, peers, and chance, which have profound implications for the interpretation of biography and memoir—an endeavor that also has much to learn from the cognitive psychology of memory and the social psychology of self-presentation. Evolutionary psychologists can distinguish the obsessions that are universal from those that are exaggerated by a particular culture and can lay out the inherent conflicts and confluences of interest within families, couples, friendships, and rivalries that are the drivers of plot.

I wonder how Rembrandt and the Impressionists (among other pre-moderns) managed to create visual art of such evident excellence without relying on the kinds of scientific mechanisms invoked by Pinker. I wonder what music scholars would learn about excellence in composition that isn’t already evident in the general loathing of audiences for most “serious” modern and contemporary music.

As for literature, great writers know instinctively and through self-criticism how to tell stories that realistically depict character, social psychology, culture, conflict, and all the rest. Scholars (and critics), at best, can acknowledge what rings true and has dramatic or comedic merit. Scientistic pretensions in scholarship (and criticism) may result in promotions and raises for the pretentious, but they do not add to the sum of human enjoyment — which is the real aim of literature.

Pinker inveighs against critics of scientism (science, in Pinker’s vocabulary) who cry “reductionism” and “simplification.” With respect to the former, Pinker writes:

Demonizers of scientism often confuse intelligibility with a sin called reductionism. But to explain a complex happening in terms of deeper principles is not to discard its richness. No sane thinker would try to explain World War I in the language of physics, chemistry, and biology as opposed to the more perspicuous language of the perceptions and goals of leaders in 1914 Europe. At the same time, a curious person can legitimately ask why human minds are apt to have such perceptions and goals, including the tribalism, overconfidence, and sense of honor that fell into a deadly combination at that historical moment.

It is reductionist to explain a complex happening in terms of a deeper principle when that principle fails to account for the complex happening. Pinker obscures that essential point by offering a silly and irrelevant example about World War I. This bit of misdirection is unsurprising, given Pinker’s foray into reductionism, The Better Angels of Our Nature: Why Violence Has Declined, which I examine here.

As for simplification, Pinker says:

The complaint about simplification is misbegotten. To explain something is to subsume it under more general principles, which always entails a degree of simplification. Yet to simplify is not to be simplistic.

Pinker again dodges the issue. Simplification is simplistic when the “general principles” fail to account adequately for the phenomenon in question.

If Pinker is right about anything, it is when he says that “the intrusion of science into the territories of the humanities has been deeply resented.” The resentment, though some of it may be wrongly motivated, is fully justified.

Related reading (added 08/10/13 and 09/06/13):
Bill Vallicella, “Steven Pinker on Scientism, Part One,” Maverick Philosopher, August 10, 2013
Leon Wieseltier, “Crimes Against Humanities,” The New Republic, September 3, 2013 (gated)

Related posts about Pinker:
Nonsense about Presidents, IQ, and War
The Fallacy of Human Progress

Related posts about modernism:
Speaking of Modern Art
Making Sense about Classical Music
An Addendum about Classical Music
My Views on Classical Music, Vindicated
But It’s Not Music
A Quick Note about Music
Modernism in the Arts and Politics
Taste and Art
Modernism and the Arts

Related posts about science:
Science’s Anti-Scientific Bent
Modeling Is Not Science
Physics Envy
We, the Children of the Enlightenment
Demystifying Science
Analysis for Government Decision-Making: Hemi-Science, Hemi-Demi-Science, and Sophistry
Scientism, Evolution, and the Meaning of Life
The Candle Problem: Balderdash Masquerading as Science
Mysteries: Sacred and Profane
The Glory of the Human Mind

The Fallacy of Human Progress

Steven Pinker’s The Better Angels of Our Nature: Why Violence Has Declined is cited gleefully by leftists and cockeyed optimists as evidence that human beings, on the whole, are becoming kinder and gentler because of:

  • The Leviathan – The rise of the modern nation-state and judiciary “with a monopoly on the legitimate use of force,” which “can defuse the [individual] temptation of exploitative attack, inhibit the impulse for revenge, and circumvent…self-serving biases.”
  • Commerce – The rise of “technological progress [allowing] the exchange of goods and services over longer distances and larger groups of trading partners,” so that “other people become more valuable alive than dead” and “are less likely to become targets of demonization and dehumanization”;
  • Feminization – Increasing respect for “the interests and values of women.”
  • Cosmopolitanism – the rise of forces such as literacy, mobility, and mass media, which “can prompt people to take the perspectives of people unlike themselves and to expand their circle of sympathy to embrace them”;
  • The Escalator of Reason – an “intensifying application of knowledge and rationality to human affairs,” which “can force people to recognize the futility of cycles of violence, to ramp down the privileging of their own interests over others’, and to reframe violence as a problem to be solved rather than a contest to be won.”

I can tell you that Pinker’s book is hogwash because two very bright leftists — Peter Singer and Will Wilkinson — have strongly and wrongly endorsed some of its key findings. Singer writes:

Pinker argues that enhanced powers of reasoning give us the ability to detach ourselves from our immediate experience and from our personal or parochial perspective, and frame our ideas in more abstract, universal terms. This in turn leads to better moral commitments, including avoiding violence. It is just this kind of reasoning ability that has improved during the 20th century. He therefore suggests that the 20th century has seen a “moral Flynn effect, in which an accelerating escalator of reason carried us away from impulses that lead to violence” and that this lies behind the long peace, the new peace, and the rights revolution. Among the wide range of evidence he produces in support of that argument is the tidbit that since 1946, there has been a negative correlation between an American president’s I.Q. and the number of battle deaths in wars involving the United States.

I disposed of this staggeringly specious correlation here:

There is the convenient cutoff point of 1946. Why 1946? Well, it enables Pinker-Singer to avoid the inconvenient fact that the Civil War, World War I, and World War II happened while the presidency was held by three men who [purportedly] had high IQs: Lincoln, Wilson, and FDR….

If you buy the brand of snake oil being peddled by Pinker-Singer, you must believe that the “dumbest” and “smartest” presidents are unlikely to get the U.S. into wars that result in a lot of battle deaths, whereas some (but, mysteriously, not all) of the “medium-smart” presidents (Lincoln, Wilson, FDR) are likely to do so….

Let us advance from one to two explanatory variables. The second explanatory variable that strongly suggests itself is political party. And because it is not good practice to omit relevant statistics (a favorite gambit of liars), I estimated an equation based on “IQ” and battle deaths for the 27 men who served as president from the first Republican presidency (Lincoln’s) through the presidency of GWB….

In other words, battle deaths rise at the rate of 841 per IQ point (so much for Pinker-Singer). But there will be fewer deaths with a Republican in the White House (so much for Pinker-Singer’s implied swipe at GWB)….

All of this is nonsense, of course, for two reasons: [the] estimates of IQ are hogwash, and the number of U.S. battle deaths is a meaningless number, taken by itself.

… [The] estimates of presidents’ IQs put every one of them — including the “dumbest,” U.S. Grant — in the top 2.3 percent of the population. And the mean of Simonton’s estimates puts the average president in the top 0.1 percent (one-tenth of one percent) of the population. That is literally incredible.

As for Wilkinson, he praises statistics adduced by Pinker that show a decline in the use of capital punishment:

In the face of such a decisive trend in moral culture, we can say a couple different things. We can say that this is just change and says nothing in particular about what is really right or wrong, good or bad. Or we can take take say this is evidence of moral progress, that we have actually become better. I prefer the latter interpretation for basically the same reasons most of us see the abolition of slavery and the trend toward greater equality between races and sexes as progress and not mere morally indifferent change. We can talk about the nature of moral progress later. It’s tricky. For now, I want you to entertain the possibility that convergence toward the idea that execution is wrong counts as evidence that it is wrong.

My observation:

I would count convergence toward the idea that execution is wrong as evidence that it is wrong, if … that idea were (a) increasingly held by individuals who (b) had arrived at their “enlightenment” unnfluenced by operatives of the state (legislatures and judges), who take it upon themselves to flout popular support of the death penalty. What we have, in the case of the death penalty, is moral regress, not moral progress.

Moral regress because the abandonment of the death penalty puts innocent lives at risk. Capital punishment sends a message, and the message is effective when it is delivered: it deters homicide. And even if it didn’t, it would at least remove killers from our midst, permanently. By what standard of morality can one claim that it is better to spare killers than to protect innocents? For that matter, by what standard of morality is it better to kill innocents (in the womb) than to spare killers? Proponents of abortion (like Singer and Wilkinson) — who by and large oppose capital punishment — are completely lacking in moral authority.

Returning to Pinker’s thesis that violence has declined, I quote a review at Foseti:

Pinker’s basic problem is that he essentially defines “violence” in such a way that his thesis that violence is declining becomes self-fulling. “Violence” to Pinker is fundamentally synonymous with behaviors of older civilizations. On the other hand, modern practices are defined to be less violent than newer practices.

A while back, I linked to a story about a guy in my neighborhood who’s been arrested over 60 times for breaking into cars. A couple hundred years ago, this guy would have been killed for this sort of vandalism after he got caught the first time. Now, we feed him and shelter him for a while and then we let him back out to do this again. Pinker defines the new practice as a decline in violence – we don’t kill the guy anymore! Someone from a couple hundred years ago would be appalled that we let the guy continue destroying other peoples’ property without consequence. In the mind of those long dead, “violence” has in fact increased. Instead of a decline in violence, this practice seems to me like a decline in justice – nothing more or less.

Here’s another example, Pinker uses creative definitions to show that the conflicts of the 20th Century pale in comparison to previous conflicts. For example, all the Mongol Conquests are considered one event, even though they cover 125 years. If you lump all these various conquests together and you split up WWI, WWII, Mao’s takeover in China, the Bolshevik takeover of Russia, the Russian Civil War, and the Chinese Civil War (yes, he actually considers this a separate event from Mao), you unsurprisingly discover that the events of the 20th Century weren’t all that violent compared to events in the past! Pinker’s third most violent event is the “Mideast Slave Trade” which he says took place between the 7th and 19th Centuries. Seriously. By this standard, all the conflicts of the 20th Century are related. Is the Russian Revolution or the rise of Mao possible without WWII? Is WWII possible without WWI? By this consistent standard, the 20th Century wars of Communism would have seen the worst conflict by far. Of course, if you fiddle with the numbers, you can make any point you like.

There’s much more to the review, including some telling criticisms of Pinker’s five reasons for the (purported) decline in violence. That the reviewer somehow still wants to believe in the rightness of Pinker’s thesis says more about the reviewer’s optimism than it does about the validity of Pinker’s thesis.

That thesis is fundamentally flawed, as Robert Epstein points out in a review at Scientific American:

[T]he wealth of data [Pinker] presents cannot be ignored—unless, that is, you take the same liberties as he sometimes does in his book. In two lengthy chapters, Pinker describes psychological processes that make us either violent or peaceful, respectively. Our dark side is driven by a evolution-based propensity toward predation and dominance. On the angelic side, we have, or at least can learn, some degree of self-control, which allows us to inhibit dark tendencies.

There is, however, another psychological process—confirmation bias—that Pinker sometimes succumbs to in his book. People pay more attention to facts that match their beliefs than those that undermine them. Pinker wants peace, and he also believes in his hypothesis; it is no surprise that he focuses more on facts that support his views than on those that do not. The SIPRI arms data are problematic, and a reader can also cherry-pick facts from Pinker’s own book that are inconsistent with his position. He notes, for example, that during the 20th century homicide rates failed to decline in both the U.S. and England. He also describes in graphic and disturbing detail the savage way in which chimpanzees—our closest genetic relatives in the animal world—torture and kill their own kind.

Of greater concern is the assumption on which Pinker’s entire case rests: that we look at relative numbers instead of absolute numbers in assessing human violence. But why should we be content with only a relative decrease? By this logic, when we reach a world population of nine billion in 2050, Pinker will conceivably be satisfied if a mere two million people are killed in war that year.

The biggest problem with the book, though, is its overreliance on history, which, like the light on a caboose, shows us only where we are not going. We live in a time when all the rules are being rewritten blindingly fast—when, for example, an increasingly smaller number of people can do increasingly greater damage. Yes, when you move from the Stone Age to modern times, some violence is left behind, but what happens when you put weapons of mass destruction into the hands of modern people who in many ways are still living primitively? What happens when the unprecedented occurs—when a country such as Iran, where women are still waiting for even the slightest glimpse of those better angels, obtains nuclear weapons? Pinker doesn’t say.

Pinker’s belief that violence is on the decline reminds me of “it’s different this time,” a phrase that was on the lips of hopeful stock-pushers, stock-buyers, and pundits during the stock-market bubble of the late 1990s. That bubble ended, of course, in the spectacular crash of 2000.

Predictions about the future of humankind are better left in the hands of writers who see human nature whole, and who are not out to prove that it can be shaped or contained by the kinds of “liberal” institutions that Pinker so obviously favors.

Consider this, from an article by Robert J. Samuelson at The Washington Post:

[T]he Internet’s benefits are relatively modest compared with previous transformative technologies, and it brings with it a terrifying danger: cyberwar. Amid the controversy over leaks from the National Security Agency, this looms as an even bigger downside.

By cyberwarfare, I mean the capacity of groups — whether nations or not — to attack, disrupt and possibly destroy the institutions and networks that underpin everyday life. These would be power grids, pipelines, communication and financial systems, business record-keeping and supply-chain operations, railroads and airlines, databases of all types (from hospitals to government agencies). The list runs on. So much depends on the Internet that its vulnerability to sabotage invites doomsday visions of the breakdown of order and trust.

In a report, the Defense Science Board, an advisory group to the Pentagon, acknowledged “staggering losses” of information involving weapons design and combat methods to hackers (not identified, but probably Chinese). In the future, hackers might disarm military units. “U.S. guns, missiles and bombs may not fire, or may be directed against our own troops,” the report said. It also painted a specter of social chaos from a full-scale cyberassault. There would be “no electricity, money, communications, TV, radio or fuel (electrically pumped). In a short time, food and medicine distribution systems would be ineffective.”

But Pinker wouldn’t count the resulting chaos as violence, as long as human beings were merely starving and dying of various diseases. That violence would ensue, of course, is another story, which is told by John Gray in The Silence of Animals: On Progress and Other Modern Myths. Gray’s book — published  18 months after Better Angels — could be read as a refutation of Pinker’s book, though Gray doesn’t mention Pinker or his book.

The gist of Gray’s argument is faithfully recounted in a review of Gray’s book by Robert W. Merry at The National Interest:

The noted British historian J. B. Bury (1861–1927) … wrote, “This doctrine of the possibility of indefinitely moulding the characters of men by laws and institutions . . . laid a foundation on which the theory of the perfectibility of humanity could be raised. It marked, therefore, an important stage in the development of the doctrine of Progress.”

We must pause here over this doctrine of progress. It may be the most powerful idea ever conceived in Western thought—emphasizing Western thought because the idea has had little resonance in other cultures or civilizations. It is the thesis that mankind has advanced slowly but inexorably over the centuries from a state of cultural backwardness, blindness and folly to ever more elevated stages of enlightenment and civilization—and that this human progression will continue indefinitely into the future…. The U.S. historian Charles A. Beard once wrote that the emergence of the progress idea constituted “a discovery as important as the human mind has ever made, with implications for mankind that almost transcend imagination.” And Bury, who wrote a book on the subject, called it “the great transforming conception, which enables history to define her scope.”

Gray rejects it utterly. In doing so, he rejects all of modern liberal humanism. “The evidence of science and history,” he writes, “is that humans are only ever partly and intermittently rational, but for modern humanists the solution is simple: human beings must in future be more reasonable. These enthusiasts for reason have not noticed that the idea that humans may one day be more rational requires a greater leap of faith than anything in religion.” In an earlier work, Straw Dogs: Thoughts on Humans and Other Animals, he was more blunt: “Outside of science, progress is simply a myth.”

…Gray has produced more than twenty books demonstrating an expansive intellectual range, a penchant for controversy, acuity of analysis and a certain political clairvoyance.

He rejected, for example, Francis Fukuyama’s heralded “End of History” thesis—that Western liberal democracy represents the final form of human governance—when it appeared in this magazine in 1989. History, it turned out, lingered long enough to prove Gray right and Fukuyama wrong….

Though for decades his reputation was confined largely to intellectual circles, Gray’s public profile rose significantly with the 2002 publication of Straw Dogs, which sold impressively and brought him much wider acclaim than he had known before. The book was a concerted and extensive assault on the idea of progress and its philosophical offspring, secular humanism. The Silence of Animals is in many ways a sequel, plowing much the same philosophical ground but expanding the cultivation into contiguous territory mostly related to how mankind—and individual humans—might successfully grapple with the loss of both metaphysical religion of yesteryear and today’s secular humanism. The fundamentals of Gray’s critique of progress are firmly established in both books and can be enumerated in summary.

First, the idea of progress is merely a secular religion, and not a particularly meaningful one at that. “Today,” writes Gray in Straw Dogs, “liberal humanism has the pervasive power that was once possessed by revealed religion. Humanists like to think they have a rational view of the world; but their core belief in progress is a superstition, further from the truth about the human animal than any of the world’s religions.”

Second, the underlying problem with this humanist impulse is that it is based upon an entirely false view of human nature—which, contrary to the humanist insistence that it is malleable, is immutable and impervious to environmental forces. Indeed, it is the only constant in politics and history. Of course, progress in scientific inquiry and in resulting human comfort is a fact of life, worth recognition and applause. But it does not change the nature of man, any more than it changes the nature of dogs or birds. “Technical progress,” writes Gray, again in Straw Dogs, “leaves only one problem unsolved: the frailty of human nature. Unfortunately that problem is insoluble.”

That’s because, third, the underlying nature of humans is bred into the species, just as the traits of all other animals are. The most basic trait is the instinct for survival, which is placed on hold when humans are able to live under a veneer of civilization. But it is never far from the surface. In The Silence of Animals, Gray discusses the writings of Curzio Malaparte, a man of letters and action who found himself in Naples in 1944, shortly after the liberation. There he witnessed a struggle for life that was gruesome and searing. “It is a humiliating, horrible thing, a shameful necessity, a fight for life,” wrote Malaparte. “Only for life. Only to save one’s skin.” Gray elaborates:

Observing the struggle for life in the city, Malaparte watched as civilization gave way. The people the inhabitants had imagined themselves to be—shaped, however imperfectly, by ideas of right and wrong—disappeared. What were left were hungry animals, ready to do anything to go on living; but not animals of the kind that innocently kill and die in forests and jungles. Lacking a self-image of the sort humans cherish, other animals are content to be what they are. For human beings the struggle for survival is a struggle against themselves.

When civilization is stripped away, the raw animal emerges. “Darwin showed that humans are like other animals,” writes Gray in Straw Dogs, expressing in this instance only a partial truth. Humans are different in a crucial respect, captured by Gray himself when he notes that Homo sapiens inevitably struggle with themselves when forced to fight for survival. No other species does that, just as no other species has such a range of spirit, from nobility to degradation, or such a need to ponder the moral implications as it fluctuates from one to the other. But, whatever human nature is—with all of its capacity for folly, capriciousness and evil as well as virtue, magnanimity and high-mindedness—it is embedded in the species through evolution and not subject to manipulation by man-made institutions.

Fourth, the power of the progress idea stems in part from the fact that it derives from a fundamental Christian doctrine—the idea of providence, of redemption….

“By creating the expectation of a radical alteration in human affairs,” writes Gray, “Christianity . . . founded the modern world.” But the modern world retained a powerful philosophical outlook from the classical world—the Socratic faith in reason, the idea that truth will make us free; or, as Gray puts it, the “myth that human beings can use their minds to lift themselves out of the natural world.” Thus did a fundamental change emerge in what was hoped of the future. And, as the power of Christian faith ebbed, along with its idea of providence, the idea of progress, tied to the Socratic myth, emerged to fill the gap. “Many transmutations were needed before the Christian story could renew itself as the myth of progress,” Gray explains. “But from being a succession of cycles like the seasons, history came to be seen as a story of redemption and salvation, and in modern times salvation became identified with the increase of knowledge and power.”

Thus, it isn’t surprising that today’s Western man should cling so tenaciously to his faith in progress as a secular version of redemption. As Gray writes, “Among contemporary atheists, disbelief in progress is a type of blasphemy. Pointing to the flaws of the human animal has become an act of sacrilege.” In one of his more brutal passages, he adds:

Humanists believe that humanity improves along with the growth of knowledge, but the belief that the increase of knowledge goes with advances in civilization is an act of faith. They see the realization of human potential as the goal of history, when rational inquiry shows history to have no goal. They exalt nature, while insisting that humankind—an accident of nature—can overcome the natural limits that shape the lives of other animals. Plainly absurd, this nonsense gives meaning to the lives of people who believe they have left all myths behind.

In the Silence of Animals, Gray explores all this through the works of various writers and thinkers. In the process, he employs history and literature to puncture the conceits of those who cling to the progress idea and the humanist view of human nature. Those conceits, it turns out, are easily punctured when subjected to Gray’s withering scrutiny….

And yet the myth of progress is so powerful in part because it gives meaning to modern Westerners struggling, in an irreligious era, to place themselves in a philosophical framework larger than just themselves….

Much of the human folly catalogued by Gray in The Silence of Animals makes a mockery of the earnest idealism of those who later shaped and molded and proselytized humanist thinking into today’s predominant Western civic philosophy.

There was an era of realism, but it was short-lived:

But other Western philosophers, particularly in the realm of Anglo-Saxon thought, viewed the idea of progress in much more limited terms. They rejected the idea that institutions could reshape mankind and usher in a golden era of peace and happiness. As Bury writes, “The general tendency of British thought was to see salvation in the stability of existing institutions, and to regard change with suspicion.” With John Locke, these thinkers restricted the proper role of government to the need to preserve order, protect life and property, and maintain conditions in which men might pursue their own legitimate aims. No zeal here to refashion human nature or remake society.

A leading light in this category of thinking was Edmund Burke (1729–1797), the British statesman and philosopher who, writing in his famous Reflections on the Revolution in France, characterized the bloody events of the Terror as “the sad but instructive monuments of rash and ignorant counsel in time of profound peace.” He saw them, in other words, as reflecting an abstractionist outlook that lacked any true understanding of human nature. The same skepticism toward the French model was shared by many of the Founding Fathers, who believed with Burke that human nature isn’t malleable but rather potentially harmful to society. Hence, it needed to be checked. The central distinction between the American and French revolutions, in the view of conservative writer Russell Kirk, was that the Americans generally held a “biblical view of man and his bent toward sin,” whereas the French opted for “an optimistic doctrine of human goodness.” Thus, the American governing model emerged as a secular covenant “designed to restrain the human tendencies toward violence and fraud . . . [and] place checks upon will and appetite.”

Most of the American Founders rejected the French philosophes in favor of the thought and history of the Roman Republic, where there was no idea of progress akin to the current Western version. “Two thousand years later,” writes Kirk, “the reputation of the Roman constitution remained so high that the framers of the American constitution would emulate the Roman model as best they could.” They divided government powers among men and institutions and created various checks and balances. Even the American presidency was modeled generally on the Roman consular imperium, and the American Senate bears similarities to the Roman version. Thus did the American Founders deviate from the French abstractionists and craft governmental structures to fit humankind as it actually is—capable of great and noble acts, but also of slipping into vice and treachery when unchecked. That ultimately was the genius of the American system.

But, as the American success story unfolded, a new collection of Western intellectuals, theorists and utopians—including many Americans—continued to toy with the idea of progress. And an interesting development occurred. After centuries of intellectual effort aimed at developing the idea of progress as an ongoing chain of improvement with no perceived end into the future, this new breed of “Progress as Power” thinkers began to declare their own visions as the final end point of this long progression.

Gray calls these intellectuals “ichthyophils,” which he defines as “devoted to their species as they think it ought to be, not as it actually is or as it truly wants to be.” He elaborates: “Ichthyophils come in many varieties—the Jacobin, Bolshevik and Maoist, terrorizing humankind in order to remake it on a new model; the neo-conservative, waging perpetual war as a means to universal democracy; liberal crusaders for human rights, who are convinced that all the world longs to become as they imagine themselves to be.” He includes also “the Romantics, who believe human individuality is everywhere repressed.”

Throughout American politics, as indeed throughout Western politics, a large proportion of major controversies ultimately are battles between the ichthyophils and the Burkeans, between the sensibility of the French Revolution and the sensibility of American Revolution, between adherents of the idea of progress and those skeptical of that potent concept. John Gray has provided a major service in probing with such clarity and acuity the impulses, thinking and aims of those on the ichthyophil side of that great divide. As he sums up, “Allowing the majority of humankind to imagine they are flying fish even as they pass their lives under the waves, liberal civilization rests on a dream.”

And so it goes. On the left there are the ichtyophils of America, represented in huge numbers by “progressives” and their constituents and dupes (i.e., a majority of the public). They are given aid and comfort by a small but vociferous number of pseudo-libertarians (as discussed here, for example). On the right stands a throng of pseudo-conservatives — mainly identified with the Republican Party — who are prone to adopt the language and ideals of progressivism, out of power-lust and ignorance. Almost entirely muted by the sound and fury emanating from left and right — and relatively few in number — are the true libertarians: Burkean conservatives.

And so Leviathan grows, crushing the liberty envisioned by our Burkean Founders in the name of “progress” (i.e., social and economic engineering). And as Robert Samuelson points out, the growth of Leviathan doesn’t ensure our immunity to chaos and barbarity in the event of a debilitating attack on our fragile infrastructure. It is ironic that we would be better able to withstand such an attack without descending into chaos and barbarity had not Leviathan weakened and sundered many true social bonds, in the name of “progress.”

Our thralldom to an essentially impotent Leviathan is of no importance to Pinker, to “progressives,” or the dupes and constituents of “progressivism.” They have struck their Faustian bargain with Leviathan, and they will pay the price, sooner or later. Unfortunately, all of us will pay the price — even those of us who despise and resist Leviathan.

*     *     *

Related reading: Wesley Morganston, “The Long, Slow Collapse: What Whig History Can’t Explain,” Theden, October 26, 2014

Related posts:
Democracy vs. Liberty
Something Controversial
More about Democracy and Liberty
Yet Another Look at Democracy
Law, Liberty, and Abortion
Abortion and the Slippery Slope
Privacy: Variations on the Theme of Liberty
An Immigration Roundup
Illogic from the Pro-Immigration Camp
The Ruinous Despotism of Democracy
On Liberty
Illegal Immigration: A Note to Libertarian Purists
Inside-Outside
A Moralist’s Moral Blindness
Pseudo-Libertarian Sophistry vs. True Libertarianism
The Folly of Pacifism
Positivism, “Natural Rights,” and Libertarianism
What Are “Natural Rights”?
The Golden Rule and the State
Libertarian Conservative or Conservative Libertarian?
Bounded Liberty: A Thought Experiment
Evolution, Human Nature, and “Natural Rights”
More Pseudo-Libertarianism
More about Conservative Governance
The Meaning of Liberty
Positive Liberty vs. Liberty
On Self-Ownership and Desert
In Defense of Marriage
Understanding Hayek
Rethinking the Constitution: Freedom of Speech and of the Press
The Golden Rule as Beneficial Learning
Why I Am Not an Extreme Libertarian
Facets of Liberty
Burkean Libertarianism
Rights: Source, Applicability, How Held
The Folly of Pacifism, Again
What Is Libertarianism?
True Libertarianism, One More Time
Human Nature, Liberty, and Rationalism
Utilitarianism and Psychopathy
Privacy Is Not Sacred
A Declaration and Defense of My Prejudices about Governance
The Libertarian-Conservative Fusion Is Alive and Well
Libertarianism and Morality
Libertarianism and Morality: A Footnote
Merit Goods, Positive Rights, and Cosmic Justice
More about Merit Goods
What Is Bleeding-Heart Libertarianism?
Society and the State
Prohibition, Abortion, and “Progressivism”
Liberty, Negative Rights, and Bleeding Hearts
Cato, the Kochs, and a Fluke
Conservatives vs. “Liberals”
Not-So-Random Thoughts (II)
Why Conservatism Works
The Pool of Liberty and “Me” Libertarianism
Bleeding-Heart Libertarians = Left-Statists
Enough with the Bleeding Hearts, Already
Not Guilty of Libertarian Purism
Liberty and Society
The Eclipse of “Old America”
Genetic Kinship and Society
Liberty as a Social Construct: Moral Relativism?
A Contrarian View of Universal Suffrage
Well-Founded Pessimism
Defending Liberty against (Pseudo) Libertarians

Nonsense about Presidents, IQ, and War

Peter Singer outdoes his usual tendentious self in this review of Steven Pinker’s The Better Angels of Our Nature: Why Violence Has Declined. In the course of the review, Singer writes:

Pinker argues that enhanced powers of reasoning give us the ability to detach ourselves from our immediate experience and from our personal or parochial perspective, and frame our ideas in more abstract, universal terms. This in turn leads to better moral commitments, including avoiding violence. It is just this kind of reasoning ability that has improved during the 20th century. He therefore suggests that the 20th century has seen a “moral Flynn effect, in which an accelerating escalator of reason carried us away from impulses that lead to violence” and that this lies behind the long peace, the new peace, and the rights revolution. Among the wide range of evidence he produces in support of that argument is the tidbit that since 1946, there has been a negative correlation between an American president’s I.Q. and the number of battle deaths in wars involving the United States.

Singer does not give the source of the IQ estimates on which Pinker relies, but the supposed correlation points to a discredited piece of historiometry by Dean Keith Simonton, “Presidential IQ, Openness, Intellectual Brilliance, and Leadership: Estimates and Correlations for 42 U.S. Chief Executives” (Political Psychology, Vol. 27, No. 4, 2006). Simonton jumps through various hoops to assess the IQs of  every president from Washington to Bush II — to one decimal place. That is a feat on a par with reconstructing the final thoughts of Abel, ere Cain slew him.

Before I explain the discrediting of Simonton’s obviously discreditable “research,” there is some fun to be had with the Pinker-Singer story of presidential IQ (Simonton-style) for battle deaths. First, of course, there is the convenient cutoff point of 1946. Why 1946? Well, it enables Pinker-Singer to avoid the inconvenient fact that the Civil War, World War I, and World War II happened while the presidency was held by three men who (in Simonton’s estimation) had high IQs: Lincoln, Wilson, and FDR.

The next several graphs depict best-fit relationships between Simonton’s estimates of presidential IQ and the U.S. battle deaths that occurred during each president’s term of office.* The presidents, in order of their appearance in the titles of the graphs are Harry S Truman (HST), George W. Bush (GWB), Franklin Delano Roosevelt (FDR), (Thomas) Woodrow Wilson (WW), Abraham Lincoln (AL), and George Washington (GW). The number of battle deaths is rounded to the nearest thousand, so that the prevailing value is 0, even in the case of the Spanish-American War (385 U.S. combat deaths) and George H.W. Bush’s Gulf War (147 U.S. combat deaths).

This is probably the relationship referred to by Singer, though Pinker may show a linear fit, rather than the tighter polynomial fit used here:

It looks bad for the low “IQ” presidents — if you believe Simonton’s estimates of IQ, which you shouldn’t, and if you believe that battle deaths are a bad thing per se, which they aren’t. I will come back to those points. For now, just suspend your well-justified disbelief.

If the relationship for the HST-GWB era were statistically meaningful, it would not change much with the introduction of additional statistics about “IQ” and battle deaths, but it does:




If you buy the brand of snake oil being peddled by Pinker-Singer, you must believe that the “dumbest” and “smartest” presidents are unlikely to get the U.S. into wars that result in a lot of battle deaths, whereas some (but, mysteriously, not all) of the “medium-smart” presidents (Lincoln, Wilson, FDR) are likely to do so.

In any event, if you believe in Pinker-Singer’s snake oil, you must accept the consistent “humpback” relationship that is depicted in the preceding four graphs, rather than the highly selective, one-shot negative relationship of the HST-GWB graph.

More seriously, the relationship in the HST-GWB graph is an evident ploy to discredit certain presidents (especially GWB, I suspect), which is why it covers only the period since WWII. Why not just say that you think GWB is a chimp-like, war-mongering, moron and be done with it? Pseudo-statistics of the kind offered up by Pinker-Singer is nothing more than a talking point for those already convinced that Bush=Hitler.

But as long as this silly game is in progress, let us continue it, with a new rule. Let us advance from one to two explanatory variables. The second explanatory variable that strongly suggests itself is political party. And because it is not good practice to omit relevant statistics (a favorite gambit of liars), I estimated an equation based on “IQ” and battle deaths for the 27 men who served as president from the first Republican presidency (Lincoln’s) through the presidency of GWB.  The equation looks like this:

U.S. battle deaths (000) “owned” by a president =

-80.6 + 0.841 x “IQ” – 31.3 x party (where 0 = Dem, 1 = GOP)

In other words, battle deaths rise at the rate of 841 per IQ point (so much for Pinker-Singer). But there will be fewer deaths with a Republican in the White House (so much for Pinker-Singer’s implied swipe at GWB).

All of this is nonsense, of course, for two reasons: Simonton’s estimates of IQ are hogwash, and the number of U.S. battle deaths is a meaningless number, taken by itself.

With regard to hogwash, Simonton’s estimates of presidents’ IQs put every one of them — including the “dumbest,” U.S. Grant — in the top 2.3 percent of the population. And the mean of Simonton’s estimates puts the average president in the top 0.1 percent (one-tenth of one percent) of the population. That is literally incredible. Good evidence of the unreliability of Simonton’s estimates is found in an entry by Thomas C. Reeves at George Mason University’s History New Network. Reeves is the author of A Question of Character: A Life of John F. Kennedy, the negative reviews of which are evidently the work of JFK idolators who refuse to be disillusioned by facts. Anyway, here is Reeves:

I’m a biographer of two of the top nine presidents on Simonton’s list and am highly familiar with the histories of the other seven. In my judgment, this study has little if any value. Let’s take JFK and Chester A. Arthur as examples.

Kennedy was actually given an IQ test before entering Choate. His score was 119…. There is no evidence to support the claim that his score should have been more than 40 points higher [i.e., the IQ of 160 attributed to Kennedy by Simonton]. As I described in detail in A Question Of Character [link added], Kennedy’s academic achievements were modest and respectable, his published writing and speeches were largely done by others (no study of Kennedy is worthwhile that downplays the role of Ted Sorensen)….

Chester Alan Arthur was largely unknown before my Gentleman Boss was published in 1975. The discovery of many valuable primary sources gave us a clear look at the president for the first time. Among the most interesting facts that emerged involved his service during the Civil War, his direct involvement in the spoils system, and the bizarre way in which he was elevated to the GOP presidential ticket in 1880. His concealed and fatal illness while in the White House also came to light.

While Arthur was a college graduate, and was widely considered to be a gentleman, there is no evidence whatsoever to suggest that his IQ was extraordinary. That a psychologist can rank his intelligence 2.3 points ahead of Lincoln’s suggests access to a treasure of primary sources from and about Arthur that does not exist.

This historian thinks it impossible to assign IQ numbers to historical figures. If there is sufficient evidence (as there usually is in the case of American presidents), we can call people from the past extremely intelligent. Adams, Wilson, TR, Jefferson, and Lincoln were clearly well above average intellectually. But let us not pretend that we can rank them by tenths of a percentage point or declare that a man in one era stands well above another from a different time and place.

My educated guess is that this recent study was designed in part to denigrate the intelligence of the current occupant of the White House….

That is an excellent guess.

The meaninglessness of battle deaths as a measure of anything — but battle deaths — should be evident. But in case it is not evident, here goes:

  • Wars are sometimes necessary, sometimes not. (I give my views about the wisdom of America’s various wars at this post.) Necessary or not, presidents usually act in accordance with popular and elite opinion about the desirability of a particular war. Imagine, for example, the reaction if FDR had not gone to Congress on December 8, 1941, to ask for a declaration of war against Japan, or if GWB had not sought the approval of Congress for action in Afghanistan.
  • Presidents may have a lot to do with the decision to enter a war, but they have little to do with the external forces that help to shape that decision. GHWB, for example, had nothing to do with Saddam’s decision to invade Kuwait and thereby threaten vital U.S. interests in the Middle East. GWB, to take another example, was not a party to the choices of earlier presidents (GHWB and Clinton) that enabled Saddam to stay in power and encouraged Osama bin Laden to believe that America could be brought to its knees by a catastrophic attack.
  • The number of battle deaths in a war depends on many things outside the control of a particular president; for example, the size and capabilities of enemy forces, the size and capabilities of U.S. forces (which have a lot to do with the decisions of earlier administrations and Congresses), and the scope and scale of a war (again, largely dependent on the enemy).
  • Battle deaths represent personal tragedies, but — in and of themselves — are not a measure of a president’s wisdom or acumen. Whether the deaths were in vain is a separate issue that depends on the aforementioned considerations. To use battle deaths as a single, negative measure of a president’s ability is rank cynicism — the rankness of which is revealed in Pinker’s decision to ignore Lincoln and FDR and their “good” but deadly wars.

To put the last point another way, if the number of battle death deaths is a bad thing, Lincoln and FDR should be rotting in hell for the wars that brought an end to slavery and Hitler.

__________
* The numbers of U.S. battle deaths, by war, are available at infoplease.com, “America’s Wars: U.S. Casualties and Veterans.” The deaths are “assigned” to presidents as follows (numbers in parentheses indicate thousands of deaths):

All of the deaths (2) in the War of 1812 occurred on Madison’s watch.

All of the deaths (2) in the Mexican-American War occurred on Polk’s watch.

I count only Union battle deaths (140) during the Civil War; all are “Lincoln’s.” Let the Confederate dead be on the head of Jefferson Davis. This is a gift, of sorts, to Pinker-Singer because if Confederate dead were counted as Lincoln, with his high “IQ,” it would make Pinker-Singer’s hypothesis even more ludicrous than it is.

WW is the sole “owner” of WWI battle deaths (53).

Some of the U.S. battle deaths in WWII (292) occurred while HST was president, but Truman was merely presiding over the final months of a war that was almost won when FDR died. Truman’s main role was to hasten the end of the war in the Pacific by electing to drop the A-bombs on Hiroshima and Nagasaki. So FDR gets “credit” for all WWII battle deaths.

The Korean War did not end until after Eisenhower succeeded Truman, but it was “Truman’s war,” so he gets “credit” for all Korean War battle deaths (34). This is another “gift” to Pinker-Singer because Ike’s “IQ” is higher than Truman’s.

Vietnam was “LBJ’s war,” but I’m sure that Singer would not want Nixon to go without “credit” for the battle deaths that occurred during his administration. Moreover, LBJ had effectively lost the Vietnam war through his gradualism, but Nixon chose nevertheless to prolong the agony. So I have shared the “credit” for Vietnam War battle deaths between LBJ (deaths in 1965-68: 29) and RMN (deaths in 1969-73: 17). To do that, I apportioned total Vietnam War battle deaths, as given by infoplease.com, according to the total number of U.S. deaths in each year of the war, 1965-1973.

The wars in Afghanistan and Iraq are “GWB’s wars,” even though Obama has continued them. So I have “credited” GWB with all the battle deaths in those wars, as of May 27, 2011 (5).

The relative paucity of U.S. combat  deaths in other post-WWII actions (e.g., Lebanon, Somalia, Persian Gulf) is attested to by “Post-Vietnam Combat Casualties,” at infoplease.com.

Related posts about war and peace:
Libertarian Nay-Saying on Foreign and Defense Policy
Libertarian Nay-Saying on Foreign and Defense Policy, Revisited
Libertarians and the Common Defense
Libertarianism and Pre-emptive War: Part I
An Aside about Libertarianism and the War
Right On! For Libertarian Hawks Only
Why Sovereignty?
Understanding Libertarian Hawks
More about Libertarian Hawks and Doves
Defense, Anarcho-Capitalist Style
War Can Be the Answer
Getting It Almost Right about Iraq
Philosophical Obtuseness
But Wouldn’t Warlords Take Over?
Sorting Out the Libertarian Hawks and Doves
Now, Let’s Talk About Something Else
Libertarianism and Preemptive War: Part II
Give Me Liberty or Give Me Non-Aggression?
My View of Warlordism, Seconded
The Fatal Naïveté of Anarcho-Libertarianism
Final (?) Words about Preemption and the Constitution
More Final (?) Words about Preemption and the Constitution
QandO Saved Me the Trouble
Thomas Woods and War
“Proportionate Response” in Perspective
Parsing Peace
Not Enough Boots
Defense as the Ultimate Social Service
I Have an Idea
September 11: Five Years On
How to View Defense Spending
Reaching the Limit?
The Best Defense . . .
More Stupidity from Cato
A Critique of Extreme Libertarianism
Anarchistic Balderdash
Not Enough Boots: The Why of It
Blood for Oil

It *Is* the Oil
The End of Slavery in the United States
Liberalism and Sovereignty
Cato’s Usual Casuistry on Matters of War and Peace
The Media, the Left, and War
A Point of Agreement
The Decision to Drop the Bomb
The “Predator War” and Self-Defense
The National Psyche and Foreign Wars
Delusions of Preparedness
Inside-Outside
A Moralist’s Moral Blindness
A Grand Strategy for the United States
The Folly of Pacifism
Why We Should (and Should Not) Fight
Rating America’s Wars
Transnationalism and National Defense
The Next 9/11?
The Folly of Pacifism, Again
September 20, 2001: Hillary Clinton Signals the End of “Unity”
NEVER FORGIVE, NEVER FORGET, NEVER RELENT!

Previous posts about Peter Singer:
Peter Singer’s Fallacy
Peter Singer’s Agenda
Singer Said It
Rationing and Health Care
Peter Presumes to Preach