Not-So-Random Thoughts (XVII)

Links to the other posts in this occasional series may be found at “Favorite Posts,” just below the list of topics.

*     *     *

Victor Davis Hanson offers “The More Things Change, the More They Actually Don’t.” It echoes what I say in “The Fallacy of Human Progress.” Hanson opens with this:

In today’s technically sophisticated and globally connected world, we assume life has been completely reinvented. In truth, it has not changed all that much.
And he proceeds to illustrate his point (and mine).

*     *     *

Dr. James Thompson, and English psychologist, often blogs about intelligence. Here are some links from last year that I’ve been hoarding:

Intelligence: All That Matters” (a review of a book by Stuart Ritchie)

GCSE Genes” (commentary about research showing the strong relationship between genes and academic achievement)

GWAS Hits and Country IQ” (commentary about preliminary research into the alleles related to intelligence)

Also, from the International Journal of Epidemiology, comes “The Association between Intelligence and Lifespan Is Mostly Genetic.”

All of this is by way of reminding you of my many posts about intelligence, which are sprinkled throughout this list and this one.

*     *     *

How bad is it? This bad:

Thomas Lifson, “Mark Levin’s Plunder and Deceit

Arthur Milikh, “Alexis de Tocqueville Predicted the Tyranny of the Majority in Our Modern World

Steve McCann, “Obama and Neo-fascist America

Related reading: “Fascism, Pots, and Kettles,” by me, of course.

Adam Freedman’s book, A Less than Perfect Union: The Case for States’ Rights. States’ rights can be perfected by secession, and I make the legal case for it in “A Resolution of Secession.”

*     *     *

In a different vein, there’s Francis Menton’s series about anthropogenic global warming. The latest installment is “The Greatest Scientific Fraud of All Time — Part VIII.” For my take on the subject, start with “AGW in Austin?” and check out the readings and posts listed at the bottom.

Ty Cobb and the State of Science

This post was inspired by “Layman’s Guide to Understanding Scientific Research” at bluebird of bitterness.

The thing about history is that it’s chock full of lies. Well, a lot of the lies are just incomplete statements of the truth. Think of history as an artificially smooth surface, where gaps in knowledge have been filled by assumptions and guesses, and where facts that don’t match the surrounding terrain have been sanded down. Charles Leershen offers an excellent example of the lies that became “history” in his essay “Who Was Ty Cobb? The History We Know That’s Wrong.” (I’m now reading the book on which the essay is based, and it tells the same tale, at length.)

Science is much like history in its illusory certainty. Stand back from things far enough and you see a smooth, mathematical relationship. Look closer, however, and you find rough patches. A classic example is found in physics, where the big picture of general relativity doesn’t mesh with the small picture of quantum mechanics.

Science is based on guesses, also known as hypotheses. The guesses are usually informed by observation, but they are guesses nonetheless. Even when a guess has been lent credence by tests and observations, it only becomes a theory — a working model of a limited aspect of physical reality. A theory is never proven; it can only be disproved.

Science, in other words, is never “settled.” Napoleon is supposed to have said “What is history but a fable agreed upon?” It seems, increasingly, that so-called scientific facts are nothing but a fable that some agree upon because they wish to use those “facts” as a weapon with which to advance their careers and political agendas. Or they simply wish to align themselves with the majority, just as Barack Obama’s popularity soared (for a few months) after he was re-elected.

*     *     *

Related reading:

Wikipedia, “Replication Crisis

John P.A. Ionnidis, “Why Most Published Research Findings Are False,” PLOS Medicine, August 30, 2005

Liberty Corner, “Science’s Anti-Scientific Bent,” April 12, 2006

Politics & Prosperity, “Modeling Is Not Science,” April 8, 2009

Politics & Prosperity, “Physics Envy,” May 26, 2010

Politics & Prosperity, “Demystifying Science,” October 5, 2011 (also see the long list of related posts at the bottom)

Politics & Prosperity, “The Science Is Settled,” May 25, 2014

Politics & Prosperity, “The Limits of Science, Illustrated by Scientists,” July 28, 2014

Steven E. Koonin, “Climate Science Is Not Settled,” WSJ.com, September 19, 2014

Joel Achenbach, “No, Science’s Reproducibility Problem Is Not Limited to Psychology,” The Washington Post, August 28, 2015

William A. Wilson, “Scientific Regress,” First Things, May 2016

Jonah Goldberg, “Who Are the Real Deniers of Science?AEI.org, May 20, 2016

Steven Hayward, “The Crisis of Scientific Credibility,” Power Line, May 25, 2016

There’s a lot more here.

Turning Points

American Revolution — 1775-1783. The Colonies became sovereign States, bound by a compact (the Articles of Confederation) in which each State clearly retained its sovereignty. And yet, those sovereign States, bound by a common language and culture, successfully banded together to defeat a stronger enemy.

Drafting and ratification of the U.S. Constitution — 1787-1790. The States, relying on the hopes of the Framers, entered into a compact which created a national government that, inevitably, would subsume the power and authority of the States.

Nullification Crisis — 1832-1833. An attempt by South Carolina to reject an unconstitutional act of Congress was stifled by a threat of military intervention by the national government. This set the stage for…

Civil War and Texas v. White — 1861-1865 and 1869. Regardless of the motivation for secession, the Southern States acted legally in seceding. Mr. Lincoln’s romantic (if not power-hungry) quest for perpetual union led not only to the bloodiest conflict ever likely to be fought on American soil, but also undoubtedly deterred any future attempt to secede. The majority opinion in Texas v. White essentially ruled that might makes right when it converted a military victory into an (invalid) holding against the constitutionality of secession.

The assassination of William McKinley — 1901. This elevated Theodore Roosevelt to the presidency. Roosevelt’s extra-constitutional activism    became the exemplar for most of the presidents who followed him — especially (though not exclusively) the Democrats.

Ratification of the 16th and 17th Amendments to the Constitution, and creation of the Federal Reserve system — 1913. The amendments enabled the national income tax and wrested control of U.S. Senate seats from State legislatures, thus ensuring the aggrandizement of the national government and the subjugation of the States. The creation of the Fed gave the national government yet another tool for exercising central control of the economy — a tool that has often been used with disastrous results for Americans.

The stock-market crash of 1929. The Fed’s policies contributed to the crash and helped turn what would have been a transitory financial crisis into the Great Depression. This one-off series of events set the stage for an unprecedented power grab by the national government — the New Deal — which was aided by several spineless Supreme Court rulings. Thus empowered, the national government has spent most of the past 80 years enlarging on the New Deal, with additional help from the Supreme Court along the way.

The assassination of John F. Kennedy — 1963. This assassination, like that of McKinley, led to the elevation of a hyper-active politician whose twin legacies were the expansion of the New Deal and the eventual demise of the ultimate guarantee of America’s security: military supremacy and the will to use it. Kennedy’s assassination also marked a cultural turning point that I have addressed elsewhere.

The Vietnam War — 1965-1973. The Korean War was a warmup for this one. The losing strategy of gradualism, and a (predictable) loss dictated by the media and academe was followed, as day follows night, by a wave of unilateral disarmament. Reagan’s rearmament and a quick (but incomplete) victory in the Gulf War merely set the stage for the next wave of unilateral disarmament, which was reversed, briefly, by the shock of 9/11. The wars that followed in Iraq and Afghanistan were fought with the same vacillation and vituperation (from media and academe) as the Vietnam War. Unilateral disarmament continues, even as Russia and China become militarily stronger and bolder in their international gestures.

The demise of economic and social liberty in the United States, which is the predictable result of the growth of the national government’s power, will matter not one whit when the U.S. is surrounded by and effectively dictated to by the great powers to its east and west.

As the world turns: from Colonies to colonies.

 

The Old Normal

The recent terrorist attacks in Brussels and Lahore might lead the impressionable to conclude that the world is falling apart. I submit that terrorist attacks shock because they occur against a backdrop of relative peacefulness. Yes, there are wars here and there, but they are mere skirmishes — albeit with tragic consequences for many — compared with what happened in the twentieth century.

From 1914 to 1973 — a span of two generations — World War I, World War II, the Korean War, the Vietnam War, and the genocides presided over by Stalin, Hitler, and Mao produced casualties on a scale never attained before or since.  Despite subsequent events — the Soviet invasion of Afghanistan, the Gulf War, the wars in Afghanistan and Iraq, and dozens of terror attacks, large and small in scale — the world today is more quiescent than it has been in more than a century.

But the problem with history is that the future isn’t part of it. It is fatuous to suggest, as some have, that the better angels of our nature have conquered violence on a twentieth-century scale. That was the prevailing view in Europe for several years before the outbreak of World War I.

Violence is in fact an essential, ineradicable component of human nature. There will always be armed conflict, and sometimes it will involve the forces of many nations. The proximate causes and timing of war are unpredictable. Conciliatory gestures can be just as provocative as saber-rattling; the former can be taken as as sign of weakness and unpreparedness, the latter as a sign of resolve and preparedness.

If history holds any lesson regarding war, it is this one: Hope for the best but prepare for the worst. This is a lesson that American leftists seem dead set on ignoring.

Who Shot JFK, and Why? (Updated)

Here.

Not-So-Random Thoughts (XV)

Links to the other posts in this occasional series may be found at “Favorite Posts,” just below the list of topics.

*     *     *

Victor Davis Hanson writes:

This descent into the Dark Ages will not end well. It never has in the past. [“Building the New Dark-Age Mind,” Works and Days, June 8, 2015]

Hamson’s chronicle of political correctness and doublespeak echoes one theme of my post, “1963: The Year Zero.”

*     *     *

Timothy Taylor does the two-handed economist act:

It may be that the question of “does inequality slow down economic growth” is too broad and diffuse to be useful. Instead, those of us who care about both the rise in inequality and the slowdown in economic growth should be looking for policies to address both goals, without presuming that substantial overlap will always occur between them. [“Does Inequality Reduce Economic Growth: A Skeptical View,” The Conversible Economist, May 29, 2015]

The short answer to the question “Does inequality reduce growth?” is no. See my post “Income Inequality and Economic Growth.” Further, even if inequality does reduce growth, the idea of reducing inequality (through income redistribution, say) to foster growth is utilitarian and therefore morally egregious. (See “Utilitarianism vs. Liberty.”)

*     *     *

In “Diminishing Marginal Utility and the Redistributive Urge” I write:

[L]eftists who deign to offer an economic justification for redistribution usually fall back on the assumption of the diminishing marginal utility (DMU) of income and wealth. In doing so, they commit (at least) four errors.

The first error is the fallacy of misplaced concreteness which is found in the notion of utility. Have you ever been able to measure your own state of happiness? I mean measure it, not just say that you’re feeling happier today than you were when your pet dog died. It’s an impossible task, isn’t it? If you can’t measure your own happiness, how can you (or anyone) presume to measure and aggregate the happiness of millions or billions of individual human beings? It can’t be done.

Which brings me to the second error, which is an error of arrogance. Given the impossibility of measuring one person’s happiness, and the consequent impossibility of measuring and comparing the happiness of many persons, it is pure arrogance to insist that “society” would be better off if X amount of income or wealth were transferred from Group A to Group B….

The third error lies in the implicit assumption embedded in the idea of DMU. The assumption is that as one’s income or wealth rises one continues to consume the same goods and services, but more of them….

All of that notwithstanding, the committed believer in DMU will shrug and say that at some point DMU must set in. Which leads me to the fourth error, which is an error of introspection….  [If over the years] your real income has risen by a factor of two or three or more — and if you haven’t messed up your personal life (which is another matter) — you’re probably incalculably happier than when you were just able to pay your bills. And you’re especially happy if you put aside a good chunk of money for your retirement, the anticipation and enjoyment of which adds a degree of utility (such a prosaic word) that was probably beyond imagining when you were in your twenties, thirties, and forties.

Robert Murphy agrees:

[T]he problem comes in when people sometimes try to use the concept of DMU to justify government income redistribution. Specifically, the argument is that (say) the billionth dollar to Bill Gates has hardly any marginal utility, while the 10th dollar to a homeless man carries enormous marginal utility. So clearly–the argument goes–taking a dollar from Bill Gates and giving it to a homeless man raises “total social utility.”

There are several serious problems with this type of claim. Most obvious, even if we thought it made sense to attribute units of utility to individuals, there is no reason to suppose we could compare them across individuals. For example, even if we thought a rich man had units of utility–akin to the units of his body temperature–and that the units declined with more money, and likewise for a poor person, nonetheless we have no way of placing the two types of units on the same scale….

In any event, this is all a moot point regarding the original question of interpersonal utility comparisons. Even if we thought individuals had cardinal utilities, it wouldn’t follow that redistribution would raise total social utility.

Even if we retreat to the everyday usage of terms, it still doesn’t follow as a general rule that rich people get less happiness from a marginal dollar than a poor person. There are many people, especially in the financial sector, whose self-esteem is directly tied to their earnings. And as the photo indicates, Scrooge McDuck really seems to enjoy money. Taking gold coins from Scrooge and giving them to a poor monk would not necessarily increase happiness, even in the everyday psychological sense. [“Can We Compare People’s Utilities?,” Mises Canada, May 22, 2015]

See also David Henderson’s “Murphy on Interpersonal Utility Comparisons” (EconLog, May 22, 2015) and Henderson’s earlier posts on the subject, to which he links. Finally, see my comment on an earlier post by Henderson, in which he touches on the related issue of cost-benefit analysis.

*     *     *

Here’s a slice of what Robert Tracinski has to say about “reform conservatism”:

The key premise of this non-reforming “reform conservatism” is the idea that it’s impossible to really touch the welfare state. We might be able to alter its incentives and improve its clanking machinery, but only if we loudly assure everyone that we love it and want to keep it forever.

And there’s the problem. Not only is this defeatist at its core, abandoning the cause of small government at the outset, but it fails to address the most important problem facing the country.

“Reform conservatism” is an answer to the question: how can we promote the goal of freedom and small government—without posing any outright challenge to the welfare state? The answer: you can’t. All you can do is tinker around the edges of Leviathan. And ultimately, it won’t make much difference, because it will all be overwelmed in the coming disaster. [“Reform Conservatism Is an Answer to the Wrong Question,” The Federalist, May 22, 2015]

Further, as I observe in “How to Eradicate the Welfare State, and How Not to Do It,” the offerings of “reform conservatives”

may seem like reasonable compromises with the left’s radical positions. But they are reasonable compromises only if you believe that the left wouldn’t strive vigorously to undo them and continue the nation’s march toward full-blown state socialism. That’s the way leftists work. They take what they’re given and then come back for more, lying and worse all the way.

See also Arnold Kling’s “Reason Roundtable on Reform Conservatism” (askblog, May 22, 2015) and follow the links therein.

*     *     *

I’ll end this installment with a look at science and the anti-scientific belief in catastrophic anthropogenic global warming.

Here’s Philip Ball in “The Trouble With Scientists“:

It’s likely that some researchers are consciously cherry-picking data to get their work published. And some of the problems surely lie with journal publication policies. But the problems of false findings often begin with researchers unwittingly fooling themselves: they fall prey to cognitive biases, common modes of thinking that lure us toward wrong but convenient or attractive conclusions. “Seeing the reproducibility rates in psychology and other empirical science, we can safely say that something is not working out the way it should,” says Susann Fiedler, a behavioral economist at the Max Planck Institute for Research on Collective Goods in Bonn, Germany. “Cognitive biases might be one reason for that.”

Psychologist Brian Nosek of the University of Virginia says that the most common and problematic bias in science is “motivated reasoning”: We interpret observations to fit a particular idea. Psychologists have shown that “most of our reasoning is in fact rationalization,” he says. In other words, we have already made the decision about what to do or to think, and our “explanation” of our reasoning is really a justification for doing what we wanted to do—or to believe—anyway. Science is of course meant to be more objective and skeptical than everyday thought—but how much is it, really?

Whereas the falsification model of the scientific method championed by philosopher Karl Popper posits that the scientist looks for ways to test and falsify her theories—to ask “How am I wrong?”—Nosek says that scientists usually ask instead “How am I right?” (or equally, to ask “How are you wrong?”). When facts come up that suggest we might, in fact, not be right after all, we are inclined to dismiss them as irrelevant, if not indeed mistaken….

Given that science has uncovered a dizzying variety of cognitive biases, the relative neglect of their consequences within science itself is peculiar. “I was aware of biases in humans at large,” says [Chris] Hartgerink [of Tilburg University in the Netherlands], “but when I first ‘learned’ that they also apply to scientists, I was somewhat amazed, even though it is so obvious.”…

One of the reasons the science literature gets skewed is that journals are much more likely to publish positive than negative results: It’s easier to say something is true than to say it’s wrong. Journal referees might be inclined to reject negative results as too boring, and researchers currently get little credit or status, from funders or departments, from such findings. “If you do 20 experiments, one of them is likely to have a publishable result,” [Ivan] Oransky and [Adam] Marcus [who run the service Retraction Watch] write. “But only publishing that result doesn’t make your findings valid. In fact it’s quite the opposite.”9 [Nautilus, May 14, 2015]

Zoom to AGW. Robert Tracinski assesses the most recent bit of confirmation bias:

A lot of us having been pointing out one of the big problems with the global warming theory: a long plateau in global temperatures since about 1998. Most significantly, this leveling off was not predicted by the theory, and observed temperatures have been below the lowest end of the range predicted by all of the computerized climate models….

Why, change the data, of course!

Hence a blockbuster new report: a new analysis of temperature data since 1998 “adjusts” the numbers and magically finds that there was no plateau after all. The warming just continued….

How convenient.

It’s so convenient that they’re signaling for everyone else to get on board….

This is going to be the new party line. “Hiatus”? What hiatus? Who are you going to believe, our adjustments or your lying thermometers?…

The new adjustments are suspiciously convenient, of course. Anyone who is touting a theory that isn’t being borne out by the evidence and suddenly tells you he’s analyzed the data and by golly, what do you know, suddenly it does support his theory—well, he should be met with more than a little skepticism.

If we look, we find some big problems. The most important data adjustments by far are in ocean temperature measurements. But anyone who has been following this debate will notice something about the time period for which the adjustments were made. This is a time in which the measurement of ocean temperatures has vastly improved in coverage and accuracy as a whole new set of scientific buoys has come online. So why would this data need such drastic “correcting”?

As climatologist Judith Curry puts it:

The greatest changes in the new NOAA surface temperature analysis is to the ocean temperatures since 1998. This seems rather ironic, since this is the period where there is the greatest coverage of data with the highest quality of measurements–ARGO buoys and satellites don’t show a warming trend. Nevertheless, the NOAA team finds a substantial increase in the ocean surface temperature anomaly trend since 1998.

….

I realize the warmists are desperate, but they might not have thought through the overall effect of this new “adjustment” push. We’ve been told to take very, very seriously the objective data showing global warming is real and is happening—and then they announce that the data has been totally changed post hoc. This is meant to shore up the theory, but it actually calls the data into question….

All of this fits into a wider pattern: the global warming theory has been awful at making predictions about the data ahead of time. But it has been great at going backward, retroactively reinterpreting the data and retrofitting the theory to mesh with it. A line I saw from one commenter, I can’t remember where, has been rattling around in my head: “once again, the theory that predicts nothing explains everything.” [“Global Warming: The Theory That Predicts Nothing and Explains Everything,” The Federalist, June 8, 2015]

Howard Hyde also weighs in with “Climate Change: Where Is the Science?” (American Thinker, June 11, 2015).

Bill Nye, the so-called Science Guy, seems to epitomize the influence of ideology on “scientific knowledge.”  I defer to John Derbyshire:

Bill Nye the Science Guy gave a commencement speech at Rutgers on Sunday. Reading the speech left me thinking that if this is America’s designated Science Guy, I can be the nation’s designated swimsuit model….

What did the Science Guy have to say to the Rutgers graduates? Well, he warned them of the horrors of climate change, which he linked to global inequality.

We’re going to find a means to enable poor people to advance in their societies in countries around the world. Otherwise, the imbalance of wealth will lead to conflict and inefficiency in energy production, which will lead to more carbon pollution and a no-way-out overheated globe.

Uh, given that advanced countries use far more energy per capita than backward ones—the U.S.A. figure is thirty-four times Bangladesh’s—wouldn’t a better strategy be to keep poor countries poor? We could, for example, encourage all their smartest and most entrepreneurial people to emigrate to the First World … Oh, wait: we already do that.

The whole climate change business is now a zone of hysteria, generating far more noise—mostly of a shrieking kind—than its importance justifies. Opinions about climate change are, as Greg Cochran said, “a mark of tribal membership.” It is also the case, as Greg also said, that “the world is never going to do much about in any event, regardless of the facts.”…

When Ma Nature means business, stuff happens on a stupendously colossal scale.  And Bill Nye the Science Guy wants Rutgers graduates to worry about a 0.4ºC warming over thirty years? Feugh.

The Science Guy then passed on from the dubiously alarmist to the batshit barmy.

There really is no such thing as race. We are one species … We all come from Africa.

Where does one start with that? Perhaps by asserting that: “There is no such thing as states. We are one country.”

The climatological equivalent of saying there is no such thing as race would be saying that there is no such thing as weather. Of course there is such a thing as race. We can perceive race with at least three of our five senses, and read it off from the genome. We tick boxes for it on government forms: I ticked such a box for the ATF just this morning when buying a gun.

This is the Science Guy? The foundational text of modern biology bears the title On the Origin of Species by Means of Natural Selection, or the Preservation of Favored Races in the Struggle for Life. Is biology not a science?

Darwin said that populations of a species long separated from each other will diverge in their biological characteristics, forming races. If the separation goes on long enough, any surviving races will diverge all the way to separate species. Was Ol’ Chuck wrong about that, Mr. Science Guy?

“We are one species”? Rottweilers and toy poodles are races within one species, a species much newer than ours; yet they differ mightily, not only in appearance but also—gasp!—in behavior, intelligence, and personality. [“Nye Lied, I Sighed,” Taki’s Magazine, May 21, 2015]

This has gone on long enough. Instead of quoting myself, I merely refer you to several related posts:

Demystifying Science
AGW: The Death Knell
Evolution and Race
The Limits of Science (II)
The Pretence of Knowledge
“The Science Is Settled”
The Limits of Science, Illustrated by Scientists
Rationalism, Empiricism, and Scientific Knowledge
AGW in Austin?

Signature

1963: The Year Zero

[A] long habit of not thinking a thing WRONG, gives it a superficial appearance of being RIGHT…. Time makes more converts than reason.

Thomas Paine, Common Sense

If ignorance and passion are the foes of popular morality, it must be confessed that moral indifference is the malady of the cultivated classes. The modern separation of enlightenment and virtue, of thought and conscience, of the intellectual aristocracy from the honest and common crowd is the greatest danger that can threaten liberty.

Henri Frédéric Amiel, Journal

The Summer of Love ignited the loose, Dionysian culture that is inescapable today. The raunch and debauchery, radical individualism, stylized non-conformity, the blitzkrieg on age-old authorities, eventually impaired society’s ability to function.

Gilbert T. Sewall, “Summer of Love, Winter of Decline

*     *     *

If, like me, you were an adult when John F. Kennedy was assassinated, you may think of his death as a watershed moment in American history. I say this not because I’m an admirer of Kennedy the man (I am not), but because American history seemed to turn a corner when Kennedy was murdered. To take the metaphor further, the corner marked the juncture of a sunny, tree-lined street (America from the end of World War II to November 22, 1963) and a dingy, littered street (America since November 22, 1963).

Changing the metaphor, I acknowledge that the first 18 years after V-J Day were by no means halcyon, but they were the spring that followed the long, harsh winter of the Great Depression and World War II. Yes, there was the Korean War, but that failure of political resolve was only a rehearsal for later debacles. McCarthyism, a political war waged (however clumsily) on America’s actual enemies, was benign compared with the war on civil society that began in the 1960s and continues to this day. The threat of nuclear annihilation, which those of you who were schoolchildren of the 1950s will remember well, had begun to subside with the advent of JFK’s military policy of flexible response, and seemed to evaporate with JFK’s resolution of the Cuban Missile Crisis (however poorly he managed it). And for all of his personal faults, JFK was a paragon of grace, wit, and charm — a movie-star president — compared with his many successors, with the possible exception of Ronald Reagan, who had been a real movie star.

What follows is an impression of America since November 22, 1963, when spring became a long, hot summer, followed by a dismal autumn and another long, harsh winter — not of deprivation, and perhaps not of war, but of rancor and repression.

This petite histoire begins with the Vietnam War and its disastrous mishandling by LBJ, its betrayal by the media, and its spawning of the politics of noise. “Protests” in public spaces and on campuses are a main feature of the politics of noise. In the new age of instant and sympathetic media attention to “protests,” civil and university authorities often refuse to enforce order. The media portray obstructive and destructive disorder as “free speech.” Thus do “protestors” learn that they can, with impunity, inconvenience and cow the masses who simply want to get on with their lives and work.

Whether “protestors” learned from rioters, or vice versa, they learned the same lesson. Authorities, in the age of Dr. Spock, lack the guts to use force, as necessary, to restore civil order. (LBJ’s decision to escalate gradually in Vietnam — “signaling” to Hanoi — instead of waging all-out war was of a piece with the “understanding” treatment of demonstrators and rioters.) Rioters learned another lesson — if a riot follows the arrest, beating, or death of a black person, it’s a “protest” against something (usually white-racist oppression, regardless of the facts), not wanton mayhem. After a lull of 21 years, urban riots resumed in 1964, and continue to this day.

LBJ’s “Great Society” marked the resurgence of FDR’s New Deal — with a vengeance — and the beginning of a long decline of America’s economic vitality. The combination of the Great Society (and its later extensions, such as Medicare Part D and Obamacare) with the rampant growth of regulatory activity has cut the rate of economic growth from 5 percent to 2 percent.  The entrepreneurial spirit has been crushed; dependency has been encouraged and rewarded; pension giveaways have bankrupted public treasuries across the land. America since 1963 has been visited by a perfect storm of economic destruction that seems to have been designed by America’s enemies.

The Civil Rights Act of 1964 unnecessarily crushed property rights, along with freedom of association, to what end? So that a violent, dependent, Democrat-voting underclass could arise from the Great Society? So that future generations of privilege-seekers could cry “discrimination” if anyone dares to denigrate their “lifestyles”? There was a time when immigrants and other persons who seemed “different” had the good sense to strive for success and acceptance as good neighbors, employees, and merchants. But the Civil Rights Act of 1964 and its various offspring — State and local as well as federal — are meant to short-circuit that striving and to force acceptance, whether or not a person has earned it. The vast, silent majority is caught between empowered privilege-seekers and powerful privilege-granters. The privilege-seekers and privilege-granters are abetted by dupes who have, as usual, succumbed to the people’s romance — the belief that government represents society.

Presidents, above all, like to think that they represent society. What they represent, of course, are their own biases and the interests to which they are beholden. Truman, Ike, and JFK were imperfect presidential specimens, but they are shining idols by contrast with most of their successors. The downhill slide from the Vietnam and the Great Society to Obamacare and lawlessness on immigration has been punctuated by many shameful episodes; for example:

  • LBJ — the botched war in Vietnam, repudiation of property rights and freedom of association (the Civil Rights Act)
  • Nixon — price controls, Watergate
  • Carter — dispiriting leadership and fecklessness in the Iran hostage crisis
  • Reagan — bugout from Lebanon, rescue of Social Security
  • Bush I — failure to oust Saddam when it could have been done easily, the broken promise about taxes
  • Clinton — bugout from Somalia, push for an early version of Obamacare, budget-balancing at the cost of defense, and perjury
  • Bush II — No Child Left Behind Act, Medicare Part D, the initial mishandling of Iraq, and Wall Street bailouts
  • Obama — stimulus spending, Obamacare, reversal of Bush II’s eventual success in Iraq, naive backing for the “Arab spring,”  acquiescence to Iran’s nuclear ambitions, unwillingness to acknowledge or do anything about the expansionist aims of Russia and China, neglect or repudiation of traditional allies (especially Israel), and refusal to take care that the immigration laws are executed faithfully.

Only Reagan’s defense buildup and its result — victory in the Cold War — stands out as a great accomplishment. But the victory was squandered: The “peace dividend” should have been peace through continued strength, not unpreparedness for the post 9/11 wars and the resurgence of Russia and China.

The war on defense has been accompanied by a war on science. The party that proclaims itself the party of science is anything but that. It is the party of superstitious, Luddite anti-science. Witness the embrace of extreme environmentalism, the arrogance of proclamations that AGW is “settled science,” unjustified fear of genetically modified foodstuffs, the implausible doctrine that race is nothing but a social construct, and on and on.

With respect to the nation’s moral well-being, the most destructive war of all has been the culture war, which assuredly began in the 1960s. Almost overnight, it seems, the nation was catapulted from the land of Ozzie and Harriet, Father Knows Best, and Leave It to Beaver to the land of the free- filthy-speech movement, Altamont, Woodstock, Hair, and the unspeakably loud, vulgar, and violent offerings that are now plastered all over the air waves, the internet, theater screens, and “entertainment” venues.

Adherents of the ascendant culture esteem protest for its own sake, and have stock explanations for all perceived wrongs (whether or not they are wrongs): racism, sexism, homophobia, Islamophobia, hate, white privilege, inequality (of any kind), Wall  Street, climate change, Zionism, and so on.

Then there is the campaign to curtail freedom of speech. This purported beneficiaries of the campaign are the gender-confused and the easily offended (thus “microagressions” and “trigger warnings”). The true beneficiaries are leftists. Free speech is all right if it’s acceptable to the left. Otherwise, it’s “hate speech,” and must be stamped out. This is McCarthyism on steroids. McCarthy, at least, was pursuing actual enemies of liberty; today’s leftists are the enemies of liberty.

There’s a lot more, unfortunately. The organs of the state have been enlisted in an unrelenting campaign against civilizing social norms. As I say here,

we now have not just easy divorce, subsidized illegitimacy, and legions of non-mothering mothers, but also abortion, concerted (and deluded) efforts to defeminize females and to neuter or feminize males, forced association (with accompanying destruction of property and employment rights), suppression of religion, absolution of pornography, and the encouragement of “alternative lifestyles” that feature disease, promiscuity, and familial instability. The state, of course, doesn’t act of its own volition. It acts at the behest of special interests — interests with a “cultural” agenda….  They are bent on the eradication of civil society — nothing less — in favor of a state-directed Rousseauvian dystopia from which morality and liberty will have vanished, except in Orwellian doublespeak.

If there are unifying themes in this petite histoire, they are the death of common sense and the rising tide of moral vacuity — thus the epigrams at the top of the post. The history of the United States since 1963 supports the proposition that the nation is indeed going to hell in a handbasket.

Read on.

*     *     *

Related reading:

*     *     *

Related posts:

Refuting Rousseau and His Progeny
Killing Free Speech in Order to Save It
The Adolescent Rebellion Syndrome
The Ruinous Despotism of Democracy
Academic Bias
The F-Scale Revisited
The Modern Presidency: A Tour of American History
The People’s Romance
Intellectuals and Capitalism
On Liberty
Greed, Cosmic Justice, and Social Welfare
Positive Rights and Cosmic Justice
Liberalism and Sovereignty
The Interest-Group Paradox
Getting It Wrong and Right about Iran
The Real Constitution and Civil Disobedience
The State of the Union 2010
The Shape of Things to Come
Sexist Nonsense
The Constitution: Original Meaning, Corruption, and Restoration
Delusions of Preparedness
Inside-Outside
Asymmetrical (Ideological) Warfare
A Grand Strategy for the United States
The Folly of Pacifism
The Unconstitutionality of the Individual Mandate
“Intellectuals and Society”: A Review
Does the Power to Tax Give Congress Unlimited Power?
Does Congress Have the Power to Regulate Inactivity?
Is the Anger Gone?
Government vs. Community
Social Justice
The Left’s Agenda
More Social Justice
Why We Should (and Should Not) Fight
Rating America’s Wars
The Public-School Swindle
The Evil That Is Done with Good Intentions
Transnationalism and National Defense
Luck-Egalitarianism and Moral Luck
The Left and Its Delusions
In Defense of Wal-Mart
The Destruction of Society in the Name of “Society”
The Folly of Pacifism, Again
An Economist’s Special Pleading: Affirmative Action for the Ugly
September 20, 2001: Hillary Clinton Signals the End of “Unity”
The War on Terror, As It Should Have Been Fought
Obamacare: Neither Necessary nor Proper
The Spoiled Children of Capitalism
Politics, Sophistry, and the Academy
Subsidizing the Enemies of Liberty
Defense as an Investment in Liberty and Prosperity
Our Perfect, Perfect Constitution
Free Will, Crime, and Punishment
Constitutional Confusion
Obamacare, Slopes, Ratchets, and the Death-Spiral of Liberty
Obamacare and Zones of Liberty
Race and Reason: The Achievement Gap — Causes and Implications
Obama’s Big Lie
Liberty and Society
The Eclipse of “Old America”
The Capitalist Paradox Meets the Interest-Group Paradox
Genetic Kinship and Society
America: Past, Present, and Future
Defending Liberty against (Pseudo) Libertarians
Left-Libertarians, Obama, and the Zimmerman Case
“Conversing” about Race
The Fallacy of Human Progress
Fighting Modernity
Political Correctness vs. Civility
IQ, Political Correctness, and America’s Present Condition
AGW: The Death Knell
The Barbarians Within and the State of the Union
Defining Liberty
The World Turned Upside Down
“We the People” and Big Government
Evolution and Race
The Culture War
Defense Spending: One More Time
The Fall and Rise of American Empire
Some Inconvenient Facts about Income Inequality
Modern Liberalism as Wishful Thinking
Mass (Economic) Hysteria: Income Inequality and Related Themes
Presidential Treason
Getting Liberty Wrong
Romanticizing the State
The Limits of Science (II)
The Pretence of Knowledge
“The Science Is Settled”
“Wading” into Race, Culture, and IQ
“Liberalism” and Personal Responsibility
Income Inequality and Economic Growth
Round Up the Usual Suspects
Walking the Tightrope Reluctantly
Poverty, Crime, and Big Government
Evolution, Culture, and “Diversity”
A Case for Redistribution, Not Made
Greed, Conscience, and Big Government
Ruminations on the Left in America
The Harmful Myth of Inherent Equality
My View of Libertarianism
Crime Revisited
Getting “Equal Protection” Right
McCloskey on Piketty
The Rahn Curve Revisited
The Slow-Motion Collapse of the Economy
A Cop-Free World?
Nature, Nurture, and Inequality
Tolerance
How to Eradicate the Welfare State, and How Not to Do It
Does Obama Love America?
The Real Burden of Government
No Wonder Liberty Is Disappearing
Diminishing Marginal Utility and the Redistributive Urge
How to Protect Property Rights and Freedom of Association and Expression
Obamanomics in Action
Democracy, Human Nature, and the Future of America
Rationalism, Empiricism, and Scientific Knowledge
The Gaystapo at Work
The Gaystapo and Islam
AGW in Austin?

Signature

More Presidential Trivia: Deaths

I drew on on “Facts about Presidents” to compile some more trivia. These trivia pertain to the deaths of presidents. Let’s start with this table, which lists the presidents in the order in which they died and gives the gap (in years) between their deaths:

Presidents-death dates and gaps

The gap between the deaths of Washington and Jefferson is 26.55 years, and so on down the list. It happens that the first gap is the longest one. The next longest gap is the 21.25 years between the deaths of LBJ and Nixon. (Aside: When LBJ died in January 1973, Nixon continued a precedent established by Truman after the death of FDR and declared a national day mourning for LBJ. The declaration meant a day of paid leave for federal employees and contractors. Many said that it was the best thing LBJ did for them.)

The chart below depicts the death years of presidents. The years are plotted in a saw-tooth pattern, from left to right: row 1 (bottom row), row 2, row 3, row 4, row 5, row 6, row 1, row 2, etc. (If you’re uncertain about the interpretation of the initials, see the key at the bottom of this post.) The vertical green and white bands delineate presidential administrations. Washington’s is the first green band, followed by a white band for John Adams, and so on. (For a sequential list of administrations, see the table later in this post. For exact dates of administrations and deaths, see “Facts about Presidents.”)

Presidents-death years

Many administrations didn’t experience any presidential deaths. Those administrations with more than one presidential death are as follows:

  • John Quincy Adams — Thomas Jefferson and John Adams
  • Andrew Jackson — James Monroe and James Madison
  • Abraham Lincoln — John Tyler, Martin Van Buren, and Abraham Lincoln (I consider the death of a sitting president to have occurred during his administration.)
  • Ulysses S. Grant — Franklin Pierce, Millard Fillmore, and Andrew Johnson
  • Grover Cleveland (first administration) — Ulysses S. Grant and Chester Alan Arthur
  • William McKinley — Benjamin Harrison and William McKinley
  • Herbert C. Hoover — William Howard Taft and Calvin Coolidge
  • Richard M. Nixon — Dwight D. Eisenhower, Harry S Truman, and Lyndon B. Johnson
  • George W. Bush — Ronald W. Reagan and Gerald R. Ford.

What about the number of ex-presidents living during the administrations of sitting presidents? Lincoln, Clinton, and G.W. Bush are tied for the most living ex-presidents (5 each):

Presidents-living ex-presidents

__________
KEY TO PRESIDENTS’ INITIALS: Presidents-key to initials

Signature

What, If Anything, Will Unite Americans?

I don’t expect that Americans can ever be united in their political principles and policy preferences. But the cacophony that emanates from the present state of disunity is figuratively deafening. America would be on the verge of another civil war if the States were now as militarily strong, relative to the central government, as they were in 1861.

Because of the present imbalance of power, a “hot” civil war is unlikely. What then, if not a civil war, might put an end to America’s internal strife? Or is it America’s fate to muddle along in clangorous divisiveness?

History tells me that when the world seems headed in a particular direction, a cataclysmic exogenous event intervenes. Here are some examples:

World War I brought an end to Edwardian elegance, sparked the demise of the British class system, and stamped the United States as a world power.

The Great Depression curtailed the Jazz Age and its associated “excesses,” as they were then considered.

World War II created the economic conditions that helped put an end to the Great Depression in the United States.

Assassinations and anti-war protests rang down the curtain on “Camelot” and deference to authority figures.

The Reagan-Volcker inflation-busting shock of the early 1980’s did much to end the “malaise” that characterized the 1970s — from Watergate to the Iran hostage crisis — and fostered almost 30 years of relative prosperity.

Gorbachev’s sudden “surrender” — due in large part to Reagan’s defense buildup — put an end to the tense and costly 40-year-long Cold War.

A stock-market crash, followed closely by 9/11, ended the relative peace and prosperity of the Reagan-Clinton era.

What, if anything, could bring an end to — or at least muffle — the prevailing political cacophony? It’s impossible to say, of course. But two possibilities strike me as most likely:

– A major war in the Middle East, into which the U.S. is drawn because of oil, Israel, or both.

– A terrorist attack on the U.S. that claims many lives (far more than 9/11), cripples vital infrastructure, or both.

There is a peaceful possibility — though it doesn’t preclude the two unappealing scenarios; the possibility is de facto secession. This has begun in a small way, with State-level legalization of marijuana, which has happened in spite of the central government’s de jure power to criminalize it.

It is therefore imaginable that GOP control of the White House and Congress would embolden some GOP-controlled States to openly flout federal laws, regulations, and judicial decrees about such matters as same-sex marriage, environmental emissions, and Obamacare — to name a few obvious targets. The result, if it came to pass, would be something like the kind of federalism envisioned by the Framers of the Constitution.

But leftists would resist, loudly and demagogically. Given their need to control others, they would use every trick in the book to keep GOP-controlled States in line while giving free rein to Democrat-controlled States. In the end, the cacophony might intensify, not diminish.

*     *     *

Related posts:
A Grand Strategy for the United States
The Folly of Pacifism
Why We Should (and Should Not) Fight
Rating America’s Wars
Transnationalism and National Defense
The Next 9/11?
The Folly of Pacifism, Again
September 20, 2001: Hillary Clinton Signals the End of “Unity”
NEVER FORGIVE, NEVER FORGET, NEVER RELENT!
Patience as a Tool of Strategy
The War on Terror, As It Should Have Been Fought
Preemptive War
Preemptive War and Iran
Some Thoughts and Questions about Preemptive War
Defense as an Investment in Liberty and Prosperity
Reclaiming Liberty throughout the Land
Secession, Anyone?
Obamacare and Zones of Liberty
Mission Not Accomplished
Secession for All Seasons
A New Constitution for a New Republic
Restoring Constitutional Government: The Way Ahead
The World Turned Upside Down
Secession Made Easy
More about “Secession Made Easy”
The Culture War
Defense Spending: One More Time
A Home of One’s Own
The Criminality and Psychopathy of Statism
Romanticizing the State
Surrender? Hell No!
Social Accounting: A Tool of Social Engineering
Has America Always Been Leftist?
Let’s Make a Deal
Jerks and Demagogues
Decline
Walking the Tightrope Reluctantly
How to Eradicate the Welfare State, and How Not to Do It
The Obamacare Effect: Greater Distrust of Government
“Blue Wall” Hype
Does Obama Love America?

Signature

Presidential Trivia: Recurring First Names

Among the 44 Presidents of the United States, there are eight first names that occur more than once. Do you know the eight names? Do you know the middle names (if any) and last names that go with the first names? Try to answer those questions without peeking at a list of presidents, then go to the bottom of this page for the answers.

Who Shot JFK, and Why?

ORIGINALLY PUBLISHED 11/22/13; UPDATED 09/25/14 AND 06/26/15; REVISED 06/27/15

On this, the 50th anniversary of the murder of John F. Kennedy, I refer you to my recollections of the day (posted two years ago), and offer the following thoughts about the killing.

I recently watched the NOVA production, Cold Case JFK, which documents the application of current forensic technology to the 50-year-old case. The investigators’ give a convincing explanation of the shooting in Dallas. The explanation supports the original “verdict” of the Warren Report: Lee Harvey Oswald was the lone shooter. He, and only he, fired the shots that killed JFK and wounded John Connally, then governor of Texas. The NOVA program moves me from “reasonable doubt” to “beyond a reasonable doubt,” with respect to who shot JFK and how.

What about the thesis advanced by James B. Reston Jr. that Oswald’s real target was Connally? Possibly, inasmuch as Oswald wasn’t a sniper-class shooter. Here’s a scenario that’s consistent with the timing of events in Dealey Plaza: Oswald can tell that his first shot missed his target. He got off a quick second shot, which hit JFK, who’s in line with Connally, passed through JFK and hit Connally. There was no obvious, dramatic reaction from Connally, even though he was hit. So Oswald fired a quick third shot, which hit Kennedy in the back of the head instead of hitting Connally, who by that time had slumped into his wife’s lap. (Go here for the Warren Commission’s chronology of the shots and their effects.)

Reston’s thesis is that Oswald went after Connally because Oswald’s discharge from the Marine Corps was downgraded from “honorable” to “undesirable” while Connally was Secretary of the Navy. The downgrading hurt Oswald’s pride and his ability to get a good job in those days when employers were allowed to hire whom they pleased. I have read Reston’s book and can tell you that it is a piece of padded fluff. Padded as it is, the book consists of only 183 pages of text, set in large type on small pages. Reston offers no evidence other than Oswald’s grudge against Connally and (supposed) lack of of grudge against JFK. Reston harms his case by ascribing unsourced and obviously fabricated words and thoughts to his characters. I say “characters” because Reston has fashioned something that reads more like a novel than a history.

Reston could be right, but we’ll never know if he is or isn’t. The truth of the matter died with Oswald on November 24, 1963. In any event, if Reston is right, it would mean that there was no conspiracy to murder JFK.

The only conspiracy theory that might still be worth considering is the idea that Oswald was gunning for JFK because he was somehow maneuvered into doing so by LBJ, the CIA, Fidel Castro, the Mafia, or the Russians. (See, for example, Philip Shenon’s “‘Maybe We Missed Something’: Warren Commission Insider Publicly Concedes That JFK Assassination Was Likely a Conspiracy,” The Washington Post, September 22, 2014, republished in The National Post.) The murder of Oswald by Ruby conveniently plays into that theory. But I say that the burden of proof is on conspiracy theorists, for whom the obvious is not titillating enough. The obvious is Oswald — a leftist loser and less-than-honorably discharged Marine with a chip on his shoulder, a domineering mother, an unhappy home life, and a menial job. In other words, the kind of loser with a gun who now appears almost daily in the news, having slaughtered family members, former co-workers, or random strangers. (This ubiquity is, of course, a manifestation of the media’s anti-gun bias, but that’s another story.)

Finally, after 50 years as a moderate skeptic of the Warren Report, I am satisfied that Lee Harvey Oswald was the only shooter in Dallas, and that he wasn’t part of a conspiracy.  Given that, it is immaterial whether Oswald was gunning for JFK or Connally. He killed JFK, and the rest — for good or ill — is history.

That history still resounds with absurd claims that an “atmosphere of hate in Dallas” (right-wing, of course) was somehow responsible for the murder of JFK. How did this “atmosphere” — invented by the media to deflect blame from a leftist killer — cause Oswald to take aim at JFK or Connally? By “atmospheric” induction? There’s a conspiracy theory for you.

Timely Trivia Question

One person administered the presidential oath of office nine times (a record). Who was that person, and to which presidents did he administer the oath? Scroll down for the answer.

John Marshall, Chief Justice of the United States from 1801 to 1835, administered the oath to Thomas Jefferson in 1801 and 1805, James Madison in 1809 and 1813, James Monroe in 1817 and 1821, John Quincy Adams in 1825, and Andrew Jackson in 1829 and 1833.

Roger B. Taney, Marshall’s successor as Chief Justice (1836 to 1864), administered the oath of office seven times. Warren E. Burger (Chief Justice from 1969 to 1986) administered the oath six times.

For more trivia about inauguration day, go here.

November 22, 1963

I have said all that I wish to say about November 22, 1963, as a political event, and about JFK’s performance as president. My purpose here is simply to mark what ranks as the third-most shocking day of my lifetime. The most shocking, because I remember it all too well, is September 11, 2001. The second-most shocking, which I remember not at all (because I was so young), is December 7, 1941.

JFK’s assassination was a mighty shock for two reasons:

  • It had been 62 years since the assassination of a president (William McKinley, 1901).
  • There was, in the early 1960s, less of the intense political polarization that would now render a president’s assassination almost unsurprising.

Presidential Heights

I once remarked on the longevity of presidents:

The [following] graph highlights trends (such as they are) in the age at which presidents have died (or to which they have survived if still living), the age at which they were elected or succeeded to the presidency, and the number of years by which they survived (or have thus far survived) election or succession. (I have omitted assassinated presidents from the data for age of death and number of years surviving, thus the gaps in the first and third series.)

It seems to me that the early presidents were generally “healthy and wise” (and wealthy, by the standards of their time). That is, they were of superior genetic stock, relative to the average person. Their successors have tended to be of less-superior stock, and it shows in the downward trends after 1836.

The general rise in life expectancies since 1900 masks the relative inferiority of twentieth century presidents. The rising age of accession to the presidency after 1932 and the rise in years of survivorship after 1924 (both with wide variations around the trend) should not be taken to indicate that presidents of the twentieth century are on a par, genetically, with the early presidents. They are not.

These observations are consistent with the following graph of presidents’ heights (here including only those men who were elected to the presidency):

Source: “Heights of United States presidents and presidential candidates” at Wikipedia.

With the notable exception of Lincoln, presidential heights generally diminished from the late 1700s to the late 1800s. The upward trend since 1900 attests to the general health and vigor of the population; it says nothing about the relative robustness of the men who have been elected to the presidency in the 20th and 21st centuries.

Popular-Vote Margins in Presidential Elections

I present the following graph as a matter of historical interest; no political commentary is intended or implied.

Draw your own conclusions, if there are any to be drawn.

Random Thoughts

Why is “gunite” pronounced gun-ite, whereas “granite” is pronounced gran-it?

If, in 1950, Harry Truman had said “four score and seven years ago,” he would have been referring to 1863, the year in which Abraham Lincoln uttered that famous phrase.

In the computer industry, “email” is preferred to “e-mail.” But it seems to me that “e-mail” better represents the phrase “electronic mail.” The meaning of “e-mail” is immediately obvious to me; “email,” at first glance, looks like a typo.

If the dismal northern weather of early April and late October — which delayed the start of the 2008 baseball season in some cities and then disrupted the World Series — doesn’t convince Major League Baseball to lop two weeks from each end of the regular season, nothing will.

One of the funniest movies I’ve seen is Harold Lloyd’s Dr. Jack (1922). It starts slowly, but builds to a hilariously frantic finish. Lloyd’s Safety Last! is better known — and deservedly considered a comedy classic — but it isn’t half as funny as Dr. Jack.

Between novels, I have been slogging my way through Thomas K. McCraw’s Prophet of Innovation: Joseph Schumpeter and Creative Destruction. There’s too much armchair psychology in it, but it whets my appetite for Schumpeter’s classic Capitalism, Socialism, and Democracy, which (I hate to admit) I haven’t read. Schumpter’s famous term for capitalism, “creative destruction,” often is applied with an emphasis on “destruction”; the emphasis should be on “creative.”

I must observe, relatedly, that my grandmother’s lifetime (1880-1977) spanned the invention and adoption of far more new technology than is likely to emerge in my lifetime, even if I live as long as my grandmother did.

Election 2008: Signs and Portents

This isn’t a “political” post. Read on.

Forty-two different men have served as president of the United States, although the official number of presidents is 43 because Grover Cleveland was elected to two non-consecutive terms, each of which is counted as a separate presidency. Herein, I present some important facts about those 42 men and their 43 presidencies, and about the implications of those facts for the outcome of election 2008.

No person whose last name begins with “O” has served as president. The last names of three presidents begin with “M” (Madison, Monroe, McKinley); the last name of one president begins with “Mc” (McKinley). Advantage: McCain

Only three presidents’ last names end in vowel sounds (Monroe, McKinley, Kennedy); all the rest end in consonant sounds. Advantage: McCain

No president’s last name ends with “a”; 15 presidents’ last names end with “n.” Advantage: McCain

The mean number of letters in the presidents’ last names is 6.67; the median number is 7. McCain (6) is closer to the norm than Obama (5). Advantage: McCain

Of the 43 presidencies, 38 have occurred by election. (The five presidents who didn’t serve elected terms of office were Tyler, Fillmore, A. Johnson, Arthur, and Ford.) There have been 37 elected successions (Washington didn’t succeed anyone). In two of those successions, the newly elected president was the same age as his predecessor was when the predecessor was elected; in 15 cases, the successor was younger than his predecessor was; in 20 cases, the successor was older than his predecessor was. It is, therefore, more usual than otherwise for a newly elected president to be older than his predecessor was upon election. Such would be the case if McCain (72 by the time of this year’s election) succeeds G.W. Bush (54 at the time of his election in 2000). Alternatively, Obama (47 by the time of this year’s election) would be younger than G.W. Bush was in 2000. Advantage: McCain

Mr. Cranky has the edge over Mr. Change.

Historical Bias 101

Guest commentary by Postmodern Conservative.

Stupid bias in books is ubiquitous, but it is particularly obvious in children’s literature. There is a reason for that. Not only are most works of popularized history and social sciences low brow, but the level of juvenile books is even lower. For that reason I give my twelve-year-old credit for spotting the obvious bias in The Cold War by Britta Bjornlund, which we got from our local library. Reagan was an “aggressive” leader but Gorbachev gets all the credit for ending the forty year standoff of East and West. Ms. Bjornlund also has a pet cat named “Trotsky,” so go figure.

But there are some good books for younger readers if you hunt for them. A truly first-rate study is Albert Marrin’s Stalin: Russia’s Man of Steel which was put out by Viking Penguin in the late 80s, and which draws heavily on the work of scholars like Robert Conquest—the British historian who was one of the first to tell western readers about the full scope of Russia’s mass murders. I’d recommend Marrin’s work for older readers as well. It provides an accurate and unflinching portrayal of the USSR and the man who came to rule it.

Slavery: East and West

Guest commentary by Postmodern Conservative.

Islam expert Robert Spencer writes in First Things how it is unacknowledged that “Christian principles played” a big role in the abolition of slavery in the West, which was “an enterprise unprecedented in the annals of human history.” By contrast,

Slavery was taken for granted throughout Islamic history, as it was, of course, in the West as well up until relatively recent times. Yet while the European and American slave trade get lavish attention from historians… the Islamic slave trade actually lasted longer and brought suffering to a larger number of people…. There is evidence that slavery still continues beneath the surface in some majority-Muslim countries as well… (“Slavery, Christianity, and Islam“).

"John Adams"

Regarding the HBO mini-series, John Adams, I have two comments:

1. I never got used to the idea of Paul Giamatti as John Adams. Giamatti simply doesn’t look the part, and he never seemed to be comfortable with the hybrid English-Yankee accent chosen for his character.

2. The mini-series was geared to viewers who are largely ignorant of American history. Why else would the script writers have kept injecting trite dialog to “establish” well-known facts? The final episode, for example, included cumbersome reminders that John Quincy Adams (John’s eldest son) also served as president (1825-9), and that both John Adams and Thomas Jefferson died on the Fourth of July 1826, the fiftieth anniversary of the Declaration of Independence.