Philosophical Musings: Part V

Desiderata as beliefs.


How many things does a human being believe because he wants to believe them, and not because there is compelling evidence to support his beliefs? Here is a small sample of what must be an extremely long list:

  • There is a God. (1a)

  • There is no God. (1b)

  • There is a Heaven. (2a)

  • There is no Heaven. (2b)

  • Jesus Christ was the Son of God. (3a)

  • Jesus Christ, if he existed, was a mere mortal. (3b)

  • Marriage is the eternal union, blessed by God, of one man and one woman. (4a)

  • Marriage is a civil union, authorized by the state, of one or more consenting adults (or not) of any gender, as the participants in the marriage so define themselves to be. (4b)

  • All human beings should have equal rights under the law, and those rights should encompass both negative rights (e.g., not to be murdered or defrauded) and positive rights (e.g., to benefit from government-granted promotions, college admissions, and other peoples’ money ). (5a)

  • Human beings are, at bottom, feral animals and cannot therefore be expected to abide always by artificial constructs, such as equal rights under the law. Accordingly, there will always be persons who use the law (or merely brute force) to set themselves above other persons. (5b)

  • The rise in global temperatures over the past 170 years has been caused primarily by a greater concentration of carbon dioxide in the atmosphere, which rise has been caused by human activity – and especially by the burning of fossil fuels. This rise, if it isn’t brought under control will make human existence far less bearable and prosperous than it has been in recent human history. (6a)

  • The rise in global temperatures over the past 170 years has not been uniform across the globe, and has not been in lockstep with the rise in the concentration of atmospheric carbon dioxide. The temperatures of recent decades, and the rate at which they are supposed to have risen, are not unprecedented in the long view of Earth’s history, and may therefore be due to conditions that have not been given adequate consideration by believers in anthropogenic global warming (e.g., natural shifts in ocean currents that have different effects on various regions of Earth, the effects of cosmic radiation on cloud formation as influenced by solar activity and the position of the solar system and the galaxy with respect to other objects in the universe, the shifting of Earth’s magnetic field, and the movement of Earth’s tectonic plates and its molten core). In any event, the models of climate change have been falsified against measured temperatures (even when the temperature record has been adjusted to support the models). And predictions of catastrophe do not take into account the beneficial effects of warming (e.g., lower mortality rates, longer growing seasons), whatever causes it, or the ability of technology to compensate for undesirable effects at a much lower cost than the economic catastrophe that would result from preemptive reductions in the use of fossil fuels. (6b)

Not one of those assertions, even the ones that seem to be supported by facts, is true beyond a reasonable doubt. I happen to believe 1a (with some significant qualifications about the nature of God), 2b, 3b (given my qualified version of 1a), a modified version of 4a (monogamous, heterosexual marriage is socially and economically preferable, regardless of its divine blessing or lack thereof), 5a (but only with negative rights) and 5b, and 6b.  But I cannot “prove” that any of my beliefs is the correct one, nor should anyone believe that anyone can “prove” such things.

Take the belief that all persons are created equal. No one who has eyes, ears, and a minimally functioning brain believes that all persons are created equal, though they may (if they are law-abiding) deserve equal treatment under the law (restricted to the enforcement of their negative rights).

Abraham Lincoln, the Great Emancipator, didn’t believe that all persons are created equal:

On September 18, 1858 at Charleston, Illinois, Lincoln told the assembled audience:

I am not, nor ever have been, in favor of bringing about in any way the social and political equality of the white and black races, that I am not, nor ever have been, in favor of making voters or jurors of negroes, nor of qualifying them to hold office, nor to intermarry with white people; and I will say in addition to this that there is a physical difference between the white and black races which I believe will forever forbid the two races living together on terms of social and political equality … I will add to this that I have never seen, to my knowledge, a man, woman, or child who was in favor of producing a perfect equality, social and political, between negroes and white men….

This was before Lincoln was elected president and before the outbreak of the Civil War, but Lincoln’s speeches, writings, and actions after these events continued to reflect this point of view about race and equality.

African American abolitionist Frederick Douglass, for his part, remained very skeptical about Lincoln’s intentions and program, even after the p[resident issued a preliminary emancipation in September 1862.

Douglass had good reason to mistrust Lincoln. On December 1, 1862, one month before the scheduled issuing of an Emancipation Proclamation, the president offered the Confederacy another chance to return to the union and preserve slavery for the foreseeable future. In his annual message to congress, Lincoln recommended a constitutional amendment, which if it had passed, would have been the Thirteenth Amendment to the Constitution.

The amendment proposed gradual emancipation that would not be completed for another thirty-seven years, taking slavery in the United States into the twentieth century; compensation, not for the enslaved, but for the slaveholder; and the expulsion, supposedly voluntary but essentially a new Trail of Tears, of formerly enslaved Africans to the Caribbean, Central America, and Africa….

Douglass’ suspicions about Lincoln’s motives and actions once again proved to be legitimate. On December 8, 1863, less than a month after the Gettysburg Address, Abraham Lincoln offered full pardons to Confederates in a Proclamation of Amnesty and Reconstruction that has come to be known as the 10 Percent Plan.

Self-rule in the South would be restored when 10 percent of the “qualified” voters according to “the election law of the state existing immediately before the so-called act of secession” pledged loyalty to the union. Since blacks could not vote in these states in 1860, this was not to be government of the people, by the people, for the people, as promised in the Gettysburg Address, but a return to white rule.

It is unnecessary, though satisfying, to read Charles Murray’s account in Human Diversity of the broad range of inherent differences in intelligence and other traits that are associated with the sexes, various genetic groups of geographic origin (sub-Saharan Africans, East Asians, etc.), and various ethnic groups (e.g., Ashkenazi Jews).

But even if all persons are not created equal, either mentally or physically, aren’t they equal under the law? If you believe that, you might just as well believe in the tooth fairy. As it says in 5b,

Human beings are, at bottom, feral animals and cannot therefore be expected to abide always by artificial constructs, such as equal rights under the law. Accordingly, there will always be persons who use the law (or merely brute force) to set themselves above other persons.

Yes, it’s only a hypothesis, but one for which there is ample evidence in the history of mankind. It is confirmed by every instance of theft, murder, armed aggression, scorched-earth warfare, mob violence as catharsis, bribery, election fraud, gratuitous cruelty, and so on into the night.

And yet, human beings (Americans especially, it seems) persist in believing tooth-fairy stories about the inevitable triumph of good over evil, self-correcting science, and the emergence of truth from the marketplace of ideas. Balderdash, all of it.

But desiderata become beliefs. And beliefs are what bind people – or make enemies of them.

The Meaning of the Red Ripple

Fasten your seat belts and get ready for a hard landing.

Like many other conservatives, I expected much more from the mid-term election than (perhaps) a slim majority in the House of Representatives. I said this (optimistically) in “The Bitter Fruits of America’s Disintegration”,

Is there hope for an American renaissance? The upcoming mid-term election will be pivotal but not conclusive. It will be a very good thing if the GOP regains control of Congress. But it will take more than that to restore sanity to the land.

A Republican (of the right kind) must win in 2024. The GOP majority in Congress must be enlarged. A purge of the deep state must follow, and it must scour every nook and cranny of the central government to remove every bureaucrat who has a leftist agenda and the ability to thwart the administration’s initiatives.

Beyond that, the American people should be rewarded for their (aggregate) return to sanity by the elimination several burdensome (and unconstitutional departments of the executive branch), by the appointment of dozens of pro-constitutional judges, and by the appointment of a string of pro-constitutional justices of the Supreme Court.

After that, the rest will take care of itself: Renewed economic vitality, a military whose might deters our enemies, and something like the restoration of sanity in cultural matters. (Bandwagon effects are powerful, and they can go uphill as well as downhill.)

But all of that is hope. The restoration of America’s greatness will not be easy or without acrimony and setbacks.

If America’s greatness isn’t restored, America will become a vassal state. And the leftists who made it possible will be the first victims of their new masters.

The election was pivotal, but not in the way that I expected it to be. It marked the end of hope for a restoration of sanity. All that happened is that some Red States got Redder, some Blue States got Bluer, and the loony left in the House gained several new members. The latter development is probably the best indication of what will happen in the coming elections.

If there was no “Red wave” this year, it’s unlikely that there’ll be one in the future. What’s more likely is that the election of 2024 will return a Democrat (not Biden) to the White House and Democrat control of Congress will be restored. From there, expect the following:

  • “Wokeness” will be in the saddle.

  • Influential institutions (Big Tech, the media, the academy, public “education”, and most government bureaucracies) will be more than ever dominated by the left.

  • Violent crime and the coddling of criminals will continue apace, or get worse.

  • There will be more suppression of conservative views through electronic censorship, financial blackmail, and selective enforcement of laws.

  • The insanity of replacing reliable and cheap fossil fuels with unreliable and therefore expensive “renewables” will continue with a vengeance.

  • The regulatory-welfare state will control more and more of the economy.

  • Inflation will continue to wreak economic havoc until price controls make things worse by further disincentivizing productive capital investments and entrepreneurship.

  • Defense spending will continue to be well below what is required to deter America’s enemies and protects Americans’ overseas interests.

In short, the decline of America — social, economic, and military — will continue. Our enemies will be able to dictate the terms on which America may survive — economically and politically. There will be no Chamberlain-esque moment of surrender, it will just happen gradually and with official approval.

The only hope for (some) Americans is a national divorce. But with the left in the saddle, that is no more likely to happen than was the emancipation of slaves by peaceful means.

I hope I’m wrong again, but I fear that I am right.

Philosophical Musings: Part IV

Irrational rationality.


I discussed Type 1 and Type 2 thinking in the previous entry. Type 2 thinking — deliberate reasoning — has two main branches: scientific and scientistic.

The scientific branch leads (often in roundabout ways) to improvements in the lot of mankind: better and more abundant food, better clothing, better shelter, faster and more comfortable means of transportation, better sanitation, a better understanding of diseases and more effective means of combating them, and on and on.

You might protest that not all of those things, and perhaps only a minority of them, emanated from formal scientific endeavors conducted by holders of Ph.D. and M.D. degrees working out of pristine laboratories or with delicate equipment. But science is much more than that. Science includes learning by doing, which encompasses everything from the concoction of effective home remedies to the hybridization of crops to the invention and refinement of planes, trains, and automobiles – and, needless to say, to the creation and development of much of the electronic technology and related software with which we are “blessed” today.

The scientific branch yields its fruits because it is based on facts about the so-called material universe. The essence of the universe may be unknown and unknowable, as discussed in an earlier entry, but it manifests themselves in observable and often predictable ways.

The scientific branch, in sum, is inductive at its core. Observations of specific phenomena lead to guesses about the causes of those phenomena or the relationships between them. The guesses are codified as hypotheses, often in mathematical form. The hypotheses are tested against new observations of the same kinds of phenomena. If the hypotheses are found wanting, they are either rejected outright or modified to take into account the new observations. Revised hypotheses are then tested against newer observations, and so on. (There is nothing scientific about testing a new hypothesis against the observations that led to it; that is a scientistic trick used by, among others, climate “scientists” who wish to align their models with historical climate data.)

If new observations are found to comport with a new hypothesis, the hypothesis is said to be confirmed. Confirmed doesn’t mean proven, it just means not disproved. Lay persons — and a lot of scientists, apparently — mistake confirmation, in the scientific sense, for proof. There is no such thing in science.

The scientistic branch of Type 2 thinking is deductive. It assumes truths and then generalizes from those assumptions; for example:

  • All Cretans are liars, according to Epimenides (a Cretan who lived ca. 600 BC).

  • Epimenides was a Cretan.

  • Therefore, Epimenides was a liar.

But if Epimenides was lying, not all Cretans are liars. The conclusion therefore doesn’t follow from the premises. The flaw in the argument is the unprovable generalization that all Cretans are liars.

The syllogism illustrates the fatuousness of deductive reasoning, that is, reasoning which proceeds from general statements that cannot be disproven (falsified).

Though deductive reasoning can be useful in contriving hypothesis, it cannot be used to “prove” anything. But there are persons who claim to be scientists, or who claim to “believe” science, who do reason deductively. It starts when a hypothesis that has been advanced by a scientist becomes an article of faith to that scientist, to a group of scientists, or to non-scientists who use their belief to justify political positions – which they purport to be “scientific” or “science-based”.

There is no more “science” in such positions as there is in the belief that the Sun revolves around the Earth or that all persons are created equal. The Sun may seem to revolve around the Earth if one’s perspective is limited to the relative motions of Sun and Earth and anchored in the implicit assumption that Earth’s position is fixed. All persons may be deemed equal in a narrow and arbitrary way — as in the legal doctrine of equal treatment under the law — but that hardly makes all persons equal in every respect; for example, in intelligence, physical strength, athletic ability, attractiveness to the opposite sex, work ethic, conditions of birth, or proneness to various ailments. (I will say more about equality as a non-scientific desideratum in the next entry.)

This isn’t to say that some scientific hypotheses — and their implications — can’t be relied upon. If they couldn’t be, humans wouldn’t have benefited from the many things mentioned earlier in this post — and much more. But confirmed hypotheses can be relied upon because they are based on observed phenomena, tested in the acid of use, and — most important — employed with ample safeguards, which still may be inadequate to real-world conditions. Despite the best efforts of physicists, chemists, and engineers, airplanes crash, bridges collapse, and so on, because there is never enough knowledge to foresee all of the conditions that might arise in the real world.

This isn’t to say that human beings would be better off without science. Far from it. Science and its practical applications have made us far better off than we would be without them. But neither scientists nor those who apply the (tentative) findings of science are infallible.

Economists and Voting

Extreme economism on tap.

It’s the time of year when economists like to remind the unwashed that voting is a waste of time. And right on schedule there’s “Sorry, But Your Vote Doesn’t Count” by Pierre Lemiux, writing at EconLog. A classic of the genre appeared 17 years ago, in the form of  “Why Vote?” by Stephen J. Dubner and Steven D. Levitt (of Freakonomics fame). Here are some relevant passages:

The odds that your vote will actually affect the outcome of a given election are very, very, very slim. This was documented by the economists Casey Mulligan and Charles Hunter, who analyzed more than 56,000 Congressional and state-legislative elections since 1898. For all the attention paid in the media to close elections, it turns out that they are exceedingly rare. The median margin of victory in the Congressional elections was 22 percent; in the state-legislature elections, it was 25 percent. Even in the closest elections, it is almost never the case that a single vote is pivotal. Of the more than 40,000 elections for state legislator that Mulligan and Hunter analyzed, comprising nearly 1 billion votes, only 7 elections were decided by a single vote, with 2 others tied. Of the more than 16,000 Congressional elections, in which many more people vote, only one election in the past 100 years – a 1910 race in Buffalo – was decided by a single vote….

Still, people do continue to vote, in the millions. Why? Here are three possibilities:

1. Perhaps we are just not very bright and therefore wrongly believe that our votes will affect the outcome.

2. Perhaps we vote in the same spirit in which we buy lottery tickets. After all, your chances of winning a lottery and of affecting an election are pretty similar. From a financial perspective, playing the lottery is a bad investment. But it’s fun and relatively cheap: for the price of a ticket, you buy the right to fantasize how you’d spend the winnings – much as you get to fantasize that your vote will have some impact on policy.

3. Perhaps we have been socialized into the voting-as-civic-duty idea, believing that it’s a good thing for society if people vote, even if it’s not particularly good for the individual. And thus we feel guilty for not voting. [The New York Times Magazine, November 6, 2005]

In true economistic fashion, Dubner and Levitt omit a key reason for voting: It makes a person feel good. Even if one’s vote will not change the outcome of an election, one attains a degree of satisfaction from taking an official (even if secret) stand in favor of or in opposition to a certain candidate, bond issue, or other issue on a ballot.

Dubner and Levitt (and their ilk) seem to inhabit a world in which a thing is not worth doing unless the payoff can be measured with some precision and compared with other, similarly quantifiable, uses of one’s time and money. I doubt that they govern their own lives accordingly. If they do, they must be missing out on a lot of life’s pleasures: sex and ice cream, to name only two.

Their article continues on a different tack:

But wait a minute, you say. If everyone thought about voting the way economists do, we might have no elections at all. No voter goes to the polls actually believing that her single vote will affect the outcome, does she? And isn’t it cruel to even suggest that her vote is not worth casting?

This is indeed a slippery slope – the seemingly meaningless behavior of an individual, which, in aggregate, becomes quite meaningful. Here’s a similar example in reverse. Imagine that you and your 8-year-old daughter are taking a walk through a botanical garden when she suddenly pulls a bright blossom off a tree.

“You shouldn’t do that,” you find yourself saying.

“Why not?” she asks.

“Well,” you reason, “because if everyone picked one, there wouldn’t be any flowers left at all.”

“Yeah, but everybody isn’t picking them,” she says with a look. “Only me.”

Clever, what? Too clever by half. This argument overlooks the powerful effect of exemplary behavior — where “exemplary”, as used here, does not imply “laudable”. By Dubner and Levitt’s account, allowing a vandal to deface a public building would not encourage other vandals to do the same thing, and would not lead to the widespread defacement of buildings and other anti-social acts. (I refer, of course, to James Q. Wilson’s Broken Windows Theory, on which Levitt and Dubner tried to cast doubt on Freakonomics. They wrongly suggested that the onset of legalized abortion was instrumental in the reduction of crime rates.)

Dubner and Levitt’s argument also overlooks the key fact that when economists preach against voting, they are not just preaching to themselves. Dubner and Levitt’s sermon appeared in the pages of one of the country’s most widely read and influential publications. It was not addressed to an individual person, but to thousands upon thousands of persons. And I doubt that they would have objected if the article had appeared in every newspaper and magazine in the country. In effect, the Dubner-Levitt argument is not just an argument that the marginal vote makes little difference — it is advice to millions of Americans that they should abstain from voting.

That’s paradoxical advice. Abstention by millions of Americans could very well make a difference in the outcome of an election. The tendency to abstain might, in a particular election, be disproportionate to party affiliation. That’s why political campaigns try to counter apathy by whipping up enthusiasm. For example, Democrats might be able to pare their losses in the coming election if they can convince enough pro-Democrat voters that defeat isn’t inevitable, and that their votes will make a difference.

In any event, Levitt, Dubner, and their ilk are guilty of paternalism as well as economism.


Related posts:

The Rationality Fallacy

Externalities and Statism

Extreme Economism

Irrational Rationality

Not-So-Random Thoughts (III) (third item)

Obesity and Statism

“Libertarian Paternalism” Revisited

Share this:

Philosophical Musings: Part III

Survival and thinking.


It’s true that instinctive (or impulsive) actions can be foolish, dangerous, and deadly. But they can also be beneficial. If, in your peripheral vision, you see an object hurtling toward you at high speed, you don’t deliberately compute its trajectory and decide whether to move out of its path. No, your brain does that for you without your having to “think” about it. And if your brain works quickly enough, you will have moved out of the object’s path before you would have finished “thinking” about what to do.

In sum, you (your brain) engaged in Type 1 thinking about the problem at hand and resolved it quickly. If you had engaged in deliberate, Type 2, thinking you might have been killed by the impact of the object that was hurtling toward you.

The distinction that I’m making here is one that Daniel Kahneman labors over in Thinking Fast and Slow. But I won’t bore you with the details of that boring book. Life is too short, and certainly shorter for me than for most of you. Let’s just say that there’s nothing especially meritorious about Type 2 thinking, and that it can lead to actions that are as foolish, dangerous, and deadly as those that result from “instinct”.

I will go further and say that Type 2 thinking has brought Americans to the brink of bankruptcy, serfdom, and civil war. But to understand why I say that, you will have to follow this series to its bitter-sweet ending.

*     *     *

If the need to survive ever had anything to do with the advancement of human intelligence and knowledge, that day is long past for most human beings in “developed” nations.

Type 1 thinking is restricted mainly to combat, competitive sports, operating motorized equipment, playing video games, and reacting to photos of Donald Trump or Joe Biden. It is the key to survival in a narrow range of activities aside from combat, such as driving on a busy highway, ducking when a lethal projectile is headed your way, and instinctively avoiding persons whose actions or appearance seem menacing. The erosion of the avoidance instinct is due in part to the cosseted lives that most Westerners (and Japanese) lead, and in part to the barrage of propaganda that denies differences in the behavior of various classes, races, and ethnic groups. (Thus, for example, disruptive black children aren’t to be ejected from classrooms unless an equal proportion of white children, disruptive or not, is likewise ejected.)

Type 2 thinking of the kind that might advance useful knowledge and its beneficial application is a specialty of the educated, intermarrying elite – a class that dominates academia and the applied sciences (e.g., medicine, medical research, and the various fields of engineering). The same class also dominates the media (including so-called entertainment), “technology” companies (most of which don’t really produce technology), the upper echelons of major corporations, and the upper echelons of government.

But, aside from academicians and professionals whose work advances practical knowledge (how to build a better mousetrap, a more earthquake-resistant building, a less collapsible bridge, or an effective vaccine), the members of aforementioned class have nothing on the yeomen who become skilled in sundry trades (construction, plumbing, electrical work) by the heuristic method — learning and improving by doing. That, too, is Type 2 thinking (though it often incorporates the sudden insights yielded by type 1 thinking). But practical knowledge accumulates over years and is tested in the acid of use, unlike the kind of Type 2 thinking that produces intricate but wildly inaccurate climate models whose designers believe in and defend because they are emotional human beings, like all of us.

Type 2 thinking, despite the stereotype that it is deliberate and dispassionate, is riddled with emotion. Emotion isn’t just rage, lust, and the like. Those are superficial manifestations of the thing that drives us all: egoism.

No matter how you slice it, everything that a person does deliberately — including type 2 thinking — is done to bolster his own sense of well-being. Altruism is merely the act of doing good for others so that one may feel better about oneself. You cannot be another person, and actually feel what another person is experiencing. You can only be a person whose sense of self is invested in loving another person or being thought of as loving mankind — whatever that means.

Type 2 thinking — the Enlightenment’s exalted “reason” — is both an aid to survival and a hindrance to it. It is an aid in ways such as those mentioned above, that is, in the advancement of practical knowledge to defeat disease, move people faster and more safely, build dwellings that will stand up against the elements, and so on.

It is a hindrance when, as Shakespeare’s Hamlet says, “the native hue of resolution Is sicklied o’er with the pale cast of thought”. Type 1 thinking causes us to smite an enemy. Type 2 thinking causes us to believe, quite wrongly, that by sparing an enemy we somehow become a law-abiding exemplar whose forbearance diminishes the level of violence in the world and the likelihood that violence will be visited upon us in the future.

Neville Chamberlain exemplified Type 2 thinking when he settled for Hitler’s empty promise of peace instead of gearing up to fight an inevitable war. Lyndon Johnson exemplified Type 2 thinking in his vacillating prosecution of the war in Vietnam, where he was more concerned with “world opinion” (whatever that is) and “public opinion” (i.e., the bleating of pundits and protestors) than he was with the real job of the commander-in-chief, which it to fight and win or don’t fight at all. George H.W. Bush exemplified Type 2 thinking when he declined to depose Saddam Hussein in 1991. Barack Obama exemplified Type 2 thinking when he made a costly deal with Iran’s ayatollahs that profited them greatly for an easily betrayed promise to refrain from the development of nuclear weapons. Type 2 thinking of the kind exemplified by Chamberlain, Johnson, Bush, and Obama is egoistic and delusional: It reflects and justifies the thinker’s inner view of the world as he wants it to be, not the world as it is.

Type 2 thinking is valuable to the survival of humanity when it passes the acid test of use. It is a danger to the survival of humanity when it arises from a worldview that excludes the facts of life. One of those facts of life is that predators exist and must be killed or somehow (and usually at greater expense) neutralized.

To be continued in Part IV.

Philosophical Musings: Part II

Evolution or devolution?


Evolution is simply change in organic (living) objects. Evolution, as a subject of scientific inquiry, is an attempt to explain how humans (and other animals) came to be what they are today.

Evolution (as a discipline) is a much scientism as it is science. Scientism, according to thefreedictionary.com is “the uncritical application of scientific or quasi-scientific methods to inappropriate fields of study or investigation.” When scientists proclaim truths instead of propounding hypotheses they are guilty of practicing scientism. Two notable scientistic scientists are Richard Dawkins and Peter Singer. It is unsurprising that Dawkins and Singer are practitioners of scientism. Both are strident atheists, and strident atheists merely practice a “religion” of their own. They have neither logic nor science nor evidence on their side.

Dawkins, Singer, and many other scientistic atheists share an especially “religious” view of evolution. In brief, they seem to believe that evolution rules out God. Evolution rules out nothing. Evolution may be true in outline but it does not bear close inspection. On that point, I turn to David Gelertner’s “Giving Up Darwin” (Claremont Review of Books, Spring 2019):

Darwin himself had reservations about his theory, shared by some of the most important biologists of his time. And the problems that worried him have only grown more substantial over the decades. In the famous “Cambrian explosion” of around half a billion years ago, a striking variety of new organisms—including the first-ever animals—pop up suddenly in the fossil record over a mere 70-odd million years. This great outburst followed many hundreds of millions of years of slow growth and scanty fossils, mainly of single-celled organisms, dating back to the origins of life roughly three and half billion years ago.

Darwin’s theory predicts that new life forms evolve gradually from old ones in a constantly branching, spreading tree of life. Those brave new Cambrian creatures must therefore have had Precambrian predecessors, similar but not quite as fancy and sophisticated. They could not have all blown out suddenly, like a bunch of geysers. Each must have had a closely related predecessor, which must have had its own predecessors: Darwinian evolution is gradual, step-by-step. All those predecessors must have come together, further back, into a series of branches leading down to the (long ago) trunk.

But those predecessors of the Cambrian creatures are missing. Darwin himself was disturbed by their absence from the fossil record. He believed they would turn up eventually. Some of his contemporaries (such as the eminent Harvard biologist Louis Agassiz) held that the fossil record was clear enough already, and showed that Darwin’s theory was wrong. Perhaps only a few sites had been searched for fossils, but they had been searched straight down. The Cambrian explosion had been unearthed, and beneath those Cambrian creatures their Precambrian predecessors should have been waiting—and weren’t. In fact, the fossil record as a whole lacked the upward-branching structure Darwin predicted.

The trunk was supposed to branch into many different species, each species giving rise to many genera, and towards the top of the tree you would find so much diversity that you could distinguish separate phyla—the large divisions (sponges, mosses, mollusks, chordates, and so on) that comprise the kingdoms of animals, plants, and several others—take your pick. But, as [David] Berlinski points out, the fossil record shows the opposite: “representatives of separate phyla appearing first followed by lower-level diversification on those basic themes.” In general, “most species enter the evolutionary order fully formed and then depart unchanged.” The incremental development of new species is largely not there. Those missing pre-Cambrian organisms have still not turned up. (Although fossils are subject to interpretation, and some biologists place pre-Cambrian life-forms closer than others to the new-fangled Cambrian creatures.)

Some researchers have guessed that those missing Precambrian precursors were too small or too soft-bodied to have made good fossils. Meyer notes that fossil traces of ancient bacteria and single-celled algae have been discovered: smallness per se doesn’t mean that an organism can’t leave fossil traces—although the existence of fossils depends on the surroundings in which the organism lived, and the history of the relevant rock during the ages since it died. The story is similar for soft-bodied organisms. Hard-bodied forms are more likely to be fossilized than soft-bodied ones, but many fossils of soft-bodied organisms and body parts do exist. Precambrian fossil deposits have been discovered in which tiny, soft-bodied embryo sponges are preserved—but no predecessors to the celebrity organisms of the Cambrian explosion.

This sort of negative evidence can’t ever be conclusive. But the ever-expanding fossil archives don’t look good for Darwin, who made clear and concrete predictions that have (so far) been falsified—according to many reputable paleontologists, anyway. When does the clock run out on those predictions? Never. But any thoughtful person must ask himself whether scientists today are looking for evidence that bears on Darwin, or looking to explain away evidence that contradicts him. There are some of each. Scientists are only human, and their thinking (like everyone else’s) is colored by emotion.

Yes, emotion, the thing that colors thought. Emotion is something that humans and other animals have. If Darwin and his successors are correct, emotion must be a faculty that improves the survival and reproductive fitness of a species.

But that can’t be true because emotion is the spark that lights murder, genocide, and war. World War II, alone, is said to have occasioned the deaths of more than one-hundred million humans. Prominently among those killed were six million Ashkenzi Jews, members of a distinctive branch of humanity whose members (on average) are significantly more intelligent than other branches, and who have contributed beneficially to science, literature, and the arts (especially music).

The evil by-products of emotion – such as the near-extermination of peoples (Ashkenazi Jews among them) – should cause one to doubt that the persistence of a trait in the human population means that the trait is beneficial to survival and reproduction.

David Berlinski, in The Devil’s Delusion: Atheism and Its Scientific Pretensions, addresses the lack of evidence for evolution before striking down the notion that persistent traits are necessarily beneficial:

At the very beginning of his treatise Vertebrate Paleontology and Evolution, Robert Carroll observes quite correctly that “most of the fossil record does not support a strictly gradualistic account” of evolution. A “strictly gradualistic” account is precisely what Darwin’s theory demands: It is the heart and soul of the theory….

In a research survey published in 2001, and widely ignored thereafter, the evolutionary biologist Joel Kingsolver reported that in sample sizes of more than one thousand individuals, there was virtually no correlation between specific biological traits and either reproductive success or survival. “Important issues about selection,” he remarked with some understatement, “remain unresolved.”

Of those important issues, I would mention prominently the question whether natural selection exists at all.

Computer simulations of Darwinian evolution fail when they are honest and succeed only when they are not. Thomas Ray has for years been conducting computer experiments in an artificial environment that he has designated Tierra. Within this world, a shifting population of computer organisms meet, mate, mutate, and reproduce.

Sandra Blakeslee, writing for The New York Times, reported the results under the headline “Computer ‘Life Form’ Mutates in an Evolution Experiment: Natural Selection Is Found at Work in a Digital World.”

Natural selection found at work? I suppose so, for as Blakeslee observes with solemn incomprehension, “the creatures mutated but showed only modest increases in complexity.” Which is to say, they showed nothing of interest at all. This is natural selection at work, but it is hardly work that has worked to intended effect.

What these computer experiments do reveal is a principle far more penetrating than any that Darwin ever offered: There is a sucker born every minute….

“Contemporary biology,” [Daniel Dennett] writes, “has demonstrated beyond all reasonable doubt that natural selection— the process in which reproducing entities must compete for finite resources and thereby engage in a tournament of blind trial and error from which improvements automatically emerge— has the power to generate breathtakingly ingenious designs” (italics added).

These remarks are typical in their self-enchanted self-confidence. Nothing in the physical sciences, it goes without saying— right?— has been demonstrated beyond all reasonable doubt. The phrase belongs to a court of law. The thesis that improvements in life appear automatically represents nothing more than Dennett’s conviction that living systems are like elevators: If their buttons are pushed, they go up. Or down, as the case may be. Although Darwin’s theory is very often compared favorably to the great theories of mathematical physics on the grounds that evolution is as well established as gravity, very few physicists have been heard observing that gravity is as well established as evolution. They know better and they are not stupid….

The greater part of the debate over Darwin’s theory is not in service to the facts. Nor to the theory. The facts are what they have always been: They are unforthcoming. And the theory is what it always was: It is unpersuasive. Among evolutionary biologists, these matters are well known. In the privacy of the Susan B. Anthony faculty lounge, they often tell one another with relief that it is a very good thing the public has no idea what the research literature really suggests.

“Darwin?” a Nobel laureate in biology once remarked to me over his bifocals. “That’s just the party line.”

In the summer of 2007, Eugene Koonin, of the National Center for Biotechnology Information at the National Institutes of Health, published a paper entitled “The Biological Big Bang Model for the Major Transitions in Evolution.”

The paper is refreshing in its candor; it is alarming in its consequences. “Major transitions in biological evolution,” Koonin writes, “show the same pattern of sudden emergence of diverse forms at a new level of complexity” (italics added). Major transitions in biological evolution? These are precisely the transitions that Darwin’s theory was intended to explain. If those “major transitions” represent a “sudden emergence of new forms,” the obvious conclusion to draw is not that nature is perverse but that Darwin was wrong….

Koonin is hardly finished. He has just started to warm up. “In each of these pivotal nexuses in life’s history,” he goes on to say, “the principal ‘types’ seem to appear rapidly and fully equipped with the signature features of the respective new level of biological organization. No intermediate ‘grades’ or intermediate forms between different types are detectable.”…

[H[is views are simply part of a much more serious pattern of intellectual discontent with Darwinian doctrine. Writing in the 1960s and 1970s, the Japanese mathematical biologist Motoo Kimura argued that on the genetic level— the place where mutations take place— most changes are selectively neutral. They do nothing to help an organism survive; they may even be deleterious…. Kimura was perfectly aware that he was advancing a powerful argument against Darwin’s theory of natural selection. “The neutral theory asserts,” he wrote in the introduction to his masterpiece, The Neutral Theory of Molecular Evolution, “that the great majority of evolutionary changes at the molecular level, as revealed by comparative studies of protein and DNA sequences, are caused not by Darwinian selection but by random drift of selectively neutral or nearly neutral mutations” (italics added)….

Writing in the Proceedings of the National Academy of Sciences, the evolutionary biologist Michael Lynch observed that “Dawkins’s agenda has been to spread the word on the awesome power of natural selection.” The view that results, Lynch remarks, is incomplete and therefore “profoundly misleading.” Lest there be any question about Lynch’s critique, he makes the point explicitly: “What is in question is whether natural selection is a necessary or sufficient force to explain the emergence of the genomic and cellular features central to the building of complex organisms.”

Survival and reproduction depend on many traits. A particular trait, considered in isolation, may seem to be helpful to the survival and reproduction of a group. But that trait may not be among the particular collection of traits that is most conducive to the group’s survival and reproduction. If that is the case, the trait will become less prevalent.

Alternatively, if the trait is an essential member of the collection that is conducive to survival and reproduction, it will survive. But its survival depends on the other traits. The fact that X is a “good trait” does not, in itself, ensure the proliferation of X. And X will become less prevalent if other traits become more important to survival and reproduction.

In any event, it is my view that genetic fitness for survival has become almost irrelevant in places like the North America, Europe, and Japan. The rise of technology and the “social safety net” (state-enforced pseudo-empathy) have enabled the survival and reproduction of traits that would have dwindled in times past.

In fact, there is a supportable hypothesis that humans in cosseted realms (i.e., the West) are, on average, becoming less intelligent. But, first, it is necessary to explain why it seemed for a while that humans were becoming more intelligent.

David Robson is on the case:

When the researcher James Flynn looked at [IQ] scores over the past century, he discovered a steady increase – the equivalent of around three points a decade. Today, that has amounted to 30 points in some countries.

Although the cause of the Flynn effect is still a matter of debate, it must be due to multiple environmental factors rather than a genetic shift.

Perhaps the best comparison is our change in height: we are 11cm (around 5 inches) taller today than in the 19th Century, for instance – but that doesn’t mean our genes have changed; it just means our overall health has changed.

Indeed, some of the same factors may underlie both shifts. Improved medicine, reducing the prevalence of childhood infections, and more nutritious diets, should have helped our bodies to grow taller and our brains to grow smarter, for instance. Some have posited that the increase in IQ might also be due to a reduction of the lead in petrol, which may have stunted cognitive development in the past. The cleaner our fuels, the smarter we became.

This is unlikely to be the complete picture, however, since our societies have also seen enormous shifts in our intellectual environment, which may now train abstract thinking and reasoning from a young age. In education, for instance, most children are taught to think in terms of abstract categories (whether animals are mammals or reptiles, for instance). We also lean on increasingly abstract thinking to cope with modern technology. Just think about a computer and all the symbols you have to recognise and manipulate to do even the simplest task. Growing up immersed in this kind of thinking should allow everyone [hyperbole alert] to cultivate the skills needed to perform well in an IQ test….

[Psychologist Robert Sternberg] is not alone in questioning whether the Flynn effect really represented a profound improvement in our intellectual capacity, however. James Flynn himself has argued that it is probably confined to some specific reasoning skills. In the same way that different physical exercises may build different muscles – without increasing overall “fitness” – we have been exercising certain kinds of abstract thinking, but that hasn’t necessarily improved all cognitive skills equally. And some of those other, less well-cultivated, abilities could be essential for improving the world in the future.

Here comes the best part:

You might assume that the more intelligent you are, the more rational you are, but it’s not quite this simple. While a higher IQ correlates with skills such as numeracy, which is essential to understanding probabilities and weighing up risks, there are still many elements of rational decision making that cannot be accounted for by a lack of intelligence.

Consider the abundant literature on our cognitive biases. Something that is presented as “95% fat-free” sounds healthier than “5% fat”, for instance – a phenomenon known as the framing bias. It is now clear that a high IQ does little to help you avoid this kind of flaw, meaning that even the smartest people can be swayed by misleading messages.

People with high IQs are also just as susceptible to the confirmation bias – our tendency to only consider the information that supports our pre-existing opinions, while ignoring facts that might contradict our views. That’s a serious issue when we start talking about things like politics.

Nor can a high IQ protect you from the sunk cost bias – the tendency to throw more resources into a failing project, even if it would be better to cut your losses – a serious issue in any business. (This was, famously, the bias that led the British and French governments to continue funding Concorde planes, despite increasing evidence that it would be a commercial disaster.)

Highly intelligent people are also not much better at tests of “temporal discounting”, which require you to forgo short-term gains for greater long-term benefits. That’s essential, if you want to ensure your comfort for the future.

Besides a resistance to these kinds of biases, there are also more general critical thinking skills – such as the capacity to challenge your assumptions, identify missing information, and look for alternative explanations for events before drawing conclusions. These are crucial to good thinking, but they do not correlate very strongly with IQ, and do not necessarily come with higher education. One study in the USA found almost no improvement in critical thinking throughout many people’s degrees.

Given these looser correlations, it would make sense that the rise in IQs has not been accompanied by a similarly miraculous improvement in all kinds of decision making.

So much for the bright people who promote and pledge allegiance to socialism and its various manifestations (e.g., the Green New Deal, and Medicare for All). So much for the bright people who suppress speech with which they disagree because it threatens the groupthink that binds them.

Robson also discusses evidence of dysgenic effects in IQ:

Whatever the cause of the Flynn effect, there is evidence that we may have already reached the end of this era – with the rise in IQs stalling and even reversing. If you look at Finland, Norway and Denmark, for instance, the turning point appears to have occurred in the mid-90s, after which average IQs dropped by around 0.2 points a year. That would amount to a seven-point difference between generations.

Psychologist (and intelligence specialist) James Thompson has addressed dysgenic effects at his blog on the website of The Unz Review. In particular, he had a lot to say about the work of an intelligence researcher named Michael Woodley. Here’s a sample from a post by Thompson:

We keep hearing that people are getting brighter, at least as measured by IQ tests. This improvement, called the Flynn Effect, suggests that each generation is brighter than the previous one. This might be due to improved living standards as reflected in better food, better health services, better schools and perhaps, according to some, because of the influence of the internet and computer games. In fact, these improvements in intelligence seem to have been going on for almost a century, and even extend to babies not in school. If this apparent improvement in intelligence is real we should all be much, much brighter than the Victorians.

Although IQ tests are good at picking out the brightest, they are not so good at providing a benchmark of performance. They can show you how you perform relative to people of your age, but because of cultural changes relating to the sorts of problems we have to solve, they are not designed to compare you across different decades with say, your grandparents.

Is there no way to measure changes in intelligence over time on some absolute scale using an instrument that does not change its properties? In the Special Issue on the Flynn Effect of the journal Intelligence Drs Michael Woodley (UK), Jan te Nijenhuis (the Netherlands) and Raegan Murphy (Ireland) have taken a novel approach in answering this question. It has long been known that simple reaction time is faster in brighter people. Reaction times are a reasonable predictor of general intelligence. These researchers have looked back at average reaction times since 1889 and their findings, based on a meta-analysis of 14 studies, are very sobering.

It seems that, far from speeding up, we are slowing down. We now take longer to solve this very simple reaction time “problem”.  This straightforward benchmark suggests that we are getting duller, not brighter. The loss is equivalent to about 14 IQ points since Victorian times.

So, we are duller than the Victorians on this unchanging measure of intelligence. Although our living standards have improved, our minds apparently have not. What has gone wrong?

From a later post by Thompson:

The Flynn Effect co-exists with the Woodley Effect. Since roughly 1870 the Flynn Effect has been stronger, at an apparent 3 points per decade. The Woodley effect is weaker, at very roughly 1 point per decade. Think of Flynn as the soil fertilizer effect and Woodley as the plant genetics effect. The fertilizer effect seems to be fading away in rich countries, while continuing in poor countries, though not as fast as one would desire. The genetic effect seems to show a persistent gradual fall in underlying ability.

Woodley’s claim is based on a set of papers written since 2013, which have been recently reviewed by [Matthew] Sarraf.

The review is unusual, to say the least. It is rare to read so positive a judgment on a young researcher’s work, and it is extraordinary that one researcher has changed the debate about ability levels across generations, and all this in a few years since starting publishing in psychology.

The table in that review which summarizes the main findings is shown below. As you can see, the range of effects is very variable, so my rough estimate of 1 point per decade is a stab at calculating a median. It is certainly less than the Flynn Effect in the 20th Century, though it may now be part of the reason for the falling of that effect, now often referred to as a “negative Flynn effect”….

Here are the findings which I have arranged by generational decline (taken as 25 years).

  • Colour acuity, over 20 years (0.8 generation) 3.5 drop/decade.

  • 3D rotation ability, over 37 years (1.5 generations) 4.8 drop/decade.

  • Reaction times, females only, over 40 years (1.6 generations) 1.8 drop/decade.

  • Working memory, over 85 years (3.4 generations) 0.16 drop/decade.

  • Reaction times, over 120 years (4.8 generations) 0.57-1.21 drop/decade.

  • Fluctuating asymmetry, over 160 years (6.4 generations) 0.16 drop/decade.

Either the measures are considerably different, and do not tap the same underlying loss of mental ability, or the drop is unlikely to be caused by dysgenic decrements from one generation to another. Bar massive dying out of populations, changes do not come about so fast from one generation to the next. The drops in ability are real, but the reason for the falls are less clear. Gathering more data sets would probably clarify the picture, and there is certainly cause to argue that on various real measures there have been drops in ability. Whether this is dysgenics or some other insidious cause is not yet clear to me.…

My view is that whereas formerly the debate was only about the apparent rise in ability, discussions are now about the co-occurrence of two trends: the slowing down of the environmental gains and the apparent loss of genetic quality. In the way that James Flynn identified an environmental/cultural effect, Michael Woodley has identified a possible genetic effect, and certainly shown that on some measures we are doing less well than our ancestors.

How will they be reconciled? Time will tell, but here is a prediction. I think that the Flynn effect will fade in wealthy countries, persist with fading effect in poor countries, and that the Woodley effect will continue, though I do not know the cause of it.

Here’s my hypothesis: The less-intelligent portions of the populace are breeding faster than the more-intelligent portions. As I said earlier, the rise of technology and the “social safety net” (state-enforced pseudo-empathy) have enabled the survival and reproduction of traits that would have dwindled in times past.

Evolution — in the absence of challenges that ensure survival of the fittest — seems to result in devolution.

Philosophical Musings: Part I

Time, existence, and science.

In this post I use the academic “we”, as opposed to the royal “we” and the politician’s presumptuous “we”.


Before we can consider time and existence, we must consider whether they are illusions.

Regarding time, there’s a reasonable view that nothing exists but the present — the now — or, rather, an infinite number of nows. In the conventional view, one now succeeds another, which creates the illusion of the passage of time. A problem with the conventional view of time is that not everyone perceives the same now. The compilation of a comprehensive now is a practical impossibility, though it could be done in theory. (Einstein’s theories of relativity notwithstanding, the Lorentz transformation restores simultenaity.)

In the view of some physicists, however, all nows exist at once, and we merely perceive sequential slices of all nows. A problem with the view that all nows exist at once (known as the many-worlds view), is that it’s purely a mathematical concoction. Inasmuch as there seems to be general agreement as to the contents of the slice, the only (very weak) evidence that many nows exist in parallel is provided by claims about such phenomena as clairvoyance, visions, and co-location. I won’t wander into that thicket.

What distinguishes one now from another now? The answer is change. If things didn’t change, there would be only a now, not an infinite series of them. More precisely, if things didn’t seem to change, time would seem to stand still. This is another way of saying that a succession of nows creates the illusion of the passage of time.

What happens between one now and the next now? Change, not the passage of time. What we think of as the passage of time is really an artifact of change.

Time is really nothing more than the awareness of events that supposedly occur at set intervals — the “ticking” of an atomic clock, for example. I say supposedly because there’s no absolute measure of time against which one can calibrate the “ticking” of an atomic clock, or any other kind of clock.

In summary: Clocks don’t measure time, which is an illusion caused by change. Clocks merely change (“tick”) at supposedly regular intervals, and those intervals are used in the representation of other things, such as the speed of an automobile or the duration of a 100-yard dash.

Change is real. But change in what — of what does reality consist?

There are two basic views of reality. One of them, posited by Bishop Berkeley and his followers, is that the only reality is that which goes on in one’s own mind. But that’s just another way of saying that humans don’t perceive the external world directly. Rather, it is perceived second-hand, through the senses that detect external phenomena and transmit signals to the brain, which is where a person’s “reality” is formed.

The sensible view, held by most humans (even most scientists), is that there is an objective reality out there, beyond the confines of one’s mind. How can so many people agree about the existence of certain things (e.g., Cleveland) if there’s not something out there? Mass psychosis, perhaps? No, because that arises from a desire to believe that a thing exists. Cleveland is real because it is and has been actually experienced by myriad persons. The widespread belief in catastrophic “climate change” is a kind of mass psychosis, triggered by a combination of scientific malfeasance; greed (for research grants and notoriety); politicians’ and bureaucrats’ naïveté and power-lust; and laypersons’ naïveté, virtue-signaling, and conformity to peers’ beliefs.

The big question is how reality came into being. This has been debated for millennia. There are two main schools of thought:

  • Things just exist and have always existed.

  • Things can’t come into existence on their own, so some non-thing must have caused things to exist. The non-thing must necessarily have always existed apart from things; that is, it is timeless and immaterial.

How can the issue be resolved? It can’t be resolved by logic alone, though logic is on the side of the second option. It can’t be resolved by facts because facts are about perceptible things. If it could be resolved by facts, there would be wide agreement about the answer. (Not perfect agreement because many human beings are impervious to facts.)

In sum, existence is a profound mystery.

Why is that? Can’t scientists someday trace the existence of things – call it the universe – back to a source? Isn’t that what the Big Bang Theory is all about? No and no. If the universe has always existed, there’s no source to be tracked down. And if the universe was created by a non-thing, how can scientists detect the non-thing if they’re only equipped to deal with things?

The Big Bang Theory posits a definite beginning, at a more or less definite point in time. But even if the theory is correct, it doesn’t tell us how that beginning began. Did things start from scratch, and if they did, what caused them to do so? And maybe they didn’t; maybe the Big Bang was just the result of the collapse of a previous universe, which was the result of a previous one, etc., etc., etc., ad infinitum. But that gets back to the question of what started it all.

Some scientists who think about such things don’t believe that the universe was created by a non-thing. But they don’t believe it because they don’t want to believe it. The much smaller number of similar scientists who believe that the universe was created by a non-thing hold that belief because they want to hold it, and because logic is on their side.

That’s life in the world of science, just as it is in the world of non-science, where believers, non-believers, and those who can’t make up their minds find all kinds of ways in which to rationalize what they believe (or don’t believe), even though they know less than scientists do about the universe.

Let’s just accept that and move on to another big question: What is it that exists?  It’s not “stuff” as we usually think of it – like mud or sand or water droplets. It’s not even atoms and their constituent particles. Those are just convenient abstractions for what seem to be various manifestations of electromagnetic forces, or emanations thereof, such as light.

But what are electromagnetic forces? And what does their behavior (to be anthropomorphic about it) have to do with the way that the things like planets, stars, and galaxies move in relation to one another? Those are more big questions that probably won’t be answered, or answered definitively.

That’s the thing about science: It’s a process, not a particular result. Human understanding of the universe offers a good example. Here’s a short list of beliefs about the universe that were considered true and then rejected:

  • Thales (c. 620 – c. 530 BC): The Earth rests on water.

  • Aneximenes (c. 540 – c. 475 BC): Everything is made of air.

  • Heraclitus (c. 540 – c. 450 BC): All is fire.

  • Empodecles (c. 493 – c. 435 BC): There are four elements: earth, air, fire, and water.

  • Democritus (c. 460 – c. 370 BC): Atoms (basic elements of nature) come in an infinite variety of shapes and sizes.

  • Aristotle (384 – 322 BC): Heavy objects must fall faster than light ones. The universe is a series of crystalline spheres that carry the sun, moon, planets, and stars around Earth.

  • Ptolemey (90 – 168 AD): Ditto the Earth-centric universe,  with a mathematical description.

  • Copernicus (1473 – 1543): The planets revolve around the sun in perfectly circular orbits.

  • Brahe (1546 – 1601): The planets revolve around the sun, but the sun and moon revolve around Earth.

  • Kepler (1573 – 1630): The planets revolve around the sun in elliptical orbits, and their trajectory is governed by magnetism.

  • Newton (1642 – 1727): The course of the planets around the sun is determined by gravity, which is a force that acts at a distance. Light consists of corpuscles; ordinary matter is made of larger corpuscles. Space and time are absolute and uniform.

  • Rutherford (1871 – 1937), Bohr (1885 – 1962), and others: The atom has a center (nucleus), which consists of two elemental particles, the neutron and proton.

  • Einstein (1879 – 1955): The universe is neither expanding nor shrinking.

That’s just a small fraction of the mistaken and incomplete theories that have held sway in the field of physics. There are many more such mistakes and lacunae in the other natural sciences: biology, chemistry, and earth science — each of which, like physics, has many branches. And in all of the branches there are many unresolved questions. For example, the Standard Model of particle physics, despite its complexity, is known to be incomplete. And it is thought (by some) to be unduly complex; that is, there may be a simpler underlying structure waiting to be discovered.

Given all of this, it is grossly presumptuous to claim that climate science – to take a salient example — is “settled” when the phenomena that it encompasses are so varied, complex, often poorly understood, and often given short shrift (e.g., the effects of solar radiation on the intensity of cosmic radiation reaching Earth, which affects low-level cloud formation, which affects atmospheric temperature and precipitation).

Anyone who says that any aspect of science is “settled” is either ignorant, stupid, or a freighted with a political agenda. Anyone who says that “science is real” is merely parroting an empty slogan.

Matt Ridley (quoted by Judith Curry) explains:

In a lecture at Cornell University in 1964, the physicist Richard Feynman defined the scientific method. First, you guess, he said, to a ripple of laughter. Then you compute the consequences of your guess. Then you compare those consequences with the evidence from observations or experiments. “If [your guess] disagrees with experiment, it’s wrong. In that simple statement is the key to science. It does not make a difference how beautiful the guess is, how smart you are, who made the guess or what his name is…it’s wrong….

In general, science is much better at telling you about the past and the present than the future. As Philip Tetlock of the University of Pennsylvania and others have shown, forecasting economic, meteorological or epidemiological events more than a short time ahead continues to prove frustratingly hard, and experts are sometimes worse at it than amateurs, because they overemphasize their pet causal theories….

Peer review is supposed to be the device that guides us away from unreliable heretics. Investigations show that peer review is often perfunctory rather than thorough; often exploited by chums to help each other; and frequently used by gatekeepers to exclude and extinguish legitimate minority scientific opinions in a field.

Herbert Ayres, an expert in operations research, summarized the problem well several decades ago: “As a referee of a paper that threatens to disrupt his life, [a professor] is in a conflict-of-interest position, pure and simple. Unless we’re convinced that he, we, and all our friends who referee have integrity in the upper fifth percentile of those who have so far qualified for sainthood, it is beyond naive to believe that censorship does not occur.” Rosalyn Yalow, winner of the Nobel Prize in medicine, was fond of displaying the letter she received in 1955 from the Journal of Clinical Investigation noting that the reviewers were “particularly emphatic in rejecting” her paper.

The health of science depends on tolerating, even encouraging, at least some disagreement. In practice, science is prevented from turning into religion not by asking scientists to challenge their own theories but by getting them to challenge each other, sometimes with gusto.

As I said, there is no such thing as “settled science”. Real science is a vast realm of unsettled uncertainty. Newton put it thus:

I do not know what I may appear to the world, but to myself I seem to have been only like a boy playing on the seashore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.

Certainty is the last refuge of a person whose mind is closed to new facts and new ways of looking at old facts.

How uncertain is the real world, especially the world of events yet to come? Consider a simple, three-parameter model in which event C depends on the occurrence of event B, which depends on the occurrence of event A; in which the value of the outcome is the summation of the values of the events that occur; and in which value of each event is binary – a value of 1 if it happens, 0 if it doesn’t happen. Even in a simple model like that, there is a wide range of possible outcomes; thus:

  • A doesn’t occur (B and C therefore don’t occur) = 0.

  • A occurs but B fails to occur (and C therefore doesn’t occur) = 1.

  • A occurs, B occurs, but C fails to occur = 2.

  • A occurs, B occurs, and C occurs = 3.

Even when A occurs, subsequent events (or non-events) will yield final outcomes ranging in value from 1 to 3 times 1. A factor of 3 is a big deal. It’s why .300 hitters make millions of dollars a year and .100 hitters sell used cars.

Let’s leave it at that and move on.

The White House Brochures on Climate Change

Down the memory hole — well, not quire.

BACKGROUND

Roy Spencer posted this at Roy Spencer, Ph.D. on January 8, 2021:

White House Brochures on Climate (There is no climate crisis)

January 8th, 2021 by Roy W. Spencer, Ph. D.

Late last year, several of us were asked by David Legates (White House Office of Science and Technology Policy) to write short, easily understandable brochures that supported the general view that there is no climate crisis or climate emergency, and pointing out the widespread misinformation being promoted by alarmists through the media.

Below are the resulting 9 brochures, and an introduction by David. Mine is entitled, “The Faith-Based Nature of Human Caused Global Warming”.

David hopes to be able to get these posted on the White House website by January 20 (I presume so they will become a part of the outgoing Administration’s record) but there is no guarantee given recent events.

He said we are free to disseminate them widely. I list them in no particular order. We all thank David for taking on a difficult job in more hostile territory that you might imagine.

Introduction(Dr. David Legates)

The Sun Climate Connection(Drs. Michael Connolly, Ronan Connolly, Willie Soon)

Systematic Problems in the Four National Assessments of Climate Change Impacts on the US(Dr. Patrick Michaels)

Record Temperatures in the United States(Dr. John Christy)

Radiation Transfer(Dr. William Happer)

Is There a Climate Emergency(Dr. Ross McKitrick)

Hurricanes and Climate Change(Dr. Ryan Maue)

Climate, Climate Change, and the General Circulation(Dr. Anthony Lupo)

Can Computer Models Predict Climate(Dr. Christopher Essex)

The Faith-Based Nature of Human-Caused Global Warming(Dr. Roy Spencer)

Spencer followed up with this post on January 12, 2021:

At the White House, the Purge of Skeptics Has Started

January 12th, 2021 by Roy W. Spencer, Ph. D.

Dr. David Legates has been Fired by White House OSTP Director and Trump Science Advisor, Kelvin Droegemeier

[Image of the seal of the Executive Office of the President]

President Donald Trump has been sympathetic with the climate skeptics’ position, which is that there is no climate crisis, and that all currently proposed solutions to the “crisis” are economically harmful to the U.S. specifically, and to humanity in general.

Today I have learned that Dr. David Legates, who had been brought to the Office of Science and Technology Policy to represent the skeptical position in the Trump Administration, has been fired by OSTP Director and Trump Science Advisor, Dr. Kelvin Droegemeier.

The event that likely precipitated this is the invitation by Dr. Legates for about a dozen of us to write brochures that we all had hoped would become part of the official records of the Trump White House. We produced those brochures (no funding was involved), and they were formatted and published by OSTP, but not placed on the WH website. My understanding is that David Legates followed protocols during this process.

So What Happened?

What follows is my opinion. I believe that Droegemeier (like many in the administration with hopes of maintaining a bureaucratic career in the new Biden Administration) has turned against the President for political purposes and professional gain. If Kelvin Droegemeier wishes to dispute this, let him… and let’s see who the new Science Advisor/OSTP Director is in the new (Biden) Administration.

I would also like to know if President Trump approved of his decision to fire Legates.

In the meantime, we have been told to remove links to the brochures, which is the prerogative of the OSTP Director since they have the White House seal on them.

But their content will live on elsewhere, as will Dr. Droegemeier’s decision.

I have saved the ten brochures in their original (.pdf) format. The following links to the files are listed in the order in which Dr. Spencer listed them in his post of January 8, 2021:

Introduction

The Sun Climate Connection

Systematic Problems in the Four National Assessments of Climate Change Impacts on the US

Record Temperatures in the United States

Radiation Transfer

Is There a Climate Emergency?

Hurricanes and Climate Change

Climate, Climate Change, and the General Circulation

Can Computer Models Predict Climate?

The Faith-Based Nature of Human-Caused Global Warming

U.S. Supreme Court: Lines of Succession

From the beginning to the present.

Though there are now only nine seats on the U.S. Supreme Court, the tables below list eleven lines of succession. There is one for the chief justiceship and ten for the associate justiceships that Congress has created at one time and another as it has changed the size of the Court. In other words, two associate justiceships have “died out” in the course of the Court’s history. The present members of the Court, in addition to the chief justice, hold the first, second, third, fourth, sixth, eighth, ninth, and tenth associate justiceships created by Congress.

Reading across, there is a column for each president, a column for each chief justice, and columns for the ten associate justiceships. Justices who have held each seat are listed in chronological order, beginning with the justices nominated by the heroic George Washington and ending with the justices nominated by the lying, fear-mongering, chameleon-like Joe Whatshisname.

There are two horizontal divisions. The first, indicated by double red lines, delineates presidencies. The beginning of every justice’s term is associated with the president who nominated that person to a seat on the Court. The end of each justice’s term is associated with the president who was in office when the justice’s term ended by resignation or death.

The second horizontal division, indicated by alternating bands of gray and white, delineates chief justiceships. Thus the reader can see which justices served with a particular chief justice. The “Roberts Court”, for example, has thus far included Roberts and — in order of ascension to the Court — Stevens, O’Connor, Scalia, Kennedy, Souter, Thomas, Ginsburg, Breyer, Alito, Sotomayor, Kagan, Gorsuch, Kavanaugh, Barrett, and Brown-Jackson.

Because there is a separate line of succession for the chief justiceship, persons who were already on the Court and then elevated to the chief justiceship are listed in two different places. Also, the names of a few other justices appear in more than one place because they served non-consecutive terms on the Court.

The table is divided into three parts for ease of reading. (Zoom in if the type is too small for you.) Part I covers the chief justiceship (currently Roberts) and associate justice positions 1-3 (currently Sotomayor, Brown-Jackson, and Kavanaugh). Part II covers associate justice positions 4-7 (currently Kagan, 4; Barrett, 6). Part III covers associate justice positions 8-10 (currently Alito, Gorsuch, and Thomas).

Part I

Part II

Part III

Looking for Something?

I have moved many posts and pages to my new blog, Loquitur’s Letter. You may find what you’re looking for at my list of favorite posts.

Reflections on Aging

It’s better than the alternative — so far.

Aging is of interest to me because I am among the oldest five percent Americans. Not that I feel old — I don’t — but objectively I am old.

I am probably also among the more solitary of Americans. But I am not lonely in my solitude, for it is and long has been of my own choosing.

This is so because of my strong introversion. I suppose that the seeds of my introversion are genetic, but the symptoms didn’t appear in earnest until I was in my early thirties. After that I became steadily more focused on a few friendships (which eventually dwindled to none) and decidedly uninterested in the aspects of work that required more than brief meetings (one-on-one preferred). Finally, enough became more than enough and I quit full-time work at the age of fifty-six. There followed, a few years later, a stint of part-time work that also became more than enough. And so, at the age of fifty-nine, I banked my final paycheck. Happily.

What does my introversion have to do with my aging? I suspected that my continued withdrawal from social intercourse (more about that, below) might be a symptom of aging. And I found this, in the Wikipedia article “Disengagement Theory“:

The disengagement theory of aging states that “aging is an inevitable, mutual withdrawal or disengagement, resulting in decreased interaction between the aging person and others in the social system he belongs to”. The theory claims that it is natural and acceptable for older adults to withdraw from society….

Disengagement theory was formulated by [Elaine] Cumming and [William Earl] Henry in 1961 in the book Growing Old, and it was the first theory of aging that social scientists developed….

The disengagement theory is one of three major psychosocial theories which describe how people develop in old age. The other two major psychosocial theories are the activity theory and the continuity theory, and the disengagement theory [is at] odds with both.

The continuity theory

states that older adults will usually maintain the same activities, behaviors, relationships as they did in their earlier years of life. According to this theory, older adults try to maintain this continuity of lifestyle by adapting strategies that are connected to their past experiences [whatever that means].

I don’t see any conflict between the continuity theory and the disengagement theory. A strong introvert like me, for example, finds it easy to maintain the same activities, behaviors, and relationships as I did before I retired. Which is to say that I had begun minimizing my social interactions before retiring, and continued to do so after retiring.

What about the activity theory? Well, it’s a normative theory, unlike the other two (which are descriptive), and it goes like this:

The activity theory … proposes that successful aging occurs when older adults stay active and maintain social interactions. It takes the view that the aging process is delayed and the quality of life is enhanced when old people remain socially active.

That’s just a social worker’s view of “appropriate” behavior for older persons. Take my word for it, introverts don’t need social activity, which is stressful for them, and they resent those who try to push them into it. The life of the mind is far more rewarding than chit-chat and bingo with geezers.

The life of the mind is certainly more rewarding than “social media”. My use of that peculiar institution was limited to Facebook. And my use of it dwindled from occasional to never a few years ago. And there it will stay.

Anyway, I mentioned my continued withdrawal from social intercourse. A particular, recent instance of withdrawal sparked this post. For about fifteen years I corresponded regularly with a former colleague. He  has a malady that I have dubbed email-arrhea: several messages a day (links and jokes, nothing original) to a large mailing list, with many insipid replies from recipients whose choose “reply all”. Enough of that finally became too much, and I declared to him my intention to refrain from correspondence until … whenever. (“Don’t call me, I’ll call you.”) So all of his messages and those of his other correspondents were dumped automatically into my email trash folder. He finally got the message, so to speak, and quit transmitting.

My withdrawal from that particular mode of social intercourse was eased by the fact that the correspondent is a “collaborator”” with a deep-state mindset. So it was satisfying to terminate our relationship — and to devote more time to things that I enjoy, like blogging.

Daylight Saving Time Doesn't Kill

But “springing forward” does.

It’s almost time to “fall back”, which reminds me of the perennial controversy about daylight saving time (that’s “saving” not “savings”). The main complaint seems to be the stress that results from moving clocks ahead in March:

Springing forward may be hazardous to your health. The Monday following the start of daylight saving time (DST) is a particularly bad one for heart attacks, traffic accidents, workplace injuries and accidental deaths. Now that most Americans have switched their clocks an hour ahead, studies show many will suffer for it.

Most Americans slept about 40 minutes less than normal on Sunday night, according to a 2009 study published in the Journal of Applied Psychology…. Since sleep is important for maintaining the body’s daily performance levels, much of society is broadly feeling the impact of less rest, which can include forgetfulness, impaired memory and a lower sex drive, according to WebMD.

One of the most striking affects of this annual shift: Last year, Colorado researchers reported finding a 25 percent increase in the number of heart attacks that occur on the Monday after DST starts, as compared with a normal Monday…. A cardiologist in Croatia recorded about twice as many heart attacks than expected during that same day, and researchers in Sweden have also witnessed a spike in heart attacks in the week following the time adjustment, particularly among those who were already at risk.

Workplace injuries are more likely to occur on that Monday, too, possibly because workers are more susceptible to a loss of focus due to too little sleep. Researchers at Michigan State University used over 20 years of data from the Mine Safety and Health Administration to determine that three to four more miners than average sustain a work-related injury on the Monday following the start of DST. Those injuries resulted in 2,649 lost days of work, which is a 68 percent increase over the hours lost from injuries on an average day. The team found no effects following the nation’s one-hour shift back to standard time in the fall….

There’s even more bad news: Drivers are more likely to be in a fatal traffic accident on DST’s first Monday, according to a 2001 study in Sleep Medicine. The authors analyzed 21 years of data on fatal traffic accidents in the U.S. and found that, following the start of DST, drivers are in 83.5 accidents as compared with 78.2 on the average Monday. This phenomenon has also been recorded in Canadian drivers and British motorists.

If all that wasn’t enough, a researcher from the University of British Columbia who analyzed three years of data on U.S. fatalities reported that accidental deaths of any kind are more likely in the days following a spring forward. Their 1996 analysis showed a 6.5 percent increase, which meant that about 200 more accidental deaths occurred immediately after the start of DST than would typically occur in a given period of the same length.

I’m convinced. But the solution to the problem isn’t to get rid of DST. No, the solution is to get rid of standard time and use DST year around.

I’m not arguing for year-around DST from an economic standpoint. The evidence about the economic advantages of DST is inconclusive.

I’m arguing for year-around DST as a way to eliminate “spring forward” stress and enjoy an extra hour of daylight in the winter.

Don’t you enjoy those late summer sunsets? I sure do, and a lot other people seem to enjoy them, too. That’s why daylight saving time won’t be abolished.

But if you love those late summer sunsets, you should also enjoy an extra hour of daylight at the end of a drab winter day. I know that I would. And it’s not as if you’d miss anything if sunrise occurs an hour later in the winter, as it would with DST. Even with standard time, most working people and students have to be up and about before winter sunrise.

How would year-around DST affect you? The following table gives the times of sunrise and sunset on the longest and shortest days for nine major cities, north to south and west to east:

I report, you decide. If it were up to me, the decision would be year-around DST. I hate “spring forward”.

The Bitter Fruits of America's Disintegration

The lunatics are in charge of the asylum, and have set it on fire.

Almost 250 years ago, a relatively small but determined band of revolutionaries overthrew British rule of the colonies that became known as the United States of America. The act of defying the Crown and establishing a new government was an open conspiracy, but it was nevertheless a conspiracy because it arose from “an agreement to perform together … [a] subversive act.” That conspiracy, of course, was the American Revolution.

Now, twelve score and six years since that conspiracy was announced to the world in the Declaration of Independence, the resulting polity — the United States of America — is approaching a crisis that is the result of another conspiracy, which I have described here.

What the conspirators seek is a secular theocracy, in which they are the high priests and theologians. If that reminds you of Mussolini’s Italy, Hitler’s Germany, the USSR, Communist China, and similar regimes, that’s because it’s of the same ilk: leftism

Leftists have a common trait: wishful thinking. Thomas Sowell calls it the unconstrained vision; I call it the unrealistic vision. It’s also known as magical thinking, in which “ought” becomes “is” and the forces of nature and human nature can be held in abeyance by edict; for example:

  • California wildfires caused by misguided environmentalism.

  • The excremental wasteland that is San Francisco. (And Blue cities, generally, because of the encouragement of homelessness.)

  • The killing of small businesses, especially restaurants, by minimum wage laws.

  • The killing of jobs for people who need them the most, by ditto.

  • Bloated pension schemes for Blue-State (and city) employees, which are bankrupting those States (and cities) and penalizing their citizens who aren’t government employees.

  • The idea that men can become women and should be allowed to compete with women in athletic competitions because the men in question have endured some surgery and taken some drugs.

  • The idea that it doesn’t and shouldn’t matter to anyone that a self-identified “woman” uses women’s rest-rooms where real women and girls became prey for prying eyes and worse.

  • Mass murder on a Hitlerian-Stalinist scale in the name of a “woman’s right to choose”, when she made that choice (in almost every case) by engaging in consensual sex.

  • Disrespect for and attacks on the police and military personnel who keep the spoiled children of capitalism safe in their cosseted existences.

  • The under-representation of women and blacks in certain fields is due to rank discrimination, not genetic differences (but it’s all right if blacks dominate certain sports and women now far outnumber men on college campuses).

  • Peace can be had without preparedness for war.

  • Regulation doesn’t reduce the rate of economic growth and foster “crony capitalism”.

  • The cost of health care will go down while the number of mandates is increased.

  • Every “right” under the sun can be granted without cost (e.g., affirmative action racial-hiring quotas, which penalize blameless whites; the Social Security Ponzi scheme, which burdens today’s workers and cuts into growth-inducing saving).

Closely related to magical thinking is the nirvana fallacy (hypothetical perfection always seems better than feasible reality), large doses of neurotic hysteria (e.g., the overpopulation fears of Paul Ehrlich, the AGW hoax of Al Gore et al.), and rampant adolescent rebelliousness (e.g., instant protests about everything, the post-election tantrum-riots of 2016).

But to say any of the foregoing about the left’s agenda, the assumptions and attitudes underlying it, the left’s strategic and tactical methods, or the psychological underpinnings of leftism, is to be “hateful”. (In my observation, nothing is more full of hate than a lefitst who has been contradicted or thwarted.) So, through the magic of psychological projection, those who dare speak the truth about leftism are called “haters”, “racists”, “fascists”, “Nazis”, and other things that apply to leftists themselves.

Labeling anti-leftists as evil “justifies” the left’s violent enforcement of its agenda. The violence takes many forms, from riots (as in the George Floyd “protests”), to suppression by force (e.g., Stalin’s war on the Cossacks), to genocide (e.g., the Holocaust), to overtly peaceful but coercive state action (e.g., forced unionization of American industry, the J6 committee’s Stalinesque “show trial”, suppression of religious liberty and freedom of association in the name of same-sex “marriage”, and the vast accumulation of economic regulations).

In a word: disintegration.

THE “GREATEST GENERATION” AND THE WASP ESTABLISHMENT SET THE STAGE FOR DISINTEGRATION

Every line of human endeavor reaches a peak, from which decline is sure to follow if the things that caused it to peak are mindlessly rejected for the sake of novelty (i.e., rejection of old norms just because they are old). This is nowhere more obvious than in the arts.

I have written elsewhere that 1963 (or thereabouts) was a “year zero” in American history. It was then that the post-World War II promise of social and economic progress, built on a foundation of unity (or as much of it as a heterogeneous nation is likely to muster), began to crumble.

At first, the “adults in the room” forgot their main duty: to be exemplars for the next generation.

As I wrote here,

the world in which we live … seems more and more to resemble the kind of world in which parents have failed in their duty to inculcate in their children the values of honesty, respect, and hard work….

I subscribe to the view that the rot set in after World War II….

The[] sudden emergence [in the 1960s of “campus rebels”] was due to the failure of too many members of the so-called Greatest Generation to inculcate in their children the values of honesty, respect, and hard work. How does one do that? By being clear about expectations and by setting limits on behavior — limits that are enforced swiftly, unequivocally, and sometimes with the palm of a hand. When children learn that they can “get away” with dishonesty, disrespect, and sloth, guess what? They become dishonest, disrespectful, and slothful. They give vent to their disrespect through whining, tantrum-like behavior, and even violence.

But the rot goes deeper than that. Wallace S. Moyle, writing in The New Criterion (“Facing the Music: On the Decline of the WASP Establishment”, September 2022), takes Averill Harriman as an exemplar of his class:

H]eir to an immense railroad fortune, a polo champion, and a Groton graduate, Harriman had been the first man in his class tapped for Yale’s Skull and Bones society…. Harriman … founded Brown Brothers Harriman, implemented Roosevelt’s National Industrial Recovery Act, and served as the U.S. ambassador to the Soviet Union. Later, he was elected governor of New York and advised Presidents Kennedy, Johnson, and Carter.

… E. Digby Baltzell, the preeminent exponent of the “Protestant Establishment” and the coiner of the term “wasp,” lamented in the early 1980s that the United States no longer boasted patricians with the moral authority to keep down the McCarthyite rabble, then newly ascendant, in Baltzell’s depressingly conventional opinion, in the figure of Ronald Reagan.

Harriman at the same time deplored Reagan as much as Baltzell did. In 1983, at the age of ninety-one, he traveled to Moscow just to reassure Communist Party General Secretary Yuri Andropov that not all Americans saw the Soviet Union as an “Evil Empire.” Before Reagan emerged, Harriman’s bête noire was Richard Nixon, whose success in exposing communists in the U.S. government was unforgiveable. Mercifully for their poor, quivering souls, neither Baltzell nor Harriman lived long enough to witness the rise of Donald Trump….

Exactly what Harriman and his brethren stood for politically is not easy to discern. Harriman’s politics shifted throughout his life. He voted for Harding in 1920. It was only at the urging of his fashionably engagée sister that he even entered the Roosevelt administration. Harriman went from Cold War hawk in 1946 and champion of George Kennan’s long telegram to dove in the aftermath of Vietnam. Unable to identify what principles wasp leaders stood for, their defenders frequently praise their dedication to what they call “service.”… [Harriman] sniffed at men who “didn’t do a damn thing.” “Service,” “giving back,” and “doing things”: these of course are polite euphemisms for exercising power. If power had to be wielded at all, America’s mid-twentieth-century Wise Men reasoned, it naturally ought to be in their hands.

Their upbringing made that an easy assumption. By the early twentieth century, the American upper class had constructed a cursus honorum as rigid as that faced by any Roman senator’s son….

The system was designed to produce custodians rather than leaders….

Having thrived in a system that rewarded conformity, the final generation of wasp leaders lacked what George H.W. Bush, the acknowledged last of their breed, called “the vision thing.”… When in the mid-1960s a guest expressed support for Vietnam protestors, Harriman denounced her as a traitor. Just a few years later, he was joining them….

From time to time, outsiders would plead with the Protestant Establishment to recover some moral fortitude. In God and Man at Yale, William F. Buckley Jr., a Catholic, warned that Yale was not only failing to uphold what Buckley called “individualism” (i.e., the free-enterprise system) and Christianity, but was also actively undermining them. For his trouble, Yale’s leaders denounced him as a reactionary bigot. McGeorge Bundy—yet another Bonesperson—called the book “dishonest in its use of facts, false in its theory, and a discredit to its author.” As a national security advisor in the Vietnam era, Bundy would go on to become literally the textbook example of policy failure. Later in life, he defended the spread of affirmative action….

[T]he wasps seemed to have erected institutions that uniquely selected for men who, as baseball scouts used to say, looked good in a uniform. Harriman may have earned middling grades and may not have been able to speak a single foreign language, but from adolescence on he looked the part of an ambassador. Harriman thus rose to the top of his Yale class. Thirty years later, he was negotiating with Stalin on behalf of the United States.

The 1960s generation is often blamed for contemporary woes. But it was the last generation of wasps that set in motion the forces that, as Buckley predicted, would lead the United States to ruin. Affirmative action, the tolerance of vagrancy (redubbed “homelessness” in the Lindsay era), the dishonoring of Christianity in public life, living constitutionalism in law, the ever-spreading blight of modern architecture, and the sacking of our cities by criminals: all of these features of the American regime were instituted by wasp patricians. America may have won the Cold War against communism, but within a generation it has fallen to a woke Marxian regime of its own making.

The wasp’s ancestors created the freest, most prosperous nation in history. By the time of the Protestant Establishment’s fading, its luminaries had left a nation ugly, depraved, and enthralled. They received a goodly heritage and squandered it.

THE NEW “ESTABLISHMENT” BECOMES THE ENEMY

The “establishment” has diversified over the years. As the “old boys” of Harriman’s generation died off, they were replaced by new men — and women. What we have is a new “elite” that has found a new way to distinguish itself from the “masses”.

Richard Hanania analyzes the phenomenon:

The American culture war is part of a global trend. The German far right marches against covid restrictions and immigration. In France, Le Pen wins the countryside and gets crushed in urban centers. Throughout the developed world you see the same cleavages opening up, with an educated urban elite that is more likely to support left-wing parties, and an exurban and rural populist backlash that looks strikingly similar across different societies….

  1. Increasing wealth causes class differentiation and segregation. One thing people with money buy is separation from poor people or others not like them, while assortative mating moves these trends along.

  2. With modern communications technology and women playing a larger role in intellectual life, genetic (i.e., true) explanations of class differentiation are disfavored, as is anything that would blame the poor or otherwise unfortunate for their own problems [i.e., leftist condescension].

  3. Despite social desirability bias leading to the triumph of egalitarian ideologies, the natural tendency towards a kind of class consciousness does not go away. The higher class therefore becomes more strenuous in defining itself as aesthetically and morally superior to the lower classes….

  4. The more egalitarian the official ideology, the harder the upper class has to work to find some other grounds on which to differentiate itself from the masses, leading to an exaggeration of the moral differences between the two tribes….

Thence the use of governmental power — directly and indirectly — to impose the left’s ideology on the “masses”. There is a government-corporate-technology-media-academic complex that moves together not just in matters of military spending or foreign policy, but in matters fundamental to the daily lives and livelihoods of Americans — “climate change”, energy policy, gender identity, the definition of marriage, immigration policy, the treatment of criminals, and much more. The approved positions on such matters are leftist, of course, and so the new establishment consists almost entirely of persons, corporations, foundations, and think-tanks that are effectively organs of the Democrat Party.

Thus did the establishment — old and new — allow, encourage, and abet the disintegration of America that is now in full spate.

In the remaining sections of this post I will trace a few of the symptoms and consequences of disintegration: military failure, economic rot, and the rise of psuedo-science in the service of leftist causes. There’s no need for me to say any more about social disintegration, the evidence of which is everywhere to be seen.

MILITARY FAILURE AS A SYMPTOM OF NATIONAL ROT

A critical element of America’s disintegration has been the unalloyed record of military futility and defeat since the end of World War II. No amount of belligerent talk can compensate for the fact that the enemies of America see that — with the exception of the Reagan and Trump years — America’s defense policy is to balk at doing what must be done to win, to disarm at the first hint of “peace”, and then fail to rearm quickly enough to prevent the next war.

The record of futility and fecklessness actual began at the end of World War II when an enfeebled FDR, guided by the Communists in his administration, gave away Eastern Europe to Stalin. The giveaway was unnecessary. The U.S. had been relatively unscathed by the war; the Soviet Union’s losses in life, property, and industrial capacity had been devastating. The U.S. (with Britain) was in a position to dictate to Stalin.

The Korean War was unnecessary, in that it was invited by the Truman administration’s policies: exclusion of Korea from the Asian defense perimeter (announced by another “old boy”) and massive cuts in the U.S. defense budget. But it was essential to defend South Korea so that the powers behind North Korea (Communist China and, by extension, the USSR) would grasp the willingness of the U.S. to maintain a forward defensive posture against aggression. That signal was blunted by Truman’s decision to sack MacArthur when the general persisted in his advocacy of attacking Chinese bases following the entry of China into the war. The end result was a stalemate, where a decisive victory might have broken the back of communistic adventurism around the globe. The Korean War, as it was fought by the U.S., became “a war to foment war”.

Anti-war propaganda disguised as journalism helped to snatch defeat from the jaws of victory in Vietnam. What was shaping up as a successful military campaign collapsed under the weight of the media’s overwrought and erroneous depiction of the Tet offensive as a Vietcong victory, the bombing of North Vietnam as “barbaric” (where the Tet offensive was given a “heroic cast”), and the deaths of American soldiers as somehow “in vain”, though many more deaths a generation earlier had not been in vain. (What a difference there was between Edward R. Murrow and Walter Cronkite and his sycophants.) Unlike Korea, U.S. forces were withdrawn from Vietnam and it took little time for North Vietnam to swallow South Vietnam.

The Gulf War of 1990-91 began with Saddam Hussein’s invasion of oil-rich Kuwait. U.S. action to repel the invasion was fully justified by the potential economic effects of Saddam’s capture of Kuwait’s petroleum reserves and oil production. The proper response to Saddam’s aggression would have been not only to defeat the Iraqi army but also to depose Saddam. The failure to do so further reinforced the pattern of compromise and retreat that had begun at the end of World War II, and necessitated the long, contentious Iraq War of the 2000s.

The quick victory in Iraq, coupled with the coincidental end of the Cold War, helped to foster a belief that the peace had been won. (That belief was given an academic imprimatur in Francis Fukuyama’s The End of History and the Last Man.) The stage was set for Clinton’s much-ballyhooed fiscal restraint, which was achieved by cutting the defense budget. Clinton’s lack of resolve in the face of terrorism underscored the evident unwillingness of American “leaders” to defend Americans’ interests, thus inviting 9/11.  (For more about Clinton’s foreign and defense policy, go here and scroll down to the section on Clinton.)

What can be said about the wars in Iraq and Afghanistan of 2001-2021 but that they were conducted in the same spirit as the wars in Korea, Vietnam, and the earlier war in Iraq. Rather than reproduce a long post that I wrote at the mid-point of the futile, post-9/11 wars, I will point you to it: “The War on Terror as It Should Have Been Fought”. Subsequent events — and especially Biden’s disgraceful bugout from Afghanistan — only underscore the main point of that post: Going to war and failing to win only encourages America’s enemies.

The war in Ukraine is a costly sideshow that detracts from the ability of the U.S. to prepare for a real showdown with Russia, China, Iran, and North Korea — a showdown that has been made more likely by the rush to arrange an unnecessary confrontation with Putin.. There are, in fact, good reasons to believe that (a) he is actually trying to protect Russia and Russians and (b) he has the facts of history on his side.

The axis of China, Russia, Iran, and North Korea can play the “long game”, which the U.S. and the West demonstrably cannot do because of their political systems and thrall to “public (elite) opinion”. By the time the axis is ready to bring the West to its knees, an outright attack of some kind probably won’t be necessary, as Putin has shown by cutting off vital fuel supplies to western Europe.

The only way to ensure that the U.S. isn’t cowed by the axis is to arm to the teeth, have a leader with moral courage, and dare the axis to harm vital U.S. interests. What is more likely to happen, given America’s present course, is a de facto surrender by the U.S. (and the West) — marked by significant concessions on trade and the scope of military operations and influence.

America — once an impregnable fortress — is on a path to becoming an isolated, subjugated, and exploited colony of the axis.

ECONOMIC ROT

The wisepersons who wrought America’s military decline are of the same breed as those who wrought its economic decline. In the first instance they rushed into wars that they were not willing to see through to victory. In the second instance they rushed into policy-making whose economic consequences they could have foreseen if they hadn’t been preoccupied with “social jutice” and similar hogwash.

America’s economic rot can be traced to the early 1900s, when the toll of Progresssivism (the original brand) began to be felt. It is no coincidence that a leading Progressive of the time was Teddy Roosevelt, a card-carrying member of the old establishment.

Consider the following graph, which is derived from estimates of constant-dollar GDP per capita that are available here:

There are four eras, as shown by the legend (1942-1946 omitted because of the vast economic distortions caused by World War II):

  • 1866-1907 — annual growth of 2.0 percent — A robust economy, fueled by (mostly) laissez-faire policies and the concomitant rise of industry, mass production, technological innovation, and entrepreneurship.

  • 1908-1941 — annual growth of 1.4 percent — A dispirited economy, shackled by the fruits of “progressivism”; for example, trust-busting; the onset of governance through regulation; the establishment of the income tax; the creation of the destabilizing Federal Reserve; and the New Deal, which prolonged the Great Depression.

  • 1947- 2007 — annual growth of 2.2 percent — A rejuvenated economy, buoyed by the end of the New Deal and the fruits of advances in technology and business management. The rebound in the rate of growth meant that the earlier decline wasn’t the result of an “aging” economy, which is an inapt metaphor for a living thing that is constantly replenished with new people, new capital, and new ideas.

  • 2008-2021 — annual growth of 1.0 percent —  An economy sagging under the cumulative weight of the fruits of “progressivism” (old and new); for example, the never-ending expansion of Medicare, Medicaid, and Social Security; and an ever-growing mountain of regulatory restrictions on business. (In a similar post, which I published in 2009, I wrote presciently that “[u]nless Obama’s megalomaniacal plans are aborted by a reversal of the Republican Party’s fortunes, the U.S. will enter a new phase of economic growth — something close to stagnation.)

Had the economy of the U.S. not been deflected from the course that it was on from 1866 to 1907, per capita GDP would now be about 1.4 times its present level. Compare the position of the dashed green line in 2021 — $83,000 — with per capita GDP in that year — $58,000.

If that seems unbelievable to you, it shouldn’t. A growing economy is a kind of compound-interest machine; some of its output is invested in intellectual and physical capital that enables the same number of workers to produce more, better, and more varied products and services. (More workers, of course, will produce even more products and services.) As the experience of 1947-2007 attests, nothing other than government interventions (or a war far more devastating to the U.S than World War II) could have kept the economy from growing along the path of 1866-1907. (I should add that economic growth in 1947-2007 would have been even greater than it was but for the ever-rising tide of government interventions.)

The sum of the annual gaps between what could have been (the dashed green line) and the reality after 1907 (omitting 1942-1946) is almost $700,000 — that’s per person in 2012 dollars. It’s $800,000 per person in 2021 dollars, and even more in 2022 dollars.

That cumulative gap represents our mega-depression.

I have identified the specific causes of the mega-depression elsewhere. They are — unsurprisingly — government spending as a fraction of GDP, government regulatory activity, reductions in private business investment (resulting from the first two items), and the rate of inflation. Based on recent values of those variables, the rate of real GDP growth for the next 10 years will be about -6 percent. Yes, that’s minus 6 percent!

Is such a thing possible in the United States? Yes! The estimates of inflation-adjusted GDP available at the website of the Bureau of Economic Analysis (an official arm of the U.S. government) yield these frightening statistics: Constant-dollar GDP dropped at an annualized rate of -9.3 percent from 1929 to 1932, and at an annualized rate of -7.4 percent from 1929 to 1933.

In any event, the outlook is gloomy:

PSEUDO-SCIENCE IN THE SADDLE: SOME EXAMPLES

The Keynesian Multiplier

It is fitting to begin this section with a summary of “The Keynesian Multiplier: Fiction vs. Fact”. When push comes to shove, the advocates of big government (which undermines economic growth) love to spend like drunken sailors (with other people’s money), claiming that such spending will stimulate the economy. And, by extension, they claim (against commons sense and statistical evidence), that government spending is economically beneficial, as well as necessary (as long as it’s not for defense).

The Keynesian multiplier is a pseudo-scientific product of the pseudo-science of macroeconomics. It is nothing more than a descriptive equation without operational significance. What it is supposed to mean is that if spending rises by X, the rise in spending will cause GDP to rise by a multiple (k) of X. What it really means is that if the relationship between GDP and spending remains constant, when GDP rises by some amount spending will have necessarily risen by a fraction of that amount. This relationship holds true regardless of the kind of spending under discussion — private investment, private consumption, or government. But proponents of government spending prefer to put “government” in front of “spending”, and then pretend (or uncritically believe) that the causation runs from government spending to GDP and not the other way around.

“Climate Change”

Here is a case of scientists becoming invested in an invalid hypothesis. The hypothesis in question is that atmospheric CO2 is largely responsible for the rise in measured temperatures (averaged “globally”) by about 1.5 degrees Celsius since the middle of the 19th century. The hypothesis has been falsified (i.e., disproved) in so many ways that I have lost count (though one will do.) You can read dozens of scientific rebuttals here, and some of my own contributions here, here, and here.

As one writer puts it,

the “science” behind the claim that human carbon emissions are heading us toward some kind of planetary catastrophe is not only not “settled,” but actually non-existent.

None of that matters — so far — because the “climatistas” have brainwashed Western political “leaders”, Western bureuacracies, and the media-information industry, which is doing its damnedest to suppress and discredit “climate deniers” (i.e., people who actually follow the science). The cost of having the “climatistas” in charge has been revealed: soaring fuel prices and freezing Europeans. There’s worse to come if the “climatistas” aren’t ejected from their positions of influence — vast economic destruction and the social distruption that goes with it.

The Response to COVID-19

I’ll start with a Washington Monthly article:

While most countries imposed draconian restrictions, there was an exception: Sweden. Early in the pandemic, Swedish schools and offices closed briefly but then reopened. Restaurants never closed. Businesses stayed open. Kids under 16 went to school.

That stood in contrast to the U.S. By April 2020, the CDC and the National Institutes of Health recommended far-reaching lockdowns that threw millions of Americans out of work. A kind of groupthink set in. In print and on social media, colleagues attacked experts who advocated a less draconian approach. Some received obscene emails and death threats. Within the scientific community, opposition to the dominant narrative was castigated and censored, cutting off what should have been vigorous debate and analysis.

In this intolerant atmosphere, Sweden’s “light touch,” as it is often referred to by scientists and policy makers, was deemed a disaster. “Sweden Has Become the World’s Cautionary Tale,” carped The New York Times. Reuters reported, “Sweden’s COVID Infections Among Highest in Europe, With ‘No Sign Of Decrease.’” Medical journals published equally damning reports of Sweden’s folly.

But Sweden seems to have been right. Countries that took the severe route to stem the virus might want to look at the evidence found in a little-known 2021 report by the Kaiser Family Foundation. The researchers found that among 11 wealthy peer nations, Sweden was the only one with no excess mortality among individuals under 75. None, zero, zip.

That’s not to say that Sweden had no deaths from COVID. It did. But it appears to have avoided the collateral damage that lockdowns wreaked in other countries. The Kaiser study wisely looked at excess mortality, rather than the more commonly used metric of COVID deaths. This means that researchers examined mortality rates from all causes of death in the 11 countries before the pandemic and compared those rates to mortality from all causes during the pandemic. If a country averaged 1 million deaths per year before the pandemic but had 1.3 million deaths in 2020, excess mortality would be 30 percent….

The Kaiser results might seem surprising, but other data have confirmed them. As of February, Our World in Data, a database maintained by the University of Oxford, shows that Sweden continues to have low excess mortality, now slightly lower than Germany, which had strict lockdowns. Another study found no increased mortality in Sweden in those under 70. Most recently, a Swedish commission evaluating the country’s pandemic response determined that although it was slow to protect the elderly and others at heightened risk from COVID in the initial stages, its laissez-faire approach was broadly correct….

One of the most pernicious effects of lockdowns was the loss of social support, which contributed to a dramatic rise in deaths related to alcohol and drug abuse. According to a recent report in the medical journal JAMAeven before the pandemic such “deaths of despair” were already high and rising rapidly in the U.S., but not in other industrialized countries. Lockdowns sent those numbers soaring.

The U.S. response to COVID was the worst of both worlds. Shutting down businesses and closing everything from gyms to nightclubs shielded younger Americans at low risk of COVID but did little to protect the vulnerable. School closures meant chaos for kids and stymied their learning and social development. These effects are widely considered so devastating that they will linger for years to come. While the U.S. was shutting down schools to protect kids, Swedish children were safe even with school doors wide open. According to a 2021 research letter, there wasn’t a single COVID death among Swedish children, despite schools remaining open for children under 16….

Of the potential years of life lost in the U.S., 30 percent were among Blacks and another 31 percent were among Hispanics; both rates are far higher than the demographics’ share of the population. Lockdowns were especially hard on young workers and their families. According to the Kaiser report, among those who died in 2020, people lost an average of 14 years of life in the U.S. versus eight years lost in peer countries. In other words, the young were more likely to die in the U.S. than in other countries, and many of those deaths were likely due to lockdowns rather than COVID.

And that isn’t all. There’s also this working paper from the National Bureau of Economic Research, which concludes:

The first estimates of the effects of COVID-19 on the number of business owners from nationally representative April 2020 CPS data indicate dramatic early-stage reductions in small business activity. The number of active business owners in the United States plunged from 15.0 million to 11.7 million over the crucial two-month window from February to April 2020. No other one-, two- or even 12-month window of time has ever shown such a large change in business activity. For comparison, from the start to end of the Great Recession the number of business owners decreased by 730,000 representing only a 5 percent reduction. In general, business ownership is relatively steady over the business cycle (Fairlie 2013; Parker 2018). The loss of 3.3 million business owners (or 22 percent) was comprised of large drops in important subgroups such as owners working roughly two days per week (28 percent), owners working four days a week (31 percent), and incorporated businesses (20 percent).

And that was more than two years ago, before the political panic had spawned a destructive tsunami of draconian measures. Such measures made the pandemic worse by creating the conditions for the evolution of more contagious strains of the coronavirus.

The correct (i.e., scientific) approach would have been to quarantine and care for the most vulnerable members of the populace: the old, those with compromised immune systems, those with diseases that left them especially vulnerable (heart disease, COPD, morbid obesity, etc.). As for the rest of us, widespread exposure to the disease would have meant the natural immunization of the populace through exposure to the coronavirus and the development of antibodies through that exposure.

In the end, millions of people have been made poorer, deprived of education and beneficial human interactions, and suffered and died needlessly because politicians and bureaucrats couldn’t (and can’t) resist the urge to do something — especially when something means trying to conquer nature and suppress human nature.

(For much more on this subject, see David Stockman’s “The Macroeconomic Consequences Of Lockdowns & The Aftermath”, reproduced at ZeroHedge.)

The Wages of Pseudo-Science

The worst thing about fallacies such as the three that I have just discussed isn’t the fact that they are widely accepted, even by scientists (if you can call economics a science). The worst thing is they have been embraced by politicians and bureaucrats eager to “solve” a “problem” whether or not it is within their power to solve. The result is the concoction and enforcement of economically and socially destructive policies. But that matters little to cosseted elites who — like their counterparts in the USSR — can live high on the hog while the masses are starving and freezing.

CODA

Is there hope for an American renaissance? The upcoming mid-term election will be pivotal but not conclusive. It will be a very good thing if the GOP regains control of Congress. But it will take more than that to restore sanity to the land.

A Republican (of the right kind) must win in 2024. The GOP majority in Congress must be enlarged. A purge of the deep state must follow, and it must scour every nook and cranny of the central government to remove every bureaucrat who has a leftist agenda and the ability to thwart the administration’s initiatives.

Beyond that, the American people should be rewarded for their (aggregate) return to sanity by the elimination several burdensome (and unconstitutional departments of the executive branch), by the appointment of dozens of pro-constitutional judges, and by the appointment of a string of pro-constitutional justices of the Supreme Court.

After that, the rest will take care of itself: Renewed economic vitality, a military whose might deters our enemies, and something like the restoration of sanity in cultural matters. (Bandwagon effects are powerful, and they can go uphill as well as downhill.)

But all of that is hope. The restoration of America’s greatness will not be easy or without acrimony and setbacks.

If America’s greatness isn’t restored, America will become a vassal state. And the leftists who made it possible will be the first victims of their new masters.