Peak Civilization?

Here is an oft-quoted observation, spuriously attributed to Socrates, about youth:

The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when elders enter the room. They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.

Even though Socrates didn’t say it, the sentiment has nevertheless been stated and restated since 1907, when the observation was concocted, and probably had been shared widely for decades, and even centuries, before that. I use a form of it when I discuss the spoiled children of capitalism (e.g., here).

Is there something to it? No and yes.

No, because rebelliousness and disrespect for elders and old ways seem to be part of the natural processes of physical and mental maturation.

Not all adolescents and young adults are rebellious and disrespectful. But many rebellious and disrespectful adolescents and young adults carry their attitudes with them through life, even if less obviously than in youth, as they climb the ladders of various callings. The callings that seem to be most attractive to the rebellious are the arts (especially the written, visual, thespian, terpsichorial, musical, and cinematic ones), the professoriate, the punditocracy, journalism, and politics.

Which brings me to the yes answer, and to the spoiled children of capitalism. Rebelliousness, though in some persons never entirely outgrown or suppressed by maturity, will more often be outgrown or suppressed in economically tenuous conditions, the challenges of which which almost fully occupied their bodies and minds. (Opinionizers and sophists were accordingly much thinner on the ground in the parlous days of yore.)

However, as economic growth and concomitant technological advances have yielded abundance far beyond the necessities of life for most inhabitants of the Western world, the beneficiaries of that abundance have acquired yet another luxury: the luxury of learning about and believing in systems that, in the abstract, seem to offer vast improvements on current conditions. It is the old adage “Idle hands are the devil’s tools” brought up to date, with “minds” joining “hands” in the devilishness.

Among many bad things that result from such foolishness (e.g., the ascendancy of ideologies that crush liberty and, ironically, economic growth) is the loss of social cohesion. I was reminded of this by Noah Smith’s fatuous article, “The 1950s Are Greatly Overrated“.

Smith is an economist who blogs and writes an opinion column for Bloomberg News. My impression of him is that he is a younger version of Paul Krugman, the former economist who has become a left-wing whiner. The difference between them is that Krugman remembers the 1950s fondly, whereas Smith does not.

I once said this about Krugman’s nostalgia for the 1950s, a decade during which he was a mere child:

[The nostalgia] is probably rooted in golden memories of his childhood in a prosperous community, though he retrospectively supplies an economic justification. The 1950s were (according to him) an age of middle-class dominance before the return of the Robber Barons who had been vanquished by the New Deal. This is zero-sum economics and class warfare on steroids — standard Krugman fare.

Smith, a mere toddler relative to Krugman and a babe in arms relative to me, takes a dim view of the 1950s:

For all the rose-tinted sentimentality, standards of living were markedly lower in the ’50s than they are today, and the system was riddled with vast injustice and inequality.

Women and minorities are less likely to have a wistful view of the ’50s, and with good reason. Segregation was enshrined in law in much of the U.S., and de facto segregation was in force even in Northern cities. Black Americans, crowded into ghettos, were excluded from economic opportunity by pervasive racism, and suffered horrendously. Even at the end of the decade, more than half of black Americans lived below the poverty line:

Women, meanwhile, were forced into a narrow set of occupations, and few had the option of pursuing fulfilling careers. This did not mean, however, that a single male breadwinner was always able to provide for an entire family. About a third of women worked in the ’50s, showing that many families needed a second income even if it defied the gender roles of the day:

For women who didn’t work, keeping house was no picnic. Dishwashers were almost unheard of in the 1950s, few families had a clothes dryer, and fewer than half had a washing machine.

But even beyond the pervasive racism and sexism, the 1950s wasn’t a time of ease and plenty compared to the present day. For example, by the end of the decade, even after all of that robust 1950s growth, the white poverty rate was still 18.1%, more than double that of the mid-1970s:

Nor did those above the poverty line enjoy the material plenty of later decades. Much of the nation’s housing stock in the era was small and cramped. The average floor area of a new single-family home in 1950 was only 983 square feet, just a bit bigger than the average one-bedroom apartment today.

To make matters worse, households were considerably larger in the ’50s, meaning that big families often had to squeeze into those tight living spaces. Those houses also lacked many of the things that make modern homes comfortable and convenient — not just dishwashers and clothes dryers, but air conditioning, color TVs and in many cases washing machines.

And those who did work had to work significantly more hours per year. Those jobs were often difficult and dangerous. The Occupational Safety and Health Administration wasn’t created until 1971. As recently as 1970, the rate of workplace injury was several times higher than now, and that number was undoubtedly even higher in the ’50s. Pining for those good old factory jobs is common among those who have never had to stand next to a blast furnace or work on an unautomated assembly line for eight hours a day.

Outside of work, the environment was in much worse shape than today. There was no Environmental Protection Agency, no Clean Air Act or Clean Water Act, and pollution of both air and water was horrible. The smog in Pittsburgh in the 1950s blotted out the sun. In 1952 the Cuyahoga River in Cleveland caught fire. Life expectancy at the end of the ’50s was only 70 years, compared to more than 78 today.

So life in the 1950s, though much better than what came before, wasn’t comparable to what Americans enjoyed even two decades later. In that space of time, much changed because of regulations and policies that reduced or outlawed racial and gender discrimination, while a host of government programs lowered poverty rates and cleaned up the environment.

But on top of these policy changes, the nation benefited from rapid economic growth both in the 1950s and in the decades after. Improved production techniques and the invention of new consumer products meant that there was much more wealth to go around by the 1970s than in the 1950s. Strong unions and government programs helped spread that wealth, but growth is what created it.

So the 1950s don’t deserve much of the nostalgia they receive. Though the decade has some lessons for how to make the U.S. economy more equal today with stronger unions and better financial regulation, it wasn’t an era of great equality overall. And though it was a time of huge progress and hope, the point of progress and hope is that things get better later. And by most objective measures they are much better now than they were then.

See? A junior Krugman who sees the same decade as a glass half-empty instead of half-full.

In the end, Smith admits the irrelevance of his irreverence for the 1950s when he says that “the point of progress and hope is that things get better later.” In other words, if there is progress the past will always look inferior to the present. (And, by the same token, the present will always look inferior to the future when it becomes the present.)

I could quibble with some of Smith’s particulars (e.g., racism may be less overt than it was in the 1950s, but it still boils beneath the surface, and isn’t confined to white racism; stronger unions and stifling financial regulations hamper economic growth, which Smith prizes so dearly). But I will instead take issue with his assertion, which precedes the passages quoted above, that “few of those who long for a return to the 1950s would actually want to live in those times.”

It’s not that anyone yearns for a return to the 1950s as it was in all respects, but for a return to the 1950s as it was in some crucial ways:

There is … something to the idea that the years between the end of World War II and the early 1960s were something of a Golden Age…. But it was that way for reasons other than those offered by Krugman [and despite Smith’s demurrer].

Civil society still flourished through churches, clubs, civic associations, bowling leagues, softball teams and many other voluntary organizations that (a) bound people and (b) promulgated and enforced social norms.

Those norms proscribed behavior considered harmful — not just criminal, but harmful to the social fabric (e.g., divorce, unwed motherhood, public cursing and sexuality, overt homosexuality). The norms also prescribed behavior that signaled allegiance to the institutions of civil society (e.g., church attendance, veterans’ organizations) , thereby helping to preserve them and the values that they fostered.

Yes, it was an age of “conformity”, as sneering sophisticates like to say, even as they insist on conformity to reigning leftist dogmas that are destructive of the social fabric. But it was also an age of widespread mutual trust, respect, and forbearance.

Those traits, as I have said many times (e.g., here) are the foundations of liberty, which is a modus vivendi, not a mystical essence. The modus vivendi that arises from the foundations is peaceful, willing coexistence and its concomitant: beneficially cooperative behavior —  liberty, in other words.

The decade and a half after the end of World War II wasn’t an ideal world of utopian imagining. But it approached a realizable ideal. That ideal — for the nation as a whole — has been put beyond reach by the vast, left-wing conspiracy that has subverted almost every aspect of life in America.

What happened was the 1960s — and its long aftermath — which saw the rise of capitalism’s spoiled children (of all ages), who have spat on and shredded the very social norms that in the 1940s and 1950s made the United States of America as united they ever would be. Actual enemies of the nation — communists — were vilified and ostracized, and that’s as it should have been. And people weren’t banned and condemned by “friends”, “followers”, Facebook, Twitter, etc. etc., for the views that they held. Not even on college campuses, on radio and TV shows, in the print media, or in Hollywood moves.

What do the spoiled children have to show for their rejection of social norms — other than economic progress that is actually far less robust than it would have been were it not for the  interventions of their religion-substitute, the omnipotent central government? Well, omnipotent at home and impotent (or drastically weakened) abroad, thanks to rounds of defense cuts and perpetual hand-wringing about what the “world” might think or some militarily inferior opponents might do if the U.S. government were to defend Americans and protect their interests abroad?

The list of the spoiled children’s “accomplishments” is impossibly long to recite here, so I will simply offer a very small sample of things that come readily to mind:

California wildfires caused by misguided environmentalism.

The excremental wasteland that is San Francisco. (And Blue cities, generally.)

Flight from California wildfires, high taxes, excremental streets, and anti-business environment.

The killing of small businesses, especially restaurants, by imbecilic Blue-State minimum wage laws.

The killing of businesses, period, by oppressive Blue-State regulations.

The killing of jobs for people who need them the most, by ditto and ditto.

Bloated pension schemes for Blue-State (and city) employees, which are bankrupting those States (and cities) and penalizing their citizens who aren’t government employees.

The hysteria (and even punishment) that follows from drawing a gun or admitting gun ownership

The idea that men can become women and should be allowed to compete with women in athletic competitions because the men in question have endured some surgery and taken some drugs.

The idea that it doesn’t and shouldn’t matter to anyone that a self-identified “woman” uses women’s rest-rooms where real women and girls became prey for prying eyes and worse.

Mass murder on a Hitlerian-Stalinist scale in the name of a “woman’s right to choose”, when she made that choice by (in almost every case) engaging in consensual sex.

Disrespect for he police and military personnel who keep them safe in their cosseted existences.

Applause for attacks on the same.

Applause for America’s enemies, which the delusional, spoiled children won’t recognize as their enemies until it’s too late.

Longing for impossible utopias (e.g., “true” socialism) because they promise what is actually impossible in the real world — and result in actual dystopias (e.g., the USSR, Cuba, Britain’s National Health Service).

Noah Smith is far too young to remember an America in which such things were almost unthinkable — rather than routine. People then didn’t have any idea how prosperous they would become, or how morally bankrupt and divided.

Every line of human endeavor reaches a peak, from which decline is sure to follow if the things that caused it to peak are mindlessly rejected for the sake of novelty (i.e., rejection of old norms just because they are old). This is nowhere more obvious than in the arts.

It should be equally obvious to anyone who takes an objective look at the present state of American society and is capable of comparing it with American society of the 1940s and 1950s. For all of its faults it was a golden age. Unfortunately, most Americans now living (Noah Smith definitely included) are too young and too fixated on material things to understand what has been lost — irretrievably, I fear.


I was going to append a list of related posts, but the list would be so long that I can only refer you to “Favorite Posts” — especially those listed in the following sections:

I. The Academy, Intellectuals, and the Left
II. Affirmative Action, Race, and Immigration
IV. Conservatism and Other Political Philosophies
V. The Constitution and the Rule of Law
VI. Economics: Principles and Issues
VIII. Infamous Thinkers and Political Correctness
IX. Intelligence and Psychology
X. Justice
XI. Politics, Politicians, and the Consequences of Government
XII. Science, Religion, and Philosophy
XIII. Self-Ownership (abortion, euthanasia, marriage, and other aspects of the human condition)
XIV. War and Peace

American Foreign Policy: Feckless no More?

In “The Subtle Authoritarianism of the ‘Liberal Order’“, I take on the “liberals” of all parties who presume to know what’s best for all of us, and are bent on making it so through the power of the state. I also had in mind, but didn’t discuss, the smug “liberals” who have long presided over U.S. foreign policy.

One of the smuggies whom I most despise for his conduct of foreign policy is the sainted George H.W. Bush. War hero or not, he failed to protect America and its interests on two notable occasions during his presidency.

The first occasion came during the Gulf War. I have this to say about it in “The Modern Presidency: From TR to DJT”:

The main event of Bush’s presidency was the Gulf War of 1990-1991. Iraq, whose ruler was Saddam Hussein, invaded the small neighboring country of Kuwait. Kuwait produces and exports a lot of oil. The occupation of Kuwait by Iraq meant that Saddam Hussein might have been able to control the amount of oil shipped to other countries, including Europe and the United States. If Hussein had been allowed to control Kuwait, he might have moved on to Saudi Arabia, which produces much more oil than Kuwait. President Bush asked Congress to approve military action against Iraq. Congress approved the action, although most Democrats voted against giving President Bush authority to defend Kuwait. The war ended in a quick defeat for Iraq’s armed forces. But President Bush decided not to allow U.S. forces to finish the job and end Saddam Hussein’s reign as ruler of Iraq.

And the rest is a long, sad history of what probably wouldn’t have happened in 2003 and the years since then.

What I didn’t appreciate when I wrote about Bush’s misadventure in Iraq was his utter fecklessness as the Soviet Union was collapsing. I learned about it from Vladimir Bukovsky‘s Judgment in Moscow: Soviet Crimes and Western Complicity. Bukovsksy is the “I” in the following passages from chapter 6 of the book:

George Bush and his Secretary of State Jim Baker … outdid everyone [including Margaret Thatcher and Ronald Reagan], [in] opposing the inevitable disintegration of the USSR until the very last day.

“Yes, I think I can trust Gorbachev,”—said George Bush to Time magazine just when Gorbachev was beginning to lose control and was tangled hopelessly in his own lies—“I looked him in the eye, I appraised him. He was very determined. Yet there was a twinkle. He is a guy quite sure of what he is doing. He has got a political feel.” [Like father, like son.]

It is notable that this phrase is illogical: if your opponent “believes deeply in what he is doing” does not necessarily mean that he is trustworthy. After all, Hitler also “believed deeply in what he was doing.” But the thought that their aims were diametrically opposed did not enter George Bush’s head. It is not surprising that with such presidential perspicacity, their top-level meeting in Malta (2-3 December 1989) was strongly reminiscent of a second Yalta: in any case, after this the US Department of State invariably maintained that the growing Soviet pressure on the Baltics was “an internal USSR matter.” Even two months prior to the collapse of the Soviet Union Bush, on a visit to Kiev, exhorted Ukraine not to break away.

The extent to which Bush’s administration did not understand the Soviet games in Europe is clear from its position on the reunification of Germany. Secretary of State Baker, who hurried to Berlin immediately after the fall of the Wall, evaluated this event as a demonstration of Gorbachev’s “remarkable realism. To give President Gorbachev his due, he was the first Soviet leader to have the daring and foresight to allow the revocation of the policy of repressions in Eastern Europe.”

And possibly in gratitude for this, Baker’s main interest was to respect the “lawful concern” of his eastern partner by slowing down the process of reunification by all means [quoting Baker:]

In the interest of overall stability in Europe, the move toward reunification must be of a peaceful nature, it must be gradual, and part of a step-by-step process.

The plan he proposed was a total disaster, for it corresponded completely to the Soviet scheme of the creation of a “common European home”: it was envisaged at first to reinforce the European Community, the Helsinki process and promote the further integration of Europe. All this, naturally, without undue haste but “step by step” over the passage of years [again quoting Baker:]

As these changes proceed, as they overcome the division of Europe, so too will the divisions of Germany and Berlin be overcome in peace and freedom.

Furthermore, even without consulting Bonn, he rushed to embrace the Kremlin’s new puppets in Eastern Germany in order to signal “US intentions to try to improve the credibility of the East German political leadership and to forestall a power vacuum that could trigger a rush to unification.” And this was in January 1990, i.e. shortly before the elections in the GDR that actually solved the key question: would Germany reunite on Soviet conditions, or Western ones? Luckily the East Germans were less “patient” and smarter: knowing well what they were dealing with, they voted for immediate reunification, ignoring Baker and the pressure of the whole world.

Why, then, did the West and the USA with its seemingly conservative, even anti-communist administration, yearn for this “stabilization” or, to put it more simply, salvation of the Soviet regime?

Let us allow that Baker was ignorant, pompous and big-headed, dreaming of some kind of global structures “from Vancouver to Vladivostok”, of which he would be the architect362 (“the Baker doctrine”). I remember at one press-conference I even suggested introducing a unit of measurement for political brainlessness—one baker (the average man in the street would be measured in millibakers). At the very height of the bloody Soviet show in Bucharest at Christmas in 1989, he stated that “They are attempting to pull off the yoke of a very oppressive and repressive dictatorship. So I think that we would be inclined probably to follow the example of France, who today has said that if the Warsaw Pact felt it necessary to intervene on behalf of the opposition, that it would support that action.” The new pro-Soviet policy of the USA after the top-level meeting in Malta he explained by saying that “the Soviet Union has switched sides, from that of oppression and dictatorships to democracy and change.” This was said at the moment when the Soviet army was smashing the democratic opposition in Baku, killing several hundred people people (which Baker also “treated with understanding”). But Baker was not alone, and this cannot be explained away by sheer stupidity. That is the tragedy, that such an idiotic position was shared by practically all Western governments, including the conservative ones.

Baker and Bush, what a team.

America’s enemies will do what they will do, whether our “leaders” are nice to them or confront them. And when they are confronted forcefully (and even forcibly), they are more likely to be deterred (and even prevented) from acting against America.

For most of the past century, U.S. foreign policy has been run by smug “liberals” who have projected their own feelings onto the likes of Hitler, Stalin, Mao, Ho, Putin, Saddam, and the ayatollahs. And where has it landed us? Scrambling from behind to win in World War II, on the defensive against Communist expansion, losing or failing to win wars against vastly inferior enemies, and giving our enemies time (and money) in which to arm themselves to attack our overseas interests and even our homeland. This tragic history has been abetted by hand-wringing from the usual suspects in the academy, the media, the foreign-policy tea-leaf-reading-signal-sending society, the left generally (though I am being redundant), and “liberals” of all political persuasions who are feckless to the core when it comes to dealing with domestic and foreign thugs.

Enough! I hope and believe that’s what President Trump just said, in effect, when he authorized the killing of Iran’s General Soleimani.


Related posts:
A Grand Strategy for the United States
The Folly of Pacifism
Transnationalism and National Defense
The Folly of Pacifism, Again
September 20, 2001: Hillary Clinton Signals the End of “Unity”
Patience as a Tool of Strategy
The War on Terror, As It Should Have Been Fought
The Cuban Missile Crisis, Revisited
Preemptive War
Preemptive War and Iran
Some Thoughts and Questions about Preemptive War
Defense as an Investment in Liberty and Prosperity
The Barbarians Within and the State of the Union
The World Turned Upside Down
Utilitarianism and Torture
Defense Spending: One More Time
The President’s Power to Kill Enemy Combatants
My Defense of the A-Bomb
Pacifism
LBJ’s Dereliction of Duty
Terrorism Isn’t an Accident
The Ken Burns Apology Tour Continues
Planning for the Last War
A Rearview Look at the Invasion of Iraq and the War on Terror
Preemptive War Revisited
It’s a MAD, MAD, MAD, MAD World
The Folly of Pacifism (III)
MAD, Again
“MAD, Again”: A Footnote
More MADness: Mistaking Bureaucratic Inertia for Strategy
World War II As an Aberration
Reflections on the “Feel Good” War

Not-So-Random Thoughts (XXV)

“Not-So-Random Thoughts” is an occasional series in which I highlight writings by other commentators on varied subjects that I have addressed in the past. Other entries in the series can be found at these links: I, II, III, IV, V, VI, VII, VIII, IX, X, XI, XII, XIII, XIV, XV, XVI, XVII, XVIII, XIX, XX, XXI, XXII, XXIII, and XXIV. For more in the same style, see “The Tenor of the Times” and “Roundup: Civil War, Solitude, Transgenderism, Academic Enemies, and Immigration“.

CONTENTS

The Real Unemployment Rate and Labor-Force Participation

Is Partition Possible?

Still More Evidence for Why I Don’t Believe in “Climate Change”

Transgenderism, Once More

Big, Bad Oligopoly?

Why I Am Bunkered in My Half-Acre of Austin

“Government Worker” Is (Usually) an Oxymoron


The Real Unemployment Rate and Labor-Force Participation

There was much celebration (on the right, at least) when it was announced that the official unemployment rate, as of November, is only 3.5 percent, and that 266,000 jobs were added to the employment rolls (see here, for example). The exultation is somewhat overdone. Yes, things would be much worse if Obama’s anti-business rhetoric and policies still prevailed, but Trump is pushing a big boulder of deregulation uphill.

In fact, the real unemployment rate is a lot higher than official figure I refer you to “Employment vs. Big Government and Disincentives to Work“. It begins with this:

The real unemployment rate is several percentage points above the nominal rate. Officially, the unemployment rate stood at 3.5 percent as of November 2019. Unofficially — but in reality — the unemployment rate was 9.4 percent.

The explanation is that the labor-force participation rate has declined drastically since peaking in January 2000. When the official unemployment rate is adjusted to account for that decline (and for a shift toward part-time employment), the result is a considerably higher real unemployment rate.

Arnold Kling recently discussed the labor-force participation rate:

[The] decline in male labor force participation among those without a college degree is a significant issue. Note that even though the unemployment rate has come down for those workers, their rate of labor force participation is still way down.

Economists on the left tend to assume that this is due to a drop in demand for workers at the low end of the skill distribution. Binder’s claim is that instead one factor in declining participation is an increase in the ability of women to participate in the labor market, which in turn lowers the advantage of marrying a man. The reduced interest in marriage on the part of women attenuates the incentive for men to work.

Could be. I await further analysis.


Is Partition Possible?

Angelo Codevilla peers into his crystal ball:

Since 2016, the ruling class has left no doubt that it is not merely enacting chosen policies: It is expressing its identity, an identity that has grown and solidified over more than a half century, and that it is not capable of changing.

That really does mean that restoring anything like the Founders’ United States of America is out of the question. Constitutional conservatism on behalf of a country a large part of which is absorbed in revolutionary identity; that rejects the dictionary definition of words; that rejects common citizenship, is impossible. Not even winning a bloody civil war against the ruling class could accomplish such a thing.

The logical recourse is to conserve what can be conserved, and for it to be done by, of, and for those who wish to conserve it. However much force of what kind may be required to accomplish that, the objective has to be conservation of the people and ways that wish to be conserved.

That means some kind of separation.

As I argued in “The Cold Civil War,” the natural, least stressful course of events is for all sides to tolerate the others going their own ways. The ruling class has not been shy about using the powers of the state and local governments it controls to do things at variance with national policy, effectively nullifying national laws. And they get away with it.

For example, the Trump Administration has not sent federal troops to enforce national marijuana laws in Colorado and California, nor has it punished persons and governments who have defied national laws on immigration. There is no reason why the conservative states, counties, and localities should not enforce their own view of the good.

Not even President Alexandria Ocasio-Cortez would order troops to shoot to re-open abortion clinics were Missouri or North Dakota, or any city, to shut them down. As Francis Buckley argues in American Secession: The Looming Breakup of the United States, some kind of separation is inevitable, and the options regarding it are many.

I would like to believe Mr. Codevilla, but I cannot. My money is on a national campaign of suppression, which will begin the instant that the left controls the White House and Congress. Shooting won’t be necessary, given the massive displays of force that will be ordered from the White House, ostensibly to enforce various laws, including but far from limited to “a woman’s right to an abortion”. Leftists must control everything because they cannot tolerate dissent.

As I say in “Leftism“,

Violence is a good thing if your heart is in the “left” place. And violence is in the hearts of leftists, along with hatred and the irresistible urge to suppress that which is hated because it challenges leftist orthodoxy — from climate skepticism and the negative effect of gun ownership on crime to the negative effect of the minimum wage and the causal relationship between Islam and terrorism.

There’s more in “The Subtle Authoritarianism of the ‘Liberal Order’“; for example:

[Quoting Sumantra Maitra] Domestically, liberalism divides a nation into good and bad people, and leads to a clash of cultures.

The clash of cultures was started and sustained by so-called liberals, the smug people described above. It is they who — firmly believing themselves to be smarter, on the the side of science, and on the side of history — have chosen to be the aggressors in the culture war.

Hillary Clinton’s remark about Trump’s “deplorables” ripped the mask from the “liberal” pretension to tolerance and reason. Clinton’s remark was tantamount to a declaration of war against the self-appointed champion of the “deplorables”: Donald Trump. And war it has been. much of it waged by deep-state “liberals” who cannot entertain the possibility that they are on the wrong side of history, and who will do anything — anything — to make history conform to their smug expectations of it.


Still More Evidence for Why I Don’t Believe in “Climate Change”

This is a sequel to an item in the previous edition of this series: “More Evidence for Why I Don’t Believe in Climate Change“.

Dave Middleton debunks the claim that 50-year-old climate models correctly predicted the susequent (but not steady) rise in the globe’s temperature (whatever that is). He then quotes a talk by Dr. John Christy of the University of Alabama-Huntsville Climate Research Center:

We have a change in temperature from the deep atmosphere over 37.5 years, we know how much forcing there was upon the atmosphere, so we can relate these two with this little ratio, and multiply it by the ratio of the 2x CO2 forcing. So the transient climate response is to say, what will the temperature be like if you double CO2– if you increase at 1% per year, which is roughly what the whole greenhouse effect is, and which is achieved in about 70 years. Our result is that the transient climate response in the troposphere is 1.1 °C. Not a very alarming number at all for a doubling of CO2. When we performed the same calculation using the climate models, the number was 2.31°C. Clearly, and significantly different. The models’ response to the forcing – their ∆t here, was over 2 times greater than what has happened in the real world….

There is one model that’s not too bad, it’s the Russian model. You don’t go to the White House today and say, “the Russian model works best”. You don’t say that at all! But the fact is they have a very low sensitivity to their climate model. When you look at the Russian model integrated out to 2100, you don’t see anything to get worried about. When you look at 120 years out from 1980, we already have 1/3 of the period done – if you’re looking out to 2100. These models are already falsified [emphasis added], you can’t trust them out to 2100, no way in the world would a legitimate scientist do that. If an engineer built an aeroplane and said it could fly 600 miles and the thing ran out of fuel at 200 and crashed, he might say: “I was only off by a factor of three”. No, we don’t do that in engineering and real science! A factor of three is huge in the energy balance system. Yet that’s what we see in the climate models….

Theoretical climate modelling is deficient for describing past variations. Climate models fail for past variations, where we already know the answer. They’ve failed hypothesis tests and that means they’re highly questionable for giving us accurate information about how the relatively tiny forcing … will affect the climate of the future.

For a lot more in this vein, see my pages “Climate Change” and “Modeling and Science“.


Transgenderism, Once More

Theodore Dalrymple (Anthony Daniels, M.D.) is on the case:

The problem alluded to in [a paper in the Journal of Medical Ethics] is, of course, the consequence of a fiction, namely that a man who claims to have changed sex actually has changed sex, and is now what used to be called the opposite sex. But when a man who claims to have become a woman competes in women’s athletic competitions, he often retains an advantage derived from the sex of his birth. Women competitors complain that this is unfair, and it is difficult not to agree with them….

Man being both a problem-creating and solving creature, there is, of course, a very simple way to resolve this situation: namely that men who change to simulacra of women should compete, if they must, with others who have done the same. The demand that they should suffer no consequences that they neither like nor want from the choices they have made is an unreasonable one, as unreasonable as it would be for me to demand that people should listen to me playing the piano though I have no musical ability. Thomas Sowell has drawn attention to the intellectual absurdity and deleterious practical consequences of the modern search for what he calls “cosmic justice.”…

We increasingly think that we live in an existential supermarket in which we pick from the shelf of limitless possibilities whatever we want to be. We forget that limitation is not incompatible with infinity; for example, that our language has a grammar that excludes certain forms of words, without in any way limiting the infinite number of meanings that we can express. Indeed, such limitation is a precondition of our freedom, for otherwise nothing that we said would be comprehensible to anybody else.

That is a tour de force typical of the good doctor. In the span of three paragraphs, he addresses matters that I have treated at length in “The Transgender Fad and Its Consequences” (and later in the previous edition of this series), “Positive Rights and Cosmic Justice“, and “Writing: A Guide” (among other entries at this blog).


Big, Bad Oligopoly?

Big Tech is giving capitalism a bad name, as I discuss in “Why Is Capitalism Under Attack from the Right?“, but it’s still the best game in town. Even oligopoly and its big brother, monopoly, aren’t necessarily bad. See, for example, my posts, “Putting in Some Good Words for Monopoly” and “Monopoly: Private Is Better than Public“. Arnold Kling makes the essential point here:

Do indicators of consolidation show us that the economy is getting less competitive or more competitive? The answer depends on which explanation(s) you believe to be most important. For example, if network effects or weak resistance to mergers are the main factors, then the winners from consolidation are quasi-monopolists that may be overly insulated from competition. On the other hand, if the winners are firms that have figured out how to develop and deploy software more effectively than their rivals, then the growth of those firms at the expense of rivals just shows us that the force of competition is doing its work.


Why I Am Bunkered in My Half-Acre of Austin

Randal O’Toole takes aim at the planners of Austin, Texas, and hits the bullseye:

Austin is one of the fastest-growing cities in America, and the city of Austin and Austin’s transit agency, Capital Metro, have a plan for dealing with all of the traffic that will be generated by that growth: assume that a third of the people who now drive alone to work will switch to transit, bicycling, walking, or telecommuting by 2039. That’s right up there with planning for dinner by assuming that food will magically appear on the table the same way it does in Hogwarts….

[W]hile Austin planners are assuming they can reduce driving alone from 74 to 50 percent, it is actually moving in the other direction….

Planners also claim that 11 percent of Austin workers carpool to work, an amount they hope to maintain through 2039. They are going to have trouble doing that as carpooling, in fact, only accounted for 8.0 percent of Austin workers in 2018.

Planners hope to increase telecommuting from its current 8 percent (which is accurate) to 14 percent. That could be difficult as they have no policy tools that can influence telecommuting.

Planners also hope to increase walking and bicycling from their current 2 and 1 percent to 4 and 5 percent. Walking to work is almost always greater than cycling to work, so it’s difficult to see how they plan to magic cycling to be greater than walking. This is important because cycling trips are longer than walking trips and so have more of a potential impact on driving.

Finally, planners want to increase transit from 4 to 16 percent. In fact, transit carried just 3.24 percent of workers to their jobs in 2018, down from 3.62 percent in 2016. Changing from 4 to 16 percent is a an almost impossible 300 percent increase; changing from 3.24 to 16 is an even more formidable 394 percent increase. Again, reality is moving in the opposite direction from planners’ goals….

Planners have developed two main approaches to transportation. One is to estimate how people will travel and then provide and maintain the infrastructure to allow them to do so as efficiently and safely as possible. The other is to imagine how you wish people would travel and then provide the infrastructure assuming that to happen. The latter method is likely to lead to misallocation of capital resources, increased congestion, and increased costs to travelers.

Austin’s plan is firmly based on this second approach. The city’s targets of reducing driving alone by a third, maintaining carpooling at an already too-high number, and increasing transit by 394 percent are completely unrealistic. No American city has achieved similar results in the past two decades and none are likely to come close in the next two decades.

Well, that’s the prevailing mentality of Austin’s political leaders and various bureaucracies: magical thinking. Failure is piled upon failure (e.g., more bike lanes crowding out traffic lanes, a hugely wasteful curbside composting plan) because to admit failure would be to admit that the emperor has no clothes.

You want to learn more about Austin? You’ve got it:

Driving and Politics (1)
Life in Austin (1)
Life in Austin (2)
Life in Austin (3)
Driving and Politics (2)
AGW in Austin?
Democracy in Austin
AGW in Austin? (II)
The Hypocrisy of “Local Control”
Amazon and Austin


“Government Worker” Is (Usually) an Oxymoron

In “Good News from the Federal Government” I sarcastically endorse the move to grant all federal workers 12 weeks of paid parental leave:

The good news is that there will be a lot fewer civilian federal workers on the job, which means that the federal bureaucracy will grind a bit more slowly when it does the things that it does to screw up the economy.

The next day, Audacious Epigone put some rhetorical and statistical meat on the bones of my informed prejudice in “Join the Crooks and Liars: Get a Government Job!“:

That [the title of the post] used to be a frequent refrain on Radio Derb. Though the gag has been made emeritus, the advice is even better today than it was when the Derb introduced it. As he explains:

The percentage breakdown is private-sector 76 percent, government 16 percent, self-employed 8 percent.

So one in six of us works for a government, federal, state, or local.

Which group does best on salary? Go on: see if you can guess. It’s government workers, of course. Median earnings 52½ thousand. That’s six percent higher than the self-employed and fourteen percent higher than the poor shlubs toiling away in the private sector.

If you break down government workers into two further categories, state and local workers in category one, federal workers in category two, which does better?

Again, which did you think? Federal workers are way out ahead, median earnings 66 thousand. Even state and local government workers are ahead of us private-sector and self-employed losers, though.

Moral of the story: Get a government job! — federal for strong preference.

….

Though it is well known that a government gig is a gravy train, opinions of the people with said gigs is embarrassingly low as the results from several additional survey questions show.

First, how frequently the government can be trusted “to do what’s right”? [“Just about always” and “most of the time” badly trail “some of the time”.]

….

Why can’t the government be trusted to do what’s right? Because the people who populate it are crooks and liars. Asked whether “hardly any”, “not many” or “quite a few” people in the federal government are crooked, the following percentages answered with “quite a few” (“not sure” responses, constituting 12% of the total, are excluded). [Responses of “quite a few” range from 59 percent to 77 percent across an array of demographic categories.]

….

Accompanying a strong sense of corruption is the perception of widespread incompetence. Presented with a binary choice between “the people running the government are smart” and “quite a few of them don’t seem to know what they are doing”, a solid majority chose the latter (“not sure”, at 21% of all responses, is again excluded). [The “don’t know what they’re doing” responses ranged from 55 percent to 78 percent across the same demographic categories.]

Are the skeptics right? Well, most citizens have had dealings with government employees of one kind and another. The “wisdom of crowds” certainly applies in this case.

Dangerous Millennials?

I return to Joel Kotkin’s essay, “America’s Drift Toward Feudalism“ (American Affairs Journal, Winter 2019), which I quoted recently and favorably. This is from the final passages of the essay:

In the world envisioned by the oligarchs [the ultrarich, especially the czars of Big Tech and financial institutions] and the clerisy [affluent professionals and members of the academic-goverment-information-media complex], the poor and much of the middle class are destined to become more dependent on the state. This dependency could be accelerated as their labor is devalued both by policy hostile to the industrial economy, and by the greater implementation of automation and artificial intelligence.

Opposing these forces will be very difficult, particularly given the orientation of our media, academia, and the nonprofit world, as well as the massive wealth accumulated by the oligarchs. A system that grants favors and entertainment to its citizens but denies them prop­erty expects little in return. This kind of state, Tocqueville suggested, can be used to keep its members in “perpetual childhood”; it “would degrade men rather than tormenting them.”

Reversing our path away from a new feudalism will require, among other things, a rediscovery of belief in our basic values and what it means to be an American. Nearly 40 percent of young Ameri­cans, for example, think the country lacks “a history to be proud of.” Fewer young people than previous generations place an emphasis on family, religion, or patriotism. Rather than look at what binds a dem­ocratic society together, the focus on both right and left has been on narrow identities incapable of sustaining a democratic and pluralistic society. The new generation has become cut off from the traditions and values of our past. If one does not even know of the legacies underpinning our democracy, one is not likely to notice when they are lost. Recovering a sense of pride and identification with Ameri­ca’s achievements is an essential component of any attempt to recover the drive, ambition, and self-confidence that propelled the United States to the space age. If we want to rescue the future from a new and pernicious form of feudalism, we will have to recover this ground.

To reverse neo-feudalism, the Third Estate—the class most threat­ened by the ascendency of the oligarchs and the clerisy—needs to re­invigorate its political will, just as it did during the Revolution and in the various struggles that followed. “Happy the nation whose people has not forgotten to how to rebel,” noted the British historian R. H. Tawney. Whether we can understand and defy the new feudalism will determine the kind of world our children will inherit.

There is altogether too much reification going on here. Take the final paragraph, for example, where Kotkin says that the Third Estate (the poor and middle class) “needs to invigorate its political will”. The Third Estate is an abstraction, not an actual association of persons united for the purpose of taking collective action.

Individual members of the Third Estate will do whatever it is that they choose to do and are capable of doing. One frightening possibility is that enough of them will take to the polls and increasingly tip the balance toward left-wing politicians who promise to share the wealth. Having followed Kotkin’s blog for some time, I doubt that that is an outcome he prefers, inasmuch as efforts to share the wealth are economically destructive — especially for members of the Third Estate.

For more about the economic status of Millennials (as an abstract group), see Timothy Taylor’s “About Millennials“.

What’s in a Name?

A lot, especially if it’s the name of a U.S. Navy ship. Take the aircraft carrier, for instance, which has been the Navy’s capital ship since World War II. The first aircraft carrier in the U.S. fleet was the USS Langley, commissioned in 1922. Including escort carriers, which were smaller than the relatively small carriers of World War II, a total of 154 carriers have been commissioned and put into service in the U.S. Navy. (During World War II, some escort carriers were transferred to the Royal Navy upon commissioning.)

As far as I am able to tell, not one of the the 82 escort carriers was named for a person. Of the 72 “regular” carriers, which includes 10 designated as light aircraft carriers, none was named for a person until CVB-49, the Franklin D. Roosevelt, was commissioned in 1945, several months after the death of its namesake. The next such naming came in 1947, with the commissioning of the Wright, named for Wilbur and Orville Wright, the aviation pioneers. There was a hiatus of 8 years, until the commissioning of the Forrestal in 1955; a ship named for the late James Forrestal, the first secretary of defense.

The dam burst in 1968, with the commissioning of John F. Kennedy. That carrier and the 11 commissioned since have been named for persons, only one of whom, Admiral of the Fleet Chester W. Nimitz, was a renowned naval person. In addition to Kennedy, the namesakes include former U.S. presidents (Eisenhower, T. Roosevelt, Lincoln, Washington, Truman, Reagan, Bush 41, and Ford), Carl Vinson (a long-serving chairman of the House Armed Services Committee), and John C. Stennis (a long-serving chairman of the Senate Armed Services Committee). Reagan and Bush were honored while still living (though Reagan may have been unaware of the honor because of the advanced state of his Alzheimer’s disease).

All but the Kennedy are on active service. And the Kennedy, which was decommissioned in 2007, is due to be replaced by a namesake next year. But that may be the end of it. Wisdom may have prevailed before the Navy becomes embroiled in nasty, needless controversies over the prospect of naming of a carrier after Lyndon Johnson, Richard Nixon, Jimmy Carter, Bill Clinton, George Bush, Barack Obama, or Donald Trump.

The carrier after Kennedy (II) will be named Enterprise — the third carrier to be thus named. Perhaps future carriers will take the dashing names of those that I remember well from my days as a young defense analyst: Bon Homme Richard (a.k.a, Bonny Dick), Kearsarge, Oriskany, Princeton, Shangri-La, Lake Champlain, Tarawa, Midway, Coral Sea, Valley Forge, Saipan, Saratoga, Ranger, Independence, Kitty Hawk, Constellation, Enterprise (II), and America.

And while we’re at it, perhaps the likes of Admiral William McRaven (USN ret.) will do their duty, become apolitical, and shut up.

The Modern Presidency: From TR to DJT

This is a revision and expansion of a post that I published at my old blog late in 2007. The didactic style of this post reflects its original purpose, which was to give my grandchildren some insights into American history that aren’t found in standard textbooks. Readers who consider themselves already well-versed in the history of American politics should nevertheless scan this post for its occasionally provocative observations.

Theodore Roosevelt Jr. (1858-1919) was elected Vice President as a Republican in 1900, when William McKinley was elected to a second term as President. Roosevelt became President when McKinley was assassinated in September 1901. Roosevelt was re-elected President in 1904, with 56 percent of the “national” popular vote. (I mention popular-vote percentages here and throughout this post because they are a gauge of the general popularity of presidential candidates, though an inaccurate gauge if a strong third-party candidate emerges to distort the usual two-party dominance of the popular vote. There is, in fact, no such thing as a national popular vote. Rather, it is the vote in each State which determines the distribution of that State’s electoral votes between the various candidates. The electoral votes of all States are officially tallied about a month after the general election, and the president-elect is the candidate with the most electoral votes. I have more to say more about electoral votes in several of the entries that follow this one.)

Theodore Roosevelt (also known as TR) served almost two full terms as President, from September 14, 1901, to March 4, 1909. (Before 1937, a President’s term of office began on March 4 of the year following his election to office.)

Roosevelt was an “activist” President. Roosevelt used what he called the “bully pulpit” of the presidency to gain popular support for programs that exceeded the limits set in the Constitution. Roosevelt was especially willing to use the power of government to regulate business and to break up companies that had become successful by offering products that consumers wanted. Roosevelt was typical of politicians who inherited a lot of money and didn’t understand how successful businesses provided jobs and useful products for less-wealthy Americans.

Roosevelt was more like the Democrat Presidents of the Twentieth Century. He did not like the “weak” government envisioned by the authors of the Constitution. The authors of the Constitution designed a government that would allow people to decide how to live their own lives (as long as they didn’t hurt other people) and to run their own businesses as they wished to (as long as they didn’t cheat other people). The authors of the Constitution thought government should exist only to protect people from criminals and foreign enemies.

William Howard Taft (1857-1930), a close friend of Theodore Roosevelt, served as President from March 4, 1909, to March 4, 1913. Taft ran for the presidency as a Republican in 1908 with Roosevelt’s support. But Taft didn’t carry out Roosevelt’s anti-business agenda aggressively enough to suit Roosevelt. So, in 1912, when Taft ran for re-election as a Republican, Roosevelt ran for election as a Progressive (a newly formed political party). Many Republican voters decided to vote for Roosevelt instead of Taft. The result was that a Democrat, Woodrow Wilson, won the most electoral votes. Although Taft was defeated for re-election, he later became Chief Justice of the United States, making him the only person ever to have served as head of the executive and judicial branches of the U.S. Government.

Thomas Woodrow Wilson (1856-1924) served as President from March 4, 1913, to March 4, 1921. (Wilson didn’t use his first name, and was known officially as Woodrow Wilson.) Wilson is the only President to have earned the degree of doctor of philosophy. Wilson’s field of study was political science, and he had many ideas about how to make government “better”. But “better” government, to Wilson, was “strong” government of the kind favored by Theodore Roosevelt. In fact, it was government by executive decree rather than according to the Constitution’s rules for law-making, in which Congress plays the central role.

Wilson was re-elected in 1916 because he promised to keep the United States out of World War I, which had begun in 1914. But Wilson changed his mind in 1917 and asked Congress to declare war on Germany. After the war, Wilson tried to get the United States to join the League of Nations, an international organization that was supposed to prevent future wars by having nations assemble to discuss their differences. The U.S. Senate, which must approve America’s membership in international organizations, refused to join the League of Nations. The League did not succeed in preventing future wars because wars are started by leaders who don’t want to discuss their differences with other nations.

Warren Gamaliel Harding (1865-1923), a Republican, was elected in 1920 and inaugurated on March 4, 1921. Harding asked voters to reject the kind of government favored by Democrats, and voters gave Harding what is known as a “landslide” victory; he received 60 percent of the votes cast in the 1920 election for president, one of the highest percentages ever recorded. Harding’s administration was about to become involved in a major scandal when Harding died suddenly on August 3, 1923, while he was on a trip to the West Coast. The exact cause of Harding’s death is unknown, but he may have had a stroke when he learned of the impending scandal, which involved Albert Fall, Secretary of the Interior. Fall had secretly allowed some of his business associates to lease government land for oil-drilling, in return for personal loans.

There were a few other scandals, but Harding probably had nothing to do with any of them. Because of the scandals, most historians say that they consider Harding to have been a poor President. But that isn’t the real reason for their dislike of Harding. Most historians, like most college professors, favor “strong” government. Historians don’t like Harding because he didn’t use the power of government to interfere in the nation’s economy. An important result of Harding’s policy (called laissez-faire, or “hands off”) was high employment and increasing prosperity during the 1920s.

John Calvin Coolidge (1872-1933) , who was Harding’s Vice President, became President upon Harding’s death in 1923. (Coolidge didn’t use his first name, and was known as Calvin.) Coolidge was elected President in 1924. He served as President from August 3, 1923, to March 4, 1929. Coolidge continued Harding’s policy of not interfering in the economy, and people continued to become more prosperous as businesses grew and hired more people and paid them higher wages. Coolidge was known as “Silent Cal” because he was a man of few words. He said only what was necessary for him to say, and he meant what he said. That was in keeping with his approach to the presidency. He was not the “activist” that reporters and historians like to see in the presidency; he simply did the job required of him by the Constitution, which was to execute the laws of the United States. He continued Harding’s hands-off policy, and the country prospered as a result. Coolidge chose not run for re-election in 1928, even though he was quite popular.

Herbert Clark Hoover (1874-1964), a Republican who had been Secretary of Commerce under Coolidge, was elected to the presidency in 1928. He served as President from March 4, 1929, to March 4, 1933.

Hoover won 58 percent of the popular vote, an endorsement of the hands-off policy of Harding and Coolidge. Hoover’s administration is known mostly for the huge drop in the price of stocks (shares of corporations, which are bought and sold in places known as stock exchanges), and for the Great Depression that was caused partly by the “Crash” — as it became known. The rate of unemployment (the percentage of American workers without jobs) rose from 3 percent just before the Crash to 25 percent by 1933, at the depth of the Great Depression.

The Crash had two main causes. First, the prices of shares in businesses (called stocks) began to rise sharply in the late 1920s. That caused many persons to borrow money in order to buy stocks, in the hope that the price of stocks would continue to rise. If the price of stocks continued to rise, buyers could sell their stocks at a profit and repay the money they had borrowed. But when stock prices got very high in the fall of 1929, some buyers began to worry that prices would fall, so they began to sell their stocks. That drove down the price of stocks, and caused more buyers to sell in the hope of getting out of the stock market before prices fell further. But prices went down so quickly that almost everyone who owned stocks lost money. Prices of stocks kept going down. By 1933, many stocks had become worthless and most stocks were selling for only a small fraction of prices that they had sold for before the Crash.

Because so many people had borrowed money to buy stocks, they went broke when stock prices dropped. When they went broke, they were unable to pay their other debts. That had a ripple effect throughout the economy. As people went broke they spent less money and were unable to pay their debts. Banks had less money to lend. Because people were buying less from businesses, and because businesses couldn’t get loans to stay in business, many businesses closed and people lost their jobs. Then the people who lost their jobs had less money to spend, and so more people lost their jobs.

The effects of the Great Depression were felt in other countries because Americans couldn’t afford to buy as much as they used to from other countries. Also, Congress passed a law known as the Smoot-Hawley Tarrif Act, which President Hoover signed. The Smoot-Hawley Act raised tarrifs (taxes) on items imported into the United States, which meant that Americans bought even less from foreign countries. Foreign countries passed similar laws, which meant that foreigners began to buy less from Americans, which put more Americans out of work.

The economy would have recovered quickly, as it had done in the past when stock prices fell and unemployment increased. But the actions of government — raising tariffs and making loans harder to get — only made things worse. What could have been a brief recession turned into the Great Depression. People were frightened. They blamed President Hoover for their problems, although President Hoover didn’t cause the Crash. Hoover ran for re-election in 1932, but he lost to Franklin Delano Roosevelt, a Democrat.

Franklin Delano Roosevelt (1882-1945), known as FDR, served as President from March 4, 1933 until his death on April 12, 1945, just a month before V-E Day. FDR was elected to the presidency in 1932, 1936, 1940, and 1944 — the only person elected more than twice. Roosevelt was a very popular President because he served during the Depression and World War II, when most Americans — having lost faith in themselves — sought reassurance that “someone was in charge”. FDR was not universally popular; his share of the popular vote rose from 57 percent in 1932 to 61 percent in 1936, but then dropped to 55 percent in 1940 and 54 percent in 1944. Americans were coming to understand what FDR’s opponents knew at the time, and what objective historians have said since:

FDR’s program to end the Great Depression was known as the New Deal. It consisted of welfare programs, which put people to work on government projects instead of making useful things. It also consisted of higher taxes and other restrictions on business, which discouraged people from starting and investing in businesses, which is the cure for unemployment.

Roosevelt did try to face up to the growing threat from Germany and Japan. However, he wasn’t able to do much to prepare America’s defenses because of strong isolationist and anti-war feelings in the country. Those feelings were the result of America’s involvement in World War I. (Similar feelings in Great Britain kept that country from preparing for war with Germany, which encouraged Hitler’s belief that he could easily conquer Europe.)

When America went to war after Japan’s attack on Pearl Harbor, Roosevelt proved to be an able and inspiring commander-in-chief. But toward the end of the war his health was failing and he was influenced by close aides who were pro-communist and sympathetic to the Soviet Union (Union of Soviet Socialist Republics, or USSR). Roosevelt allowed Soviet forces to claim Eastern Europe, including half of Germany. Roosevelt also encouraged the formation of the United Nations, where the Soviet Union (now the Russian Federation) has had a strong voice because it was made a permanent member of the Security Council, the policy-making body of the UN. As a member of the Security Council, Russia can obstruct actions proposed by the United States. (In any event, the UN has long since become a hotbed of anti-American, left-wing sentiment.)

Roosevelt’s appeasement of the USSR caused Josef Stalin (the Soviet dictator) to believe that the U.S. had weak leaders who would not challenge the USSR’s efforts to spread Communism. The result was the Cold War, which lasted for 45 years. During the Cold War the USSR developed nuclear weapons, built large military forces, kept a tight rein on countries behind the Iron Curtain (in Eastern Europe), and expanded its influence to other parts of the world.

Stalin’s belief in the weakness of U.S. leaders was largely correct, until Ronald Reagan became President. As I will discuss, Reagan’s policies led to the end of the Cold War.

Harry S Truman (1884-1972), who was Vice President in FDR’s fourth term, became President upon FDR’s death. Truman was re-elected in 1948, so he served as President from April 12, 1945 until January 20, 1953 — almost two full terms.

Truman made one right decision during his presidency. He approved the dropping of atomic bombs on Japan. Although hundreds of thousands of Japanese were killed by the bombs, the Japanese soon surrendered. If the Japanese hadn’t surrendered then, U.S. forces would have invaded Japan and millions of Americans and Japanese lives would have been lost in the battles that followed the invasion.

Truman ordered drastic reductions in the defense budget because he thought that Stalin was an ally of the United States. (Truman, like FDR, had advisers who were Communists.) Truman changed his mind about defense budgets, and about Stalin, when Communist North Korea attacked South Korea in 1950. The attack on South Korea came after Truman’s Secretary of State (the man responsible for relations with other countries) made a speech about countries that the United States would defend. South Korea was not one of those countries.

When South Korea was invaded, Truman asked General of the Army Douglas MacArthur to lead the defense of South Korea. MacArthur planned and executed the amphibious landing at Inchon, which turned the war in favor of South Korea and its allies. The allied forces then succeeded in pushing the front line far into North Korea. Communist China then entered the war on the side of North Korea. MacArthur wanted to counterattack Communist Chinese bases and supply lines in Manchuria, but Truman wouldn’t allow that. Truman then “fired” MacArthur because MacArthur spoke publicly about his disagreement with Truman’s decision. The Chinese Communists pushed allied forces back and the Korean War ended in a deadlock, just about where it had begun, near the 38th parallel.

In the meantime, Communist spies had stolen the secret plans for making atomic bombs. They were able to do that because Truman refused to hear the truth about Communist spies who were working inside the government. By the time Truman left office the Soviet Union had manufactured nuclear weapons, had strengthened its grip on Eastern Europe, and was beginning to expand its influence into the Third World (the nations of Africa and the Middle East).

Truman was very unpopular by 1952. As a result he chose not to run for re-election, even though he could have done so. (The “Lame Duck” amendment to the Constitution, which bars a person from serving as President for more than six years was adopted while Truman was President, but it didn’t apply to him.)

Dwight David Eisenhower (1890-1969), a Republican, served as President from January 20, 1953 to January 20, 1961. Eisenhower (also known by his nickname, “Ike”) received 55 percent of the popular vote in 1952 and 57 percent in 1956; his Democrat opponent in both elections was Adlai Stevenson. The Republican Party chose Eisenhower as a candidate mainly because he had become famous as a general during World War II. Republican leaders thought that by nominating Eisenhower they could end the Democrats’ twenty-year hold on the presidency. The Republican leaders were right about that, but in choosing Eisenhower as a candidate they rejected the Republican Party’s traditional stand in favor of small government.

Eisenhower was a “moderate” Republican. He was not a “big spender” but he did not try to undo all of the new government programs that had been started by FDR and Truman. Traditional Republicans eventually fought back and, in 1964, nominated a small-government candidate named Barry Goldwater. I will discuss him when I get to President Lyndon B. Johnson.

Eisenhower was a popular President, and he was a good manager, but he gave the impression of being “laid back” and not “in charge” of things. The news media had led Americans to believe that “activist” Presidents are better than laissez-faire Presidents, and so there was by 1960 a lot of talk about “getting the country moving again” — as if it was the job of the President to “run” the country instead of execution laws duly enacted in accordance with the Constitution.

John Fitzgerald Kennedy (1917-1963), a Democrat, was elected in 1960 to succeed President Eisenhower. Kennedy, who became known as JFK, served from January 20, 1961, until November 22, 1963, when he was assassinated in Dallas, Texas.

One reason that Kennedy won the election of 1960 (with 50 percent of the popular vote) was his image of “vigorous youth” (he was 27 years younger than Eisenhower). In fact, JFK had been in bad health for most of his life. He seemed to be healthy only because he used a lot of medications. Those medications probably impaired his judgment and would have caused him to die at a relatively early age if he hadn’t been assassinated.

Late in Eisenhower’s administration a Communist named Fidel Castro had taken over Cuba, which is only 90 miles south of Florida. The Central Intelligence Agency then began to work with anti-Communist exiles from Cuba. The exiles were going to attempt an invasion of Cuba at a place called the Bay of Pigs. In addition to providing the necessary military equipment, the U.S. was also going to provide air support during the invasion.

JFK succeeded Eisenhower before the invasion took place, in April 1961. JFK approved changes in the invasion plan that resulted in the failure of the invasion. The most important change was to discontinue air support for the invading forces. The exiles were defeated, and Castro has remained firmly in control of Cuba.

The failed invasion caused Castro to turn to the USSR for military and economic assistance. In exchange for that assistance, Castro agreed to allow the USSR to install medium-range ballistic missiles in Cuba. That led to the so-called Cuban Missile Crisis in 1962. Many historians give Kennedy credit for resolving the crisis and avoiding a nuclear war with the USSR. The Russians withdrew their missiles from Cuba, but JFK had to agree to withdraw American missiles from bases in Turkey.

The myth that Kennedy had stood up to the Russians made him more popular in the U.S. His major accomplishment, which Democrats today like to ignore, was to initiate tax cuts, which became law after his assassination. The Kennedy tax cuts helped to make America more prosperous during the 1960s by giving people more money to spend, and by encouraging businesses to expand and create jobs.

The assassination of JFK on November 22, 1963, in Dallas was a shocking event. It also led many Americans to believe that JFK would have become a great President if he had lived and been re-elected to a second term. There is little evidence that JFK would have become a great President. His record in Cuba suggests that he would not have done a good job of defending the country.

Lyndon Baines Johnson (1908-1973), also known as LBJ, was Kennedy’s Vice President and became President upon Kennedy’s assassination. LBJ was re-elected in 1964; he served as President from November 22, 1963 to January 20, 1969. LBJ’s Republican opponent in 1964 was Barry Goldwater, who was an old-style Republican conservative, in favor of limited government and a strong defense. LBJ portrayed Goldwater as a threat to America’s prosperity and safety, when it was LBJ who was the real threat. Americans were still in shock about JFK’s assassination, and so they rallied around LBJ, who won 61 percent of the popular vote.

LBJ is known mainly for two things: his “Great Society” program and the war in Vietnam. The Great Society program was an expansion of FDR’s New Deal. It included such things as the creation of Medicare, which is medical care for retired persons that is paid for by taxes. Medicare is an example of a “welfare” program. Welfare programs take money from people who earn it and give money to people who don’t earn it. The Great Society also included many other welfare programs, such as more benefits for persons who are unemployed. The stated purpose of the expansion of welfare programs under the Great Society was to end poverty in America, but that didn’t happen. The reason it didn’t happen is that when people receive welfare they don’t work as hard to take care of themselves and their families, and they don’t save enough money for their retirement. Welfare actually makes people worse off in the long run.

America’s involvement in Vietnam began in the 1950s, when Eisenhower was President. South Vietnam was under attack by Communist guerrillas, who were sponsored by North Vietnam. Small numbers of U.S. forces were sent to South Vietnam to train and advise South Vietnamese forces. More U.S. advisers were sent by JFK, but within a few years after LBJ became President he had turned the war into an American-led defense of South Vietnam against Communist guerrillas and regular North Vietnamese forces. LBJ decided that it was important for the U.S. to defeat a Communist country and stop Communism from spreading in Southeast Asia.

However, LBJ was never willing to commit enough forces in order to win the war. He allowed air attacks on North Vietnam, for example, but he wouldn’t invade North Vietnam because he was afraid that the Chinese Communists might enter the war. In other words, like Truman in Korea, LBJ was unwilling to do what it would take to win the war decisively. Progress was slow and there were a lot of American casualties from the fighting in South Vietnam. American newspapers and TV began to focus attention on the casualties and portray the war as a losing effort. That led a lot of Americans to turn against the war, and college students began to protest the war (because they didn’t want to be drafted). Attention shifted from the war to the protests, giving the world the impression that America had lost its resolve. And it had.

LBJ had become so unpopular because of the war in Vietnam that he decided not to run for President in 1968. Most of the candidates for President campaigned by saying that they would end the war. In effect, the United States had announced to North Vietnam that it would not fight the war to win. The inevitable outcome was the withdrawal of U.S. forces from Vietnam, which finally happened in 1973, under LBJ’s successor, Richard Nixon. South Vietnam was left on its own, and it fell to North Vietnam in 1975.

Richard Milhous Nixon (1913-1994) was a Republican. He won the election of 1968 by beating the Democrat candidate, Hubert H. Humphrey (who had been LBJ’s Vice President), and a third-party candidate, George C. Wallace. Nixon and Humphrey each received 43 percent of the popular vote; Wallace received 14 percent. If Wallace had not been a candidate, most of the votes cast for him probably would have been cast for Nixon.

Even though Nixon received less than half of the popular vote, he won the election because he received a majority of electoral votes. Electoral votes are awarded to the winner of each State’s popular vote. Nixon won a lot more States than Humphrey and Wallace, so Nixon became President.

Nixon won re-election in 1972, with 61 percent of the popular vote, by beating a Democrat (George McGovern) who would have expanded LBJ’s Great Society and cut America’s armed forces even more than they were cut after the Vietnam War ended. Nixon’s victory was more a repudiation of McGovern than it was an endorsement of Nixon. His second term ended in disgrace when he resigned the presidency on August 9, 1974.

Nixon called himself a conservative, but he did nothing during his presidency to curb the power of government. He did not cut back on the Great Society. He spent a lot of time on foreign policy. But Nixon’s diplomatic efforts did nothing to make the USSR and Communist China friendlier to the United States. Nixon had shown that he was essentially a weak President by allowing U.S. forces to withdraw from Vietnam. Dictatorial rulers like do not respect countries that display weakness.

Nixon was the first (and only) President who resigned from office. He resigned because the House of Representatives was ready to impeach him. An impeachment is like a criminal indictment; it is a set of charges against the holder of a public office. If Nixon had been impeached by the House of Representatives, he would have been tried by the Senate. If two-thirds of the Senators had voted to convict him he would have been removed from office. Nixon knew that he would be impeached and convicted, so he resigned.

The main charge against Nixon was that he ordered his staff to cover up his involvement in a crime that happened in 1972, when Nixon was running for re-election. The crime was a break-in at the headquarters of the Democratic Party in Washington, D.C. Because the Democratic Party’s headquarters was located in the Watergate Building in Washington, D.C., this episode became known as the Watergate Scandal.

The purpose of the break-in was to obtain documents that might help Nixon’s re-election effort. The men who participated in the break-in were hired by aides to Nixon. Details about the break-in and Nixon’s involvement were revealed as a result of investigations by Congress, which were helped by reporters who were doing their own investigative work.

But there is good reason to believe that Nixon was unjustly forced from office by the concerted efforts of the news media (most of which had long been biased against Nixon), Democrats in Congress, and many Republicans who were anxious to rid themselves of Nixon, who was a magnet for controversy.

Gerald Rudolph Ford (born Leslie King Jr.) (1913 – 2007), who was Nixon’s Vice President at the time Nixon resigned, became President on August 9, 1974 and served until January 20, 1977. Ford succeeded Spiro T. Agnew, who had been Nixon’s Vice President until October 10, 1973, when he resigned because he had been taking bribes while he was Governor of Maryland (the job he had before becoming Vice President).

Ford became the first Vice President chosen in accordance with the Twenty-Fifth Amendment to the Constitution. That amendment spells out procedures for filling vacancies in the presidency and vice presidency. When Vice President Agnew resigned, President Nixon nominated Ford as Vice President, and the nomination was approved by a majority vote of the House and Senate. Then, when Ford became President, he nominated Nelson Rockefeller to fill the vice presidency, and Rockefeller was elected Vice President by the House and Senate.

Ford ran for re-election in 1976, but he was defeated by James Earl Carter, mainly because of the Watergate Scandal. Ford was not involved in the scandal, but voters often cast votes for silly reasons. Carter’s election was a rejection of Richard Nixon, who had left office two years earlier, not a vote of confidence in Carter.

James Earl (“Jimmy”) Carter Jr. (1924 – ), a Democrat who had been Governor of Georgia, received only 50 percent of the popular vote. He was defeated for re-election in 1980, so he served as President from January 20, 1977 to January 20, 1981.

Carter was an ineffective President who failed at the most important duty of a President, which is to protect Americans from foreign enemies. His failure came late in his term of office, during the Iran Hostage Crisis. The Shah of Iran had ruled the country for 38 years. He was overthrown in 1979 by a group of Muslim clerics (religious men) who disliked the Shah’s pro-American policies. In November 1979 a group of students loyal to the new Muslim government of Iran invaded the American embassy in Tehran (Iran’s capital city) and took 66 hostages. Carter approved rescue efforts, but they were poorly planned. The hostages were still captive by the time of the presidential election in 1980. Carter lost the election largely because of his feeble rescue efforts.

In recent years Carter has become an outspoken critic of America’s foreign policy. Carter is sympathetic to America’s enemies and he opposes strong military action in defense of America.

Ronald Wilson Reagan (1911-2004), a Republican, succeeded Jimmy Carter as President. Reagan won 51 percent of the popular vote in 1980. Reagan would have received more votes, but a former Republican (John Anderson) ran as a third-party candidate and took 7 percent of the popular vote. Reagan was re-elected in 1984 with 59 percent of the popular vote. He served as President from January 20, 1981, until January 20, 1989.

Reagan had two goals as President: to reduce the size of government and to increase America’s military strength. He was unable to reduce the size of government because, for most of his eight years in office, Democrats were in control of Congress. But Reagan was able to get Congress to approve large reductions in income-tax rates. Those reductions led to more spending on consumer goods and more investment in the creation of new businesses. As a result, Americans had more jobs and higher incomes.

Reagan succeeded in rebuilding America’s military strength. He knew that the only way to defeat the USSR, without going to war, was to show the USSR that the United States was stronger. A lot of people in the United States opposed spending more on military forces; they though that it would cause the USSR to spend more. They also thought that a war between the U.S. and USSR would result. Reagan knew better. He knew that the USSR could not afford to keep up with the United States. Reagan was right. Not long after the end of his presidency the countries of Eastern Europe saw that the USSR was really a weak country, and they began to break away from the USSR. Residents of Berlin demolished the Berlin Wall, which the USSR had erected in 1961 to keep East Berliners from crossing over into West Berlin. East Germany was freed from Communist rule, and it reunited with West Germany. The USSR collapsed, and many of the countries that had been part of the USSR became independent. We owe the end of the Soviet Union and its influence President Reagan’s determination to defeat the threat posed by the Soviet Union.

George Herbert Walker Bush (1924 – 2019), a Republican, was Reagan’s Vice President. He won 54 percent of the popular vote when he defeated his Democrat opponent, Michael Dukakis, in the election of 1988. Bush lost the election of 1992. He served as President from January 20, 1989 to January 20, 1993.

The main event of Bush’s presidency was the Gulf War of 1990-1991. Iraq, whose ruler was Saddam Hussein, invaded the small neighboring country of Kuwait. Kuwait produces and exports a lot of oil. The occupation of Kuwait by Iraq meant that Saddam Hussein might have been able to control the amount of oil shipped to other countries, including Europe and the United States. If Hussein had been allowed to control Kuwait, he might have moved on to Saudi Arabia, which produces much more oil than Kuwait. President Bush asked Congress to approve military action against Iraq. Congress approved the action, although most Democrats voted against giving President Bush authority to defend Kuwait. The war ended in a quick defeat for Iraq’s armed forces. But President Bush decided not to allow U.S. forces to finish the job and end Saddam Hussein’s reign as ruler of Iraq.

Bush’s other major blunder was to raise taxes, which helped to cause a recession. The country was recovering from the recession in 1992, when Bush ran for re-election, but his opponents were able to convince voters that Bush hadn’t done enough to end the recession. In spite of his quick (but incomplete) victory in the Persian Gulf War, Bush lost his bid for re-election because voters were concerned about the state of the economy.

William Jefferson Clinton (born William Jefferson Blythe III) (1946 – ), a Democrat, defeated George H.W. Bush in the 1992 election by gaining a majority of the electoral vote. But Clinton won only 43 percent of the popular vote. Bush won 37 percent, and 19 percent went to H. Ross Perot. Perot, a third-party candidate, who received many votes that probably would have been cast for Bush.

Clinton’s presidency got off to a bad start when he sent to Congress a proposal that would have put health care under government control. Congress rejected the plan, and a year later (in 1994) voters went to the polls in large number to elect Republican majorities to the House and Senate.

Clinton was able to win re-election in 1996, but he received only 49 percent of the popular vote. He was re-elected mainly because fewer Americans were out of work and incomes were rising. This economic “boom” was a continuation of the recovery that began under President Reagan. Clinton got credit for the “boom” of the 1990s, which occurred in spite of tax increases passed by Congress while it was still controlled by Democrats.

Clinton was perceived as a “moderate” Democrat because he tried to balance the government’s budget; that is, he tried not to spend more money than the government was receiving in taxes. He was eventually able to balance the budget, but only because he cut defense spending. In addition to that, Clinton made several bad decisions about defense issues. In 1993 he withdrew American troops from Somalia, instead of continuing with the military mission there after some troops were captured and killed by natives. In 1994 he signed an agreement with North Korea that was supposed to keep North Korea from developing nuclear weapons, but the North Koreans continued to work on building nuclear weapons because they had fooled Clinton. By 1998 Clinton knew that al Qaeda had become a major threat when terrorists bombed two U.S. embassies in Africa, but Clinton failed to go to war against al Qaeda. Only after terrorists struck a Navy ship, the USS Cole, in 2000 did Clinton declare terrorism to be a major threat. By then, his term of office was almost over.

Clinton was the second President to be impeached. The House of Representatives impeached him in 1998. He was charged with perjury (lying under oath) when he was the defendant (the person being charged with wrong-doing) in a law suit. The Senate didn’t convict Clinton because every Democrat senator refused to vote for conviction, in spite of overwhelming evidence that Clinton was guilty. The day before Clinton left office he acknowledged his guilt by agreeing to a five-year suspension of his law license. A federal judge later found Clinton guilty of contempt of court for his misleading testimony and fined him $90,000.

Clinton was involved in other scandals during his presidency, but he remains popular with many people because he is good at giving the false impression that he is a nice, humble person.

Clinton’s scandals had more effect on his Vice President, Al Gore, who ran for President as the nominee of the Democrat Party in 2000. His main opponent was George W. Bush, a Republican. A third-party candidate named Ralph Nader also received a lot of votes. The election of 2000 was the closest presidential election since 1876. Bush and Gore each won about 48 percent of the popular vote (Gore’s percentage was slightly higher than Bush’s); Nader won 3 percent. The winner of the election was decided by outcome of the vote in Florida. That outcome was the subject of legal proceedings for six weeks. It had to be decided by the U.S. Supreme Court.

Initial returns in Florida gave that State’s electoral votes to Bush, which meant that he would become President. But the Supreme Court of Florida decided that election officials should violate Florida’s election laws and keep recounting the ballots in certain counties. Those counties were selected because they had more Democrats than Republicans, and so it was likely that recounts would favor Gore, the Democrat. The case finally went to the U.S. Supreme Court, which decided that the Florida Supreme Court was wrong. The U.S. Supreme Court ordered an end to the recounts, and Bush was declared the winner of Florida’s electoral votes.

George Walker Bush (1946 – ), a Republican, was the second son of a President to become President. (The first was John Quincy Adams, the sixth President, whose father, John Adams, was the second President. Also, Benjamin Harrison, the 23rd President, was the grandson of William Henry Harrison, the ninth President.) Bush won re-election in 2004, with 51 percent of the popular vote. He served as President from January 20, 2001, to January 20, 2009.

President Bush’s major accomplishment before September 11, 2001, was to get Congress to cut taxes. The tax cuts were necessary because the economy had been in a recession since 2000. The tax cuts gave people more money to spend and encouraged businesses to expand and create new jobs.

The terrorist attacks on September 11, 2001, caused President Bush to give most of his time and attention to the War on Terror. The invasion of Afghanistan, late in 2001, was part of a larger campaign to disrupt terrorist activities. Afghanistan was ruled by the Taliban, a group that gave support and shelter to al Qaeda terrorists. The U.S. quickly defeated the Taliban and destroyed al Qaeda bases in Afghanistan.

The invasion of Iraq, which took place in 2003, was also intended to combat al Qaeda, but in a different way. Iraq, under Saddam Hussein, had been an enemy of the U.S. since the Persian Gulf War of 1990-1991. Hussein was trying to acquire deadly weapons to use against the U.S. and its allies. Hussein was also giving money to terrorists and sheltering them in Iraq. The defeat of Hussein, which came quickly after the invasion of Iraq, was intended to establish a stable, friendly government in the Middle East.

The invasion of Iraq produced some of the intended results, but there was much unrest there because of long-standing animosity between Sunni Muslims and Shi’a Muslims. There was also much defeatist talk about Iraq — especially by Democrats and the media. That defeatist talk helped to encourage those who were creating unrest in Iraq. It gave them hope that the U.S. would abandon Iraq, just as it abandoned Vietnam more than 30 years earlier. The country had become almost uncontrollable until Bush authorized a military “surge” — enough additional troops to quell the unrest.

However, Bush, like his father, failed to take a strategically decisive course of action. He should have ended the pretense of “nation-building”, beefed up U.S. military presence, and installed a compliant Iraqi government. That would have created a U.S. stronghold in the Middle East and stifled Iran’s moves toward regional hegemony, just as the presence of U.S. forces in Europe for decades after World War II kept the USSR from seizing new territory and eventually wore it down.

With Iraq as a U.S. base of operations, it would have been easier to quell Afghanistan and to launch preemptive strikes on Iran’s nuclear-weapons program while it was still in its early stages.

But the early failures in Iraq — and the futility of the Afghan operation (also done on the cheap) — meant that Bush had no political backing for bolder military measures. Further, the end of his second term was blighted by a financial crisis that led a stock-market crash, the failure of some major financial firms, the bailout of some others, and thence to the Great Recession.

The election of 2008 coincided with the economic downturn, and it was no surprise that the Democrat candidate handily beat the feckless Republican (in-name-only) candidate, John Sidney McCain III.

Barack Hussein Obama II (1961 – ) was the Democrat who defeated McCain. Obama, like most of his predecessors, was a professional politician, but most of his political experience was as a “community organizer” (i.e., rabble-rouser and shakedown artist) in Chicago. He was still serving in his first major office (as U.S. Senator from Illinois) when he vaulted ahead of Hillary Rodham Clinton and seized the Democrat nomination for the presidency. He served as President from January 20, 2009, until January 20, 2017.

Obama’s ascendancy was owed in large part to the perception of him as youthful and energetic. He was careful to seem moderate in his campaign rhetoric, though those in the know (party leaders and activists) were well aware of his strong left-wing leanings, which were revealed in his Senate votes and positions. Clinton, by contrast, was perceived as middle-of the-road, but only because the road had shifted well to the left over the years. It was she, for example, who propounded the health-care nationalization scheme known as HillaryCare. The scheme was defeated in Congress, but it was responsible in large part for massive swing of House seats in 1994, which returned the House to GOP control for the first time in 42 years.

Obama’s election was due also to a health dose of white “guilt”. Here was an opportunity for many voters to “prove” (and to brag about) their lack of racism. And so, given the experience of Iraq, the onset of the Great Recession, and a me-too Republican candidate, they did the easy thing by voting for Obama, and enjoyed the feel-good sensation that went with it.

At any rate, Obama served two terms (the second was secured by defeating Willard Mitt Romney, another feckless RINO). His presidency throughout both terms was marked by disastrous policies; for example:

  • Obamacare, which drastically raised health-care costs and insurance premiums and added millions of freeloaders to Medicaid
  • encouragement of illegal immigration, which imposes heavy burdens on middle-class taxpayers and is intended to swell the rolls of Democrat voters through amnesty schemes
  • increases in marginal tax rates for individuals and businesses
  • issuance of economically stultifying regulations at an unprecedented page
  • nomination of dozens of left-wing judges and two left-wing Supreme Court Justices, partly to ensure “empathic” (leftist) rulings rather than rulings in accordance with the Constitution
  • sharp reductions in defense spending
  • meddling in Libya, which through Hillary Clinton’s negligence cost the lives of American diplomats
  • Clinton’s use of a private e-mail server, in which Obama was complicit, and which resulted in the compromise of sensitive, classified information.
  • a drastic military draw-down in Iraq, with immediately dire consequences (and a just-in-time reversal by Obama)
  • persistent anti-white and anti-American rhetoric (the latter especially on foreign soil and at the UN)
  • persistent anti-business rhetoric that, together with tax increases and regulatory excesses, killed the recovery from the Great Recession and put the U.S. firmly on the road to economic stagnation.

It should therefore have been a simple matter for voters to reject Obama’s inevitable successor: Hillary Clinton. But the American public has been indoctrinated in leftism for decades by public schools, the mainstream media, and a plethora TV shows and movies, with the result that Clinton acquired 5 million more popular votes, nationwide, than did her Republican opponent. The foresight of the Framers of the Constitution proved providential because her opponent carefully chose his battlegrounds and was handily won in the electoral college. Thus …

Donald John Trump (1946 – ) succeeded Obama and was inaugurated as President on January 20, 2017. He is only in the third year of his presidency, but has accomplished much despite a “resistance” movement that began as soon as his election was assured in the early-morning hours of November 9, 2016. (The “resistance”, which I discuss here, is a continuation of political and social trends that are rooted in the 1960s.)

These are among Trump’s accomplishments, many of them the result of a successful collaboration with both houses of Congress, which Republicans controlled for the first two years of Trump’s presidency, and the Senate, which remains under GOP control:

  • the end of Obamacare’s requirement to buy some form of health-insurance or pay a “tax”, which penalized the healthy and forced many to do something that would otherwise not do
  • discouragement of illegal immigration through tougher enforcement (against a huge, left-wing financed influx of illegals)
  • decreases in marginal tax rates for individuals and businesses
  • the repeal of many economically stultifying regulations and a drastic slowdown in the issuance of regulations
  • nomination of dozens of conservative judges and two conservative Supreme Court Justices
  • sharp increases in defense spending
  • the beginning of the end of foreign adventures that are unrelated to the interests of Americans (e.g., the drawdown in Syria)
  • relative stability in Iraq
  • pro-American rhetoric on foreign soil and at the UN
  • persistent pro-business rhetoric that, together with tax-rate cuts and regulatory reform, is helping to buoy the U.S. economy despite slowdowns elsewhere and Trump’s “trade war”, which is really aimed at creating a level playing field for American companies and workers.

This story will be continued.

More Presidential Trivia

The modern presidency began with the adored “activist”, Teddy Roosevelt. From TR to the present, there have been only four (of twenty) presidents who first competed in a general election as a candidate for the presidency: Taft, Hoover, Eisenhower, and Trump. Trump is alone in having had no previous governmental service before becoming president. There’s no moral to this story. Make of it what you will.

(See also “Presidents: Key Dates and Various Trivia“, to which this commentary has been added.)

That “Hurtful” Betsy Ross Flag

Fox News has the latest:

Two Democratic hopefuls have expressed their support for Nike after the sportswear company pulled sneakers featuring the Betsy Ross-designed American flag ahead of the Fourth of July holiday. The company did so after former NFL quarterback and Nike endorser Colin Kaepernick raised concerns about the shoes.

Former HUD Secretary Julián Castro told CBS News on Wednesday that he was “glad to see” Nike remove the shoes from the shelves, comparing the “painful” symbol to the Confederate flag.

“There are a lot of things in our history that are still very painful,” Castro explained. As an example, he cited “the Confederate flag that still flies in some places and is used as a symbol.”\

Former Texas congressman Beto O’Rourke also approved of Nike’s decision, noting that “white nationalist groups” have “appropriated” the Betsy Ross flag.

“I think its really important to take into account the impression that kind of symbol would have for many of our fellow Americans,” he said,  according to Jewish Insider senior political reporter Ben Jacobs.

As I understand it, the Betsy Ross flag, which became the symbol of the rebellious, united States (i.e., Colonies) in 1777, is “hurtful” because it dates from an era when slavery was legal in what became the United States. How that historical fact is “hurtful” to anyone is beyond me. The fact of slavery is reprehensible, but a flag that merely denotes America’s struggle for independence from Britain really has nothing to do with slavery, except in the slippery way that “social justice” warriors have just invented. (Clearly, they are running low on ideas.)

Well, if the Betsy Ross flag is “hurtful” to professional virtue-signalers and malcontents, it is certainly — and more legitimately — hurtful to me. I am a direct descendant of a man who, with three of his sons (one of whom I am also directly descended from), fought on the British side in the Revolutionary War. They had settled in the Colony of Pennsylvania in the 1750s and, perhaps not unwisely, chose to defend the Crown against presumptuous rebels like George Washington, Samuel Adams, Thomas Jefferson, and 53 other signatories of the Declaration of Independence — all of whom used to be called patriots. (Washington, Jefferson, and many other signatories owned slaves, but that wasn’t why they rebelled; slavery was then still legal throughout the British Empire.)

In any event, because my ancestors were Loyalists, they fled to Canada at the end of the war. And from then until the birth of my father in the United States more than 130 years later, the ancestors in my paternal line of descent were Canadian and therefore (nominally, at least) subjects of the British monarch.

So if anyone has a right to be offended by the Betsy Ross flag, it is I. But I am not offended by the flag, though I am deeply offended by the useless twits who profess to be offended by it.

The Fall of America

Victor Davis Hanson, like many others before him (and like) me, sees the unraveling of America portended by Petronius’s The Satyricon (ca. 60 AD):

Certain themes … are timeless and still resonate today.

The abrupt transition from a society of rural homesteaders into metropolitan coastal hubs had created two Romes. One world was a sophisticated and cosmopolitan network of traders, schemers, investors, academics, and deep-state imperial cronies. Their seaside corridors were not so much Roman as Mediterranean. And they saw themselves more as “citizens of the world” than as mere Roman citizens.

In the novel, vast, unprecedented wealth had produced license. On-the-make urbanites suck up and flatter the childless rich in hopes of being given estates rather than earning their own money….

[The] novel’s accepted norms are pornography, gratuitous violence, sexual promiscuity, transgenderism, delayed marriage, childlessness, fear of aging, homelessness, social climbing, ostentatious materialism, prolonged adolescence, and scamming and conning in lieu of working.

The characters are fixated on expensive fashion, exotic foods, and pretentious name-dropping. They are the lucky inheritors of a dynamic Roman infrastructure that had globalized three continents. Rome had incorporated the shores of the Mediterranean under uniform law, science, institutions—all kept in check by Roman bureaucracy and the overwhelming power of the legions, many of them populated by non-Romans.

Never in the history of civilization had a generation become so wealthy and leisured, so eager to gratify every conceivable appetite—and yet so bored and unhappy.

But there was also a second Rome in the shadows. Occasionally the hipster antiheroes of the novel bump into old-fashioned rustics, shopkeepers, and legionaries. They are what we might now call the ridiculed “deplorables” and “clingers.”…

Globalization had enriched and united non-Romans into a world culture. That was an admirable feat. But such homogenization also attenuated the very customs, traditions, and values that had led to such astounding Roman success in the first place….

But the new empire also diluted a noble and unique Roman agrarianism. It eroded nationalism and patriotism. The empire’s wealth, size, and lack of cohesion ultimately diminished Roman unity, as well as traditional marriage, child-bearing, and autonomy….

[W]ide reading ensures erudition and sophistication, and helps science supplant superstition. But sometimes education is also ambiguous. Students become idle, pretentious loafers. Professors are no different from loud pedants. Writers are trite and boring. Elite pundits sound like gasbags.

Petronius seems to imply that whatever the Rome of his time was, it was likely not sustainable—but would at least be quite exciting in its splendid decline.

Petronius also argues that with too much rapid material progress comes moral regress. His final warning might be especially troubling for the current generation of Western Europeans and Americans. Even as we brag of globalizing the world and enriching the West materially and culturally, we are losing our soul in the process.

Getting married, raising families, staying in one place, still working with our hands, and postponing gratification may be seen as boring and out of date. But nearly 2,000 years later, all of that is what still keeps civilization alive.

Hanson omits — because Petronious’s prescience was limited — the end game, in which the glory that was Rome was extinguished by internal rot, military failure, and invasion. The first of those — internal rot –is well underway in the United States, “thanks” to the Democrat Party. The second — military failure — has become more or less a habit since the Korean War — a habit that will resume with the eventual return to power of the Democrat Party. The third — invasion — probably will be accomplished in bloodless form by the determination of China’s leadership, when a Democrat administration (having disarmed the country) accedes to military and economic coercion.

And, ironically (but blessedly) that will put paid to the kinds of excesses that Democrats have fostered in their zeal for (evanescent) power: pornography, gratuitous violence, sexual promiscuity, transgenderism, delayed marriage, childlessness, fear of aging, homelessness, social climbing, ostentatious materialism, prolonged adolescence, and scamming and conning in lieu of working.

America’s virtual state of servitude will also put paid to the last vestiges of liberty in the land, though they would have eventually disappeared under Democrat rule.

End of a Generation

The so-called greatest generation has died out in my family, as it soon will die out across the land. The recent death of my mother-in-law at age 98 removed from the scene the last of my wife’s and my parents and their siblings: 26 of them in all.

Their birth years ranged from 1903 to 1922. There were, oddly, 18 males as against only 8 females, and the disparity held for all four sets of siblings:

7 to 3 for my mother’s set

2 to 1 for my father’s set

5 to 3 for my wife’s mother’s set

4 to 1 for my wife’s father’s set.

Only one of the 26 died before reaching adulthood (my father’s younger brother at 18 months). Two others (also males) died relatively young. One of my mother’s brothers died just a few weeks before his 40th birthday as a result of a jeep accident (he was on active duty in the Coast Guard). One of my wife’s mother’s brothers died at age 48 as a long delayed result of a blow to the head by a police truncheon.

The other 15 males lived to ages ranging from 65 to 96, with an average age at death of 77 years. The 8 females lived to ages ranging from 69 to 99, with an average age at death of 87 years. The longest-lived of the males was the only one to pass the 90 mark. Four of the females lived into their 90s, dying at ages 91, 96, 98, and 99.

All of the 25 who reached adulthood also married. Only two of them had a marriage end in divorce. All of them were raised in near-poverty or in somewhat comfortable circumstances that vanished with the onset of the Great Depression. All of them worked hard, whether in the home or outside of it; none of them went on welfare; most of the men and two of the women served in uniform during World War II.

Thus passeth a generation sui generis.

Where are Elmer, Herman, Bert, Tom and Charley,
The weak of will, the strong of arm, the clown, the boozer, the fighter?
All, all, are sleeping on the hill….

Where are Ella, Kate, Mag, Lizzie and Edith,
The tender heart, the simple soul, the loud, the proud, the happy one?
All, all, are sleeping on the hill.

Edgar Lee Masters, Spoon River Anthology (“The Hill“)

Peak Civilization

The fate of most human endeavors is that they reach a peak of attainment, which is then followed by a decline due to excess on the one hand and neglect on the other hand. “Classical” music is a favorite example of mine. The form peaked around the turn of the 20th century, then went over the top into — variously — cacophony, atonality, and arrythmic confusion. The best of contemporary “classical” music is merely derivative of the form as it was at its peak.

So it is with myriad endeavors, the most important of which is the endeavor of rational inquiry. In the West, rational inquiry seems to have peaked in the early 1960s. I needn’t remind you of the subsequent descent: mobs, riots, the din of “entertainment”, quasi-religious movements from hippiedom to “climate change”, and on and on into the night.

It all makes me glad that I came of age in the 1950s, when civilized discourse was still possible and scientists were dedicated to the pursuit of truth, not the projection of their hopes, fears, and feelings.

Not with a Bang

This is the way the world ends
This is the way the world ends
This is the way the world ends
Not with a bang but a whimper.

T.S. Elliot, The Hollow Men

It’s also the way that America is ending. Yes, there are verbal fireworks aplenty, but there will not be a “hot” civil war. The country that my parents and grandparents knew and loved — the country of my youth in the 1940s and 1950s — is just fading away.

This would not necessarily be a bad thing if the remaking of America were a gradual, voluntary process, leading to time-tested changes for the better. But that isn’t the case. The very soul of America has been and is being ripped out by the government that was meant to protect that soul, and by movements that government not only tolerates but fosters.

Before I go further, I should explain what I mean by America, which is not the same thing as the geopolitical entity known as the United States, though the two were tightly linked for a long time.

America was a relatively homogeneous cultural order that fostered mutual respect, mutual trust, and mutual forbearance — or far more of those things than one might expect in a nation as populous and far-flung as the United States. Those things — conjoined with a Constitution that has been under assault since the New Deal — made America a land of liberty. That is to say, they fostered real liberty, which isn’t an unattainable state of bliss but an actual (and imperfect) condition of peaceful, willing coexistence and its concomitant: beneficially cooperative behavior.

The attainment of this condition depends on social comity, which depends in turn on (a) genetic kinship and (b) the inculcation and enforcement of social norms, especially the norms that define harm.

All of that is going by the boards because the emerging cultural order is almost diametrically opposite that which prevailed in America. The new dispensation includes:

  • casual sex
  • serial cohabitation
  • subsidized illegitimacy
  • abortion on demand
  • easy divorce
  • legions of non-mothering mothers
  • concerted (and deluded) efforts to defeminize females and to neuter or feminize males
  • gender-confusion as a burgeoning norm
  • “alternative lifestyles” that foster disease, promiscuity, and familial instability
  • normalization of drug abuse
  • forced association (with accompanying destruction of property and employment rights)
  • suppression of religion
  • rampant obscenity
  • identity politics on steroids
  • illegal immigration as a “right”
  • “free stuff” from government (Social Security was meant to be self-supporting)
  • America as the enemy
  • all of this (and more) as gospel to influential elites whose own lives are modeled mostly on old America.

As the culture has rotted, so have the ties that bound America.

The rot has occurred to the accompaniment of cacophony. Cultural coarsening begets loud and inconsiderate vulgarity. Worse than that is the cluttering of the ether with the vehement and belligerent propaganda, most of it aimed at taking down America.

The advocates of the new dispensation haven’t quite finished the job of dismantling America. But that day isn’t far off. Complete victory for the enemies of America is only a few election cycles away. The squishy center of the electorate — as is its wont — will swing back toward the Democrat Party. With a Democrat in the White House, a Democrat-controlled Congress, and a few party switches in the Supreme Court (of the packing of it), the dogmas of the anti-American culture will become the law of the land; for example:

Billions and trillions of dollars will be wasted on various “green” projects, including but far from limited to the complete replacement of fossil fuels by “renewables”, with the resulting impoverishment of most Americans, except for comfortable elites who press such policies).

It will be illegal to criticize, even by implication, such things as abortion, illegal immigration, same-sex marriage, transgenderism, anthropogenic global warming, or the confiscation of firearms. These cherished beliefs will be mandated for school and college curricula, and enforced by huge fines and draconian prison sentences (sometimes in the guise of “re-education”).

Any hint of Christianity and Judaism will be barred from public discourse, and similarly punished. Islam will be held up as a model of unity and tolerance.

Reverse discrimination in favor of females, blacks, Hispanics, gender-confused persons, and other “protected” groups will be required and enforced with a vengeance. But “protections” will not apply to members of such groups who are suspected of harboring libertarian or conservative impulses.

Sexual misconduct (as defined by the “victim”) will become a crime, and any male person may be found guilty of it on the uncorroborated testimony of any female who claims to have been the victim of an unwanted glance, touch (even if accidental), innuendo (as perceived by the victim), etc.

There will be parallel treatment of the “crimes” of racism, anti-Islamism, nativism, and genderism.

All health care in the United States will be subject to review by a national, single-payer agency of the central government. Private care will be forbidden, though ready access to doctors, treatments, and medications will be provided for high officials and other favored persons. The resulting health-care catastrophe that befalls most of the populace (like that of the UK) will be shrugged off as a residual effect of “capitalist” health care.

The regulatory regime will rebound with a vengeance, contaminating every corner of American life and regimenting all businesses except those daring to operate in an underground economy. The quality and variety of products and services will decline as their real prices rise as a fraction of incomes.

The dire economic effects of single-payer health care and regulation will be compounded by massive increases in other kinds of government spending (defense excepted). The real rate of economic growth will approach zero.

The United States will maintain token armed forces, mainly for the purpose of suppressing domestic uprisings. Given its economically destructive independence from foreign oil and its depressed economy, it will become a simulacrum of the USSR and Mao’s China — and not a rival to the new superpowers, Russia and China, which will largely ignore it as long as it doesn’t interfere in their pillaging of respective spheres of influence. A policy of non-interference (i.e., tacit collusion) will be the order of the era in Washington.

Though it would hardly be necessary to rig elections in favor of Democrats, given the flood of illegal immigrants who will pour into the country and enjoy voting rights, a way will be found to do just that. The most likely method will be election laws requiring candidates to pass ideological purity tests by swearing fealty to the “law of the land” (i.e., abortion, unfettered immigration, same-sex marriage, freedom of gender choice for children, etc., etc., etc.). Those who fail such a test will be barred from holding any kind of public office, no matter how insignificant.

Are my fears exaggerated? I don’t think so, given what has happened in recent decades and the cultural revolutionaries’ tightening grip on the Democrat party. What I have sketched out can easily happen within a decade after Democrats seize total control of the central government.

Will the defenders of liberty rally to keep it from happening? Perhaps, but I fear that they will not have a lot of popular support, for three reasons:

First, there is the problem of asymmetrical ideological warfare, which favors the party that says “nice” things and promises “free” things.

Second, What has happened thus far — mainly since the 1960s — has happened slowly enough that it seems “natural” to too many Americans. They are like fish in water who cannot grasp the idea of life in a different medium.

Third, although change for the worse has accelerated in recent years, it has occurred mainly in forums that seem inconsequential to most Americans, for example, in academic fights about free speech, in the politically correct speeches of Hollywood stars, and in culture wars that are conducted mainly in the blogosphere. The unisex-bathroom issue seems to have faded as quickly as it arose, mainly because it really affects so few people. The latest gun-control mania may well subside — though it has reached new heights of hysteria — but it is only one battle in the broader war being waged by the left. And most Americans lack the political and historical knowledge to understand that there really is a civil war underway — just not a “hot” one.

Is a reversal possible? Possible, yes, but unlikely. The rot is too deeply entrenched. Public schools and universities are cesspools of anti-Americanism. The affluent elites of the information-entertainment-media-academic complex are in the saddle. Republican politicians, for the most part, are of no help because they are more interested on preserving their comfortable sinecures than in defending America or the Constitution.

On that note, I will take a break from blogging — perhaps forever. I urge you to read one of my early posts, “Reveries“, for a taste of what America means to me. As for my blogging legacy, please see “A Summing Up“, which links to dozens of posts and pages that amplify and support this post.

Il faut cultiver notre jardin.

Voltaire, Candide


Related reading:

Michael Anton, “What We Still Have to Lose“, American Greatness, February 10, 2019

Rod Dreher, “Benedict Option FAQ“, The American Conservative, October 6, 2015

Roger Kimball, “Shall We Defend Our Common History?“, Imprimis, February 2019

Joel Kotkin, “Today’s Cultural Engineers“, newgeography, January 26, 2019

Daniel Oliver, “Where Has All the Culture Gone?“, The Federalist, February 8, 2019

Malcolm Pollack, “On Civil War“, Motus Mentis, March 7, 2019

Fred Reed, “The White Man’s Burden: Reflections on the Custodial State“, Fred on Everything, January 17, 2019

Gilbert T. Sewall, “The Diminishing Authority of the Bourgeois Culture“, The American Conservative, February 4, 2019

Bob Unger, “Requiem for America“, The New American, January 24, 2019

A Summing Up

This post has been updated and moved to “Favorite Posts“.

“The Little Drummer Girl” and War

My wife and I recently watched a six-episode, made-for-TV adaptation of The Little Drummer Girl, a novel by John Le Carré‘ that was published in 1983. The story

follows the manipulations of Martin Kurtz, an Israeli spymaster who intends to kill Khalil – a Palestinian terrorist who is bombing Jewish-related targets in Europe, particularly Germany – and Charlie, an English actress and double agent working on behalf of the Israelis….

Kurtz … recruits Charlie, a “21 or 22-year-old” radical left-wing English actress, as part of an elaborate scheme to discover the whereabouts of Khalil… Joseph is Charlie’s case officer. Khalil’s younger brother Salim is abducted, interrogated, and killed by Kurtz’s unit. Joseph impersonates Salim and travels through Europe with Charlie to make Khalil believe that Charlie and Salim are lovers. When Khalil discovers the affair and contacts Charlie, the Israelis are able to track him down.

Charlie is taken to Palestinian refugee camps to be trained as a bomber. She becomes more sympathetic to the Palestinian cause, and her divided loyalties bring her close to collapse. Charlie is sent on a mission to place a bomb at a lecture given by an Israeli moderate whose peace proposals are not to Khalil’s liking. She carries out the mission under the Israelis’ supervision. As a result, Joseph kills Khalil. Charlie subsequently has a mental breakdown caused by the strain of her mission and her own internal contradictions.

I recall that the 1984 feature-film version was widely thought to be pro-Palestinian and, therefore, anti-Israeli.

Neither my wife nor I have seen the 1984 film. She has read the novel, though she doesn’t remember much about it. I haven’t read the novel. I therefore came to the made-for-TV series with little baggage, though I feared that it might prove to be anti-Israeli propaganda. I will render a verdict later in this post, after considering some relevant evidence about the novel and feature film.

According to a piece in The New York Times, published soon after the release of the feature film, the novel and film were meant to be neutral:

The main problem in attempting to remain faithful to the book was dealing with what the filmmakers saw as its political balance – striving to be even-handed in the portrayal of Israelis and Palestinians engaged in a violent struggle for their respective causes and survival in the super- charged, highly sensitive arena of current history involving the ongoing agony of the Middle East.

”We weren’t making a political film,” said [director George Roy] Hill. ”We have no political ax to grind. We were making a suspense story that happened to have a political background. But we wanted to be true to the book, which we believe to be even-handed. The book shows the Palestinians for the first time in a human light. Up until then, they were seen as bloodthirsty monsters.”…

Like the book, the film does humanize the Palestinians and, perhaps because of the medium itself which makes them and their ultimate decimation visually and painfully real to the audience, it seems likely that the film will engender even more controversy than did the book.

Mr. Le Carre thinks controversy arose because the Palestinians never had a fair hearing in the United States. ”It is true,” he said, ”that some people think that it is heretical, anti-Semitic and probably even anti- American to suggest that there is even anything to be said for the Palestinian side.”

The novelist has continued to arouse passions by publishing some articles sympathetic to the Palestinians after the Shatila massacre in 1982. Nevertheless, he denies that this makes him anti-Israeli. ”It’s almost a vulgarity to confuse a balance of compassion with a want of sympathy for Israel,” he said. ”If I had written the book later, after the full extent of the Israeli operation was known, I would have made it angrier. But I begin and I end, believe it or not, as a tremendous supporter of a concept of Israel.”…

Indeed, the movie does not proclaim itself explicitly on one side or the other. A catalog of the ills shown suffered by each side would probably add up to a fairly even score….

But still, making the movie called for tremendous amounts of surgery and, in some cases, amputation….

The change in Charlie’s character is interesting because Mr. Le Carre had specified in his original contract that Charlie be played by an English actress. ”We were unable to find a suitable English actress,” Mr. Hill said. ”When I first spoke to Diane about the part we discussed the possibility of playing it with an English accent. But then I saw the advantage of making her American – to isolate her even more from the European community. This difference, and her more advanced age, makes the whole ending scene more moving, gives it more impact. By the end she can no longer act, she can’t pretend. She has been destroyed.”…

While the changes in Charlie’s personality added a dimension, the changes in Kurtz’s removed an aspect of his character – a moral one.

In the book, Kurtz, the master-spy, has many of the same doubts as Joseph, the agent Charlie loves. The two resolve their doubts in different ways. Kurtz pushes past them by working to stop the Palestinians even if in the process he has to act against his own conscience….

In the movie Mr. Kinski, who has previously played many fierce and even demonic characters, plays Kurtz as a hard-liner. He becomes a super-efficient agent with a touch of fanaticism, who resolutely brushes away all moral qualms. The effect is to make the Israelis seem like a ruthlessly moving machine pitted against the more vulnerable Palestinians.

Mr. Le Carre originally objected to the casting of Mr. Kinski because ”I thought he carried too much baggage with him.” He said he thinks his own Kurtz is probably ”more Israeli” and not as harsh. Mr. Hill said the casting choice was made for dramatic reasons. It would have been boring, he maintains, to have on screen two characters as similar as Joseph and Kurtz. But it’s one example of how a change made for dramatic impact can subtly change the film’s psychological effect.

It would seem that the crucial casting of Kinski as Kurtz gave the film an anti-Israeli tone — intended or not — even if the novel was meant to be neutral, as Le Carré‘ insists. The made-for-TV series struck me as truer to the spirit of the novel, as Le Carré‘ describes it.

The TV series can be viewed superficially, as just another story with some compelling characters, suspenseful sequences, and a conclusive climax. The series can also seem pro-Israeli or pro-Palestinian, depending on the stance you bring to your viewing.

I admit to having been staunchly pro-Israeli for a long time, but on reflection I conclude that the TV series conveys a pro-Israeli message — and more.

Charlie’s pangs of conscience after the killings of Khalil and his henchpersons are short-lived. She retreats to a seaside resort, recovers quickly, and reconciles with Joseph. I see these anti-climactic events as indicative of a pro-Israeli slant. Although the anti-climactic events might have been contrived merely to give the series a happy ending, they rather obviously (though subtly) endorse the rightness of the cause to which Charlie was recruited.

The series also conveys, even more subtly, this crucial message: One cannot win a war — or stave off defeat — by being less than ruthless. It’s probably true that most Palestinians, like most Israelis, are just “ordinary people” trying to get on with daily life. But that doesn’t negate the reality of the unrelenting Arab-Muslim effort to terrorize and kill Israelis and to undermine Israel as a sovereign state.

The need for ruthlessness is a lesson that American leaders seemed to have learned in World War II, but which their successors failed to apply in the Korean War, the Vietnam War, the 1990-91 Gulf War, and the wars in Afghanistan and Iraq.


Related posts:
The Decision to Drop the Bomb
Delusions of Preparedness
Inside-Outside
A Moralist’s Moral Blindness
A Grand Strategy for the United States
Why We Should (and Should Not) Fight
Rating America’s Wars
Transnationalism and National Defense
Patience as a Tool of Strategy
The War on Terror, As It Should Have Been Fought
Preemptive War
Some Thoughts and Questions about Preemptive War
Defense as an Investment in Liberty and Prosperity
Defense Spending: One More Time
My Defense of the A-Bomb
Pacifism
Presidents and War
LBJ’s Dereliction of Duty
The Ken Burns Apology Tour Continues
Planning for the Last War
A Rearview Look at the Invasion of Iraq and the War on Terror
Preemptive War Revisited
The Folly of Pacifism (III)

GHWB

George Herbert Walker Bush (June 12, 1924 – November 30, 2018) is the new leader in the clubhouse. That is to say, he is now the oldest member of the Dead Presidents Club, at the age of 94.47 years.

GHWB replaced Gerald Ford, who made it to 93.45. Ford replaced Ronald Reagan, who made it to 93.33.

Jimmy (now 94.17) will replace GHWB if he lives to March 25, 2019.

John Adams (90.67) and Herbert Hoover (90.19) are the other occupants of the club’s exclusive 90+ room. As many members of the club (5) lived into their 90s as lived into their 80s.

Will Jimmy make it 6 to 5, or will he become the club’s first centenarian? He already holds the record for having outlived his presidency. He’s almost at the 38-year mark, well beyond Hoover’s 31.63. The only president of the past half-century who was worse than Carter is Obama, who will probably break Carter’s miserable record for post-presidential pestilence.

For more in this vein, see the updated version of “Presidents: Key Dates and Various Trivia“.

55 Years Ago — And Today

From “Where Were You?“, which I posted seven years ago:

I have long since repented of my admiration for JFK (e.g., here). But my repentance is irrelevant to this story. The events in Dallas on November 22, 1963, burned into my brain a memory that will remain with me for the rest of my life….

I have come to see that the emotions that stirred in me 48 years ago were foolish ones. The greatest tragedy of JFK’s passing was LBJ’s succession to the presidency. LBJ’s cynical use of JFK’s memory helped him to unleash policies that have divided America and threaten to bankrupt it.

From “Who Shot JFK, and Why?“:

What about the thesis advanced by James B. Reston Jr. that Oswald’s real target was Connally? Possibly, inasmuch as Oswald wasn’t a sniper-class shooter. Here’s a scenario that’s consistent with the timing of events in Dealey Plaza: Oswald can tell that his first shot missed his target. He got off a quick second shot, which hit JFK, who’s in line with Connally, passed through JFK and hit Connally. There was no obvious, dramatic reaction from Connally, even though he was hit. So Oswald fired a quick third shot, which hit Kennedy in the back of the head instead of hitting Connally, who by that time had slumped into his wife’s lap. (Go here for the Warren Commission’s chronology of the shots and their effects.)…

Reston could be right, but we’ll never know if he is or isn’t. The truth of the matter died with Oswald on November 24, 1963. In any event, if Reston is right, it would mean that there was no conspiracy to murder JFK.

The only conspiracy theory that might still be worth considering is the idea that Oswald was gunning for JFK because he was somehow maneuvered into doing so by LBJ, the CIA, Fidel Castro, the Mafia, or the Russians. (See, for example, Philip Shenon’s “‘Maybe We Missed Something’: Warren Commission Insider Publicly Concedes That JFK Assassination Was Likely a Conspiracy,” The Washington Post, September 22, 2014, republished in The National Post.) The murder of Oswald by Ruby conveniently plays into that theory. But I say that the burden of proof is on conspiracy theorists, for whom the obvious is not titillating enough. The obvious is Oswald — a leftist loser and less-than-honorably discharged Marine with a chip on his shoulder, a domineering mother, an unhappy home life, and a menial job.

From “1963: The Year Zero“:

If, like me, you were an adult when John F. Kennedy was assassinated, you may think of his death as a watershed moment in American history. I say this not because I’m an admirer of Kennedy the man (I am not), but because American history seemed to turn a corner when Kennedy was murdered….

This petite histoire begins with the Vietnam War and its disastrous mishandling by LBJ, its betrayal by the media, and its spawning of the politics of noise. “Protests” in public spaces and on campuses are a main feature of the politics of noise. In the new age of instant and sympathetic media attention to “protests,” civil and university authorities often refuse to enforce order. The media portray obstructive and destructive disorder as “free speech.” Thus do “protestors” learn that they can, with impunity, inconvenience and cow the masses who simply want to get on with their lives and work….

LBJ’s “Great Society” marked the resurgence of FDR’s New Deal — with a vengeance — and the beginning of a long decline of America’s economic vitality….

The Civil Rights Act of 1964 unnecessarily crushed property rights, along with freedom of association, to what end? So that a violent, dependent, Democrat-voting underclass could arise from the Great Society? So that future generations of privilege-seekers could cry “discrimination” if anyone dares to denigrate their “lifestyles”?…

The war on defense has been accompanied by a war on science. The party that proclaims itself the party of science is anything but that. It is the party of superstitious, Luddite anti-science. Witness the embrace of extreme environmentalism, the arrogance of proclamations that AGW is “settled science,” unjustified fear of genetically modified foodstuffs, the implausible doctrine that race is nothing but a social construct, and on and on.

With respect to the nation’s moral well-being, the most destructive war of all has been the culture war, which assuredly began in the 1960s. Almost overnight, it seems, the nation was catapulted from the land of Ozzie and Harriet, Father Knows Best, and Leave It to Beaver to the land of the free- filthy-speech movement, Altamont, Woodstock, Hair, and the unspeakably loud, vulgar, and violent offerings that are now plastered all over the air waves, the internet, theater screens, and “entertainment” venues….

Then there is the campaign to curtail freedom of speech. This purported beneficiaries of the campaign are the gender-confused and the easily offended (thus “microagressions” and “trigger warnings”). The true beneficiaries are leftists. Free speech is all right if it’s acceptable to the left. Otherwise, it’s “hate speech,” and must be stamped out. This is McCarthyism on steroids. McCarthy, at least, was pursuing actual enemies of liberty; today’s leftists are the enemies of liberty….

If there are unifying themes in this petite histoire, they are the death of common sense and the rising tide of moral vacuity….


Related reading:

Victor Davis Hanson, “Did 1968 Win the Culture War?“, American Greatness, November 22, 2018

Will Lloyd, “How the Myth of JFK Tortured the Democratic Party for 55 Years“, Spectator USA, November 22. 2018

Jamie Palmer, “My Misspent Years of Conspiricism“, Quillette, November 22, 2018

Enough, Already!

In “Trump Defies Gravity“, and other posts about electoral trends, I contrast President Trump’s approval ratings with and those of his predecessor, over whom the media fawned ad nauseum. As I often note, Trump’s ratings are higher than Obama’s, despite the anti-Trump hysteria in which most of the media engage.

A new page at this blog, “Trump Coverage” A Chronology“, summarizes events related to Donald Trump’s presidency that have drawn media attention. The chronology is taken from Wikipedia‘s pages about newsworthy events in the United States during 2016, 2017, and 2018.

The summary begins with the aftermath of the election of November 8, 2016. Not all of the events listed in Wikipedia‘s chronologies occurred in the U.S., which leads me to wonder why the “migrant caravans” of 2018 aren’t included. They were and are clearly aimed at challenging Trump’s stance on immigration, and provoking incidents that cast Trump in a bad light.

At any rate, the tone of Wikipedia‘s narratives — which I copied verbatim — reflects the one-sided, negative, and apocalyptic coverage that bombards those Americans who bother to read or view mainstream media outlets.

V-J Day Stirs Memories

V-J Day in the United States commemorates the official surrender of Japan to the Allied Forces, and the end of World War II. The surrender ceremony took place on September 2, 1945 (the date in Japan), beginning at 9:00 a.m. Tokyo time. The ceremony was held in Tokyo Bay, aboard U.S.S. Missouri, and was presided over by General Douglas MacArthur:

Though it was actually September 1 in the United States at the time of the ceremony, V-J Day is traditionally observed in the U.S. on September 2.

The Monday after the surrender was Labor Day in the U.S. And in those more civilized times (barbarous wars aside), school began on the day after Labor Day.

On September 4, 1945 (the day after Labor Day), I entered kindergarten at the age of 4-2/3 years. Here’s the school that I attended:

PolkSch

In those innocent days, students got to school and back home by walking. Here’s the route that I followed as a kindergartener:

Route to Polk School

A 4-year-old walking several blocks between home and school, usually alone most of the way? Unheard of today, it seems. But that was a different time, in many ways.

For more, see “The Passing of Red-Brick Schoolhouses and a Way of Life“.

Suicide or Destiny?

The list of related reading at the bottom of this post is updated occasionally.

The suicide to which I refer is the so-called suicide of the West, about which Jonah Goldberg has written an eponymous book. This is from Goldberg’s essay based on the book, “Suicide of the West” (National Review, April 12, 2018):

Almost everything about modernity, progress, and enlightened society emerged in the last 300 years. If the last 200,000 years of humanity were one year, nearly all material progress came in the last 14 hours. In the West, and everywhere that followed our example, incomes rose, lifespans grew, toil lessened, energy and water became ubiquitous commodities.

Virtually every objective, empirical measure that capitalism’s critics value improved with the emergence of Western liberal-democratic capitalism. Did it happen overnight? Sadly, no. But in evolutionary terms, it did….

Of course, material prosperity isn’t everything. But the progress didn’t stop there. Rapes, deaths by violence and disease, slavery, illiteracy, torture have all declined massively, while rights for women, minorities, the disabled have expanded dramatically. And, with the exception of slavery, which is a more recent human innovation made possible by the agricultural revolution, material misery was natural and normal for us. Then suddenly, almost overnight, that changed.

What happened? We stumbled into a different world. Following sociologist Robin Fox and historian Ernest Gellner, I call this different world “the Miracle.”…

Why stress that the Miracle was both unnatural and accidental? Because Western civilization generally, and America particularly, is on a suicidal path. The threats are many, but beneath them all is one constant, eternal seducer: human nature. Modernity often assumes that we’ve conquered human nature as much as we’ve conquered the natural world. The truth is we’ve done neither….

The Founders closely studied human nature, recognizing the dangers of despots and despotic majorities alike. They knew that humans would coalesce around common interests, forming “factions.” They also understood that you can’t repeal human nature. So, unlike their French contemporaries, they didn’t try. Instead, they established our system of separated powers and enumerated rights so that no faction, including a passionate majority, could use the state’s power against other factions.

But the Founders’ vision assumed many preconditions, the two most important of which were the people’s virtue and the role of civil society. “The general government . . . can never be in danger of degenerating into a monarchy, an oligarchy, an aristocracy, or any despotic or oppressive form so long as there is any virtue in the body of the people,” George Washington argued.

People learn virtue first and most importantly from family, and then from the myriad institutions family introduces them to: churches, schools, associations, etc. Every generation, Western civilization is invaded by barbarians, Hannah Arendt observed: “We call them children.” Civil society, starting with the family, civilizes barbarians, providing meaning, belonging, and virtue.

But here’s the hitch. When that ecosystem breaks down, people still seek meaning and belonging. And it is breaking down. Its corruption comes from reasons too numerous and complex to detail here, but they include family breakdown, mass immigration, the war on assimilation, and the rise of virtual communities pretending to replace real ones.

First, the market, as Joseph Schumpeter argued, maximizes efficiency with relentless rationality, tending to break down the sinews of tradition and the foundations of civil society that enable and instill virtue. Yet those pre-rational virtues make capitalism possible in the first place.

Second, capitalism also creates a mass class of resentful intellectuals, artists, journalists, and bureaucrats who are professionally, psychologically, and ideologically committed to undermining capitalism’s legitimacy (as noted by Schumpeter and James Burnham, the author of another book titled “Suicide of the West”). This adversarial elite is its own coalition.

Thus, people increasingly look to Washington and national politics for meaning and belonging they can’t find at home. As Mary Eberstadt recently argued, the rise in identity politics coincided with family breakdown, as alienated youth looked to the artificial tribes of racial or sexual solidarity for meaning. Populism, which always wants the national government to solve local problems, is in vogue on left and right precisely because local institutions and civil society generally no longer do their jobs. Indeed, populism is its own tribalism, because “We the People” invariably means “my people.” As Jan-Werner Müller notes in his book What Is Populism?: “Populism is always a form of identity politics.”

A video at the 2012 Democratic National Convention proclaimed that “government is the only thing we all belong to.” For conservatives, this was Orwellian. But for many Americans, it was an invitation to belong. That was the subtext of “The Life of Julia” and President Obama’s call for Americans to emulate SEAL Team Six and strive in unison — towards his goals….

The American Founding’s glory is that those English colonists took their cousins’ tradition, purified it into a political ideology, and extended it farther than the English ever dreamed. And they wrote it down, thank God. The Founding didn’t apply these principles as universally as its rhetoric implied. But that rhetoric was transformative. When the Declaration of Independence was written, some dismissed the beginning as flowery boilerplate; what mattered was the ending: Independence! But the boilerplate became a creed, and America’s story is the story of that creed — those mere words — unfolding to its logical conclusion….

It seems axiomatic to me that whatever words can create, they can destroy. And ingratitude is the destroyer’s form. We teach children that the moral of the Goose that Lays the Golden Egg is the danger of greed. But the real moral of the story is ingratitude. A farmer finds an animal, which promises to make him richer than he ever imagined. But rather than nurture and protect this miracle, he resents it for not doing more. In one version, the farmer demands two golden eggs per day. When the goose politely demurs, he kills it out of a sense of entitlement — the opposite of gratitude.

The Miracle is our goose. And rather than be grateful for it, our schools, our culture, and many of our politicians say we should resent it for not doing more. Conservatism is a form of gratitude, because we conserve only what we are grateful for. Our society is talking itself out of gratitude for the Miracle and teaching our children resentment. Our culture affirms our feelings as the most authentic sources of truth when they are merely the expressions of instincts, and considers the Miracle a code word for white privilege, greed, and oppression.

This is corruption. And it is a choice. Collectively, we are embracing entitlement over gratitude. That is suicidal.

I would put it this way: About 300 years ago there arose in the West the idea of innate equality and inalienable rights. At the same time, and not coincidentally, there arose the notion of economic betterment through free markets. The two concepts — political and economic liberty — are in fact inseparable. One cannot have economic liberty without political liberty; political liberty — the ownership of oneself — implies the ownership of the fruits of one’s own labor and the right to strive for prosperity. This latter striving, as Adam Smith pointed out, works not only for the betterment of the striver but also for the betterment of those who engage in trade with him. The forces of statism are on the march (and have been for a long time). The likely result is the loss of liberty and the vibrancy and prosperity that arises from it.

I want to be clear about liberty. It is not a spiritual state of bliss. It is, as I have written,

a modus vivendi, not the result of a rational political scheme. Though a rational political scheme, such as the one laid out in the Constitution of the United States, could promote liberty.

The key to a libertarian modus vivendi is the evolutionary development and widespread observance of social norms that foster peaceful coexistence and mutually beneficial cooperation.

Liberty, in sum, is not an easy thing to attain or preserve because it depends on social comity: mutual trust, mutual respect, and mutual forbearance. These are hard to inculcate and sustain in the relatively small groupings of civil society (family, church, club, etc.). They are almost impossible to attain or sustain in a large, diverse nation-state. Interests clash and factions clamor and claw for ascendancy over other factions. (It is called tribalism, and even anti-tribalists are tribal in their striving to impose their values on others). The Constitution, as Goldberg implies, has proved unequal to the task of preserving liberty, for reasons to which I will come.

I invoke the Constitution deliberately. This essay is about the United States, not the West in general. (Goldberg gets to the same destination after a while.) Much of the West has already committed “suicide” by replacing old-fashioned (“classical“) liberalism with oppressive statism. The U.S. is far down the same path. The issue at hand, therefore, is whether America’s “suicide” can be avoided.

Perhaps, but only if the demise of liberty is a choice. It may not be a choice, however, as Goldberg unwittingly admits when he writes about human nature.

On that point I turn to John Daniel Davidson, writing in “The West Isn’t Committing Suicide, It’s Dying of Natural Causes” (The Federalist, May 18, 2018):

Perhaps the Miracle, wondrous as it is, needs more than just our gratitude to sustain it. Perhaps the only thing that can sustain it is an older order, one that predates liberal democratic capitalism and gave it its vitality in the first place. Maybe the only way forward is to go back and rediscover the things we left behind at the dawn of the Enlightenment.

Goldberg is not very interested in all of that. He does not ask whether there might be some contradictions at the heart of the liberal order, whether it might contain within it the seeds of its undoing. Instead, Goldberg makes his stand on rather narrow grounds. He posits that the Enlightenment Miracle can be defended in purely secular, utilitarian terms, which he supposes are the only terms skeptics of liberal democratic capitalism will accept.

That forces him to treat the various illiberal ideologies that came out of Enlightenment thought (like communism) as nothing more than a kind of tribalism rather than a natural consequence of the hyper-rational scientism embedded in the liberal order itself. As Richard M. Reinsch II noted last week in an excellent review of Goldberg’s book over at Law and Liberty, “If you are going to set the Enlightenment Miracle as the standard of human excellence, one that we are losing, you must also clearly state the dialectic it introduces of an exaltation of reason, power, and science that can become something rather illiberal.”

That is to say, we mustn’t kid ourselves about the Miracle. We have to be honest, not just about its benefits but also its costs….

What about science and medical progress? What about the eradication of disease? What about technological advances? Isn’t man’s conquest of nature a good thing? Hasn’t the Enlightenment and the scientific revolution and the invention of liberal democratic capitalism done more to alleviate poverty and create wealth than anything in human history? Shouldn’t we preserve this liberal order and pass it on to future generations? Shouldn’t we inculcate in our children a profound sense of gratitude for all this abundance and prosperity?

This is precisely Goldberg’s argument. Yes, he says, man’s conquest of nature is a good thing. It’s the same species of argument raised earlier this year in reaction to Patrick Deneen’s book, “Why Liberalism Failed,” which calls into question the entire philosophical system that gave us the Miracle….

[Deneen] is not chiefly interested in the problems of the modern progressive era or the contemporary political Left. He isn’t alarmed merely by political tribalism and the fraying of the social order. Those things are symptoms, not the cause, of the illness he’s diagnosing. Even the social order at its liberal best—the Miracle itself—is part of the illness.

Deneen’s argument reaches back to the foundations of the liberal order in the sixteenth  and seventeenth centuries—prior to the appearance of the Miracle, in Goldberg’s telling—when a series of thinkers embarked on a fundamentally revisionist project “whose central aim was to disassemble what they concluded were irrational religious and social norms in the pursuit of civil peace that might in turn foster stability and prosperity, and eventually individual liberty of conscience and action.”

The project worked, as Goldberg has chronicled at length, but only up to a point. Today, says Deneen, liberalism is a 500-year-old experiment that has run its course and now “generates endemic pathologies more rapidly and pervasively than it is able to produce Band-Aids and veils to cover them.”

Taking the long view of history, Deneen’s book could be understood as an extension of Lewis’s argument in “The Abolition of Man.” The replacement of moral philosophy and religion with liberalism and applied science has begun, in our lifetimes, to manifest the dangers that Lewis warned about. Deneen, writing more than a half-century after Lewis, declares that the entire liberal project manifestly has failed.

Yes, the Miracle gave us capitalism and democracy, but it also gave us hyper-individualism, scientism, and communism. It gave us liberty and universal suffrage, but it also gave us abortion, euthanasia, and transgenderism. The abolition of man was written into the Enlightenment, in other words, and the suicide of the West that Goldberg warns us about isn’t really a suicide at all, because it isn’t really a choice: we aren’t committing suicide, we’re dying of natural causes.

Goldberg is correct that we have lost our sense of gratitude, that we don’t really feel like things are as good as all that. But a large part of the reason is that the liberal order itself has robbed us of our ability to articulate what constitutes human happiness. We have freedom, we have immense wealth, but we have nothing to tell us what we should do with it, nothing to tell us what is good.

R.R. Reno, in “The Smell of Death” (First Things, May 31, 2018), comes at it this way:

At every level, our elites oppose traditional regulation of behavior based on clear moral norms, preferring a therapeutic and bureaucratic approach. They seek to decriminalize marijuana. They have deconstructed male and female roles for children. They correct anyone who speaks of “sex,” preferring to speak of “gender,” which they insist is “socially constructed.” They have ushered in a view of free speech that makes it impossible to prevent middle school boys from watching pornography on their smart phones. They insist upon a political correctness that rejects moral correctness.

The upshot is American culture circa 2018. Our ideal is a liquid world of self-definition, characterized by plenary acceptance and mutual affirmation. In practice, the children of our elites are fortunate: Their families and schools carefully socialize them into the disciplines of twenty-first-century meritocratic success while preaching openness, inclusion, and diversity. But the rest are not so fortunate. Most Americans gasp for air as they tread water. More and more drown….

Liberalism has always been an elite project of deregulation. In the nineteenth century, it sought to deregulate pre-modern economies and old patterns of social hierarchy. It worked to the advantage of the talented, enterprising, and ambitious, who soon supplanted the hereditary aristocracy.

In the last half-century, liberalism has focused on deregulating personal life. This, too, has been an elite priority. It makes options available to those with the resources to exploit them. But it has created a world in which disordered souls kill themselves with drugs and alcohol—and in which those harboring murderous thoughts feel free to act upon them.

The penultimate word goes to Malcolm Pollack (“The Magic Feather“, Motus Mentis, July 6, 2018):

Our friend Bill Vallicella quoted this, from Michael Anton, on Independence Day:

For the founders, government has one fundamental purpose: to protect person and property from conquest, violence, theft and other dangers foreign and domestic. The secure enjoyment of life, liberty and property enables the “pursuit of happiness.” Government cannot make us happy, but it can give us the safety we need as the condition for happiness. It does so by securing our rights, which nature grants but leaves to us to enforce, through the establishment of just government, limited in its powers and focused on its core responsibility.

Bill approves, and adds:

This is an excellent statement. Good government secures our rights; it does not grant them. Whether they come from nature, or from God, or from nature qua divine creation are further questions that can be left to the philosophers. The main thing is that our rights are not up for democratic grabs, nor are they subject to the whims of any bunch of elitists that manages to insinuate itself into power.

I agree all round. I hope that my recent engagement with Mr. Anton about the ontology of our fundamental rights did not give readers the impression that I doubt for a moment the importance of Americans believing they possess them, or of the essential obligation of government to secure them (or of the people to overthrow a government that won’t).

My concerns are whether the popular basis for this critically important belief is sustainable in an era of radical and corrosive secular doubt (and continuing assault on those rights), and whether the apparently irresistible tendency of democracy to descend into faction, mobs, and tyranny was in fact a “poison pill” baked into the nation at the time of the Founding. I am inclined to think it was, but historical contingency and inevitability are nearly impossible to parse with any certainty.

Arnold Kling (“Get the Story Straight“, Library of Economics and Liberty, July 9, 2018) is more succinct:

Lest we fall back into a state of primitive tribalism, we need to understand the story of the Miracle. We need to understand that it is unnatural, and we should be grateful for the norms and institutions that restrained human nature in order to make the Miracle possible.

All of the writers I have quoted are on to something, about which I have written in “Constitution: Myths and Realities“. I call it the Framers’ fatal error.

The Framers’ held a misplaced faith in the Constitution’s checks and balances (see Madison’s Federalist No. 51 and Hamilton’s Federalist No. 81). The Constitution’s wonderful design — containment of a strictly limited central government through horizontal and vertical separation of powers — worked rather well until the Progressive Era. The design then cracked under the strain of greed and the will to power, as the central government began to impose national economic regulation at the behest of muckrakers and do-gooders. The design then broke during the New Deal, which opened the floodgates to violations of constitutional restraint (e.g., Medicare, Medicaid, Obamacare,  the vast expansion of economic regulation, and the destruction of civilizing social norms), as the Supreme Court has enabled the national government to impose its will in matters far beyond its constitutional remit.

In sum, the “poison pill” baked into the nation at the time of the Founding is human nature, against which no libertarian constitution is proof unless it is enforced resolutely by a benign power.

Barring that, it is may be too late to rescue liberty in America. I am especially pessimistic because of the unraveling of social comity since the 1960s, and by a related development: the frontal assault on freedom of speech, which is the final constitutional bulwark against oppression.

Almost overnight, it seems, the nation was catapulted from the land of Ozzie and Harriet, Father Knows Best, and Leave It to Beaver to the land of the free- filthy-speech movement, Altamont, Woodstock, Hair, and the unspeakably loud, vulgar, and violent offerings that are now plastered all over the air waves, the internet, theater screens, and “entertainment” venues.

The 1960s and early 1970s were a tantrum-throwing time, and many of the tantrum-throwers moved into positions of power, influence, and wealth, having learned from the success of their main ventures: the end of the draft and the removal of Nixon from office. They schooled their psychological descendants well, and sometimes literally on college campuses. Their successors on the campuses of today — students, faculty, and administrators — carry on the tradition of reacting with violent hostility toward persons and ideas that they oppose, and supporting draconian punishments for infractions of their norms and edicts. (For myriad examples, see The College Fix.)

Adherents of the ascendant culture esteem protest for its own sake, and have stock explanations for all perceived wrongs (whether or not they are wrongs): racism, sexism, homophobia, Islamophobia, hate, white privilege, inequality (of any kind), Wall  Street, climate change, Zionism, and so on. All of these are to be combated by state action that deprives citizens of economic and social liberties.

In particular danger are the freedoms of speech and association. The purported beneficiaries of the campaign to destroy those freedoms are “oppressed minorities” (women, Latinos, blacks, Muslims, the gender-confused, etc.) and the easily offended. The true beneficiaries are leftists. Free speech is speech that is acceptable to the left. Otherwise, it’s “hate speech”, and must be stamped out. Freedom of association is bigotry, except when it is practiced by leftists in anti-male, anti-conservative, pro-ethnic, and pro-racial causes. This is McCarthyism on steroids. McCarthy, at least, was pursuing actual enemies of liberty; today’s leftists are the enemies of liberty.

The organs of the state have been enlisted in an unrelenting campaign against civilizing social norms. We now have not just easy divorce, subsidized illegitimacy, and legions of non-mothering mothers, but also abortion, concerted (and deluded) efforts to defeminize females and to neuter or feminize males, forced association (with accompanying destruction of property and employment rights), suppression of religion, absolution of pornography, and the encouragement of “alternative lifestyles” that feature disease, promiscuity, and familial instability.

The state, of course, doesn’t act of its own volition. It acts at the behest of special interests — interests with a “cultural” agenda. They are bent on the eradication of civil society — nothing less — in favor of a state-directed Rousseauvian dystopia from which Judeo-Christian morality and liberty will have vanished, except in Orwellian doublespeak.

If there are unifying themes in this petite histoire, they are the death of common sense and the rising tide of moral vacuity. The history of the United States since the 1960s supports the proposition that the nation is indeed going to hell in a handbasket.

In fact, the speed at which it is going to hell seems to have accelerated since the Charleston church shooting and the legal validation of  same-sex “marriage” in 2015. It’s a revolution (e.g., this) piggy-backing on mass hysteria. Here’s the game plan:

  • Define opposition to illegal immigration, Islamic terrorism, same-sex marriage, transgenderism, and other kinds violent and anti-social behavior as “hate“.
  • Associate “hate” with conservatism.
  • Watch as normally conservative politicians, business people, and voters swing left rather than look “mean” and put up a principled fight for conservative values. (Many of them can’t put up such a fight, anyway. Trump’s proper but poorly delivered refusal to pin all of the blame on neo-Nazis for the Charlottesville riot just added momentum to the left’s cause because he’s Trump and a “fascist” by definition.)
  • Watch as Democrats play the “hate” card to retake the White House and Congress.

With the White House in the hands of a left-wing Democrat (is there any other kind now?) and an aggressive left-wing majority in Congress, freedom of speech, freedom of association, and property rights will become not-so-distant memories. “Affirmative action” (a.k.a. “diversity”) will be enforced on an unprecedented scale of ferocity. The nation will become vulnerable to foreign enemies while billions of dollars are wasted on the hoax of catastrophic anthropogenic global warming and “social services” for the indolent. The economy, already buckling under the weight of statism, will teeter on the brink of collapse as the regulatory regime goes into high gear and entrepreneurship is all but extinguished by taxation and regulation.

All of that will be secured by courts dominated by left-wing judges — from here to eternity.

And most of the affluent white enablers dupes of the revolution will come to rue their actions. But they won’t be free to say so.

Thus will liberty — and prosperity — die in America.

And it is possible that nothing can prevent it because it is written in human nature; specifically, a penchant for the kind of mass hysteria that seems to dominate campuses, the “news” and “entertainment” media, and the Democrat Party.

Christopher Booker describes this phenomenon presciently in his book about England and America of the 1950s and 1960s, The Neophiliacs (1970):

[T]here is no dream so powerful as one generated and subscribed to by a whole mass of people simultaneously — one of those mass projections of innumerable individual neuroses which we may call a group fantasy. This is why the twentieth century has equally been dominated by every possible variety of collective make-believe — whether expressed through mass political movements and forms of nationalism, or through mass social movements….

Any group fantasy is in some sense a symptom of social disintegration, of the breaking down of the balance and harmony between individuals, classes, generations, the sexes, or even nations. For the organic relationships of a stable and secure community, in which everyone may unself-consciously exist in his own separate place and right, a group fantasy substitutes the elusive glamor of identification with a fantasy community, of being swept along as part of a uniform mass united in a common cause. But the individuals making up the mass are not, of course, united in any real sense, except through their common dress, catch phrases, slogans, and stereotyped attitudes. Behind their conformist exteriors they remain individually as insecure as ever — and indeed become even more so, for the collective dream, such as that expressed through mass advertising or the more hysterical forms of fashion, is continually aggravating their fantasy-selves and appealing to them through their insecurities to merge themselves in the mass ever more completely….

This was the phenomenon of mass psychology which was portrayed in an extreme version by George Orwell in his 1984…. But in fact the pattern described was that of every group fantasy; exactly the same that we can see, for instance, in the teen age subculture of the fifties and sixties, … or that of the left-wing progressive intellectuals, with their dream heroes such as D. H. Lawrence or Che Guevera and their ritual abuse of the “reactionaries”….

… Obviously no single development in history has done more to promote both social disintegration and unnatural conformity than the advance and ubiquity of machines and technology. Not only must the whole pressure of an industrialized, urbanized, mechanized society tend to weld its members into an ever more rootless uniform mass, by the very nature of its impersonal organization and of the processes of mass-production and standardization. But in addition the twentieth century has also provided two other factors to aggravate and to feed the general neurosis; the first being the image-conveying apparatus of films, radio, television, advertising, mass-circulation newspapers and magazines; the second the feverishly increased pace of life, from communications and transport to the bewildering speed of change and innovation, all of which has created a profound subconscious restlessness which neurotically demands to be assuaged by more speed and more change of every kind….

The essence of fantasy is that it feeds on a succession of sensations or unresolved images, each one of which arouses anticipation, followed by inevitable frustration, leading to the demand for a new image to be put in its place. But the very fact that each sensation is fundamentally unsatisfying means that the fantasy itself becomes progressively more jaded…. And so we arrive at the fantasy spiral.

Whatever pattern of fantasy we choose to look at … she shall find that it is straining through a spiral of increasingly powerful sensations toward some kind of climax…. What happens therefore is simply that, in its pursuit of the elusive image of life, freedom, and self-assertion, the fantasy pushes on in an ever-mounting spiral of demand, ever more violent, more dream-like and fragmentary, and ever more destructive of the framework of order. Further and further pushes the fantasy, always in pursuit of the elusive climax, always further from reality — until it is actually bringing about the very opposite of its aims.

That, of course, is what will happen when the left and its dupes bring down the Constitution and all that it was meant to stand for: the protection of citizens and their voluntary institutions and relationships from predators, including not least governmental predators and the factions they represent.

The Constitution, in short, was meant to shield Americans from human nature. But it seems all too likely that human nature will destroy the shield.

Thus my call for a “Preemptive (Cold) Civil War“.


Related reading:
Fred Reed, “The Symptoms Worsen”, Fred on Everything, March 15, 2015
Christopher Booker, Global Warming: A Case Study in Groupthink, Global Warming Policy Foundation, 2018
Michael Mann, “Have Wars and Violence Declined?“, Theory and Society, February 2018
John Gray, “Steven Pinker Is Wrong about Violence and War”, The Guardian, March 13, 2015
Nikita Vladimirov, “Scholar Traces Current Campus Intolerance to 60’s Radicals“, Campus Reform, March 14, 2018
Nick Spencer, “Enlightenment and Progress: Why Steven Pinker Is Wrong” Mercatornet, March 19, 2018
Steven Hayward, “Deja Vu on Campus?“, PowerLine, April 15, 2018
William A. Nitze, “The Tech Giants Must Be Stopped“, The American Conservative, April 16, 2018
Steven Hayward, “Jonah’s Suicide Hotline, and All That Stuff“, PowerLine, May 15, 2018
Jeff Groom, “40 Years Ago Today: When Solzhenitsyn Schooled Harvard“, The American Conservative, June 8, 2018
Graham Allison, “The Myth of the Liberal Order: From Historical Accident to Conventional Wisdom“, Foreign Affairs, July/August 2018
Gilbert T. Sewall, “The America That Howard Zinn Made“, The American Conservative, July 10, 2018
Mary Eberstadt, “Two Nations, Revisited“, National Affairs, Summer 2018

Related posts and pages:
Constitution: Myths and Realities
Leftism
The Psychologist Who Played God
We, the Children of the Enlightenment
Society and the State
The Eclipse of “Old America”
Genetic Kinship and Society
The Fallacy of Human Progress
The Culture War
Ruminations on the Left in America
1963: The Year Zero
Academic Ignorance
The Euphemism Conquers All
Defending the Offensive
Superiority
Whiners
A Dose of Reality
Turning Points
God-Like Minds
Non-Judgmentalism as Leftist Condescension
An Addendum to (Asymmetrical) Ideological Warfare
Social Justice vs. Liberty
The Left and “the People”
Liberal Nostrums
Liberty and Social Norms Re-examined
Equality
Academic Freedom, Freedom of Speech, and the Demise of Civility
Leftism As Crypto-Fascism: The Google Paradigm
What’s Going On? A Stealth Revolution
Disposition and Ideology
Down the Memory Hole
“Tribalists”, “Haters”, and Psychological Projection
Mass Murder: Reaping What Was Sown
Utopianism, Leftism, and Dictatorship
The Framers, Mob Rule, and a Fatal Error
Abortion, the “Me” Generation, and the Left
Abortion Q and A
Whence Polarization?
Negative Rights, Etc.
Social Norms, the Left, and Social Disintegration
Order vs. Authority
Can Left and Right Be Reconciled?
Rage on the Left
Rights, Liberty, the Golden Rule, and Leviathan