What’s in a Name?

A lot, especially if it’s the name of a U.S. Navy ship. Take the aircraft carrier, for instance, which has been the Navy’s capital ship since World War II. The first aircraft carrier in the U.S. fleet was the USS Langley, commissioned in 1922. Including escort carriers, which were smaller than the relatively small carriers of World War II, a total of 154 carriers have been commissioned and put into service in the U.S. Navy. (During World War II, some escort carriers were transferred to the Royal Navy upon commissioning.)

As far as I am able to tell, not one of the the 82 escort carriers was named for a person. Of the 72 “regular” carriers, which includes 10 designated as light aircraft carriers, none was named for a person until CVB-49, the Franklin D. Roosevelt, was commissioned in 1945, several months after the death of its namesake. The next such naming came in 1947, with the commissioning of the Wright, named for Wilbur and Orville Wright, the aviation pioneers. There was a hiatus of 8 years, until the commissioning of the Forrestal in 1955; a ship named for the late James Forrestal, the first secretary of defense.

The dam burst in 1968, with the commissioning of John F. Kennedy. That carrier and the 11 commissioned since have been named for persons, only one of whom, Admiral of the Fleet Chester W. Nimitz, was a renowned naval person. In addition to Kennedy, the namesakes include former U.S. presidents (Eisenhower, T. Roosevelt, Lincoln, Washington, Truman, Reagan, Bush 41, and Ford), Carl Vinson (a long-serving chairman of the House Armed Services Committee), and John C. Stennis (a long-serving chairman of the Senate Armed Services Committee). Reagan and Bush were honored while still living (though Reagan may have been unaware of the honor because of the advanced state of his Alzheimer’s disease).

All but the Kennedy are on active service. And the Kennedy, which was decommissioned in 2007, is due to be replaced by a namesake next year. But that may be the end of it. Wisdom may have prevailed before the Navy becomes embroiled in nasty, needless controversies over the prospect of naming of a carrier after Lyndon Johnson, Richard Nixon, Jimmy Carter, Bill Clinton, George Bush, Barack Obama, or Donald Trump.

The carrier after Kennedy (II) will be named Enterprise — the third carrier to be thus named. Perhaps future carriers will take the dashing names of those that I remember well from my days as a young defense analyst: Bon Homme Richard (a.k.a, Bonny Dick), Kearsarge, Oriskany, Princeton, Shangri-La, Lake Champlain, Tarawa, Midway, Coral Sea, Valley Forge, Saipan, Saratoga, Ranger, Independence, Kitty Hawk, Constellation, Enterprise (II), and America.

And while we’re at it, perhaps the likes of Admiral William McRaven (USN ret.) will do their duty, become apolitical, and shut up.

The Modern Presidency: From TR to DJT

This is a revision and expansion of a post that I published at my old blog late in 2007. The didactic style of this post reflects its original purpose, which was to give my grandchildren some insights into American history that aren’t found in standard textbooks. Readers who consider themselves already well-versed in the history of American politics should nevertheless scan this post for its occasionally provocative observations.

Theodore Roosevelt Jr. (1858-1919) was elected Vice President as a Republican in 1900, when William McKinley was elected to a second term as President. Roosevelt became President when McKinley was assassinated in September 1901. Roosevelt was re-elected President in 1904, with 56 percent of the “national” popular vote. (I mention popular-vote percentages here and throughout this post because they are a gauge of the general popularity of presidential candidates, though an inaccurate gauge if a strong third-party candidate emerges to distort the usual two-party dominance of the popular vote. There is, in fact, no such thing as a national popular vote. Rather, it is the vote in each State which determines the distribution of that State’s electoral votes between the various candidates. The electoral votes of all States are officially tallied about a month after the general election, and the president-elect is the candidate with the most electoral votes. I have more to say more about electoral votes in several of the entries that follow this one.)

Theodore Roosevelt (also known as TR) served almost two full terms as President, from September 14, 1901, to March 4, 1909. (Before 1937, a President’s term of office began on March 4 of the year following his election to office.)

Roosevelt was an “activist” President. Roosevelt used what he called the “bully pulpit” of the presidency to gain popular support for programs that exceeded the limits set in the Constitution. Roosevelt was especially willing to use the power of government to regulate business and to break up companies that had become successful by offering products that consumers wanted. Roosevelt was typical of politicians who inherited a lot of money and didn’t understand how successful businesses provided jobs and useful products for less-wealthy Americans.

Roosevelt was more like the Democrat Presidents of the Twentieth Century. He did not like the “weak” government envisioned by the authors of the Constitution. The authors of the Constitution designed a government that would allow people to decide how to live their own lives (as long as they didn’t hurt other people) and to run their own businesses as they wished to (as long as they didn’t cheat other people). The authors of the Constitution thought government should exist only to protect people from criminals and foreign enemies.

William Howard Taft (1857-1930), a close friend of Theodore Roosevelt, served as President from March 4, 1909, to March 4, 1913. Taft ran for the presidency as a Republican in 1908 with Roosevelt’s support. But Taft didn’t carry out Roosevelt’s anti-business agenda aggressively enough to suit Roosevelt. So, in 1912, when Taft ran for re-election as a Republican, Roosevelt ran for election as a Progressive (a newly formed political party). Many Republican voters decided to vote for Roosevelt instead of Taft. The result was that a Democrat, Woodrow Wilson, won the most electoral votes. Although Taft was defeated for re-election, he later became Chief Justice of the United States, making him the only person ever to have served as head of the executive and judicial branches of the U.S. Government.

Thomas Woodrow Wilson (1856-1924) served as President from March 4, 1913, to March 4, 1921. (Wilson didn’t use his first name, and was known officially as Woodrow Wilson.) Wilson is the only President to have earned the degree of doctor of philosophy. Wilson’s field of study was political science, and he had many ideas about how to make government “better”. But “better” government, to Wilson, was “strong” government of the kind favored by Theodore Roosevelt. In fact, it was government by executive decree rather than according to the Constitution’s rules for law-making, in which Congress plays the central role.

Wilson was re-elected in 1916 because he promised to keep the United States out of World War I, which had begun in 1914. But Wilson changed his mind in 1917 and asked Congress to declare war on Germany. After the war, Wilson tried to get the United States to join the League of Nations, an international organization that was supposed to prevent future wars by having nations assemble to discuss their differences. The U.S. Senate, which must approve America’s membership in international organizations, refused to join the League of Nations. The League did not succeed in preventing future wars because wars are started by leaders who don’t want to discuss their differences with other nations.

Warren Gamaliel Harding (1865-1923), a Republican, was elected in 1920 and inaugurated on March 4, 1921. Harding asked voters to reject the kind of government favored by Democrats, and voters gave Harding what is known as a “landslide” victory; he received 60 percent of the votes cast in the 1920 election for president, one of the highest percentages ever recorded. Harding’s administration was about to become involved in a major scandal when Harding died suddenly on August 3, 1923, while he was on a trip to the West Coast. The exact cause of Harding’s death is unknown, but he may have had a stroke when he learned of the impending scandal, which involved Albert Fall, Secretary of the Interior. Fall had secretly allowed some of his business associates to lease government land for oil-drilling, in return for personal loans.

There were a few other scandals, but Harding probably had nothing to do with any of them. Because of the scandals, most historians say that they consider Harding to have been a poor President. But that isn’t the real reason for their dislike of Harding. Most historians, like most college professors, favor “strong” government. Historians don’t like Harding because he didn’t use the power of government to interfere in the nation’s economy. An important result of Harding’s policy (called laissez-faire, or “hands off”) was high employment and increasing prosperity during the 1920s.

John Calvin Coolidge (1872-1933) , who was Harding’s Vice President, became President upon Harding’s death in 1923. (Coolidge didn’t use his first name, and was known as Calvin.) Coolidge was elected President in 1924. He served as President from August 3, 1923, to March 4, 1929. Coolidge continued Harding’s policy of not interfering in the economy, and people continued to become more prosperous as businesses grew and hired more people and paid them higher wages. Coolidge was known as “Silent Cal” because he was a man of few words. He said only what was necessary for him to say, and he meant what he said. That was in keeping with his approach to the presidency. He was not the “activist” that reporters and historians like to see in the presidency; he simply did the job required of him by the Constitution, which was to execute the laws of the United States. He continued Harding’s hands-off policy, and the country prospered as a result. Coolidge chose not run for re-election in 1928, even though he was quite popular.

Herbert Clark Hoover (1874-1964), a Republican who had been Secretary of Commerce under Coolidge, was elected to the presidency in 1928. He served as President from March 4, 1929, to March 4, 1933.

Hoover won 58 percent of the popular vote, an endorsement of the hands-off policy of Harding and Coolidge. Hoover’s administration is known mostly for the huge drop in the price of stocks (shares of corporations, which are bought and sold in places known as stock exchanges), and for the Great Depression that was caused partly by the “Crash” — as it became known. The rate of unemployment (the percentage of American workers without jobs) rose from 3 percent just before the Crash to 25 percent by 1933, at the depth of the Great Depression.

The Crash had two main causes. First, the prices of shares in businesses (called stocks) began to rise sharply in the late 1920s. That caused many persons to borrow money in order to buy stocks, in the hope that the price of stocks would continue to rise. If the price of stocks continued to rise, buyers could sell their stocks at a profit and repay the money they had borrowed. But when stock prices got very high in the fall of 1929, some buyers began to worry that prices would fall, so they began to sell their stocks. That drove down the price of stocks, and caused more buyers to sell in the hope of getting out of the stock market before prices fell further. But prices went down so quickly that almost everyone who owned stocks lost money. Prices of stocks kept going down. By 1933, many stocks had become worthless and most stocks were selling for only a small fraction of prices that they had sold for before the Crash.

Because so many people had borrowed money to buy stocks, they went broke when stock prices dropped. When they went broke, they were unable to pay their other debts. That had a ripple effect throughout the economy. As people went broke they spent less money and were unable to pay their debts. Banks had less money to lend. Because people were buying less from businesses, and because businesses couldn’t get loans to stay in business, many businesses closed and people lost their jobs. Then the people who lost their jobs had less money to spend, and so more people lost their jobs.

The effects of the Great Depression were felt in other countries because Americans couldn’t afford to buy as much as they used to from other countries. Also, Congress passed a law known as the Smoot-Hawley Tarrif Act, which President Hoover signed. The Smoot-Hawley Act raised tarrifs (taxes) on items imported into the United States, which meant that Americans bought even less from foreign countries. Foreign countries passed similar laws, which meant that foreigners began to buy less from Americans, which put more Americans out of work.

The economy would have recovered quickly, as it had done in the past when stock prices fell and unemployment increased. But the actions of government — raising tariffs and making loans harder to get — only made things worse. What could have been a brief recession turned into the Great Depression. People were frightened. They blamed President Hoover for their problems, although President Hoover didn’t cause the Crash. Hoover ran for re-election in 1932, but he lost to Franklin Delano Roosevelt, a Democrat.

Franklin Delano Roosevelt (1882-1945), known as FDR, served as President from March 4, 1933 until his death on April 12, 1945, just a month before V-E Day. FDR was elected to the presidency in 1932, 1936, 1940, and 1944 — the only person elected more than twice. Roosevelt was a very popular President because he served during the Depression and World War II, when most Americans — having lost faith in themselves — sought reassurance that “someone was in charge”. FDR was not universally popular; his share of the popular vote rose from 57 percent in 1932 to 61 percent in 1936, but then dropped to 55 percent in 1940 and 54 percent in 1944. Americans were coming to understand what FDR’s opponents knew at the time, and what objective historians have said since:

FDR’s program to end the Great Depression was known as the New Deal. It consisted of welfare programs, which put people to work on government projects instead of making useful things. It also consisted of higher taxes and other restrictions on business, which discouraged people from starting and investing in businesses, which is the cure for unemployment.

Roosevelt did try to face up to the growing threat from Germany and Japan. However, he wasn’t able to do much to prepare America’s defenses because of strong isolationist and anti-war feelings in the country. Those feelings were the result of America’s involvement in World War I. (Similar feelings in Great Britain kept that country from preparing for war with Germany, which encouraged Hitler’s belief that he could easily conquer Europe.)

When America went to war after Japan’s attack on Pearl Harbor, Roosevelt proved to be an able and inspiring commander-in-chief. But toward the end of the war his health was failing and he was influenced by close aides who were pro-communist and sympathetic to the Soviet Union (Union of Soviet Socialist Republics, or USSR). Roosevelt allowed Soviet forces to claim Eastern Europe, including half of Germany. Roosevelt also encouraged the formation of the United Nations, where the Soviet Union (now the Russian Federation) has had a strong voice because it was made a permanent member of the Security Council, the policy-making body of the UN. As a member of the Security Council, Russia can obstruct actions proposed by the United States. (In any event, the UN has long since become a hotbed of anti-American, left-wing sentiment.)

Roosevelt’s appeasement of the USSR caused Josef Stalin (the Soviet dictator) to believe that the U.S. had weak leaders who would not challenge the USSR’s efforts to spread Communism. The result was the Cold War, which lasted for 45 years. During the Cold War the USSR developed nuclear weapons, built large military forces, kept a tight rein on countries behind the Iron Curtain (in Eastern Europe), and expanded its influence to other parts of the world.

Stalin’s belief in the weakness of U.S. leaders was largely correct, until Ronald Reagan became President. As I will discuss, Reagan’s policies led to the end of the Cold War.

Harry S Truman (1884-1972), who was Vice President in FDR’s fourth term, became President upon FDR’s death. Truman was re-elected in 1948, so he served as President from April 12, 1945 until January 20, 1953 — almost two full terms.

Truman made one right decision during his presidency. He approved the dropping of atomic bombs on Japan. Although hundreds of thousands of Japanese were killed by the bombs, the Japanese soon surrendered. If the Japanese hadn’t surrendered then, U.S. forces would have invaded Japan and millions of Americans and Japanese lives would have been lost in the battles that followed the invasion.

Truman ordered drastic reductions in the defense budget because he thought that Stalin was an ally of the United States. (Truman, like FDR, had advisers who were Communists.) Truman changed his mind about defense budgets, and about Stalin, when Communist North Korea attacked South Korea in 1950. The attack on South Korea came after Truman’s Secretary of State (the man responsible for relations with other countries) made a speech about countries that the United States would defend. South Korea was not one of those countries.

When South Korea was invaded, Truman asked General of the Army Douglas MacArthur to lead the defense of South Korea. MacArthur planned and executed the amphibious landing at Inchon, which turned the war in favor of South Korea and its allies. The allied forces then succeeded in pushing the front line far into North Korea. Communist China then entered the war on the side of North Korea. MacArthur wanted to counterattack Communist Chinese bases and supply lines in Manchuria, but Truman wouldn’t allow that. Truman then “fired” MacArthur because MacArthur spoke publicly about his disagreement with Truman’s decision. The Chinese Communists pushed allied forces back and the Korean War ended in a deadlock, just about where it had begun, near the 38th parallel.

In the meantime, Communist spies had stolen the secret plans for making atomic bombs. They were able to do that because Truman refused to hear the truth about Communist spies who were working inside the government. By the time Truman left office the Soviet Union had manufactured nuclear weapons, had strengthened its grip on Eastern Europe, and was beginning to expand its influence into the Third World (the nations of Africa and the Middle East).

Truman was very unpopular by 1952. As a result he chose not to run for re-election, even though he could have done so. (The “Lame Duck” amendment to the Constitution, which bars a person from serving as President for more than six years was adopted while Truman was President, but it didn’t apply to him.)

Dwight David Eisenhower (1890-1969), a Republican, served as President from January 20, 1953 to January 20, 1961. Eisenhower (also known by his nickname, “Ike”) received 55 percent of the popular vote in 1952 and 57 percent in 1956; his Democrat opponent in both elections was Adlai Stevenson. The Republican Party chose Eisenhower as a candidate mainly because he had become famous as a general during World War II. Republican leaders thought that by nominating Eisenhower they could end the Democrats’ twenty-year hold on the presidency. The Republican leaders were right about that, but in choosing Eisenhower as a candidate they rejected the Republican Party’s traditional stand in favor of small government.

Eisenhower was a “moderate” Republican. He was not a “big spender” but he did not try to undo all of the new government programs that had been started by FDR and Truman. Traditional Republicans eventually fought back and, in 1964, nominated a small-government candidate named Barry Goldwater. I will discuss him when I get to President Lyndon B. Johnson.

Eisenhower was a popular President, and he was a good manager, but he gave the impression of being “laid back” and not “in charge” of things. The news media had led Americans to believe that “activist” Presidents are better than laissez-faire Presidents, and so there was by 1960 a lot of talk about “getting the country moving again” — as if it was the job of the President to “run” the country instead of execution laws duly enacted in accordance with the Constitution.

John Fitzgerald Kennedy (1917-1963), a Democrat, was elected in 1960 to succeed President Eisenhower. Kennedy, who became known as JFK, served from January 20, 1961, until November 22, 1963, when he was assassinated in Dallas, Texas.

One reason that Kennedy won the election of 1960 (with 50 percent of the popular vote) was his image of “vigorous youth” (he was 27 years younger than Eisenhower). In fact, JFK had been in bad health for most of his life. He seemed to be healthy only because he used a lot of medications. Those medications probably impaired his judgment and would have caused him to die at a relatively early age if he hadn’t been assassinated.

Late in Eisenhower’s administration a Communist named Fidel Castro had taken over Cuba, which is only 90 miles south of Florida. The Central Intelligence Agency then began to work with anti-Communist exiles from Cuba. The exiles were going to attempt an invasion of Cuba at a place called the Bay of Pigs. In addition to providing the necessary military equipment, the U.S. was also going to provide air support during the invasion.

JFK succeeded Eisenhower before the invasion took place, in April 1961. JFK approved changes in the invasion plan that resulted in the failure of the invasion. The most important change was to discontinue air support for the invading forces. The exiles were defeated, and Castro has remained firmly in control of Cuba.

The failed invasion caused Castro to turn to the USSR for military and economic assistance. In exchange for that assistance, Castro agreed to allow the USSR to install medium-range ballistic missiles in Cuba. That led to the so-called Cuban Missile Crisis in 1962. Many historians give Kennedy credit for resolving the crisis and avoiding a nuclear war with the USSR. The Russians withdrew their missiles from Cuba, but JFK had to agree to withdraw American missiles from bases in Turkey.

The myth that Kennedy had stood up to the Russians made him more popular in the U.S. His major accomplishment, which Democrats today like to ignore, was to initiate tax cuts, which became law after his assassination. The Kennedy tax cuts helped to make America more prosperous during the 1960s by giving people more money to spend, and by encouraging businesses to expand and create jobs.

The assassination of JFK on November 22, 1963, in Dallas was a shocking event. It also led many Americans to believe that JFK would have become a great President if he had lived and been re-elected to a second term. There is little evidence that JFK would have become a great President. His record in Cuba suggests that he would not have done a good job of defending the country.

Lyndon Baines Johnson (1908-1973), also known as LBJ, was Kennedy’s Vice President and became President upon Kennedy’s assassination. LBJ was re-elected in 1964; he served as President from November 22, 1963 to January 20, 1969. LBJ’s Republican opponent in 1964 was Barry Goldwater, who was an old-style Republican conservative, in favor of limited government and a strong defense. LBJ portrayed Goldwater as a threat to America’s prosperity and safety, when it was LBJ who was the real threat. Americans were still in shock about JFK’s assassination, and so they rallied around LBJ, who won 61 percent of the popular vote.

LBJ is known mainly for two things: his “Great Society” program and the war in Vietnam. The Great Society program was an expansion of FDR’s New Deal. It included such things as the creation of Medicare, which is medical care for retired persons that is paid for by taxes. Medicare is an example of a “welfare” program. Welfare programs take money from people who earn it and give money to people who don’t earn it. The Great Society also included many other welfare programs, such as more benefits for persons who are unemployed. The stated purpose of the expansion of welfare programs under the Great Society was to end poverty in America, but that didn’t happen. The reason it didn’t happen is that when people receive welfare they don’t work as hard to take care of themselves and their families, and they don’t save enough money for their retirement. Welfare actually makes people worse off in the long run.

America’s involvement in Vietnam began in the 1950s, when Eisenhower was President. South Vietnam was under attack by Communist guerrillas, who were sponsored by North Vietnam. Small numbers of U.S. forces were sent to South Vietnam to train and advise South Vietnamese forces. More U.S. advisers were sent by JFK, but within a few years after LBJ became President he had turned the war into an American-led defense of South Vietnam against Communist guerrillas and regular North Vietnamese forces. LBJ decided that it was important for the U.S. to defeat a Communist country and stop Communism from spreading in Southeast Asia.

However, LBJ was never willing to commit enough forces in order to win the war. He allowed air attacks on North Vietnam, for example, but he wouldn’t invade North Vietnam because he was afraid that the Chinese Communists might enter the war. In other words, like Truman in Korea, LBJ was unwilling to do what it would take to win the war decisively. Progress was slow and there were a lot of American casualties from the fighting in South Vietnam. American newspapers and TV began to focus attention on the casualties and portray the war as a losing effort. That led a lot of Americans to turn against the war, and college students began to protest the war (because they didn’t want to be drafted). Attention shifted from the war to the protests, giving the world the impression that America had lost its resolve. And it had.

LBJ had become so unpopular because of the war in Vietnam that he decided not to run for President in 1968. Most of the candidates for President campaigned by saying that they would end the war. In effect, the United States had announced to North Vietnam that it would not fight the war to win. The inevitable outcome was the withdrawal of U.S. forces from Vietnam, which finally happened in 1973, under LBJ’s successor, Richard Nixon. South Vietnam was left on its own, and it fell to North Vietnam in 1975.

Richard Milhous Nixon (1913-1994) was a Republican. He won the election of 1968 by beating the Democrat candidate, Hubert H. Humphrey (who had been LBJ’s Vice President), and a third-party candidate, George C. Wallace. Nixon and Humphrey each received 43 percent of the popular vote; Wallace received 14 percent. If Wallace had not been a candidate, most of the votes cast for him probably would have been cast for Nixon.

Even though Nixon received less than half of the popular vote, he won the election because he received a majority of electoral votes. Electoral votes are awarded to the winner of each State’s popular vote. Nixon won a lot more States than Humphrey and Wallace, so Nixon became President.

Nixon won re-election in 1972, with 61 percent of the popular vote, by beating a Democrat (George McGovern) who would have expanded LBJ’s Great Society and cut America’s armed forces even more than they were cut after the Vietnam War ended. Nixon’s victory was more a repudiation of McGovern than it was an endorsement of Nixon. His second term ended in disgrace when he resigned the presidency on August 9, 1974.

Nixon called himself a conservative, but he did nothing during his presidency to curb the power of government. He did not cut back on the Great Society. He spent a lot of time on foreign policy. But Nixon’s diplomatic efforts did nothing to make the USSR and Communist China friendlier to the United States. Nixon had shown that he was essentially a weak President by allowing U.S. forces to withdraw from Vietnam. Dictatorial rulers like do not respect countries that display weakness.

Nixon was the first (and only) President who resigned from office. He resigned because the House of Representatives was ready to impeach him. An impeachment is like a criminal indictment; it is a set of charges against the holder of a public office. If Nixon had been impeached by the House of Representatives, he would have been tried by the Senate. If two-thirds of the Senators had voted to convict him he would have been removed from office. Nixon knew that he would be impeached and convicted, so he resigned.

The main charge against Nixon was that he ordered his staff to cover up his involvement in a crime that happened in 1972, when Nixon was running for re-election. The crime was a break-in at the headquarters of the Democratic Party in Washington, D.C. Because the Democratic Party’s headquarters was located in the Watergate Building in Washington, D.C., this episode became known as the Watergate Scandal.

The purpose of the break-in was to obtain documents that might help Nixon’s re-election effort. The men who participated in the break-in were hired by aides to Nixon. Details about the break-in and Nixon’s involvement were revealed as a result of investigations by Congress, which were helped by reporters who were doing their own investigative work.

But there is good reason to believe that Nixon was unjustly forced from office by the concerted efforts of the news media (most of which had long been biased against Nixon), Democrats in Congress, and many Republicans who were anxious to rid themselves of Nixon, who was a magnet for controversy.

Gerald Rudolph Ford (born Leslie King Jr.) (1913 – 2007), who was Nixon’s Vice President at the time Nixon resigned, became President on August 9, 1974 and served until January 20, 1977. Ford succeeded Spiro T. Agnew, who had been Nixon’s Vice President until October 10, 1973, when he resigned because he had been taking bribes while he was Governor of Maryland (the job he had before becoming Vice President).

Ford became the first Vice President chosen in accordance with the Twenty-Fifth Amendment to the Constitution. That amendment spells out procedures for filling vacancies in the presidency and vice presidency. When Vice President Agnew resigned, President Nixon nominated Ford as Vice President, and the nomination was approved by a majority vote of the House and Senate. Then, when Ford became President, he nominated Nelson Rockefeller to fill the vice presidency, and Rockefeller was elected Vice President by the House and Senate.

Ford ran for re-election in 1976, but he was defeated by James Earl Carter, mainly because of the Watergate Scandal. Ford was not involved in the scandal, but voters often cast votes for silly reasons. Carter’s election was a rejection of Richard Nixon, who had left office two years earlier, not a vote of confidence in Carter.

James Earl (“Jimmy”) Carter Jr. (1924 – ), a Democrat who had been Governor of Georgia, received only 50 percent of the popular vote. He was defeated for re-election in 1980, so he served as President from January 20, 1977 to January 20, 1981.

Carter was an ineffective President who failed at the most important duty of a President, which is to protect Americans from foreign enemies. His failure came late in his term of office, during the Iran Hostage Crisis. The Shah of Iran had ruled the country for 38 years. He was overthrown in 1979 by a group of Muslim clerics (religious men) who disliked the Shah’s pro-American policies. In November 1979 a group of students loyal to the new Muslim government of Iran invaded the American embassy in Tehran (Iran’s capital city) and took 66 hostages. Carter approved rescue efforts, but they were poorly planned. The hostages were still captive by the time of the presidential election in 1980. Carter lost the election largely because of his feeble rescue efforts.

In recent years Carter has become an outspoken critic of America’s foreign policy. Carter is sympathetic to America’s enemies and he opposes strong military action in defense of America.

Ronald Wilson Reagan (1911-2004), a Republican, succeeded Jimmy Carter as President. Reagan won 51 percent of the popular vote in 1980. Reagan would have received more votes, but a former Republican (John Anderson) ran as a third-party candidate and took 7 percent of the popular vote. Reagan was re-elected in 1984 with 59 percent of the popular vote. He served as President from January 20, 1981, until January 20, 1989.

Reagan had two goals as President: to reduce the size of government and to increase America’s military strength. He was unable to reduce the size of government because, for most of his eight years in office, Democrats were in control of Congress. But Reagan was able to get Congress to approve large reductions in income-tax rates. Those reductions led to more spending on consumer goods and more investment in the creation of new businesses. As a result, Americans had more jobs and higher incomes.

Reagan succeeded in rebuilding America’s military strength. He knew that the only way to defeat the USSR, without going to war, was to show the USSR that the United States was stronger. A lot of people in the United States opposed spending more on military forces; they though that it would cause the USSR to spend more. They also thought that a war between the U.S. and USSR would result. Reagan knew better. He knew that the USSR could not afford to keep up with the United States. Reagan was right. Not long after the end of his presidency the countries of Eastern Europe saw that the USSR was really a weak country, and they began to break away from the USSR. Residents of Berlin demolished the Berlin Wall, which the USSR had erected in 1961 to keep East Berliners from crossing over into West Berlin. East Germany was freed from Communist rule, and it reunited with West Germany. The USSR collapsed, and many of the countries that had been part of the USSR became independent. We owe the end of the Soviet Union and its influence President Reagan’s determination to defeat the threat posed by the Soviet Union.

George Herbert Walker Bush (1924 – 2019), a Republican, was Reagan’s Vice President. He won 54 percent of the popular vote when he defeated his Democrat opponent, Michael Dukakis, in the election of 1988. Bush lost the election of 1992. He served as President from January 20, 1989 to January 20, 1993.

The main event of Bush’s presidency was the Gulf War of 1990-1991. Iraq, whose ruler was Saddam Hussein, invaded the small neighboring country of Kuwait. Kuwait produces and exports a lot of oil. The occupation of Kuwait by Iraq meant that Saddam Hussein might have been able to control the amount of oil shipped to other countries, including Europe and the United States. If Hussein had been allowed to control Kuwait, he might have moved on to Saudi Arabia, which produces much more oil than Kuwait. President Bush asked Congress to approve military action against Iraq. Congress approved the action, although most Democrats voted against giving President Bush authority to defend Kuwait. The war ended in a quick defeat for Iraq’s armed forces. But President Bush decided not to allow U.S. forces to finish the job and end Saddam Hussein’s reign as ruler of Iraq.

Bush’s other major blunder was to raise taxes, which helped to cause a recession. The country was recovering from the recession in 1992, when Bush ran for re-election, but his opponents were able to convince voters that Bush hadn’t done enough to end the recession. In spite of his quick (but incomplete) victory in the Persian Gulf War, Bush lost his bid for re-election because voters were concerned about the state of the economy.

William Jefferson Clinton (born William Jefferson Blythe III) (1946 – ), a Democrat, defeated George H.W. Bush in the 1992 election by gaining a majority of the electoral vote. But Clinton won only 43 percent of the popular vote. Bush won 37 percent, and 19 percent went to H. Ross Perot. Perot, a third-party candidate, who received many votes that probably would have been cast for Bush.

Clinton’s presidency got off to a bad start when he sent to Congress a proposal that would have put health care under government control. Congress rejected the plan, and a year later (in 1994) voters went to the polls in large number to elect Republican majorities to the House and Senate.

Clinton was able to win re-election in 1996, but he received only 49 percent of the popular vote. He was re-elected mainly because fewer Americans were out of work and incomes were rising. This economic “boom” was a continuation of the recovery that began under President Reagan. Clinton got credit for the “boom” of the 1990s, which occurred in spite of tax increases passed by Congress while it was still controlled by Democrats.

Clinton was perceived as a “moderate” Democrat because he tried to balance the government’s budget; that is, he tried not to spend more money than the government was receiving in taxes. He was eventually able to balance the budget, but only because he cut defense spending. In addition to that, Clinton made several bad decisions about defense issues. In 1993 he withdrew American troops from Somalia, instead of continuing with the military mission there after some troops were captured and killed by natives. In 1994 he signed an agreement with North Korea that was supposed to keep North Korea from developing nuclear weapons, but the North Koreans continued to work on building nuclear weapons because they had fooled Clinton. By 1998 Clinton knew that al Qaeda had become a major threat when terrorists bombed two U.S. embassies in Africa, but Clinton failed to go to war against al Qaeda. Only after terrorists struck a Navy ship, the USS Cole, in 2000 did Clinton declare terrorism to be a major threat. By then, his term of office was almost over.

Clinton was the second President to be impeached. The House of Representatives impeached him in 1998. He was charged with perjury (lying under oath) when he was the defendant (the person being charged with wrong-doing) in a law suit. The Senate didn’t convict Clinton because every Democrat senator refused to vote for conviction, in spite of overwhelming evidence that Clinton was guilty. The day before Clinton left office he acknowledged his guilt by agreeing to a five-year suspension of his law license. A federal judge later found Clinton guilty of contempt of court for his misleading testimony and fined him $90,000.

Clinton was involved in other scandals during his presidency, but he remains popular with many people because he is good at giving the false impression that he is a nice, humble person.

Clinton’s scandals had more effect on his Vice President, Al Gore, who ran for President as the nominee of the Democrat Party in 2000. His main opponent was George W. Bush, a Republican. A third-party candidate named Ralph Nader also received a lot of votes. The election of 2000 was the closest presidential election since 1876. Bush and Gore each won about 48 percent of the popular vote (Gore’s percentage was slightly higher than Bush’s); Nader won 3 percent. The winner of the election was decided by outcome of the vote in Florida. That outcome was the subject of legal proceedings for six weeks. It had to be decided by the U.S. Supreme Court.

Initial returns in Florida gave that State’s electoral votes to Bush, which meant that he would become President. But the Supreme Court of Florida decided that election officials should violate Florida’s election laws and keep recounting the ballots in certain counties. Those counties were selected because they had more Democrats than Republicans, and so it was likely that recounts would favor Gore, the Democrat. The case finally went to the U.S. Supreme Court, which decided that the Florida Supreme Court was wrong. The U.S. Supreme Court ordered an end to the recounts, and Bush was declared the winner of Florida’s electoral votes.

George Walker Bush (1946 – ), a Republican, was the second son of a President to become President. (The first was John Quincy Adams, the sixth President, whose father, John Adams, was the second President. Also, Benjamin Harrison, the 23rd President, was the grandson of William Henry Harrison, the ninth President.) Bush won re-election in 2004, with 51 percent of the popular vote. He served as President from January 20, 2001, to January 20, 2009.

President Bush’s major accomplishment before September 11, 2001, was to get Congress to cut taxes. The tax cuts were necessary because the economy had been in a recession since 2000. The tax cuts gave people more money to spend and encouraged businesses to expand and create new jobs.

The terrorist attacks on September 11, 2001, caused President Bush to give most of his time and attention to the War on Terror. The invasion of Afghanistan, late in 2001, was part of a larger campaign to disrupt terrorist activities. Afghanistan was ruled by the Taliban, a group that gave support and shelter to al Qaeda terrorists. The U.S. quickly defeated the Taliban and destroyed al Qaeda bases in Afghanistan.

The invasion of Iraq, which took place in 2003, was also intended to combat al Qaeda, but in a different way. Iraq, under Saddam Hussein, had been an enemy of the U.S. since the Persian Gulf War of 1990-1991. Hussein was trying to acquire deadly weapons to use against the U.S. and its allies. Hussein was also giving money to terrorists and sheltering them in Iraq. The defeat of Hussein, which came quickly after the invasion of Iraq, was intended to establish a stable, friendly government in the Middle East.

The invasion of Iraq produced some of the intended results, but there was much unrest there because of long-standing animosity between Sunni Muslims and Shi’a Muslims. There was also much defeatist talk about Iraq — especially by Democrats and the media. That defeatist talk helped to encourage those who were creating unrest in Iraq. It gave them hope that the U.S. would abandon Iraq, just as it abandoned Vietnam more than 30 years earlier. The country had become almost uncontrollable until Bush authorized a military “surge” — enough additional troops to quell the unrest.

However, Bush, like his father, failed to take a strategically decisive course of action. He should have ended the pretense of “nation-building”, beefed up U.S. military presence, and installed a compliant Iraqi government. That would have created a U.S. stronghold in the Middle East and stifled Iran’s moves toward regional hegemony, just as the presence of U.S. forces in Europe for decades after World War II kept the USSR from seizing new territory and eventually wore it down.

With Iraq as a U.S. base of operations, it would have been easier to quell Afghanistan and to launch preemptive strikes on Iran’s nuclear-weapons program while it was still in its early stages.

But the early failures in Iraq — and the futility of the Afghan operation (also done on the cheap) — meant that Bush had no political backing for bolder military measures. Further, the end of his second term was blighted by a financial crisis that led a stock-market crash, the failure of some major financial firms, the bailout of some others, and thence to the Great Recession.

The election of 2008 coincided with the economic downturn, and it was no surprise that the Democrat candidate handily beat the feckless Republican (in-name-only) candidate, John Sidney McCain III.

Barack Hussein Obama II (1961 – ) was the Democrat who defeated McCain. Obama, like most of his predecessors, was a professional politician, but most of his political experience was as a “community organizer” (i.e., rabble-rouser and shakedown artist) in Chicago. He was still serving in his first major office (as U.S. Senator from Illinois) when he vaulted ahead of Hillary Rodham Clinton and seized the Democrat nomination for the presidency. He served as President from January 20, 2009, until January 20, 2017.

Obama’s ascendancy was owed in large part to the perception of him as youthful and energetic. He was careful to seem moderate in his campaign rhetoric, though those in the know (party leaders and activists) were well aware of his strong left-wing leanings, which were revealed in his Senate votes and positions. Clinton, by contrast, was perceived as middle-of the-road, but only because the road had shifted well to the left over the years. It was she, for example, who propounded the health-care nationalization scheme known as HillaryCare. The scheme was defeated in Congress, but it was responsible in large part for massive swing of House seats in 1994, which returned the House to GOP control for the first time in 42 years.

Obama’s election was due also to a health dose of white “guilt”. Here was an opportunity for many voters to “prove” (and to brag about) their lack of racism. And so, given the experience of Iraq, the onset of the Great Recession, and a me-too Republican candidate, they did the easy thing by voting for Obama, and enjoyed the feel-good sensation that went with it.

At any rate, Obama served two terms (the second was secured by defeating Willard Mitt Romney, another feckless RINO). His presidency throughout both terms was marked by disastrous policies; for example:

  • Obamacare, which drastically raised health-care costs and insurance premiums and added millions of freeloaders to Medicaid
  • encouragement of illegal immigration, which imposes heavy burdens on middle-class taxpayers and is intended to swell the rolls of Democrat voters through amnesty schemes
  • increases in marginal tax rates for individuals and businesses
  • issuance of economically stultifying regulations at an unprecedented page
  • nomination of dozens of left-wing judges and two left-wing Supreme Court Justices, partly to ensure “empathic” (leftist) rulings rather than rulings in accordance with the Constitution
  • sharp reductions in defense spending
  • meddling in Libya, which through Hillary Clinton’s negligence cost the lives of American diplomats
  • Clinton’s use of a private e-mail server, in which Obama was complicit, and which resulted in the compromise of sensitive, classified information.
  • a drastic military draw-down in Iraq, with immediately dire consequences (and a just-in-time reversal by Obama)
  • persistent anti-white and anti-American rhetoric (the latter especially on foreign soil and at the UN)
  • persistent anti-business rhetoric that, together with tax increases and regulatory excesses, killed the recovery from the Great Recession and put the U.S. firmly on the road to economic stagnation.

It should therefore have been a simple matter for voters to reject Obama’s inevitable successor: Hillary Clinton. But the American public has been indoctrinated in leftism for decades by public schools, the mainstream media, and a plethora TV shows and movies, with the result that Clinton acquired 5 million more popular votes, nationwide, than did her Republican opponent. The foresight of the Framers of the Constitution proved providential because her opponent carefully chose his battlegrounds and was handily won in the electoral college. Thus …

Donald John Trump (1946 – ) succeeded Obama and was inaugurated as President on January 20, 2017. He is only in the third year of his presidency, but has accomplished much despite a “resistance” movement that began as soon as his election was assured in the early-morning hours of November 9, 2016. (The “resistance”, which I discuss here, is a continuation of political and social trends that are rooted in the 1960s.)

These are among Trump’s accomplishments, many of them the result of a successful collaboration with both houses of Congress, which Republicans controlled for the first two years of Trump’s presidency, and the Senate, which remains under GOP control:

  • the end of Obamacare’s requirement to buy some form of health-insurance or pay a “tax”, which penalized the healthy and forced many to do something that would otherwise not do
  • discouragement of illegal immigration through tougher enforcement (against a huge, left-wing financed influx of illegals)
  • decreases in marginal tax rates for individuals and businesses
  • the repeal of many economically stultifying regulations and a drastic slowdown in the issuance of regulations
  • nomination of dozens of conservative judges and two conservative Supreme Court Justices
  • sharp increases in defense spending
  • the beginning of the end of foreign adventures that are unrelated to the interests of Americans (e.g., the drawdown in Syria)
  • relative stability in Iraq
  • pro-American rhetoric on foreign soil and at the UN
  • persistent pro-business rhetoric that, together with tax-rate cuts and regulatory reform, is helping to buoy the U.S. economy despite slowdowns elsewhere and Trump’s “trade war”, which is really aimed at creating a level playing field for American companies and workers.

This story will be continued.

More Presidential Trivia

The modern presidency began with the adored “activist”, Teddy Roosevelt. From TR to the present, there have been only four (of twenty) presidents who first competed in a general election as a candidate for the presidency: Taft, Hoover, Eisenhower, and Trump. Trump is alone in having had no previous governmental service before becoming president. There’s no moral to this story. Make of it what you will.

(See also “Presidents: Key Dates and Various Trivia“, to which this commentary has been added.)

That “Hurtful” Betsy Ross Flag

Fox News has the latest:

Two Democratic hopefuls have expressed their support for Nike after the sportswear company pulled sneakers featuring the Betsy Ross-designed American flag ahead of the Fourth of July holiday. The company did so after former NFL quarterback and Nike endorser Colin Kaepernick raised concerns about the shoes.

Former HUD Secretary Julián Castro told CBS News on Wednesday that he was “glad to see” Nike remove the shoes from the shelves, comparing the “painful” symbol to the Confederate flag.

“There are a lot of things in our history that are still very painful,” Castro explained. As an example, he cited “the Confederate flag that still flies in some places and is used as a symbol.”\

Former Texas congressman Beto O’Rourke also approved of Nike’s decision, noting that “white nationalist groups” have “appropriated” the Betsy Ross flag.

“I think its really important to take into account the impression that kind of symbol would have for many of our fellow Americans,” he said,  according to Jewish Insider senior political reporter Ben Jacobs.

As I understand it, the Betsy Ross flag, which became the symbol of the rebellious, united States (i.e., Colonies) in 1777, is “hurtful” because it dates from an era when slavery was legal in what became the United States. How that historical fact is “hurtful” to anyone is beyond me. The fact of slavery is reprehensible, but a flag that merely denotes America’s struggle for independence from Britain really has nothing to do with slavery, except in the slippery way that “social justice” warriors have just invented. (Clearly, they are running low on ideas.)

Well, if the Betsy Ross flag is “hurtful” to professional virtue-signalers and malcontents, it is certainly — and more legitimately — hurtful to me. I am a direct descendant of a man who, with three of his sons (one of whom I am also directly descended from), fought on the British side in the Revolutionary War. They had settled in the Colony of Pennsylvania in the 1750s and, perhaps not unwisely, chose to defend the Crown against presumptuous rebels like George Washington, Samuel Adams, Thomas Jefferson, and 53 other signatories of the Declaration of Independence — all of whom used to be called patriots. (Washington, Jefferson, and many other signatories owned slaves, but that wasn’t why they rebelled; slavery was then still legal throughout the British Empire.)

In any event, because my ancestors were Loyalists, they fled to Canada at the end of the war. And from then until the birth of my father in the United States more than 130 years later, the ancestors in my paternal line of descent were Canadian and therefore (nominally, at least) subjects of the British monarch.

So if anyone has a right to be offended by the Betsy Ross flag, it is I. But I am not offended by the flag, though I am deeply offended by the useless twits who profess to be offended by it.

The Fall of America

Victor Davis Hanson, like many others before him (and like) me, sees the unraveling of America portended by Petronius’s The Satyricon (ca. 60 AD):

Certain themes … are timeless and still resonate today.

The abrupt transition from a society of rural homesteaders into metropolitan coastal hubs had created two Romes. One world was a sophisticated and cosmopolitan network of traders, schemers, investors, academics, and deep-state imperial cronies. Their seaside corridors were not so much Roman as Mediterranean. And they saw themselves more as “citizens of the world” than as mere Roman citizens.

In the novel, vast, unprecedented wealth had produced license. On-the-make urbanites suck up and flatter the childless rich in hopes of being given estates rather than earning their own money….

[The] novel’s accepted norms are pornography, gratuitous violence, sexual promiscuity, transgenderism, delayed marriage, childlessness, fear of aging, homelessness, social climbing, ostentatious materialism, prolonged adolescence, and scamming and conning in lieu of working.

The characters are fixated on expensive fashion, exotic foods, and pretentious name-dropping. They are the lucky inheritors of a dynamic Roman infrastructure that had globalized three continents. Rome had incorporated the shores of the Mediterranean under uniform law, science, institutions—all kept in check by Roman bureaucracy and the overwhelming power of the legions, many of them populated by non-Romans.

Never in the history of civilization had a generation become so wealthy and leisured, so eager to gratify every conceivable appetite—and yet so bored and unhappy.

But there was also a second Rome in the shadows. Occasionally the hipster antiheroes of the novel bump into old-fashioned rustics, shopkeepers, and legionaries. They are what we might now call the ridiculed “deplorables” and “clingers.”…

Globalization had enriched and united non-Romans into a world culture. That was an admirable feat. But such homogenization also attenuated the very customs, traditions, and values that had led to such astounding Roman success in the first place….

But the new empire also diluted a noble and unique Roman agrarianism. It eroded nationalism and patriotism. The empire’s wealth, size, and lack of cohesion ultimately diminished Roman unity, as well as traditional marriage, child-bearing, and autonomy….

[W]ide reading ensures erudition and sophistication, and helps science supplant superstition. But sometimes education is also ambiguous. Students become idle, pretentious loafers. Professors are no different from loud pedants. Writers are trite and boring. Elite pundits sound like gasbags.

Petronius seems to imply that whatever the Rome of his time was, it was likely not sustainable—but would at least be quite exciting in its splendid decline.

Petronius also argues that with too much rapid material progress comes moral regress. His final warning might be especially troubling for the current generation of Western Europeans and Americans. Even as we brag of globalizing the world and enriching the West materially and culturally, we are losing our soul in the process.

Getting married, raising families, staying in one place, still working with our hands, and postponing gratification may be seen as boring and out of date. But nearly 2,000 years later, all of that is what still keeps civilization alive.

Hanson omits — because Petronious’s prescience was limited — the end game, in which the glory that was Rome was extinguished by internal rot, military failure, and invasion. The first of those — internal rot –is well underway in the United States, “thanks” to the Democrat Party. The second — military failure — has become more or less a habit since the Korean War — a habit that will resume with the eventual return to power of the Democrat Party. The third — invasion — probably will be accomplished in bloodless form by the determination of China’s leadership, when a Democrat administration (having disarmed the country) accedes to military and economic coercion.

And, ironically (but blessedly) that will put paid to the kinds of excesses that Democrats have fostered in their zeal for (evanescent) power: pornography, gratuitous violence, sexual promiscuity, transgenderism, delayed marriage, childlessness, fear of aging, homelessness, social climbing, ostentatious materialism, prolonged adolescence, and scamming and conning in lieu of working.

America’s virtual state of servitude will also put paid to the last vestiges of liberty in the land, though they would have eventually disappeared under Democrat rule.

End of a Generation

The so-called greatest generation has died out in my family, as it soon will die out across the land. The recent death of my mother-in-law at age 98 removed from the scene the last of my wife’s and my parents and their siblings: 26 of them in all.

Their birth years ranged from 1903 to 1922. There were, oddly, 18 males as against only 8 females, and the disparity held for all four sets of siblings:

7 to 3 for my mother’s set

2 to 1 for my father’s set

5 to 3 for my wife’s mother’s set

4 to 1 for my wife’s father’s set.

Only one of the 26 died before reaching adulthood (my father’s younger brother at 18 months). Two others (also males) died relatively young. One of my mother’s brothers died just a few weeks before his 40th birthday as a result of a jeep accident (he was on active duty in the Coast Guard). One of my wife’s mother’s brothers died at age 48 as a long delayed result of a blow to the head by a police truncheon.

The other 15 males lived to ages ranging from 65 to 96, with an average age at death of 77 years. The 8 females lived to ages ranging from 69 to 99, with an average age at death of 87 years. The longest-lived of the males was the only one to pass the 90 mark. Four of the females lived into their 90s, dying at ages 91, 96, 98, and 99.

All of the 25 who reached adulthood also married. Only two of them had a marriage end in divorce. All of them were raised in near-poverty or in somewhat comfortable circumstances that vanished with the onset of the Great Depression. All of them worked hard, whether in the home or outside of it; none of them went on welfare; most of the men and two of the women served in uniform during World War II.

Thus passeth a generation sui generis.

Where are Elmer, Herman, Bert, Tom and Charley,
The weak of will, the strong of arm, the clown, the boozer, the fighter?
All, all, are sleeping on the hill….

Where are Ella, Kate, Mag, Lizzie and Edith,
The tender heart, the simple soul, the loud, the proud, the happy one?
All, all, are sleeping on the hill.

Edgar Lee Masters, Spoon River Anthology (“The Hill“)

Peak Civilization

The fate of most human endeavors is that they reach a peak of attainment, which is then followed by a decline due to excess on the one hand and neglect on the other hand. “Classical” music is a favorite example of mine. The form peaked around the turn of the 20th century, then went over the top into — variously — cacophony, atonality, and arrythmic confusion. The best of contemporary “classical” music is merely derivative of the form as it was at its peak.

So it is with myriad endeavors, the most important of which is the endeavor of rational inquiry. In the West, rational inquiry seems to have peaked in the early 1960s. I needn’t remind you of the subsequent descent: mobs, riots, the din of “entertainment”, quasi-religious movements from hippiedom to “climate change”, and on and on into the night.

It all makes me glad that I came of age in the 1950s, when civilized discourse was still possible and scientists were dedicated to the pursuit of truth, not the projection of their hopes, fears, and feelings.

Not with a Bang

This is the way the world ends
This is the way the world ends
This is the way the world ends
Not with a bang but a whimper.

T.S. Elliot, The Hollow Men

It’s also the way that America is ending. Yes, there are verbal fireworks aplenty, but there will not be a “hot” civil war. The country that my parents and grandparents knew and loved — the country of my youth in the 1940s and 1950s — is just fading away.

This would not necessarily be a bad thing if the remaking of America were a gradual, voluntary process, leading to time-tested changes for the better. But that isn’t the case. The very soul of America has been and is being ripped out by the government that was meant to protect that soul, and by movements that government not only tolerates but fosters.

Before I go further, I should explain what I mean by America, which is not the same thing as the geopolitical entity known as the United States, though the two were tightly linked for a long time.

America was a relatively homogeneous cultural order that fostered mutual respect, mutual trust, and mutual forbearance — or far more of those things than one might expect in a nation as populous and far-flung as the United States. Those things — conjoined with a Constitution that has been under assault since the New Deal — made America a land of liberty. That is to say, they fostered real liberty, which isn’t an unattainable state of bliss but an actual (and imperfect) condition of peaceful, willing coexistence and its concomitant: beneficially cooperative behavior.

The attainment of this condition depends on social comity, which depends in turn on (a) genetic kinship and (b) the inculcation and enforcement of social norms, especially the norms that define harm.

All of that is going by the boards because the emerging cultural order is almost diametrically opposite that which prevailed in America. The new dispensation includes:

  • casual sex
  • serial cohabitation
  • subsidized illegitimacy
  • abortion on demand
  • easy divorce
  • legions of non-mothering mothers
  • concerted (and deluded) efforts to defeminize females and to neuter or feminize males
  • gender-confusion as a burgeoning norm
  • “alternative lifestyles” that foster disease, promiscuity, and familial instability
  • normalization of drug abuse
  • forced association (with accompanying destruction of property and employment rights)
  • suppression of religion
  • rampant obscenity
  • identity politics on steroids
  • illegal immigration as a “right”
  • “free stuff” from government (Social Security was meant to be self-supporting)
  • America as the enemy
  • all of this (and more) as gospel to influential elites whose own lives are modeled mostly on old America.

As the culture has rotted, so have the ties that bound America.

The rot has occurred to the accompaniment of cacophony. Cultural coarsening begets loud and inconsiderate vulgarity. Worse than that is the cluttering of the ether with the vehement and belligerent propaganda, most of it aimed at taking down America.

The advocates of the new dispensation haven’t quite finished the job of dismantling America. But that day isn’t far off. Complete victory for the enemies of America is only a few election cycles away. The squishy center of the electorate — as is its wont — will swing back toward the Democrat Party. With a Democrat in the White House, a Democrat-controlled Congress, and a few party switches in the Supreme Court (of the packing of it), the dogmas of the anti-American culture will become the law of the land; for example:

Billions and trillions of dollars will be wasted on various “green” projects, including but far from limited to the complete replacement of fossil fuels by “renewables”, with the resulting impoverishment of most Americans, except for comfortable elites who press such policies).

It will be illegal to criticize, even by implication, such things as abortion, illegal immigration, same-sex marriage, transgenderism, anthropogenic global warming, or the confiscation of firearms. These cherished beliefs will be mandated for school and college curricula, and enforced by huge fines and draconian prison sentences (sometimes in the guise of “re-education”).

Any hint of Christianity and Judaism will be barred from public discourse, and similarly punished. Islam will be held up as a model of unity and tolerance.

Reverse discrimination in favor of females, blacks, Hispanics, gender-confused persons, and other “protected” groups will be required and enforced with a vengeance. But “protections” will not apply to members of such groups who are suspected of harboring libertarian or conservative impulses.

Sexual misconduct (as defined by the “victim”) will become a crime, and any male person may be found guilty of it on the uncorroborated testimony of any female who claims to have been the victim of an unwanted glance, touch (even if accidental), innuendo (as perceived by the victim), etc.

There will be parallel treatment of the “crimes” of racism, anti-Islamism, nativism, and genderism.

All health care in the United States will be subject to review by a national, single-payer agency of the central government. Private care will be forbidden, though ready access to doctors, treatments, and medications will be provided for high officials and other favored persons. The resulting health-care catastrophe that befalls most of the populace (like that of the UK) will be shrugged off as a residual effect of “capitalist” health care.

The regulatory regime will rebound with a vengeance, contaminating every corner of American life and regimenting all businesses except those daring to operate in an underground economy. The quality and variety of products and services will decline as their real prices rise as a fraction of incomes.

The dire economic effects of single-payer health care and regulation will be compounded by massive increases in other kinds of government spending (defense excepted). The real rate of economic growth will approach zero.

The United States will maintain token armed forces, mainly for the purpose of suppressing domestic uprisings. Given its economically destructive independence from foreign oil and its depressed economy, it will become a simulacrum of the USSR and Mao’s China — and not a rival to the new superpowers, Russia and China, which will largely ignore it as long as it doesn’t interfere in their pillaging of respective spheres of influence. A policy of non-interference (i.e., tacit collusion) will be the order of the era in Washington.

Though it would hardly be necessary to rig elections in favor of Democrats, given the flood of illegal immigrants who will pour into the country and enjoy voting rights, a way will be found to do just that. The most likely method will be election laws requiring candidates to pass ideological purity tests by swearing fealty to the “law of the land” (i.e., abortion, unfettered immigration, same-sex marriage, freedom of gender choice for children, etc., etc., etc.). Those who fail such a test will be barred from holding any kind of public office, no matter how insignificant.

Are my fears exaggerated? I don’t think so, given what has happened in recent decades and the cultural revolutionaries’ tightening grip on the Democrat party. What I have sketched out can easily happen within a decade after Democrats seize total control of the central government.

Will the defenders of liberty rally to keep it from happening? Perhaps, but I fear that they will not have a lot of popular support, for three reasons:

First, there is the problem of asymmetrical ideological warfare, which favors the party that says “nice” things and promises “free” things.

Second, What has happened thus far — mainly since the 1960s — has happened slowly enough that it seems “natural” to too many Americans. They are like fish in water who cannot grasp the idea of life in a different medium.

Third, although change for the worse has accelerated in recent years, it has occurred mainly in forums that seem inconsequential to most Americans, for example, in academic fights about free speech, in the politically correct speeches of Hollywood stars, and in culture wars that are conducted mainly in the blogosphere. The unisex-bathroom issue seems to have faded as quickly as it arose, mainly because it really affects so few people. The latest gun-control mania may well subside — though it has reached new heights of hysteria — but it is only one battle in the broader war being waged by the left. And most Americans lack the political and historical knowledge to understand that there really is a civil war underway — just not a “hot” one.

Is a reversal possible? Possible, yes, but unlikely. The rot is too deeply entrenched. Public schools and universities are cesspools of anti-Americanism. The affluent elites of the information-entertainment-media-academic complex are in the saddle. Republican politicians, for the most part, are of no help because they are more interested on preserving their comfortable sinecures than in defending America or the Constitution.

On that note, I will take a break from blogging — perhaps forever. I urge you to read one of my early posts, “Reveries“, for a taste of what America means to me. As for my blogging legacy, please see “A Summing Up“, which links to dozens of posts and pages that amplify and support this post.

Il faut cultiver notre jardin.

Voltaire, Candide


Related reading:

Michael Anton, “What We Still Have to Lose“, American Greatness, February 10, 2019

Rod Dreher, “Benedict Option FAQ“, The American Conservative, October 6, 2015

Roger Kimball, “Shall We Defend Our Common History?“, Imprimis, February 2019

Joel Kotkin, “Today’s Cultural Engineers“, newgeography, January 26, 2019

Daniel Oliver, “Where Has All the Culture Gone?“, The Federalist, February 8, 2019

Malcolm Pollack, “On Civil War“, Motus Mentis, March 7, 2019

Fred Reed, “The White Man’s Burden: Reflections on the Custodial State“, Fred on Everything, January 17, 2019

Gilbert T. Sewall, “The Diminishing Authority of the Bourgeois Culture“, The American Conservative, February 4, 2019

Bob Unger, “Requiem for America“, The New American, January 24, 2019

A Summing Up

This post has been updated and moved to “Favorite Posts“.

“The Little Drummer Girl” and War

My wife and I recently watched a six-episode, made-for-TV adaptation of The Little Drummer Girl, a novel by John Le Carré‘ that was published in 1983. The story

follows the manipulations of Martin Kurtz, an Israeli spymaster who intends to kill Khalil – a Palestinian terrorist who is bombing Jewish-related targets in Europe, particularly Germany – and Charlie, an English actress and double agent working on behalf of the Israelis….

Kurtz … recruits Charlie, a “21 or 22-year-old” radical left-wing English actress, as part of an elaborate scheme to discover the whereabouts of Khalil… Joseph is Charlie’s case officer. Khalil’s younger brother Salim is abducted, interrogated, and killed by Kurtz’s unit. Joseph impersonates Salim and travels through Europe with Charlie to make Khalil believe that Charlie and Salim are lovers. When Khalil discovers the affair and contacts Charlie, the Israelis are able to track him down.

Charlie is taken to Palestinian refugee camps to be trained as a bomber. She becomes more sympathetic to the Palestinian cause, and her divided loyalties bring her close to collapse. Charlie is sent on a mission to place a bomb at a lecture given by an Israeli moderate whose peace proposals are not to Khalil’s liking. She carries out the mission under the Israelis’ supervision. As a result, Joseph kills Khalil. Charlie subsequently has a mental breakdown caused by the strain of her mission and her own internal contradictions.

I recall that the 1984 feature-film version was widely thought to be pro-Palestinian and, therefore, anti-Israeli.

Neither my wife nor I have seen the 1984 film. She has read the novel, though she doesn’t remember much about it. I haven’t read the novel. I therefore came to the made-for-TV series with little baggage, though I feared that it might prove to be anti-Israeli propaganda. I will render a verdict later in this post, after considering some relevant evidence about the novel and feature film.

According to a piece in The New York Times, published soon after the release of the feature film, the novel and film were meant to be neutral:

The main problem in attempting to remain faithful to the book was dealing with what the filmmakers saw as its political balance – striving to be even-handed in the portrayal of Israelis and Palestinians engaged in a violent struggle for their respective causes and survival in the super- charged, highly sensitive arena of current history involving the ongoing agony of the Middle East.

”We weren’t making a political film,” said [director George Roy] Hill. ”We have no political ax to grind. We were making a suspense story that happened to have a political background. But we wanted to be true to the book, which we believe to be even-handed. The book shows the Palestinians for the first time in a human light. Up until then, they were seen as bloodthirsty monsters.”…

Like the book, the film does humanize the Palestinians and, perhaps because of the medium itself which makes them and their ultimate decimation visually and painfully real to the audience, it seems likely that the film will engender even more controversy than did the book.

Mr. Le Carre thinks controversy arose because the Palestinians never had a fair hearing in the United States. ”It is true,” he said, ”that some people think that it is heretical, anti-Semitic and probably even anti- American to suggest that there is even anything to be said for the Palestinian side.”

The novelist has continued to arouse passions by publishing some articles sympathetic to the Palestinians after the Shatila massacre in 1982. Nevertheless, he denies that this makes him anti-Israeli. ”It’s almost a vulgarity to confuse a balance of compassion with a want of sympathy for Israel,” he said. ”If I had written the book later, after the full extent of the Israeli operation was known, I would have made it angrier. But I begin and I end, believe it or not, as a tremendous supporter of a concept of Israel.”…

Indeed, the movie does not proclaim itself explicitly on one side or the other. A catalog of the ills shown suffered by each side would probably add up to a fairly even score….

But still, making the movie called for tremendous amounts of surgery and, in some cases, amputation….

The change in Charlie’s character is interesting because Mr. Le Carre had specified in his original contract that Charlie be played by an English actress. ”We were unable to find a suitable English actress,” Mr. Hill said. ”When I first spoke to Diane about the part we discussed the possibility of playing it with an English accent. But then I saw the advantage of making her American – to isolate her even more from the European community. This difference, and her more advanced age, makes the whole ending scene more moving, gives it more impact. By the end she can no longer act, she can’t pretend. She has been destroyed.”…

While the changes in Charlie’s personality added a dimension, the changes in Kurtz’s removed an aspect of his character – a moral one.

In the book, Kurtz, the master-spy, has many of the same doubts as Joseph, the agent Charlie loves. The two resolve their doubts in different ways. Kurtz pushes past them by working to stop the Palestinians even if in the process he has to act against his own conscience….

In the movie Mr. Kinski, who has previously played many fierce and even demonic characters, plays Kurtz as a hard-liner. He becomes a super-efficient agent with a touch of fanaticism, who resolutely brushes away all moral qualms. The effect is to make the Israelis seem like a ruthlessly moving machine pitted against the more vulnerable Palestinians.

Mr. Le Carre originally objected to the casting of Mr. Kinski because ”I thought he carried too much baggage with him.” He said he thinks his own Kurtz is probably ”more Israeli” and not as harsh. Mr. Hill said the casting choice was made for dramatic reasons. It would have been boring, he maintains, to have on screen two characters as similar as Joseph and Kurtz. But it’s one example of how a change made for dramatic impact can subtly change the film’s psychological effect.

It would seem that the crucial casting of Kinski as Kurtz gave the film an anti-Israeli tone — intended or not — even if the novel was meant to be neutral, as Le Carré‘ insists. The made-for-TV series struck me as truer to the spirit of the novel, as Le Carré‘ describes it.

The TV series can be viewed superficially, as just another story with some compelling characters, suspenseful sequences, and a conclusive climax. The series can also seem pro-Israeli or pro-Palestinian, depending on the stance you bring to your viewing.

I admit to having been staunchly pro-Israeli for a long time, but on reflection I conclude that the TV series conveys a pro-Israeli message — and more.

Charlie’s pangs of conscience after the killings of Khalil and his henchpersons are short-lived. She retreats to a seaside resort, recovers quickly, and reconciles with Joseph. I see these anti-climactic events as indicative of a pro-Israeli slant. Although the anti-climactic events might have been contrived merely to give the series a happy ending, they rather obviously (though subtly) endorse the rightness of the cause to which Charlie was recruited.

The series also conveys, even more subtly, this crucial message: One cannot win a war — or stave off defeat — by being less than ruthless. It’s probably true that most Palestinians, like most Israelis, are just “ordinary people” trying to get on with daily life. But that doesn’t negate the reality of the unrelenting Arab-Muslim effort to terrorize and kill Israelis and to undermine Israel as a sovereign state.

The need for ruthlessness is a lesson that American leaders seemed to have learned in World War II, but which their successors failed to apply in the Korean War, the Vietnam War, the 1990-91 Gulf War, and the wars in Afghanistan and Iraq.


Related posts:
The Decision to Drop the Bomb
Delusions of Preparedness
Inside-Outside
A Moralist’s Moral Blindness
A Grand Strategy for the United States
Why We Should (and Should Not) Fight
Rating America’s Wars
Transnationalism and National Defense
Patience as a Tool of Strategy
The War on Terror, As It Should Have Been Fought
Preemptive War
Some Thoughts and Questions about Preemptive War
Defense as an Investment in Liberty and Prosperity
Defense Spending: One More Time
My Defense of the A-Bomb
Pacifism
Presidents and War
LBJ’s Dereliction of Duty
The Ken Burns Apology Tour Continues
Planning for the Last War
A Rearview Look at the Invasion of Iraq and the War on Terror
Preemptive War Revisited
The Folly of Pacifism (III)

GHWB

George Herbert Walker Bush (June 12, 1924 – November 30, 2018) is the new leader in the clubhouse. That is to say, he is now the oldest member of the Dead Presidents Club, at the age of 94.47 years.

GHWB replaced Gerald Ford, who made it to 93.45. Ford replaced Ronald Reagan, who made it to 93.33.

Jimmy (now 94.17) will replace GHWB if he lives to March 25, 2019.

John Adams (90.67) and Herbert Hoover (90.19) are the other occupants of the club’s exclusive 90+ room. As many members of the club (5) lived into their 90s as lived into their 80s.

Will Jimmy make it 6 to 5, or will he become the club’s first centenarian? He already holds the record for having outlived his presidency. He’s almost at the 38-year mark, well beyond Hoover’s 31.63. The only president of the past half-century who was worse than Carter is Obama, who will probably break Carter’s miserable record for post-presidential pestilence.

For more in this vein, see the updated version of “Presidents: Key Dates and Various Trivia“.

55 Years Ago — And Today

From “Where Were You?“, which I posted seven years ago:

I have long since repented of my admiration for JFK (e.g., here). But my repentance is irrelevant to this story. The events in Dallas on November 22, 1963, burned into my brain a memory that will remain with me for the rest of my life….

I have come to see that the emotions that stirred in me 48 years ago were foolish ones. The greatest tragedy of JFK’s passing was LBJ’s succession to the presidency. LBJ’s cynical use of JFK’s memory helped him to unleash policies that have divided America and threaten to bankrupt it.

From “Who Shot JFK, and Why?“:

What about the thesis advanced by James B. Reston Jr. that Oswald’s real target was Connally? Possibly, inasmuch as Oswald wasn’t a sniper-class shooter. Here’s a scenario that’s consistent with the timing of events in Dealey Plaza: Oswald can tell that his first shot missed his target. He got off a quick second shot, which hit JFK, who’s in line with Connally, passed through JFK and hit Connally. There was no obvious, dramatic reaction from Connally, even though he was hit. So Oswald fired a quick third shot, which hit Kennedy in the back of the head instead of hitting Connally, who by that time had slumped into his wife’s lap. (Go here for the Warren Commission’s chronology of the shots and their effects.)…

Reston could be right, but we’ll never know if he is or isn’t. The truth of the matter died with Oswald on November 24, 1963. In any event, if Reston is right, it would mean that there was no conspiracy to murder JFK.

The only conspiracy theory that might still be worth considering is the idea that Oswald was gunning for JFK because he was somehow maneuvered into doing so by LBJ, the CIA, Fidel Castro, the Mafia, or the Russians. (See, for example, Philip Shenon’s “‘Maybe We Missed Something’: Warren Commission Insider Publicly Concedes That JFK Assassination Was Likely a Conspiracy,” The Washington Post, September 22, 2014, republished in The National Post.) The murder of Oswald by Ruby conveniently plays into that theory. But I say that the burden of proof is on conspiracy theorists, for whom the obvious is not titillating enough. The obvious is Oswald — a leftist loser and less-than-honorably discharged Marine with a chip on his shoulder, a domineering mother, an unhappy home life, and a menial job.

From “1963: The Year Zero“:

If, like me, you were an adult when John F. Kennedy was assassinated, you may think of his death as a watershed moment in American history. I say this not because I’m an admirer of Kennedy the man (I am not), but because American history seemed to turn a corner when Kennedy was murdered….

This petite histoire begins with the Vietnam War and its disastrous mishandling by LBJ, its betrayal by the media, and its spawning of the politics of noise. “Protests” in public spaces and on campuses are a main feature of the politics of noise. In the new age of instant and sympathetic media attention to “protests,” civil and university authorities often refuse to enforce order. The media portray obstructive and destructive disorder as “free speech.” Thus do “protestors” learn that they can, with impunity, inconvenience and cow the masses who simply want to get on with their lives and work….

LBJ’s “Great Society” marked the resurgence of FDR’s New Deal — with a vengeance — and the beginning of a long decline of America’s economic vitality….

The Civil Rights Act of 1964 unnecessarily crushed property rights, along with freedom of association, to what end? So that a violent, dependent, Democrat-voting underclass could arise from the Great Society? So that future generations of privilege-seekers could cry “discrimination” if anyone dares to denigrate their “lifestyles”?…

The war on defense has been accompanied by a war on science. The party that proclaims itself the party of science is anything but that. It is the party of superstitious, Luddite anti-science. Witness the embrace of extreme environmentalism, the arrogance of proclamations that AGW is “settled science,” unjustified fear of genetically modified foodstuffs, the implausible doctrine that race is nothing but a social construct, and on and on.

With respect to the nation’s moral well-being, the most destructive war of all has been the culture war, which assuredly began in the 1960s. Almost overnight, it seems, the nation was catapulted from the land of Ozzie and Harriet, Father Knows Best, and Leave It to Beaver to the land of the free- filthy-speech movement, Altamont, Woodstock, Hair, and the unspeakably loud, vulgar, and violent offerings that are now plastered all over the air waves, the internet, theater screens, and “entertainment” venues….

Then there is the campaign to curtail freedom of speech. This purported beneficiaries of the campaign are the gender-confused and the easily offended (thus “microagressions” and “trigger warnings”). The true beneficiaries are leftists. Free speech is all right if it’s acceptable to the left. Otherwise, it’s “hate speech,” and must be stamped out. This is McCarthyism on steroids. McCarthy, at least, was pursuing actual enemies of liberty; today’s leftists are the enemies of liberty….

If there are unifying themes in this petite histoire, they are the death of common sense and the rising tide of moral vacuity….


Related reading:

Victor Davis Hanson, “Did 1968 Win the Culture War?“, American Greatness, November 22, 2018

Will Lloyd, “How the Myth of JFK Tortured the Democratic Party for 55 Years“, Spectator USA, November 22. 2018

Jamie Palmer, “My Misspent Years of Conspiricism“, Quillette, November 22, 2018

Enough, Already!

In “Trump Defies Gravity“, and other posts about electoral trends, I contrast President Trump’s approval ratings with and those of his predecessor, over whom the media fawned ad nauseum. As I often note, Trump’s ratings are higher than Obama’s, despite the anti-Trump hysteria in which most of the media engage.

A new page at this blog, “Trump Coverage” A Chronology“, summarizes events related to Donald Trump’s presidency that have drawn media attention. The chronology is taken from Wikipedia‘s pages about newsworthy events in the United States during 2016, 2017, and 2018.

The summary begins with the aftermath of the election of November 8, 2016. Not all of the events listed in Wikipedia‘s chronologies occurred in the U.S., which leads me to wonder why the “migrant caravans” of 2018 aren’t included. They were and are clearly aimed at challenging Trump’s stance on immigration, and provoking incidents that cast Trump in a bad light.

At any rate, the tone of Wikipedia‘s narratives — which I copied verbatim — reflects the one-sided, negative, and apocalyptic coverage that bombards those Americans who bother to read or view mainstream media outlets.

V-J Day Stirs Memories

V-J Day in the United States commemorates the official surrender of Japan to the Allied Forces, and the end of World War II. The surrender ceremony took place on September 2, 1945 (the date in Japan), beginning at 9:00 a.m. Tokyo time. The ceremony was held in Tokyo Bay, aboard U.S.S. Missouri, and was presided over by General Douglas MacArthur:

Though it was actually September 1 in the United States at the time of the ceremony, V-J Day is traditionally observed in the U.S. on September 2.

The Monday after the surrender was Labor Day in the U.S. And in those more civilized times (barbarous wars aside), school began on the day after Labor Day.

On September 4, 1945 (the day after Labor Day), I entered kindergarten at the age of 4-2/3 years. Here’s the school that I attended:

PolkSch

In those innocent days, students got to school and back home by walking. Here’s the route that I followed as a kindergartener:

Route to Polk School

A 4-year-old walking several blocks between home and school, usually alone most of the way? Unheard of today, it seems. But that was a different time, in many ways.

For more, see “The Passing of Red-Brick Schoolhouses and a Way of Life“.

Suicide or Destiny?

The list of related reading at the bottom of this post is updated occasionally.

The suicide to which I refer is the so-called suicide of the West, about which Jonah Goldberg has written an eponymous book. This is from Goldberg’s essay based on the book, “Suicide of the West” (National Review, April 12, 2018):

Almost everything about modernity, progress, and enlightened society emerged in the last 300 years. If the last 200,000 years of humanity were one year, nearly all material progress came in the last 14 hours. In the West, and everywhere that followed our example, incomes rose, lifespans grew, toil lessened, energy and water became ubiquitous commodities.

Virtually every objective, empirical measure that capitalism’s critics value improved with the emergence of Western liberal-democratic capitalism. Did it happen overnight? Sadly, no. But in evolutionary terms, it did….

Of course, material prosperity isn’t everything. But the progress didn’t stop there. Rapes, deaths by violence and disease, slavery, illiteracy, torture have all declined massively, while rights for women, minorities, the disabled have expanded dramatically. And, with the exception of slavery, which is a more recent human innovation made possible by the agricultural revolution, material misery was natural and normal for us. Then suddenly, almost overnight, that changed.

What happened? We stumbled into a different world. Following sociologist Robin Fox and historian Ernest Gellner, I call this different world “the Miracle.”…

Why stress that the Miracle was both unnatural and accidental? Because Western civilization generally, and America particularly, is on a suicidal path. The threats are many, but beneath them all is one constant, eternal seducer: human nature. Modernity often assumes that we’ve conquered human nature as much as we’ve conquered the natural world. The truth is we’ve done neither….

The Founders closely studied human nature, recognizing the dangers of despots and despotic majorities alike. They knew that humans would coalesce around common interests, forming “factions.” They also understood that you can’t repeal human nature. So, unlike their French contemporaries, they didn’t try. Instead, they established our system of separated powers and enumerated rights so that no faction, including a passionate majority, could use the state’s power against other factions.

But the Founders’ vision assumed many preconditions, the two most important of which were the people’s virtue and the role of civil society. “The general government . . . can never be in danger of degenerating into a monarchy, an oligarchy, an aristocracy, or any despotic or oppressive form so long as there is any virtue in the body of the people,” George Washington argued.

People learn virtue first and most importantly from family, and then from the myriad institutions family introduces them to: churches, schools, associations, etc. Every generation, Western civilization is invaded by barbarians, Hannah Arendt observed: “We call them children.” Civil society, starting with the family, civilizes barbarians, providing meaning, belonging, and virtue.

But here’s the hitch. When that ecosystem breaks down, people still seek meaning and belonging. And it is breaking down. Its corruption comes from reasons too numerous and complex to detail here, but they include family breakdown, mass immigration, the war on assimilation, and the rise of virtual communities pretending to replace real ones.

First, the market, as Joseph Schumpeter argued, maximizes efficiency with relentless rationality, tending to break down the sinews of tradition and the foundations of civil society that enable and instill virtue. Yet those pre-rational virtues make capitalism possible in the first place.

Second, capitalism also creates a mass class of resentful intellectuals, artists, journalists, and bureaucrats who are professionally, psychologically, and ideologically committed to undermining capitalism’s legitimacy (as noted by Schumpeter and James Burnham, the author of another book titled “Suicide of the West”). This adversarial elite is its own coalition.

Thus, people increasingly look to Washington and national politics for meaning and belonging they can’t find at home. As Mary Eberstadt recently argued, the rise in identity politics coincided with family breakdown, as alienated youth looked to the artificial tribes of racial or sexual solidarity for meaning. Populism, which always wants the national government to solve local problems, is in vogue on left and right precisely because local institutions and civil society generally no longer do their jobs. Indeed, populism is its own tribalism, because “We the People” invariably means “my people.” As Jan-Werner Müller notes in his book What Is Populism?: “Populism is always a form of identity politics.”

A video at the 2012 Democratic National Convention proclaimed that “government is the only thing we all belong to.” For conservatives, this was Orwellian. But for many Americans, it was an invitation to belong. That was the subtext of “The Life of Julia” and President Obama’s call for Americans to emulate SEAL Team Six and strive in unison — towards his goals….

The American Founding’s glory is that those English colonists took their cousins’ tradition, purified it into a political ideology, and extended it farther than the English ever dreamed. And they wrote it down, thank God. The Founding didn’t apply these principles as universally as its rhetoric implied. But that rhetoric was transformative. When the Declaration of Independence was written, some dismissed the beginning as flowery boilerplate; what mattered was the ending: Independence! But the boilerplate became a creed, and America’s story is the story of that creed — those mere words — unfolding to its logical conclusion….

It seems axiomatic to me that whatever words can create, they can destroy. And ingratitude is the destroyer’s form. We teach children that the moral of the Goose that Lays the Golden Egg is the danger of greed. But the real moral of the story is ingratitude. A farmer finds an animal, which promises to make him richer than he ever imagined. But rather than nurture and protect this miracle, he resents it for not doing more. In one version, the farmer demands two golden eggs per day. When the goose politely demurs, he kills it out of a sense of entitlement — the opposite of gratitude.

The Miracle is our goose. And rather than be grateful for it, our schools, our culture, and many of our politicians say we should resent it for not doing more. Conservatism is a form of gratitude, because we conserve only what we are grateful for. Our society is talking itself out of gratitude for the Miracle and teaching our children resentment. Our culture affirms our feelings as the most authentic sources of truth when they are merely the expressions of instincts, and considers the Miracle a code word for white privilege, greed, and oppression.

This is corruption. And it is a choice. Collectively, we are embracing entitlement over gratitude. That is suicidal.

I would put it this way: About 300 years ago there arose in the West the idea of innate equality and inalienable rights. At the same time, and not coincidentally, there arose the notion of economic betterment through free markets. The two concepts — political and economic liberty — are in fact inseparable. One cannot have economic liberty without political liberty; political liberty — the ownership of oneself — implies the ownership of the fruits of one’s own labor and the right to strive for prosperity. This latter striving, as Adam Smith pointed out, works not only for the betterment of the striver but also for the betterment of those who engage in trade with him. The forces of statism are on the march (and have been for a long time). The likely result is the loss of liberty and the vibrancy and prosperity that arises from it.

I want to be clear about liberty. It is not a spiritual state of bliss. It is, as I have written,

a modus vivendi, not the result of a rational political scheme. Though a rational political scheme, such as the one laid out in the Constitution of the United States, could promote liberty.

The key to a libertarian modus vivendi is the evolutionary development and widespread observance of social norms that foster peaceful coexistence and mutually beneficial cooperation.

Liberty, in sum, is not an easy thing to attain or preserve because it depends on social comity: mutual trust, mutual respect, and mutual forbearance. These are hard to inculcate and sustain in the relatively small groupings of civil society (family, church, club, etc.). They are almost impossible to attain or sustain in a large, diverse nation-state. Interests clash and factions clamor and claw for ascendancy over other factions. (It is called tribalism, and even anti-tribalists are tribal in their striving to impose their values on others). The Constitution, as Goldberg implies, has proved unequal to the task of preserving liberty, for reasons to which I will come.

I invoke the Constitution deliberately. This essay is about the United States, not the West in general. (Goldberg gets to the same destination after a while.) Much of the West has already committed “suicide” by replacing old-fashioned (“classical“) liberalism with oppressive statism. The U.S. is far down the same path. The issue at hand, therefore, is whether America’s “suicide” can be avoided.

Perhaps, but only if the demise of liberty is a choice. It may not be a choice, however, as Goldberg unwittingly admits when he writes about human nature.

On that point I turn to John Daniel Davidson, writing in “The West Isn’t Committing Suicide, It’s Dying of Natural Causes” (The Federalist, May 18, 2018):

Perhaps the Miracle, wondrous as it is, needs more than just our gratitude to sustain it. Perhaps the only thing that can sustain it is an older order, one that predates liberal democratic capitalism and gave it its vitality in the first place. Maybe the only way forward is to go back and rediscover the things we left behind at the dawn of the Enlightenment.

Goldberg is not very interested in all of that. He does not ask whether there might be some contradictions at the heart of the liberal order, whether it might contain within it the seeds of its undoing. Instead, Goldberg makes his stand on rather narrow grounds. He posits that the Enlightenment Miracle can be defended in purely secular, utilitarian terms, which he supposes are the only terms skeptics of liberal democratic capitalism will accept.

That forces him to treat the various illiberal ideologies that came out of Enlightenment thought (like communism) as nothing more than a kind of tribalism rather than a natural consequence of the hyper-rational scientism embedded in the liberal order itself. As Richard M. Reinsch II noted last week in an excellent review of Goldberg’s book over at Law and Liberty, “If you are going to set the Enlightenment Miracle as the standard of human excellence, one that we are losing, you must also clearly state the dialectic it introduces of an exaltation of reason, power, and science that can become something rather illiberal.”

That is to say, we mustn’t kid ourselves about the Miracle. We have to be honest, not just about its benefits but also its costs….

What about science and medical progress? What about the eradication of disease? What about technological advances? Isn’t man’s conquest of nature a good thing? Hasn’t the Enlightenment and the scientific revolution and the invention of liberal democratic capitalism done more to alleviate poverty and create wealth than anything in human history? Shouldn’t we preserve this liberal order and pass it on to future generations? Shouldn’t we inculcate in our children a profound sense of gratitude for all this abundance and prosperity?

This is precisely Goldberg’s argument. Yes, he says, man’s conquest of nature is a good thing. It’s the same species of argument raised earlier this year in reaction to Patrick Deneen’s book, “Why Liberalism Failed,” which calls into question the entire philosophical system that gave us the Miracle….

[Deneen] is not chiefly interested in the problems of the modern progressive era or the contemporary political Left. He isn’t alarmed merely by political tribalism and the fraying of the social order. Those things are symptoms, not the cause, of the illness he’s diagnosing. Even the social order at its liberal best—the Miracle itself—is part of the illness.

Deneen’s argument reaches back to the foundations of the liberal order in the sixteenth  and seventeenth centuries—prior to the appearance of the Miracle, in Goldberg’s telling—when a series of thinkers embarked on a fundamentally revisionist project “whose central aim was to disassemble what they concluded were irrational religious and social norms in the pursuit of civil peace that might in turn foster stability and prosperity, and eventually individual liberty of conscience and action.”

The project worked, as Goldberg has chronicled at length, but only up to a point. Today, says Deneen, liberalism is a 500-year-old experiment that has run its course and now “generates endemic pathologies more rapidly and pervasively than it is able to produce Band-Aids and veils to cover them.”

Taking the long view of history, Deneen’s book could be understood as an extension of Lewis’s argument in “The Abolition of Man.” The replacement of moral philosophy and religion with liberalism and applied science has begun, in our lifetimes, to manifest the dangers that Lewis warned about. Deneen, writing more than a half-century after Lewis, declares that the entire liberal project manifestly has failed.

Yes, the Miracle gave us capitalism and democracy, but it also gave us hyper-individualism, scientism, and communism. It gave us liberty and universal suffrage, but it also gave us abortion, euthanasia, and transgenderism. The abolition of man was written into the Enlightenment, in other words, and the suicide of the West that Goldberg warns us about isn’t really a suicide at all, because it isn’t really a choice: we aren’t committing suicide, we’re dying of natural causes.

Goldberg is correct that we have lost our sense of gratitude, that we don’t really feel like things are as good as all that. But a large part of the reason is that the liberal order itself has robbed us of our ability to articulate what constitutes human happiness. We have freedom, we have immense wealth, but we have nothing to tell us what we should do with it, nothing to tell us what is good.

R.R. Reno, in “The Smell of Death” (First Things, May 31, 2018), comes at it this way:

At every level, our elites oppose traditional regulation of behavior based on clear moral norms, preferring a therapeutic and bureaucratic approach. They seek to decriminalize marijuana. They have deconstructed male and female roles for children. They correct anyone who speaks of “sex,” preferring to speak of “gender,” which they insist is “socially constructed.” They have ushered in a view of free speech that makes it impossible to prevent middle school boys from watching pornography on their smart phones. They insist upon a political correctness that rejects moral correctness.

The upshot is American culture circa 2018. Our ideal is a liquid world of self-definition, characterized by plenary acceptance and mutual affirmation. In practice, the children of our elites are fortunate: Their families and schools carefully socialize them into the disciplines of twenty-first-century meritocratic success while preaching openness, inclusion, and diversity. But the rest are not so fortunate. Most Americans gasp for air as they tread water. More and more drown….

Liberalism has always been an elite project of deregulation. In the nineteenth century, it sought to deregulate pre-modern economies and old patterns of social hierarchy. It worked to the advantage of the talented, enterprising, and ambitious, who soon supplanted the hereditary aristocracy.

In the last half-century, liberalism has focused on deregulating personal life. This, too, has been an elite priority. It makes options available to those with the resources to exploit them. But it has created a world in which disordered souls kill themselves with drugs and alcohol—and in which those harboring murderous thoughts feel free to act upon them.

The penultimate word goes to Malcolm Pollack (“The Magic Feather“, Motus Mentis, July 6, 2018):

Our friend Bill Vallicella quoted this, from Michael Anton, on Independence Day:

For the founders, government has one fundamental purpose: to protect person and property from conquest, violence, theft and other dangers foreign and domestic. The secure enjoyment of life, liberty and property enables the “pursuit of happiness.” Government cannot make us happy, but it can give us the safety we need as the condition for happiness. It does so by securing our rights, which nature grants but leaves to us to enforce, through the establishment of just government, limited in its powers and focused on its core responsibility.

Bill approves, and adds:

This is an excellent statement. Good government secures our rights; it does not grant them. Whether they come from nature, or from God, or from nature qua divine creation are further questions that can be left to the philosophers. The main thing is that our rights are not up for democratic grabs, nor are they subject to the whims of any bunch of elitists that manages to insinuate itself into power.

I agree all round. I hope that my recent engagement with Mr. Anton about the ontology of our fundamental rights did not give readers the impression that I doubt for a moment the importance of Americans believing they possess them, or of the essential obligation of government to secure them (or of the people to overthrow a government that won’t).

My concerns are whether the popular basis for this critically important belief is sustainable in an era of radical and corrosive secular doubt (and continuing assault on those rights), and whether the apparently irresistible tendency of democracy to descend into faction, mobs, and tyranny was in fact a “poison pill” baked into the nation at the time of the Founding. I am inclined to think it was, but historical contingency and inevitability are nearly impossible to parse with any certainty.

Arnold Kling (“Get the Story Straight“, Library of Economics and Liberty, July 9, 2018) is more succinct:

Lest we fall back into a state of primitive tribalism, we need to understand the story of the Miracle. We need to understand that it is unnatural, and we should be grateful for the norms and institutions that restrained human nature in order to make the Miracle possible.

All of the writers I have quoted are on to something, about which I have written in “Constitution: Myths and Realities“. I call it the Framers’ fatal error.

The Framers’ held a misplaced faith in the Constitution’s checks and balances (see Madison’s Federalist No. 51 and Hamilton’s Federalist No. 81). The Constitution’s wonderful design — containment of a strictly limited central government through horizontal and vertical separation of powers — worked rather well until the Progressive Era. The design then cracked under the strain of greed and the will to power, as the central government began to impose national economic regulation at the behest of muckrakers and do-gooders. The design then broke during the New Deal, which opened the floodgates to violations of constitutional restraint (e.g., Medicare, Medicaid, Obamacare,  the vast expansion of economic regulation, and the destruction of civilizing social norms), as the Supreme Court has enabled the national government to impose its will in matters far beyond its constitutional remit.

In sum, the “poison pill” baked into the nation at the time of the Founding is human nature, against which no libertarian constitution is proof unless it is enforced resolutely by a benign power.

Barring that, it is may be too late to rescue liberty in America. I am especially pessimistic because of the unraveling of social comity since the 1960s, and by a related development: the frontal assault on freedom of speech, which is the final constitutional bulwark against oppression.

Almost overnight, it seems, the nation was catapulted from the land of Ozzie and Harriet, Father Knows Best, and Leave It to Beaver to the land of the free- filthy-speech movement, Altamont, Woodstock, Hair, and the unspeakably loud, vulgar, and violent offerings that are now plastered all over the air waves, the internet, theater screens, and “entertainment” venues.

The 1960s and early 1970s were a tantrum-throwing time, and many of the tantrum-throwers moved into positions of power, influence, and wealth, having learned from the success of their main ventures: the end of the draft and the removal of Nixon from office. They schooled their psychological descendants well, and sometimes literally on college campuses. Their successors on the campuses of today — students, faculty, and administrators — carry on the tradition of reacting with violent hostility toward persons and ideas that they oppose, and supporting draconian punishments for infractions of their norms and edicts. (For myriad examples, see The College Fix.)

Adherents of the ascendant culture esteem protest for its own sake, and have stock explanations for all perceived wrongs (whether or not they are wrongs): racism, sexism, homophobia, Islamophobia, hate, white privilege, inequality (of any kind), Wall  Street, climate change, Zionism, and so on. All of these are to be combated by state action that deprives citizens of economic and social liberties.

In particular danger are the freedoms of speech and association. The purported beneficiaries of the campaign to destroy those freedoms are “oppressed minorities” (women, Latinos, blacks, Muslims, the gender-confused, etc.) and the easily offended. The true beneficiaries are leftists. Free speech is speech that is acceptable to the left. Otherwise, it’s “hate speech”, and must be stamped out. Freedom of association is bigotry, except when it is practiced by leftists in anti-male, anti-conservative, pro-ethnic, and pro-racial causes. This is McCarthyism on steroids. McCarthy, at least, was pursuing actual enemies of liberty; today’s leftists are the enemies of liberty.

The organs of the state have been enlisted in an unrelenting campaign against civilizing social norms. We now have not just easy divorce, subsidized illegitimacy, and legions of non-mothering mothers, but also abortion, concerted (and deluded) efforts to defeminize females and to neuter or feminize males, forced association (with accompanying destruction of property and employment rights), suppression of religion, absolution of pornography, and the encouragement of “alternative lifestyles” that feature disease, promiscuity, and familial instability.

The state, of course, doesn’t act of its own volition. It acts at the behest of special interests — interests with a “cultural” agenda. They are bent on the eradication of civil society — nothing less — in favor of a state-directed Rousseauvian dystopia from which Judeo-Christian morality and liberty will have vanished, except in Orwellian doublespeak.

If there are unifying themes in this petite histoire, they are the death of common sense and the rising tide of moral vacuity. The history of the United States since the 1960s supports the proposition that the nation is indeed going to hell in a handbasket.

In fact, the speed at which it is going to hell seems to have accelerated since the Charleston church shooting and the legal validation of  same-sex “marriage” in 2015. It’s a revolution (e.g., this) piggy-backing on mass hysteria. Here’s the game plan:

  • Define opposition to illegal immigration, Islamic terrorism, same-sex marriage, transgenderism, and other kinds violent and anti-social behavior as “hate“.
  • Associate “hate” with conservatism.
  • Watch as normally conservative politicians, business people, and voters swing left rather than look “mean” and put up a principled fight for conservative values. (Many of them can’t put up such a fight, anyway. Trump’s proper but poorly delivered refusal to pin all of the blame on neo-Nazis for the Charlottesville riot just added momentum to the left’s cause because he’s Trump and a “fascist” by definition.)
  • Watch as Democrats play the “hate” card to retake the White House and Congress.

With the White House in the hands of a left-wing Democrat (is there any other kind now?) and an aggressive left-wing majority in Congress, freedom of speech, freedom of association, and property rights will become not-so-distant memories. “Affirmative action” (a.k.a. “diversity”) will be enforced on an unprecedented scale of ferocity. The nation will become vulnerable to foreign enemies while billions of dollars are wasted on the hoax of catastrophic anthropogenic global warming and “social services” for the indolent. The economy, already buckling under the weight of statism, will teeter on the brink of collapse as the regulatory regime goes into high gear and entrepreneurship is all but extinguished by taxation and regulation.

All of that will be secured by courts dominated by left-wing judges — from here to eternity.

And most of the affluent white enablers dupes of the revolution will come to rue their actions. But they won’t be free to say so.

Thus will liberty — and prosperity — die in America.

And it is possible that nothing can prevent it because it is written in human nature; specifically, a penchant for the kind of mass hysteria that seems to dominate campuses, the “news” and “entertainment” media, and the Democrat Party.

Christopher Booker describes this phenomenon presciently in his book about England and America of the 1950s and 1960s, The Neophiliacs (1970):

[T]here is no dream so powerful as one generated and subscribed to by a whole mass of people simultaneously — one of those mass projections of innumerable individual neuroses which we may call a group fantasy. This is why the twentieth century has equally been dominated by every possible variety of collective make-believe — whether expressed through mass political movements and forms of nationalism, or through mass social movements….

Any group fantasy is in some sense a symptom of social disintegration, of the breaking down of the balance and harmony between individuals, classes, generations, the sexes, or even nations. For the organic relationships of a stable and secure community, in which everyone may unself-consciously exist in his own separate place and right, a group fantasy substitutes the elusive glamor of identification with a fantasy community, of being swept along as part of a uniform mass united in a common cause. But the individuals making up the mass are not, of course, united in any real sense, except through their common dress, catch phrases, slogans, and stereotyped attitudes. Behind their conformist exteriors they remain individually as insecure as ever — and indeed become even more so, for the collective dream, such as that expressed through mass advertising or the more hysterical forms of fashion, is continually aggravating their fantasy-selves and appealing to them through their insecurities to merge themselves in the mass ever more completely….

This was the phenomenon of mass psychology which was portrayed in an extreme version by George Orwell in his 1984…. But in fact the pattern described was that of every group fantasy; exactly the same that we can see, for instance, in the teen age subculture of the fifties and sixties, … or that of the left-wing progressive intellectuals, with their dream heroes such as D. H. Lawrence or Che Guevera and their ritual abuse of the “reactionaries”….

… Obviously no single development in history has done more to promote both social disintegration and unnatural conformity than the advance and ubiquity of machines and technology. Not only must the whole pressure of an industrialized, urbanized, mechanized society tend to weld its members into an ever more rootless uniform mass, by the very nature of its impersonal organization and of the processes of mass-production and standardization. But in addition the twentieth century has also provided two other factors to aggravate and to feed the general neurosis; the first being the image-conveying apparatus of films, radio, television, advertising, mass-circulation newspapers and magazines; the second the feverishly increased pace of life, from communications and transport to the bewildering speed of change and innovation, all of which has created a profound subconscious restlessness which neurotically demands to be assuaged by more speed and more change of every kind….

The essence of fantasy is that it feeds on a succession of sensations or unresolved images, each one of which arouses anticipation, followed by inevitable frustration, leading to the demand for a new image to be put in its place. But the very fact that each sensation is fundamentally unsatisfying means that the fantasy itself becomes progressively more jaded…. And so we arrive at the fantasy spiral.

Whatever pattern of fantasy we choose to look at … she shall find that it is straining through a spiral of increasingly powerful sensations toward some kind of climax…. What happens therefore is simply that, in its pursuit of the elusive image of life, freedom, and self-assertion, the fantasy pushes on in an ever-mounting spiral of demand, ever more violent, more dream-like and fragmentary, and ever more destructive of the framework of order. Further and further pushes the fantasy, always in pursuit of the elusive climax, always further from reality — until it is actually bringing about the very opposite of its aims.

That, of course, is what will happen when the left and its dupes bring down the Constitution and all that it was meant to stand for: the protection of citizens and their voluntary institutions and relationships from predators, including not least governmental predators and the factions they represent.

The Constitution, in short, was meant to shield Americans from human nature. But it seems all too likely that human nature will destroy the shield.

Thus my call for a “Preemptive (Cold) Civil War“.


Related reading:
Fred Reed, “The Symptoms Worsen”, Fred on Everything, March 15, 2015
Christopher Booker, Global Warming: A Case Study in Groupthink, Global Warming Policy Foundation, 2018
Michael Mann, “Have Wars and Violence Declined?“, Theory and Society, February 2018
John Gray, “Steven Pinker Is Wrong about Violence and War”, The Guardian, March 13, 2015
Nikita Vladimirov, “Scholar Traces Current Campus Intolerance to 60’s Radicals“, Campus Reform, March 14, 2018
Nick Spencer, “Enlightenment and Progress: Why Steven Pinker Is Wrong” Mercatornet, March 19, 2018
Steven Hayward, “Deja Vu on Campus?“, PowerLine, April 15, 2018
William A. Nitze, “The Tech Giants Must Be Stopped“, The American Conservative, April 16, 2018
Steven Hayward, “Jonah’s Suicide Hotline, and All That Stuff“, PowerLine, May 15, 2018
Jeff Groom, “40 Years Ago Today: When Solzhenitsyn Schooled Harvard“, The American Conservative, June 8, 2018
Graham Allison, “The Myth of the Liberal Order: From Historical Accident to Conventional Wisdom“, Foreign Affairs, July/August 2018
Gilbert T. Sewall, “The America That Howard Zinn Made“, The American Conservative, July 10, 2018
Mary Eberstadt, “Two Nations, Revisited“, National Affairs, Summer 2018

Related posts and pages:
Constitution: Myths and Realities
Leftism
The Psychologist Who Played God
We, the Children of the Enlightenment
Society and the State
The Eclipse of “Old America”
Genetic Kinship and Society
The Fallacy of Human Progress
The Culture War
Ruminations on the Left in America
1963: The Year Zero
Academic Ignorance
The Euphemism Conquers All
Defending the Offensive
Superiority
Whiners
A Dose of Reality
Turning Points
God-Like Minds
Non-Judgmentalism as Leftist Condescension
An Addendum to (Asymmetrical) Ideological Warfare
Social Justice vs. Liberty
The Left and “the People”
Liberal Nostrums
Liberty and Social Norms Re-examined
Equality
Academic Freedom, Freedom of Speech, and the Demise of Civility
Leftism As Crypto-Fascism: The Google Paradigm
What’s Going On? A Stealth Revolution
Disposition and Ideology
Down the Memory Hole
“Tribalists”, “Haters”, and Psychological Projection
Mass Murder: Reaping What Was Sown
Utopianism, Leftism, and Dictatorship
The Framers, Mob Rule, and a Fatal Error
Abortion, the “Me” Generation, and the Left
Abortion Q and A
Whence Polarization?
Negative Rights, Etc.
Social Norms, the Left, and Social Disintegration
Order vs. Authority
Can Left and Right Be Reconciled?
Rage on the Left
Rights, Liberty, the Golden Rule, and Leviathan

Recommended Reading

Leftism, Political Correctness, and Other Lunacies (Dispatches from the Fifth Circle Book 1)

 

On Liberty: Impossible Dreams, Utopian Schemes (Dispatches from the Fifth Circle Book 2)

 

We the People and Other American Myths (Dispatches from the Fifth Circle Book 3)

 

Americana, Etc.: Language, Literature, Movies, Music, Sports, Nostalgia, Trivia, and a Dash of Humor (Dispatches from the Fifth Circle Book 4)

Lincoln Was Wrong

Michael Stokes Paulsen and his son Luke opine:

[A]t the heart of the Civil War, the crisis that triggered it, and the changes that it brought were enormous constitutional issues. Indeed, it is no exaggeration to say that the Civil War was fought over the meaning of the Constitution, and over who would have the ultimate power to decide that meaning. The Civil War decided—on the battlefields rather than in the courts—the most important constitutional questions in our nation’s history: the nature of the Union under the Constitution, the status and future of slavery, the powers of the national government versus the states, the supremacy of the Constitution, and the wartime powers of the president as commander in chief. It was the Civil War, not any subsequent judicial decision, that “overruled” the Supreme Court’s atrocious decision in Dred Scott v. Sandford creating a national constitutional right to own slaves….

The United States is the nation it is today because of Lincoln’s unwavering commitment to the Constitution as governing a single, permanent nation and forbidding secession. Lincoln’s vision of Union is so thoroughly accepted today that we forget how hotly disputed it was for the first seventy years of our nation’s history. The result was hardly inevitable. Lincoln’s vision and resolve saved the nation. Lincoln’s nationalist views have shaped every issue of federalism and sovereignty for the past one hundred fifty years. Compared with the constitutional issues over which the Civil War was fought, today’s disputes over federal- versus-state power are minor-league ball played out on a field framed by Lincoln’s prevailing constitutional vision of the United States as one nation, indivisible.

On the president’s constitutional duty: Lincoln understood his oath to impose an absolute personal moral and legal duty not to cave in to wrong, destructive views of the Constitution. He fought on the campaign trail for his understanding of Union and of the authority of the national government to limit the spread of slavery. Once in office, he understood his oath to impose on him an irreducible moral and legal duty of faithful execution of the laws, throughout the Union. It was a duty he could not abandon for any reason. [“The Great Interpreter”, University of St. Thomas (Minnesota) Research Paper No. 15-09, April 17, 2017]

Whence Lincoln’s view of the Union? This is from the Paulsens’ book, The Constitution: An Introduction:

Lincoln was firmly persuaded that secession was unconstitutional. Immediately upon taking office as President, in his First Inaugural Address, Lincoln— a careful constitutional lawyer— laid out in public his argument as to why secession was unconstitutional: The Constitution was the supreme law of the land, governing all the states. The Constitution did not provide that states could withdraw from the Union, and to infer such a right was contrary to the letter and spirit of the document. The Constitution’s Preamble announced the objective of forming a “more perfect Union” of the states than had existed under the Articles of Confederation, which themselves had said that the Union would be “perpetual.” Moreover, the Constitution created a true national government, not a mere “compact,” league, or confederacy— in fact, it explicitly forbade states from entering into alliances, confederacies, or treaties outside of national authority. The people of the United States, taken as a whole, were sovereign, not the states.

It followed from these views, Lincoln argued, that “no State upon its own mere motion can lawfully get out of the Union; that resolves and ordinances to that effect are legally void, and that acts of violence within any State or States against the authority of the United States are insurrectionary or revolutionary, according to circumstances.” Purported secession was simply an illegal— unconstitutional— rebellion against the Union.

Lincoln’s position, which the Paulsens seem to applaud, is flawed at its root. The Constitution did not incorporate the Articles of Confederation, it supplanted them. The “perpetual Union” of the Articles vanished into thin air upon the adoption of the Constitution. Moreover, the “more perfect Union” of the Constitution’s preamble is merely aspirational, as are the desiderata that follow it:

establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity.

“More perfect”, if it means anything, means that the Constitution created a central government where there was none before. The Constitution is silent about perpetuity. It is silent about secession. Therefore, one must turn elsewhere to find (or reject) a legal basis for secession, but not to the Civil War.

The Civil War “decided” the issue of secession in the same way that World War I “decided” the future of war. It was the “war to end all wars”, was it not? Therefore, tens of millions of deaths to the contrary notwithstanding, there have been no wars since the Armistice of 1918. By the same logic, the thief who steals your car or the vandal who defaces your home or the scam artist who takes your life savings has “decided” that you don’t own a car, or that your home should be ugly, or that your savings are really his. Thus does might make right, as the Paulsens would have it.

There is in fact a perfectly obvious and straightforward case for unilateral secession, which I have made elsewhere, including “A Resolution of Secession”. You should read all of it if you are a rabid secessionist — or a rabid anti-secessionist. Here are some key passages:

The Constitution is a contract — a compact in the language of the Framers. The parties to the compact are not only the States but also the central government….

Lest there be any question about the status of the Constitution as a compact, we turn to James Madison, who is often called the Father of the Constitution. Madison, in a letter to Daniel Webster dated March 15, 1833, addresses

the question whether the Constitution of the U.S. was formed by the people or by the States, now under a theoretic discussion by animated partizans.

Madison continues:

It is fortunate when disputed theories, can be decided by undisputed facts. And here the undisputed fact is, that the Constitution was made by the people, but as imbodied into the several states, who were parties to it and therefore made by the States in their highest authoritative capacity….

[I]n The Federalist No. 39, which informed the debates in the various States about ratification….

Madison leaves no doubt about the continued sovereignty of each State and its people. The remaining question is this: On what grounds, if any, may a State withdraw from the compact into which it entered voluntarily?

There is a judicial myth — articulated by a majority of the United States Supreme Court in Texas v. White (1869) — that States may not withdraw from the compact because the union of States is perpetual….

The Court’s reasoning is born of mysticism, not legality. Similar reasoning might have been used — and was used — to assert that the Colonies were inseparable from Great Britain. And yet, some of the people of the Colonies put an end to the union of the Colonies and Great Britain, on the moral principle that the Colonies were not obliged to remain in an abusive relationship. That moral principle is all the more compelling in the case of the union known as the United States, which — mysticism aside — is nothing more than the creature of the States, as authorized by the people thereof.

In fact, the Constitution supplanted the Articles of Confederation and Perpetual Union, by the will of only nine of the thirteen States….

[I]n a letter to Alexander Rives dated January 1, 1833, Madison says that

[a] rightful secession requires the consent of the others [other States], or an abuse of the compact, absolving the seceding party from the obligations imposed by it.

An abuse of the compact most assuredly necessitates withdrawal from it, on the principle of the preservation of liberty, especially if that abuse has been persistent and shows no signs of abating. The abuse, in this instance, has been and is being committed by the central government.

The central government is both a creature of the Constitution and a de facto party to it, as co-sovereign with the States and supreme in its realm of enumerated and limited powers. One of those powers enables the Supreme Court of the United States to decide “cases and controversies” arising under the Constitution, which alone makes the central government a responsible party. More generally, the high officials of the central government acknowledge the central government’s role as a party to the compact — and the limited powers vested in them — when they take oaths of office requiring them to uphold the Constitution.

Many of those high officials have nevertheless have committed myriad abuses of the central government’s enumerated and limited powers. The abuses are far too numerous to list in their entirety. The following examples amply justify the withdrawal of the State of _______________ from the compact….

We, therefore, the representatives of the people of _______________ do solemnly publish and declare that this State ought to be free and independent; that it is absolved from all allegiance to the government of the United States; that all political connection between it and government of the United States is and ought to be totally dissolved; and that as a free and independent State it has full power to levy war, conclude peace, contract alliances, establish commerce, and to do all other acts and things which independent States may of right do. And for the support of this Declaration, with a firm reliance on the protection of divine Providence, we mutually pledge to each other our lives, our fortunes and our sacred honor.


See “The Constitution: Myths and Realities“.

The JFK Standard

I believe that the media’s treatment of a president has more to do with his party affiliation and various “cosmetic” factors than with his policies or executive competence. To test this hypothesis (unscientifically), I constructed a table listing six factors, and gave JFK a 10 for each of them. (He was the last president to enjoy an extended honeymoon with the media, and it had to have been based more on “cosmetic” factors than anything else.) I then quickly assigned relative scores to each of JFK’s successors — and didn’t go back to change any scores. I didn’t add the scores until I had assigned every president a score for all six factors. Here’s the result:

Given the media’s long-standing bias toward Democrats, I arbitrarily gave each Democrat 10 points for party affiliation, as against zero for the Republicans. “Looks,” “wit,” and “elegance” (as seen through the eyes of the media) should be self-explanatory. “Wife” and “children” refer to contemporaneous media perceptions of each president’s wife and child or children. Jackie was the perfect First Lady, from the standpoint of looks, poise, and “culture.” And Caroline and John Jr. epitomized “adorable,” unlike the older and often unattractive (or notorious) offspring of later presidents.

I’d say that the total scores are a good indication of the relative treatment — favorable, middling, and unfavorable — given each president by the media.


Related:
Facts about Presidents
Is Character Really an Issue?
Blindsided by the Truth
Rating the Presidents, Again
The Modern Presidency: A Tour of American History
Nonsense about Presidents, IQ, and War
1963: The Year Zero

Presidents and War

This post is prompted by a recent exchange with former think-tank colleagues about H.R. McMaster‘s Dereliction of Duty: Johnson, McNamara, the Joint Chiefs of Staff. I’ve just started it, having until now steadfastly eschewed rehashes of the Vietnam War since its ignominious end. My assessment of LBJ’s handling of the Vietnam War is based entirely on my knowledge of the war as it unfolded and unraveled, and subsequent reflections on that knowledge. I’ll review McMaster’s book in a later post.

The president’s role as commander-in-chief is a two-edged sword. It was wielded ably by Lincoln and FDR (until the end-game in Europe), and badly by Truman, LBJ, Bush I, Bush II, and Obama.

Starting with Bush II, I believe that he made the right strategic decision, which was to bring the Middle East under control instead of leaving it hostage to the whims of Saddam. (Some will say that Saddam was contained, but — in my view — he was a threat to the Middle East if not to the U.S. as long as he was in power.) That may not have been what Bush intended, but that’s what he could have achieved, and would have achieved if he had committed the forces necessary to bring Iraq firmly under control. Instead, he followed Rumsfeld’s do-it-on-the-cheap advice for too long. Anyway, Bush got bogged down, much as LBJ had done with his “signalling” and gradualism in Vietnam. The 2007 surge might have turned things around, but Bush had run out of political capital and couldn’t commit the forces needed to stabilize Iraq for the long haul (and neutralize Iran), even if he had wanted to.

Obama then followed his anti-colonial impulses and converted potential stability into the mess that we see today.

Bush I set it all up when he declined the golden opportunity to depose Saddam in 1991.

Truman’s handling of the Korean War could be defended as making the best of a bad situation. But Truman’s decision to accept a stalemate instead of taking on the Chinese, as MacArthur urged, was a strategic miscalculation of the first order. It signaled to Russia and China the unwillingness of U.S. leaders to push back against Communist expansion. LBJ reinforced that signal in Vietnam. It took Reagan, who pursued a defense buildup in the face of chicken-little screams from the defeatist left, to push the USSR to its breaking point.

To paraphrase Andy Granatelli, you can pay now or pay later, but pay you will. I fear that the long-run price of the defense build-down under Obama will be high.