Month: June 2017

Terrorism Isn’t an Accident

Some writers have just caught on to what I have been saying for eleven years. For instance, Michael Brendan Dougherty says this about the London Bridge attacks earlier this month:

Imagine if Saturday’s three London Bridge killers had been British Nationalist party thugs, ramming their car through a Pakistani neighborhood. Would a single decent person have heard the news and immediately said, “Well, this number of dead people is statistically insignificant compared to those that die in car accidents. These punks can’t threaten our society!” Would anyone have asked, “Why are we talking about the killer’s politics? There are thousands of gun murders in America every year and those killers don’t have their politics talked about.” Would they have felt like singing John Lennon’s “Imagine” the next morning to conjure up a vision of a day when people of all political creeds can get along?

We all know the answer.

And yet, even before the victims on London Bridge had stopped bleeding, this was the reaction among society’s best, brightest and most morally self-assured members on social media. The pattern is by now familiar. Even as an Islamic terrorist killer’s proclamations about Allah’s will are still ringing in victims’ ears, these individuals are already declaring that the true danger from the attack is an Islamophobic backlash, and that you’re more likely to die by drowning in your own swimming pool than from a terrorist attack.

Do they know how callous that sounds? Do they not realize that sensible human beings react differently to a car accident than to a murder plot? Or that states and car manufacturers are constantly working to decrease the lethality of driving, while terrorists are constantly trying to improve the lethality of their enterprise? Terrorist acts have now become “the kind of thing that inevitably flares up and causes some damage before the experts put it out,” according to one media wit. Or consider Vox’s Will Wilkinson, who wrote that “If it is truly the case that the risk of death by Islamic terrorism can be reduced to approximately zero through official anti-terror zeal, that suggests the threat is manageable — indeed, that it is being managed.” [“Stop Comparing the London Attacks to Freak Accidents and Natural Disasters“, National Review, June 7, 2017]

Wilkinson proves, once again that he’s a world-class asshole. But Dougherty doesn’t decisively squelch the fatuousness of the apologists for terrorism. Stefan Kanter does a better job:

Sometimes the attacks are made with firearms. Sometimes they’re done with trucks on busy bridges or thronged thoroughfares or with bombs at rock performances or with pressure-cookers at marathons. Terrorists constantly revise their tactics, but the assurances from the media are unvarying.

Make America safe again? inquires a Newsweek headline. but America is already safe. The story goes on to explain that “the attacks in San Bernadino, California, and Orlando, Florida, certainly set people on edge, but Americans have a better chance at being killed by lightning or drowning in their own bathtubs than being killed by a terrorist.” In the New York Times, columnist Nicholas Kristof falls in lockstep. “Most years in the U.S., ladders kill far more Americans than Muslim terrorists do. Same with bathtubs. Ditto for stairs. And lightning.” Boston Globe columnist Michael H. Cohen is more statistical but just as banal: “Even unintentional drowning—in bathtubs, lakes and swimming pools—kills more than 3,500 people a year, a number approximately 1,800 times larger than the number of people killed by terrorism.”

These soothing statements ignore a fundamental fact: bathtubs, ladders, and lightning don’t have as their object the murder and maiming of human beings—jihadists  do. Those who perish by accident are victims. Murdered people are prey. If it turned out that the ladder companies were putting teflon on the top rungs so people would slip off and die, then we would expect the government to respond swiftly to the threat, even if only a few people were killed this way. [“Lightning, Bathtubs, and Banality“, City Journal, June 29, 2017]

Not bad. Could be better. Like this post of mine:

Glen Whitman, an associate professor of economics and co-proprietor of Agoraphilia, amply demonstrates muddleheadedness in “Perspective on Terrorism”, where he says this:

At Cato Unbound, in response to a lead essay by John Mueller, Clark Kent Ervin rejects the comparison of the death rate from terrorism to the death rates from bee stings, lightning, drowning, etc. Ervin’s argument is so unpersuasive (to me) that I think it deserves a fisking.

It is undoubtedly true that Americans are far more likely to die from “bee stings, lightning, or accident-causing deer” than terrorism, but so what? … This statistical argument implicitly equates deaths from bee stings, lightning or close encounters with marauding deer with deaths from terrorism.

They should be equated. It doesn’t make sense to spend a billion dollars to prevent one death by terrorism if the same billion dollars could prevent ten or a hundred deaths by other causes. Death is death. It can be sensible to give different treatment to deaths by different causes, but only if there’s some reason to think one cause of death is more easily deterred than another.

Ervin may have made his case badly, but he is right and Whitman is wrong. To see why, let’s go back to Mueller’s statement:

Although polls continue to show Americans notably concerned that they or members of their families might die at the hands of terrorists, astronomer Alan Harris has calculated that, at present rates and including the disaster of 9/11 in the consideration, the chances any individual resident of the globe will be killed by an international terrorist over the course of an 80-year lifetime is about 1 in 80,000, about the same likelihood of being killed over the same interval from the impact on the Earth of an especially ill-directed asteroid or comet. At present, Americans are vastly more likely to die from bee stings, lightning, or accident-causing deer than by terrorism within the country. That seems pretty safe.

That seems “pretty safe” only because the United States (and many other countries) have taken affirmative steps to detect and thwart terrorist attacks before they occur. We have seen the enemy’s successes. But we are unaware of many of his failures because it is stupid to give the enemy an inkling of how we have achieved all of our successes.

Comparing the ex post death rate from terrorism with such unpreventable and/or random events as asteroid strikes, bee stings, lightning strikes, or deer-caused accidents is a classic demonstration of academic cluelessness. Those unpreventable and/or random events will occur regardless of terrorism. Terrorism is an additional threat — not an alternative one. The worst mistake we can make is to underestimate that threat.

Now, Whitman would say that we can spend less money on the war on terror and more to prevent asteroid strikes, for example, and that we ought to determine the right balance of spending between the two activities. That’s fine, as far as it goes, but a correct determination of the balance of spending cannot be made by using probabilities of the type cited by Mueller.

Asteroids, bees, lightning, and deer — unlike terrorists — are not sentient enemies. Ignoring those “threats” will not enable them to increase their “attacks” on us; they will do what they will do, according to the “laws of nature.” Ignoring terrorists, on the other hand, certainly would enable them — and encourage them — to increase their attacks on us. The apparently low probability of being killed by a terrorist is low precisely because we have spent a lot of money to make it low.

Moreover, we must continue to spend a lot of money to keep that probability low. If we fail to do so, we will then find out what it is like to be besieged — to live lives that are markedly poorer, filled with anxiety, and isolated from that large part of the world in which Islamism will have triumphed. [“A Skewed Perspective on Terrorism“, Politics & Prosperity (originally at Liberty Corner), September 20, 2006]

Six years later:

Cato’s loony libertarians (on matters of defense) once again trot out Herr Doktor Professor John Mueller. He writes:

We have calculated that, for the 12-year period from 1999 through 2010 (which includes 9/11, of course), there was one chance in 22 million that an airplane flight would be hijacked or otherwise attacked by terrorists. [“Serial Innumeracy on Homeland Security”, Cato@Liberty, July 24, 2012]

Mueller’s “calculation” consists of an recitation of known terrorist attacks pre-Benghazi and speculation about the status of Al-Qaeda. Note to Mueller: It is the unknown unknowns that kill you. I refer Herr Doktor Professor to “Riots, Culture, and the Final Showdown” and “Mission Not Accomplished.” [“Not-So-Random Thoughts (VI)“, Politics & Prosperity, October 11, 2012]

It’s no wonder that good-old boys scorn pointy-headed idiots like Wilkinson, Kristof, Cohen, Whitman, and Mueller. They are to logic as “modern” music is to music: full of sound and fury, signifying nothing.

Supreme Court Page Updated

I have updated “U.S. Supreme Court: Lines of Succession and Ideological Alignment” to cover the recently completed October 2016 term. Observations:

The fairly harmonious 2014 term was succeeded by more typical (i.e., more divided) 2015 and 2016 terms. In fact, 2016 was even more polarized than 2015.

Kennedy’s long-standing proneness to defect more often than his “conservative” colleagues grew markedly in the 2014-2015 terms and receded a bit in the 2016 term. Kennedy probably should be counted as a member of the Court’s “liberal” wing, but I won’t make that call until the end of the 2017 term. Perhaps Kennedy will have done the right thing and retired by then.

Roberts is more in step with the “conservative” wing than he had been in the previous three terms, but he isn’t back to where he was in 2005-2011.

Academic Freedom, Freedom of Speech, and the Demise of Civility

A professor has been suspended after claiming that anyone witnessing white people in mortal danger should “let them fucking die.”

Daniel Payne, The College Fix (June 27, 2017)

Predictably, the suspension of the professor — one Johnny Eric Williams of Trinity College in Hartford, Connecticut — and a similar case at Essex County (New Jersey) College caused the usual (left-wing) suspects to defend the offenders and claim that their “academic freedom” and “freedom of speech” were being violated. (Boo hoo.) I will be unsurprised if the ACLU doesn’t weigh in on the side of hate.

This is what happens when the law becomes an abstraction, separate and apart from social norms. There is no better example of the degradation of the law, and of public discourse, than the case of Snyder v. Phelps, which I addressed in “Rethinking the Constitution: ‘Freedom of Speech and of the Press’“. What follows is based on that post.

Contrary to the current state of constitutional jurisprudence, freedom of speech and freedom of the press — and, by implication, academic freedom — do not comprise an absolute license to “express” almost anything, regardless of the effects on the social fabric.

One example of misguided absolutism is found in Snyder v. Phelps, a case recently and wrongly decided by the U.S. Supreme Court. This is from “The Burkean Justice” (The Weekly Standard, July 18, 2011):

When the Supreme Court convened for oral argument in Snyder v. Phelps, judicial formalities only thinly veiled the intense bitterness smoldering among the parties and their supporters. At one table sat counsel for Albert Snyder, father of the late Marine Lance Corporal Matthew Snyder, who was killed in al Anbar Province, Iraq. At the other sat Margie Phelps, counsel for (and daughter of) Fred Phelps, whose notorious Westboro Baptist Church descended upon Snyder’s Maryland funeral, waving signs bearing such startlingly offensive slogans as “Thank God for IEDs,” “God Hates Fags,” and “Thank God for Dead Soldiers.” A federal jury had awarded Snyder nearly $11 million for the “severe depression” and “exacerbated preexisting health conditions” that Phelps’s protest had caused him.

In the Supreme Court, Phelps argued that the jury’s verdict could not stand because the First Amendment protected Westboro’s right to stage their protest outside the funeral. As the Court heard the case on a gray October morning, Westboro protesters marched outside the courthouse, informing onlookers that God still “Hates Fags” and advising them to “Pray for More Dead Soldiers.”

Amidst that chaos, the Court found not division, but broad agreement. On March 2, 2011, it held that Westboro’s slurs were protected by the First Amendment, and that Snyder would receive no compensation, let alone punitive damages, for the emotional injuries that he had suffered. Chief Justice John Roberts wrote the Court’s opinion, speaking for all of his brethren, conservatives and liberals alike—except one.

Justice Samuel Alito rejected the Court’s analysis and wrote a stirring lone dissent. “The Court now holds that the First Amendment protected respondents’ right to brutalize Mr. Snyder. I cannot agree.” Repeatedly characterizing Westboro’s protest as not merely speech but “verbal assaults” that “brutally attacked” the fallen Snyder and left the father with “wounds that are truly severe and incapable of healing themselves,” Justice Alito concluded that the First Amendment’s text and precedents did not bar Snyder’s lawsuit. “In order to have a society in which public issues can be openly and vigorously debated, it is not necessary to allow the brutalization of innocent victims. .  .  . I therefore respectfully dissent.”

There is more:

Snyder v. Phelps would not be the last time that Alito stood nearly alone in a contentious free speech case this term. Just weeks ago, as the Court issued its final decisions of the term, Alito rejected the Court’s broad argument that California could not ban the distribution of violent video games without parental consent. Although he shared the Court’s bottom-line conclusion that the particular statute at issue was unconstitutional, he criticized the majority’s analysis in Brown v. Entertainment Merchants Association as failing to give states and local communities latitude to promote parental control over children’s video-game habits. The states, he urged, should not be foreclosed from passing better-crafted statutes achieving that legitimate end.

Moreover, Alito’s opinions in those cases followed a solo dissent late in the previous term, in United States v. Stevens, where eight of the nine justices struck down a federal law barring the distribution of disturbing “crush videos” in which, for example, a woman stabs a kitten through the eye with her high heel, all for the gratification of anonymous home audiences.

The source of Alito’s positions:

[T]hose speculating as to the roots of Alito’s jurisprudence need look no further than his own words—in public documents, at his confirmation hearing, and elsewhere. Justice Alito is uniquely attuned to the space that the Constitution preserves for local communities to defend the vulnerable and to protect traditional values. In these three new opinions, more than any others, he has emerged as the Court’s Burkean justice….

A review of Alito’s Snyder, Brown, and Stevens opinions quickly suggests the common theme: Alito, more than any of his colleagues, would not allow broad characterizations of the freedom of speech effectively to immunize unlawful actions. He sharply criticized the Court for making generalized pronouncements on the First Amendment’s reach, when the Court’s reiterations of theory glossed over the difficult factual questions that had given rise to regulation in the first place—whether in grouping brutal verbal attacks with protected political speech; or in equating interactive Duke Nukem games with the text of Grimm’s Fairy Tales; or in extending constitutional protection to the video of women illegally crushing animals. And Alito was particularly sensitive to the Court’s refusal to grant at least a modicum of deference to the local communities and state officials who were attempting to protect their populations against actions that they found so injurious as to require state intervention.

A general and compelling case against the current reign of absolutism is made by David Lowenthal in No Liberty for License: The Forgotten Logic of the First Amendment. My copy is now in someone else’s hands, so I must rely on Edward J. Erler’s review of the book:

Liberty is lost when the law allows “freedom of speech, and of the press” to undermine the social norms that enable liberty. Liberty is not an abstraction, it is it is the scope of action that is allowed by socially agreed upon rights. It is that restrained scope of action which enables people to coexist willingly, peacefully, and cooperatively for their mutual benefit. Such coexistence depends greatly on mutual trust, respect, and forbearance. Liberty is therefore necessarily degraded when courts sunder social restraints in the name of liberty.

Other related posts:
On Liberty
Line-Drawing and Liberty
The Meaning of Liberty
Positive Liberty vs. Liberty
Facets of Liberty
Burkean Libertarianism
What Is Libertarianism?
True Libertarianism, One More Time
Human Nature, Liberty, and Rationalism
Liberty, Negative Rights, and Bleeding Hearts
Why Conservatism Works
Liberty and Society
Liberty as a Social Construct: Moral Relativism?
Defending Liberty against (Pseudo) Libertarians
Defining Liberty
The Futile Search for “Natural Rights”
The Pseudo-Libertarian Temperament
Parsing Political Philosophy (II)
Getting Liberty Wrong
Libertarianism and the State
“Liberalism” and Personal Responsibility
My View of Libertarianism
More About Social Norms and Liberty
The War on Conservatism
Social Justice vs. Liberty
Economically Liberal, Socially Conservative
The Harm Principle Revisited: Mill Conflates Society and State
Liberty and Social Norms Re-examined
Natural Law, Natural Rights, and the Real World
Rescuing Conservatism
If Men Were Angels
The Left and Evergreen State: Reaping What Was Sown

Suicidal Despair and the “War on Whites”

This entry is prompted by a recent spate of posts and articles about the rising mortality rate among non-Hispanic whites without a college degree (hereinafter working-class whites, for convenience). Thomas Lifson characterizes the trend as “a spiritual crisis”, after saying this:

White males, in large numbers, are simply losing their will to live, and as a result, they are dying so prematurely and in such large numbers that a startling demographic gap has emerged. [“Stunning Evidence that the Left Has Won its War on White Males“, American Thinker, March 26, 2017]

Later in the piece, Lifson gets to the “war” on white males:

For at least four decades, white males have been under continuous assault as bearers of “white privilege” and beneficiaries of sexism. Special preferences and privileges have been granted to other groups, but that is the least of it.  More importantly, the very basis of the psychological self-worth of white males have been under attack.  White males are frequently instructed by authority figures in education and the media that they are responsible for most of the evils of the modern world, that the achievements of Euro-American civilization are a net loss for humanity, stained by exploitation, racism, unfairness, and every other collective evil the progressive mind can manufacture.

Some white males are relatively unscathed by the psychological warfare, but others are more vulnerable. Those who have educational, financial, or employment achievements that have rewarded their efforts may be able to keep going as productive members of society, their self-esteem resting on tangible fruits of their work and social position. But other white males, especially those who work with their hands and have been seeing job opportunities contract or disappear, have been losing the basis for a robust sense of self-worth as their job opportunities disappear.

We now have statistical evidence that political correctness kills.

We have no such thing. The recent trend isn’t yet significant. But it is real, and government is the underlying cause.

To begin at the beginning, the source of the spate of articles about the rising mortality rate of working-class whites is Anne Case and Angus Deaton’s “Mortality and Morbidity in the 21st Century” (Brookings Institution, Brookings Papers on Economic Activity (conference edition), March 17, 2017). Three of the paper’s graphs set the scene. This one shows mortality trends in the United States:

The next figure indicates that the phenomenon isn’t unique to non-Hispanic whites in the age 50-54 bracket:

But the trend among American whites defies the trends in several other Western nations:

Whence the perverse trend? It seems due mainly to suicidal despair:

How do these recent trends stack up against the long view? I couldn’t find a long time series for drug, alcohol, and suicide mortality. But I did find a study by Feijin Luo et al. that traces suicide rates from just before the onset of the Great Depression to just before the onset of the Great Recession — “Impact of Business Cycles on US Suicide Rates, 1928–2007” (American Journal of Public Health, June 2011). Here are two key graphs from the report:

The graphs don’t reproduce well, so the following quotations will be of help:

The overall suicide rate fluctuated from 10.4 to 22.1 over the 1928–2007 period. It peaked in 1932, the last full year of the Great Depression, and bottomed in 2000. The overall suicide rate decreased from 18.0 in 1928 to 11.2 in 2007. However, most of the decline occurred before 1945; after that it fluctuated until the mid-1950s, and then it gradually moved up until the late 1970s. The overall suicide rate resumed its downward trend from the mid-1980s to 2000, followed by a trend reversal in the new millennium.

Figure 1a [top] shows that the overall suicide rate generally increased in recessions, especially in severe recessions that lasted longer than 1 year. The largest increase in the overall suicide rate occurred during the Great Depression (1929–1933), when it surged from 18.0 in 1928 to 22.1 (the all-time high) in 1932, the last full year of the Great Depression. [The Great Depression actually lasted until 1940: TEA.] This increase of 22.8% was the highest recorded for any 4-year interval during the study period. The overall suicide rate also rose during 3 other severe recessions: [the recession inside the Great Depression] (1937–1938), the oil crisis (1973–1975), and the double-dip recession (1980–1982). Not only did the overall suicide rate generally rise during recessions; it also mostly fell during expansions…. However, the overall suicide rate did not fall during the 1960s (i.e., 1961–1969), a notable phenomenon that will be explained by the different trends of age-specific suicide rates.

The age-specific suicide rates displayed more variations than did the overall suicide rate, and the trends of those age-specific suicide rates were largely different. As shown in Figure 1b [bottom], from 1928–2007, the suicide rates of the 2 elderly groups (65–74 years and 75 years and older) and the oldest middle-age group (55–64 years) experienced the most remarkable decline. The suicide rates of those groups declined in both pre- and postwar periods. The suicide rates of the other 2 middle-aged groups (45–54 years and 35–44 years) also declined from 1928–2007, which we attributed to the decrease during the war period more than offsetting the increase in the postwar period. In contrast with the declining suicide rates of the 2 elderly and 3 middle-age groups, the suicide rates of the 2 young groups (15–24 years and 25–34 years) increased or just marginally decreased from 1928–2007. The 2 young groups experienced a marked increase in suicide rates in the postwar period. The suicide rate of the youngest group (5–14 years) also increased from 1928–2007. However, because of its small magnitude, we do not include this increase in the subsequent discussion.

We noted that the suicide rate of the group aged 65–74 years, the highest of all age groups until 1936, declined the most from 1928 to 2007. That rate started at 41.2 in 1928 and dropped to 12.6 in 2007, peaking at 52.3 in 1932 and bottoming at 12.3 in 2004. By contrast, the suicide rate of the group aged 15–24 years increased from 6.7 in 1928 to 9.7 in 2007. That rate peaked at 13.7 in 1994 and bottomed at 3.8 in 1944, and it generally trended upward from the late 1950s to the mid-1990s. The suicide rate differential between the group aged 65–74 years and the group aged 15–24 years generally decreased until 1994, from 34.5 in 1928 to 1.6 in 1994.

All age groups experienced a substantial increase in their suicide rates during the Great Depression, and most groups (35–44 years, 45–54 years, 55–64 years, 65–74 years, and 75 years and older) set record-high suicide rates in 1932; but they reacted differently to many other recessions, including severe recessions such as the [1937-1938 recession] and the oil crisis. Their reactions were different during expansions as well, most notably in the 1960s, when the suicide rates of the 3 oldest groups (75 years and older, 65–74 years, and 55–64 years) declined moderately, and those of the 3 youngest groups (15–24 years, 25–34 years, and 35–44 years) rose noticeably….

[T]he overall suicide rate and the suicide rate of the group aged 45–54 years were associated with business cycles at the significance level of 1%; the suicide rates of the groups aged 25–34 years, 35–44 years, and 55–64 years were associated with business cycles at the significance level of 5%; and the suicide rates of the groups aged 15–24 years, 65–74 years, and 75 years and older were associated with business cycles at nonsignificant levels. To sumarize, the overall suicide rate was significantly countercyclical; the suicide rates of the groups aged 25–34 years, 35–44 years, 45–54 years, and 55–64 years were significantly countercyclical; and the suicide rates of the groups aged 15–24 years, 65–74 years, and 75 years and older were not significantly countercyclical.

The following graph, obtained from the website of the American Foundation for Suicide Prevention, extends the age-related analysis to 2015:

And this graph, from the same source, shows that the rising suicide rate is concentrated among whites and American Indians:

Though this graph encompasses deaths from all causes, the opposing trends for blacks and whites suggest strongly that working-class whites in all age groups have become much more prone to suicidal despair in the past 20 years. Moreover, the despair has persisted through periods of economic decline and economic growth (slow as it has been).

Why? Case and Deaton opine:

[S]ome of the most convincing discussions of what has happened to working class whites emphasize a long-term process of decline, or of cumulative deprivation, rooted in the steady deterioration in job opportunities for people with low education…. This process … worsened over time, and caused, or at least was accompanied by, other changes in society that made life more difficult for less-educated people, not only in their employment opportunities, but in their marriages, and in the lives of and prospects for their children. Traditional structures of social and economic support slowly weakened; no longer was it possible for a man to follow his father and grandfather into a manufacturing job, or to join the union. Marriage was no longer the only way to form intimate partnerships, or to rear children. People moved away from the security of legacy religions or the churches of their parents and grandparents, towards churches that emphasized seeking an identity, or replaced membership with the search for connections…. These changes left people with less structure when they came to choose their careers, their religion, and the nature of their family lives. When such choices succeed, they are liberating; when they fail, the individual can only hold him or herself responsible….

As technical change and globalization reduced the quantity and quality of opportunity in the labor market for those with no more than a high school degree, a number of things happened that have been documented in an extensive literature. Real wages of those with only a high school degree declined, and the college premium increased….

Lower wages made men less marriageable, marriage rates declined, and there was a marked rise in cohabitation, then much less frowned upon than had been the case a generation before…. [B]eyond the cohort of 1940, men and women with less than a BA degree are less likely to have ever been married at any given age. Again, this is not occurring among those with a four-year degree. Unmarried cohabiting partnerships are less stable than marriages. Moreover, among those who do marry, those without a college degree are also much more likely to divorce than are those with a degree….

These accounts share much, though not all, with Murray’s … account [in Coming Apart] of decline among whites in his fictional “Fishtown.” Murray argues that traditional American virtues are being lost among working-class white Americans, especially the virtue of industriousness. The withdrawal of men from the labor force reflects this loss of industriousness; young men in particular prefer leisure—which is now more valuable because of video games … —though much of the withdrawal of young men is for education…. The loss of virtue is supported and financed by government payments, particularly disability payments….

In our account here, we emphasize the labor market, globalization and technical change as the fundamental forces, and put less focus on any loss of virtue, though we certainly accept that the latter could be a consequence of the former. Virtue is easier to maintain in a supportive environment. Yet there is surely general agreement on the roles played by changing beliefs and attitudes, particularly the acceptance of cohabitation, and of the rearing of children in unstable cohabiting unions.

These slow-acting and cumulative social forces seem to us to be plausible candidates to explain rising morbidity and mortality, particularly their role in suicide, and with the other deaths of despair, which share much with suicides. As we have emphasized elsewhere, … purely economic accounts of suicide have consistently failed to explain the phenomenon. If they work at all, they work through their effects on family, on spiritual fulfillment, and on how people perceive meaning and satisfaction in their lives in a way that goes beyond material success. At the same time, cumulative distress, and the failure of life to turn out as expected is consistent with people compensating through other risky behaviors such as abuse of alcohol, overeating, or drug use that predispose towards the outcomes we have been discussing….

What our data show is that the patterns of mortality and morbidity for white non-Hispanics without a college degree move together over lifetimes and birth cohorts, and that they move in tandem with other social dysfunctions, including the decline of marriage, social isolation, and detachment from the labor force…. Whether these factors (or factor) are “the cause” is more a matter of semantics than statistics. The factor could certainly represent some force that we have not identified, or we could try to make a case that declining real wages is more fundamental than other forces. Better, we can see globalization and automation as the underlying deep causes. Ultimately, we see our story as about the collapse of the white, high school educated, working class after its heyday in the early 1970s, and the pathologies that accompany that decline. [Op. cit., pp. 29-38]

The seemingly rigorous and well-reasoned analyses reported by Case-Deaton and Luo et al. are seriously flawed, for these reasons:

  • Case and Deaton’s focus on events since 1990 is analogous to a search for lost keys under a street lamp because that’s where the light is. As shown in the graphs taken from Luo et al., suicide rates have at various times risen (and dropped) as sharply as they have in recent years.
  • Luo et al. address a much longer time span but miss an important turning point, which came during World War II. Because of that, they resort to a strained, non-parametric analysis of the relationship between the suicide rate and business cycles.
  • It is misleading to focus on age groups, as opposed to birth-year cohorts. For example, persons in the 50-54 age group in 1990 were born between 1936 and 1940, but in 2010 persons in the 50-54 age group were born between 1956 and 1960. The groups, in other words, don’t represent the same cohort. The only meaningful suicide rate for a span of more than a few years is the rate for the entire population.

I took a fresh look at the overall suicide rate and its relationship to the state of the economy. First, I extended the overall, age-adjusted suicide rate for 1928-2007 provided by Luo et al. in Supplementary Table B (purchase required) by splicing it with a series for 1999-2014 from Centers for Disease Control and Prevention, National Center for Health Statistics, Data Brief 241. I then drew on the database at Measuring Worth to derive year-over-year changes in real GDP for 1928-2014. Here’s an overview of the two time series:

The suicide rate doesn’t drop below 15 until 1942. From 1943 through 2014 it vacillates in a narrow range between 10.4 (2000) and 13.6 (1975). Despite the rise since 2000, the overall rate still hasn’t returned to the 1975 peak. And only in recent years has the overall rate reached figures that were reached often between 1943 and 1994.

Moreover, the suicide rate from 1928 through 1942 is strongly correlated with changes in real GDP. But the rate from 1943 through 2014 is not:

Something happened during the war years to loosen the connection between the state of the economy and the suicide rate. That something was the end of the pervasive despair that the Great Depression inflicted on huge numbers of Americans. It’s as if America had a mood transplant, one which has lasted for more than 70 years. The recent uptick in the rate of suicide (and the accompanying rise in slow-motion suicide) is sad because it represents wasted lives. But it is within one standard deviation of the 1943-2014 average of  12.2 suicides per 100,000 persons:

(It seems to me that researchers ought to be asking why the rate was so low for about 20 years, beginning in the 1980s.)

Perhaps the recent uptick among working-class whites can be blamed, in part, on loss of “virtue”, technological change, and globalization — as Case and Deaton claim. But they fail to notice the bigger elephant in the room: the destructive role of government.

Technological change and globalization simply reinforce the disemployment effects of the long-term decline in the rate of economic growth. I’ve addressed the decline many times, most recently in “Presidents and Economic Growth“. The decline has three main causes, all attributable to government action, which I’ve assessed in “The Rahn Curve Revisited“: the rise in government spending as a fraction of GDP, the rise in the number of regulations on the books, and the (unsurprising) effect of those variables on private business investment. The only silver lining has been a decline in the rate of inflation, which is unsurprising in view of the general slow-down of economic growth. Many jobs may have disappeared because of technological change and many jobs may have been “shipped overseas”, but there would be a lot more jobs if government had kept its hands off the economy and out of Americans’ wallets.

Moreover, the willingness of Americans — especially low-skill Americans — to seek employment has been eroded by various government programs: aid to the families of dependent children (a boon to unwed mothers and a bane to family stability), food stamps, disability benefits, the expansion of Medicaid, subsidized health-care for “children” under the age of 26, and various programs that encourage women to work outside the home, thus fostering male unemployment.

Economist Edward Glaeser puts it this way:

The rise of joblessness—especially among men—is the great American domestic crisis of the twenty-first century. It is a crisis of spirit more than of resources. The jobless are far more prone to self-destructive behavior than are the working poor. Proposed solutions that focus solely on providing material benefits are a false path. Well-meaning social policies—from longer unemployment insurance to more generous disability diagnoses to higher minimum wages—have only worsened the problem; the futility of joblessness won’t be solved with a welfare check….

The New Deal saw the rise of public programs that worked against employment. Wage controls under the National Recovery Act made it difficult for wages to fall enough to equilibrate the labor market. The Wagner Act strengthened the hand of unions, which kept pay up and employment down. Relief efforts for the unemployed, including federal make-work jobs, eased the pressure on the jobless to find private-sector work….

… In 2011, more than one in five prime-age men were out of work, a figure comparable with the Great Depression. But while employment came back after the Depression, it hasn’t today. The unemployment rate may be low, but many people have quit the labor force entirely and don’t show up in that number. As of December 2016, 15.2 percent of prime-age men were jobless—a figure worse than at any point between World War II and the Great Recession, except during the depths of the early 1980s recession….

Joblessness is disproportionately a condition of the poorly educated. While 72 percent of college graduates over age 25 have jobs, only 41 percent of high school dropouts are working. The employment-rate gap between the most and least educated groups has widened from about 6 percent in 1977 to almost 15 percent today….

Both Franklin Roosevelt and Lyndon Johnson aggressively advanced a stronger safety net for American workers, and other administrations largely supported these efforts. The New Deal gave us Social Security and unemployment insurance, which were expanded in the 1950s. National disability insurance debuted in 1956 and was made far more accessible to people with hard-to-diagnose conditions, like back pain, in 1984. The War on Poverty delivered Medicaid and food stamps. Richard Nixon gave us housing vouchers. During the Great Recession, the federal government temporarily doubled the maximum eligibility time for receiving unemployment insurance.

These various programs make joblessness more bearable, at least materially; they also reduce the incentives to find work. Consider disability insurance. Industrial work is hard, and plenty of workers experience back pain. Before 1984, however, that pain didn’t mean a disability check for American workers. After 1984, though, millions went on the disability rolls. And since disability payments vanish if the disabled person starts earning more than $1,170 per month, the disabled tend to stay disabled…. Disability insurance alone doesn’t entirely explain the rise of long-term joblessness—only one-third or so of jobless males get such benefits. But it has surely played a role.

Other social-welfare programs operate in a similar way. Unemployment insurance stops completely when someone gets a job, … [thus] the unemployed tend to find jobs just as their insurance payments run out. Food-stamp and housing-voucher payments drop 30 percent when a recipient’s income rises past a set threshold by just $1. Elementary economics tells us that paying people to be or stay jobless will increase joblessness….

The rise of joblessness among the young has been a particularly pernicious effect of the Great Recession. Job loss was extensive among 25–34-year-old men and 35–44-year-old men between 2007 and 2009. The 25–34-year-olds have substantially gone back to work, but the number of employed 35–44-year-olds, which dropped by 2 million at the start of the Great Recession, hasn’t recovered. The dislocated workers in this group seem to have left the labor force permanently.

Unfortunately, policymakers seem intent on making the joblessness crisis worse. The past decade or so has seen a resurgent progressive focus on inequality—and little concern among progressives about the downsides of discouraging work. Advocates of a $15 minimum hourly wage, for example, don’t seem to mind, or believe, that such policies deter firms from hiring less skilled workers. The University of California–San Diego’s Jeffrey Clemens examined states where higher federal minimum wages raised the effective state-level minimum wage during the last decade. He found that the higher minimum “reduced employment among individuals ages 16 to 30 with less than a high school education by 5.6 percentage points,” which accounted for “43 percent of the sustained, 13 percentage point decline in this skill group’s employment rate.”

The decision to prioritize equality over employment is particularly puzzling, given that social scientists have repeatedly found that unemployment is the greater evil…. One recent study estimated that unemployment leads to 45,000 suicides worldwide annually. Jobless husbands have a 50 percent higher divorce rate than employed husbands. The impact of lower income on suicide and divorce is much smaller. The negative effects of unemployment are magnified because it so often becomes a semipermanent state.

Time-use studies help us understand why the unemployed are so miserable. Jobless men don’t do a lot more socializing; they don’t spend much more time with their kids. They do spend an extra 100 minutes daily watching television, and they sleep more. The jobless also are more likely to use illegal drugs….

Joblessness and disability are also particularly associated with America’s deadly opioid epidemic…. The strongest correlate of those deaths is the share of the population on disability. That connection suggests a combination of the direct influence of being disabled, which generates a demand for painkillers; the availability of the drugs through the health-care system; and the psychological misery of having no economic future.

Increasing the benefits received by nonemployed persons may make their lives easier in a material sense but won’t help reattach them to the labor force. It won’t give them the sense of pride that comes from economic independence. It won’t give them the reassuring social interactions that come from workplace relationships. When societies sacrifice employment for a notion of income equality, they make the wrong choice. [“The War on Work — And How to End It“, City Journal, special issue: The Shape of Work to Come 2017]

In sum, the rising suicide rate — whatever its significance — is a direct and indirect result of government policies. “We’re from the government and we’re here to help” is black humor, at best. The left is waging a war on white males. But the real war — the war that kills — is hidden from view behind the benign facade of governmental “compassion”.

Having said all of that, I will end on a cautiously positive note. There still is upward mobility in America. Not all working-class people are destined for suicidal despair, only those at the margin who have pocketed the fool’s gold of government handouts.

Other related posts:
Bubbling Along
Economic Mobility Is Alive and Well in America
H.L. Mencken’s Final Legacy
The Problem with Political Correctness
“They Deserve to Die”?
Mencken’s Pearl of Wisdom
Class in America
Another Thought or Two about Class
The Midwest Is a State of Mind

The Secret of a Happy Marriage

Most people marry young. Even though the average age at first marriage is creeping up, it is still below 30 as far as I know. And it was closer to 20 when I wed several decades ago.

A person who is in his early 20s has a lot of life and learning ahead. His political views are likely to change. Mine changed from idealistic “liberalism” to informed conservatism, with a few stops in between. (For more, go to “About” and scroll down to “Beliefs”.) If one’s political views are heritable, as this piece suggests, what happened to me is that nature — my parents’ innate conservatism — finally overcame nurture — the attitudes and ideas that I absorbed as a collegian.

I married my wife only two years after completing my undergraduate degree, still a naive “liberal” with simplistic views about such things as race (not a problem), markets (suspect), and government (more is better). Fast-forward more than 50 years to the conservative me, still wed to the “liberal” lass who views Donald Trump as unalloyed evil, daily expresses the hope that he will be shot (though that may stop after the shooting of Steve Scalise), cannot understand why Texas Republicans care about who uses which bathroom, favors abortion (in principle, not practice), supports gun control (though we have guns in the house), has swallowed the global-warming hoax, and bases most of her other views on the slants of NBC Nightly News and the Austin American-Statesman.

But she hates to pay taxes.

That, plus love, unites us despite our differences.

“Science” vs. Science: The Case of Evolution, Race, and Intelligence

If you were to ask those people who marched for science if they believe in evolution, they would have answered with a resounding “yes”. Ask them if they believe that all branches of the human race evolved identically and you will be met with hostility. The problem, for them, is that an admission of the obvious — differential evolution, resulting in broad racial differences — leads to a fact that they don’t want to admit: there are broad racial differences in intelligence, differences that must have evolutionary origins.

“Science” — the cherished totem of left-wing ideologues — isn’t the same thing as science. The totemized version consists of whatever set of facts and hypotheses suit the left’s agenda. In the case of “climate change”, for example, the observation that in the late 1900s temperatures rose for a period of about 25 years coincident with a reported rise in the level of atmospheric CO2 occasioned the hypothesis that the generation of CO2 by humans causes temperatures to rise. This is a reasonable hypothesis, given the long-understood, positive relationship between temperature and so-called greenhouse gases. But it comes nowhere close to confirming what leftists seem bent on believing and “proving” with hand-tweaked models, which is that if humans continue to emit CO2, and do so at a higher rate than in the past, temperatures will rise to the point that life on Earth will become difficult if not impossible to sustain. There is ample evidence to support the null hypothesis (that “climate change” isn’t catastrophic) and the alternative view (that recent warming is natural and caused mainly by things other than human activity).

Leftists want to believe in catastrophic anthropogenic global warming because it suits the left’s puritanical agenda, as did Paul Ehrlich’s discredited thesis that population growth would outstrip the availability of food and resources, leading to mass starvation and greater poverty. Population control therefore became a leftist mantra, and remains one despite the generally rising prosperity of the human race and the diminution of scarcity (except where leftist governments, like Venezuela’s, create misery).

Why are leftists so eager to believe in problems that portend catastrophic consequences which “must” be averted through draconian measures, such as enforced population control, taxes on soft drinks above a certain size, the prohibition of smoking not only in government buildings but in all buildings, and decreed reductions in CO2-emitting activities (which would, in fact, help to impoverish humans)? The common denominator of such measures is control. And yet, by the process of psychological projection, leftists are always screaming “fascist” at libertarians and conservatives who resist control.

Returning to evolution, why are leftists so eager to eager to embrace it or, rather, what they choose to believe about it? My answers are that (a) it’s “science” (it’s only science when it’s spelled out in detail, uncertainties and all) and (b) it gives leftists (who usually are atheists) a stick with which to beat “creationists”.

But when it comes to race, leftists insist on denying what’s in front of their eyes: evolutionary disparities in such phenomena as skin color, hair texture, facial structure, running and jumping ability, cranial capacity, and intelligence.

Why? Because the urge to control others is of a piece with the superiority with which leftists believe they’re endowed because they are mainly white persons of European descent and above-average intelligence (just smart enough to be dangerous). Blacks and Hispanics who vote left do so mainly for the privileges it brings them. White leftists are their useful idiots.

Leftism, in other words, is a manifestation of “white privilege”, which white leftists feel compelled to overcome through paternalistic condescension toward blacks and other persons of color. (But not East Asians or the South Asians who have emigrated to the U.S., because the high intelligence of those groups is threatening to white leftists’ feelings of superiority.) What could be more condescending, and less scientific, than to deny what evolution has wrought in order to advance a political agenda?

Leftist race-denial, which has found its way into government policy, is akin to Stalin’s support of Lysenkoism, which its author cleverly aligned with Marxism. Lysenkoism

rejected Mendelian inheritance and the concept of the “gene”; it departed from Darwinian evolutionary theory by rejecting natural selection.

This brings me to Stephen Jay Gould, a leading neo-Lysenkoist and a fraudster of “science” who did much to deflect science from the question of race and intelligence:

[In The Mismeasure of Man] Gould took the work of a 19th century physical anthropologist named Samuel George Morton and made it ridiculous. In his telling, Morton was a fool and an unconscious racist — his project of measuring skull sizes of different ethnic groups conceived in racism and executed in same. Why, Morton clearly must have thought Caucasians had bigger brains than Africans, Indians, and Asians, and then subconsciously mismeasured the skulls to prove they were smarter.

The book then casts the entire project of measuring brain function — psychometrics — in the same light of primitivism.

Gould’s antiracist book was a hit with reviewers in the popular press, and many of its ideas about the morality and validity of testing intelligence became conventional wisdom, persisting today among the educated folks. If you’ve got some notion that IQ doesn’t measure anything but the ability to take IQ tests, that intelligence can’t be defined or may not be real at all, that multiple intelligences exist rather than a general intelligence, you can thank Gould….

Then, in 2011, a funny thing happened. Researchers at the University of Pennsylvania went and measured old Morton’s skulls, which turned out to be just the size he had recorded. Gould, according to one of the co-authors, was nothing but a “charlatan.”

The study itself couldn’t matter, though, could it? Well, recent work using MRI technology has established that descendants of East Asia have slightly more cranial capacity than descendants of Europe, who in turn have a little more than descendants of Africa. Another meta-analysis finds a mild correlation between brain size and IQ performance.

You see where this is going, especially if you already know about the racial disparities in IQ testing, and you’d probably like to hit the brakes before anybody says… what, exactly? It sounds like we’re perilously close to invoking science to argue for genetic racial superiority.

Am I serious? Is this a joke?…

… The reason the joke feels dangerous is that it incorporates a fact that is rarely mentioned in public life. In America, white people on average score higher than black people on IQ tests, by a margin of 12-15 points. And there’s one man who has been made to pay the price for that fact — the scholar Charles Murray.

Murray didn’t come up with a hypothesis of racial disparity in intelligence testing. He simply co-wrote a book, The Bell Curve, that publicized a fact well known within the field of psychometrics, a fact that makes the rest of us feel tremendously uncomfortable.

Nobody bears more responsibility for the misunderstanding of Murray’s work than Gould, who reviewed The Bell Curve savagely in the New Yorker. The IQ tests couldn’t be explained away — here he is acknowledging the IQ gap in 1995 — but the validity of IQ testing could be challenged. That was no trouble for the old Marxist.

Gould should have known that he was dead wrong about his central claim — that general intelligence, or g, as psychologists call it, was unreal. In fact, “Psychologists generally agree that the greatest success of their field has been in intelligence testing,” biologist Bernard D. Davis wrote in the Public Interest in 1983, in a long excoriation of Gould’s strange ideas.

Psychologists have found that performance on almost any test of cognition will have some correlation to other tests of cognition, even in areas that might seem distant from pure logic, such as recognizing musical notes. The more demanding tests have a higher correlation, or a high g load, as they term it.

IQ is very closely related to this measure, and turns out to be extraordinarily predictive not just for how well one does on tests, but on all sorts of real-life outcomes.

Since the publication of The Bell Curve, the data have demonstrated not just those points, but that intelligence is highly heritable (around 50 to 80 percent, Murray says), and that there’s little that can be done to permanently change the part that’s dependent on the environment….

The liberal explainer website Vox took a swing at Murray earlier this year, publishing a rambling 3,300-word hit job on Murray that made zero references to the scientific literature….

Vox might have gotten the last word, but a new outlet called Quillette published a first-rate rebuttal this week, which sent me down a three-day rabbit hole. I came across some of the most troubling facts I’ve ever encountered — IQ scores by country — and then came across some more reassuring ones from Thomas Sowell, suggesting that environment could be the main or exclusive factor after all.

The classic analogy from the environment-only crowd is of two handfuls of genetically identical seed corn, one planted in Iowa and the other in the Mojave Desert. One group flourishes; the other is stunted. While all of the variation within one group will be due to genetics, its flourishing relative to the other group will be strictly due to environment.

Nobody doubts that the United States is richer soil than Equatorial Guinea, but the analogy doesn’t prove the case. The idea that there exists a mean for human intelligence and that all racial subgroups would share it given identical environments remains a metaphysical proposition. We may want this to be true quite desperately, but it’s not something we know to be true.

For all the lines of attack, all the brutal slander thrown Murray’s way, his real crime is having an opinion on this one key issue that’s open to debate. Is there a genetic influence on the IQ testing gap? Murray has written that it’s “likely” genetics explains “some” of the difference. For this, he’s been crucified….

Murray said [in a recent interview] that the assumption “that everyone is equal above the neck” is written into social policy, employment policy, academic policy and more.

He’s right, of course, especially as ideas like “disparate impact” come to be taken as proof of discrimination. There’s no scientifically valid reason to expect different ethnic groups to have a particular representation in this area or that. That much is utterly clear.

The universities, however, are going to keep hollering about institutional racism. They are not going to accept Murray’s views, no matter what develops. [Jon Cassidy, “Mau Mau Redux: Charles Murray Comes in for Abuse, Again“, The American Spectator, June 9, 2017]

And so it goes in the brave new world of alternative facts, most of which seem to come from the left. But the left, with its penchant for pseudo-intellectualism (“science” vs. science) calls it postmodernism:

Postmodernists … eschew any notion of objectivity, perceiving knowledge as a construct of power differentials rather than anything that could possibly be mutually agreed upon…. [S]cience therefore becomes an instrument of Western oppression; indeed, all discourse is a power struggle between oppressors and oppressed. In this scheme, there is no Western civilization to preserve—as the more powerful force in the world, it automatically takes on the role of oppressor and therefore any form of equity must consequently then involve the overthrow of Western “hegemony.” These folks form the current Far Left, including those who would be described as communists, socialists, anarchists, Antifa, as well as social justice warriors (SJWs). These are all very different groups, but they all share a postmodernist ethos. [Michael Aaron, “Evergreen State and the Battle for Modernity“, Quillette, June 8, 2017]

Other related reading (listed chronologically):

Molly Hensley-Clancy, “Asians With “Very Familiar Profiles”: How Princeton’s Admissions Officers Talk About Race“, BuzzFeed News, May 19, 2017

Warren Meyer, “Princeton Appears To Penalize Minority Candidates for Not Obsessing About Their Race“, Coyote Blog, May 24, 2017

B. Wineguard et al., “Getting Voxed: Charles Murray, Ideology, and the Science of IQ“, Quillette, June 2, 2017

James Thompson, “Genetics of Racial Differences in Intelligence: Updated“, The Unz Review: James Thompson Archive, June 5, 2017

Raymond Wolters, “We Are Living in a New Dark Age“, American Renaissance, June 5, 2017

F. Roger Devlin, “A Tactical Retreat for Race Denial“, American Renaissance, June 9, 2017

Scott Johnson, “Mugging Mr. Murray: Mr. Murray Speaks“, Power Line, June 9, 2017

Related posts:
Race and Reason: The Victims of Affirmative Action
Race and Reason: The Achievement Gap — Causes and Implications
“Conversing” about Race
Evolution and Race
“Wading” into Race, Culture, and IQ
Round Up the Usual Suspects
Evolution, Culture, and “Diversity”
The Harmful Myth of Inherent Equality
Let’s Have That “Conversation” about Race
Affirmative Action Comes Home to Roost
The IQ of Nations
Race and Social Engineering
Some Notes about Psychology and Intelligence

The Left and Evergreen State: Reaping What Was Sown

Tiana Lowe writes with misguided enthusiasm at National Review:

In the past fortnight, the Evergreen State College mob has incited violence against a professor, gotten said professor, Bret Weinstein, to flee campus in fear for his physical safety, inflicted $10,000 in property damage on campus, shut down classes, and forced graduation to be held off-campus as a result.

… Prior to going quiet after receiving mass-murder threats, Weinstein wrote an editorial in the Wall Street Journal warning: “The Campus Mob Came for Me—and You, Professor, Could Be Next.”…  [T]he New York Times has found a mob victim sympathetic enough in Weinstein, a liberal professor, to publicly lambaste the mobs at Evergreen, who counter every question, comment, and even a hand gesture by shouting, “RACIST.”

“It’s just the way discourse goes these days,” Evergreen president George Bridges told the Times’s Frank Bruni. Even the Seattle Times, which has previously let Bridges wax poetic on, “Why students need trigger warnings and safe places” in its editorial pages, condemned Evergreen as having “no safety, no learning, no future.”…

With the world witnessing Evergreen’s Mizzou-scale collapse in real time, perhaps the Left has finally woken up to its own tendency to eat its own. [“Evergreen State Faces Condemnation from [T]he Seattle Times and [T]he New York Times“, June 8, 2017]

Lowe links to a piece by Frank Bruni, an unsurprisingly left-wing columnist at The New York Times (“These Campus Inquisitions Must Stop“, June 3, 2017). Bruni opens with a morally relativistic, irrelevant, and sweeping statement:

Racism pervades our country. Students who have roiled college campuses from coast to coast have that exactly right.

Pervades? Perhaps Bruni is thinking of the attitude of blacks toward whites. Most American whites don’t have the time or inclination to be racist; they’re trying to get into universities and get hired and promoted despite the favoritism that’s showered on less-qualified blacks by their condescending, leftist “betters”. Yes, there is a hotbed of racism in the U.S., and it is located in the media, among the professoriate, and in the soul of every collegian of whatever color who sees life through the lens of “racism”.

Bruni, having shored up his left-wing credentials, actually says a few sensible things. After recounting the travails of Professor Weinstein, whose cause is laudable to leftists because Weinstein is a leftist, Bruni turns to

that awful moment … when one of the dozens of students encircling Nicholas Christakis, a professor [at Yale], shrieked at him: “You should not sleep at night! You are disgusting!”

He and his wife, Erika, were masters at one of Yale’s residential colleges, and she had circulated an email in which she raised questions about the university’s caution against any Halloween costumes that might be seen as examples of cultural appropriation or hurtful stereotyping.

“American universities were once a safe space not only for maturation but also for a certain regressive, or even transgressive, experience,” she wrote. “Increasingly, it seems, they have become places of censure and prohibition. And the censure and prohibition come from above, not from yourselves! Are we all O.K. with this transfer of power? Have we lost faith in young people’s capacity — in your capacity — to exercise self-censure?”

“Talk to each other,” she added. “Free speech and the ability to tolerate offense are the hallmarks of a free and open society.”

Agree or disagree with her, she was teeing up precisely the kind of contest of ideas that higher education should be devoted to. And she did so, if you read the whole of her email, in a considered, respectful fashion.

No matter: She was pushing back at something — the costume guideline — that was draped in the garb of racial sensitivity. And that made her, ipso facto, an enemy of illumination and agent of hate.

She and her husband were driven from their roles in the residential college, though he still teaches at Yale. He posted several sympathetic tweets last week about Weinstein’s vilification. In one he wrote that his wife “spent her whole career” working with “marginalized populations” and has a “deep, abiding humanity.”

“But still they came for her,” he added.

You would think that the Christakises, having been mugged by reality, would have changed their political stripes. Life is an IQ test, and they failed the mid-term.

Bruni continues:

Like plenty of adults across the political spectrum, they use slurs in lieu of arguments, looking for catharsis rather than constructive engagement. They ratchet up their language to a degree that weakens its currency for direr circumstances. And they undermine their goals — our goals — by pushing away good-hearted allies and handing ammunition to the very people who itch to dismiss them.

Right-wing media have had a field day with Evergreen, but not because they’ve faked a story. No, the story was given to them in ribbons and bows.

That’s the real problem. Bruni is afraid that Evergreen State will be used to discredit “progressivism”. But “progressivism” discredits itself, every day in every way. The riots at Evergreen State and other universities are merely the contemporary equivalent of Stalin’s purges and “show trials“.

Another piece linked to by Lowe is an unsigned editorial in The Seattle Times, “The Evergreen State College: No Safety, No Learning, No Future” (June 5, 2017). Here’s some of it:

The public state college near Olympia has become a national caricature of intolerant campus liberalism in both The New York Times and Fox News. At least one professor has been harangued and classes disrupted by shouting mobs of students accusing the famously progressive campus of “systemic racism.”

That coverage apparently has incited anonymous threats of mass murder, resulting in the campus being closed for three days. In the critical last week of school, students have been deprived of learning by extremes on the left and right.

Caricature? How can reality be a caricature? How did the “extreme” right get into the act? It’s news to me that there were and are rightists of any kind among the thugs who seized control of Evergreen.


Since the corrosive 2016 presidential election, Americans increasingly comprise a nation with citizens sealed in ideological bubbles; college campuses are often the most hermetically sealed of bubbles. When Weinstein, the professor, asked a yelling mob of students if they wanted to hear his answer, they shouted “No!”

Left-wing craziness at universities long predates the 2016 election. This is another  transparent (but failed) attempt to spread some of the blame rightward.

Leftists like Bruni and the editorial board of The Seattle Times can’t see the real problem because they’re part of it. They’re like the never-say-die apologists for socialism who protest that “real socialism” has never been tried. What they can’t face up to — despite the failure of the too-long-lived Soviet experiment — is that “real socialism” necessarily leads to suppression and violence. The Soviet Union, Communist China, Castro’s Cuba, and other socialist regimes are real socialism in action, not failed substitutes for it.

Bruni and his ilk, past and present, are responsible for the turmoil at Evergreen and other campuses. Bruni and his ilk — too many parents, most school teachers, most professors of the soft stuff, most pundits, too many politicians — have been spoon-feeding leftism to the young people of this country for more than a century. That is to say, they’ve been spoon-feeding generations of young people an intolerant ideology which prevails only through violence or the clear threat of it. The particulars of the ideology shift with the winds of leftist fashion, but its main catch-words are these:

  • liberty — to do whatever one feels like doing, and to suppress whatever one doesn’t like
  • equality — which others will be forced to pay for, à la socialism, and bow to, as in “some are more equal than others”
  • fraternity — but only with the like-minded of the moment.

Bruni and his ilk seem surprised by the virulence of their intellectual offspring, but they shouldn’t be. Dr. Frankenstein was a mere amateur by comparison with his 20th and 21st century successors, who must be blamed for loosing the monsters — students, faculty, administrators — who are destroying universities. Far worse than that, they and their elders are destroying the institutions of civil society.

Related posts:
Intellectuals and Capitalism
“Intellectuals and Society”: A Review
The Left’s Agenda
The Left and Its Delusions
The Spoiled Children of Capitalism
Politics, Sophistry, and the Academy
Subsidizing the Enemies of Liberty
Are You in the Bubble?
The Culture War
Ruminations on the Left in America
Academic Ignorance
The Euphemism Conquers All
Defending the Offensive
A Dose of Reality
God-Like Minds
Non-Judgmentalism as Leftist Condescension
An Addendum to (Asymmetrical) Ideological Warfare
The Left and Violence
Four Kinds of “Liberals”
Leftist Condescension
A Word of Warning to Leftists (and Everyone Else)
Another Thought or Two about Class
The Vast Left-Wing Conspiracy

Another Case of Cultural Appropriation

Maverick Philosopher makes an excellent case for cultural appropriation. I am here to make a limited case against it.

There is an eons-old tradition that marriage is a union of man and woman, which was shared  by all religions and ethnicities until yesterday, on the time-scale of human existence. Then along came some homosexual “activists” and their enablers (mainly leftists, always in search of “victims”), to claim that homosexuals can marry.

This claim ignores the biological and deep social basis of marriage, which is the procreative pairing of male and female and the resulting formation of the basic social unit: the biologically bonded family.

Homosexual “marriage” is, by contrast, a wholly artificial conception. It is the ultimate act of cultural appropriation. Its artificiality is underscored by the fact that a homosexual “marriage” seems to consist of two “wives” or two “husbands”, in a rather risible bow to traditional usage. Why not “wusbands” or “hives”?

Related posts:
In Defense of Marriage
The Myth That Same-Sex “Marriage” Causes No Harm
Getting “Equal Protection” Right
Equal Protection in Principle and Practice

The Vast Left-Wing Conspiracy

The following list of enemies of liberty is in no particular order, and is not a mutually exclusive set.

Everyone who appeals to the Constitution of the United States but doesn’t understand its principal premise, which is the co-sovereignty of a central government of enumerated and strictly limited powers (notwithstanding the purely aspirational Preamble, the widely misinterpreted General Welfare, Necessary and Proper, and Interstate Commerce clauses) and the States, which are in fact the creators of the Constitution — not the mythical “we the People”

About half of the elected officials of the federal government

A sizable chunk of the remaining half (who choose to go along rather than be portrayed as “mean”)

Varying percentages of senior appointed officials of the federal government, but about half on average

Probably more than half of judges at all levels of government

Vast numbers of elected and appointed officials of State and local governments

The overwhelming majority of civil servants at all levels of government, with the possible (but diminishing) exception of public-safety officers

Executives of large corporations who foster a cozy relationship with government, as rent-seekers, and who eagerly and visibly endorse government’s social meddling, as virtue-signalers

Almost all of the professoriate in the “liberal” arts and humanities, social “sciences”, and “education” indoctrination centers disciplines

Almost all administrators at colleges and universities

Most public-school teachers and administrators (who are excretions of the collegiate cabals listed immediately above)

Most “human resources” specialists, of whatever rank, wherever they are found

Almost everyone who is employed by any kind of entertainment or news medium, from stars to back-room technicians (the exceptions are notable because they are so few)

Almost everyone who is directly or indirectly involved in the creation, performance, or presentation of “art” (musical, visual, plastic, performing, etc.), with the exception of some practitioners of “country” music

Almost everyone who is a patron or aficionado of the aforementioned “arts”

Most American Jews, who are well represented in many of the other categories

The vast majority of members of the various groups favored and supported by government officials, in a long-standing symbiotic relationship, including (but not limited to) blacks, Hispanics, women, homosexuals (and other members of the gender-confused community), and the aforementioned “artists”

“Activists” of most stripes, who wish to remake the world in whatever utopian image enthralls them

An alarming fraction of the clergy of “mainline” religious denominations, who have somehow come to believe that Christ’s exhortations regarding private charity should be enforced by government

The spoiled children of capitalism who populate the campuses of most colleges and universities

Affluent Americans (the more affluent, the more left-leaning), whose unfounded guilt and alienation from reality have caused them to lose sight of the connection between self-reliance and dignity, and government’s powerfully destructive effect on both

A residual but still very large fraction of white working-class persons who hope that government will make their lives better or at least come through with bigger handouts

Every voter who shares those hopes

If Men Were Angels

Libertarians, God bless them, are always looking for simple solutions to complex problems. Here, for example, is David Bernstein, writing at The Volokh Conspiracy:

I doubt [that] any two libertarians agree on the exact boundaries of libertarianism, but how’s this for a working definition: “A libertarian is someone who generally opposes government interference with and regulation of civil society, even when the result of such government action would be to clamp down on things the individual in question personally dislikes, finds offensive, or morally disapproves of.”

Thus, for example, a libertarian who hates smoking opposes smoking bans in private restaurants, a libertarian who thinks homosexual sodomy is immoral nevertheless opposes sodomy laws, a libertarian who finds certain forms of “hate speech” offensive still opposes hate speech laws, a libertarian who believes in eating natural foods opposes bans or special taxes on processed foods, and a libertarian who thinks that all employers should pay a living wage nevertheless opposes living wage legislation. It doesn’t matter whether the libertarian holds these positions because he believes in natural rights, for utilitarian reasons, or because he thinks God wants us to live in a libertarian society. [“How’s This for a Working Definition of ‘Libertarian’?,” February 26,2015]

This reminds me of the title of a poem by A.E. Housman: “Terence, This Is Stupid Stuff.” Why is it stupid stuff? Because it omits an essential ingredient of liberty, which is line-drawing.

By Bernstein’s logic, one must conclude that anything goes; for example, a libertarian who hates murder, rape, theft, and fraud must oppose laws against such things. Bernstein, like many a libertarian, propounds a moral code that is devoid of morality.

Bernstein might argue that morality is supplied by prevailing social norms. Which, until the bandwagon effect produced by the Supreme Court’s decision in Obergefell v. Hodges, would have meant the non-recognition of homosexual “marriage”. But libertarians were prominent in the chorus of voices clamoring for the Supreme Court to make a national law recognizing homosexual “marriage”, even though the marriage laws still on the books in most parts of the nation — laws that defined marriage as the union of male and female — arose from prevailing social norms. Libertarians have a slippery way of proclaiming laissez faire while striving to enforce their own moral views through law.

Libertarianism is an ideology rooted in John Stuart Mill’s empty harm principle (a.k.a the non-aggression principle), about which I’ve written many times (e.g., here). Regarding ideology, I turn to Jean-François Revel:

As an a priori construction, formulated without regard to facts or ethics, ideology is distinct from science and philosophy on the one hand, and from religion and ethics on the other. Ideology is not science — which it pretends to be. Science accepts the results of the experiments it devises, whereas ideology systematically rejects empirical evidence. It is not moral philosophy — which it claims to have a monopoly on, while striving furiously to destroy the source and necessary conditions of morality: the free will of the individual. Ideology is not religion — to which it is often, and mistakenly, compared: for religion draws its meaning from faith in a transcendent reality, while ideology aims to perfect the world here below.

Ideology — that malignant invention of the human spirit’s dark side, an invention which has cost us dearly — has the singular property of causing zealots to project the structural features of their own mentality onto others. Ideologues cannot imagine that an objection to their abstract systems could come from any source other than a competing system.

All ideologies are aberrations. A sound and rational ideology cannot exist. Falsehood is intrinsic to ideology by virtue of cause, motivation and objective, which is to bring into being a fictional version of the human self — the “self,” at least, that has resolved no longer to accept reality as a source of information or a guide to action. [Last Exit to Utopia, pp. 52-53]

A key aspect of ideology — libertarian ideology included — is its studied dismissal of human nature. Arnold Kling notes, for example,

that humans in large societies have two natural desires that frustrate libertarians.

1. A desire for religion, defined as a set of rituals, norms, and affirmations that are shared by a group and which the group believes it is wrong not to share….

2. A desire for war. I think that it is in human nature to fantasize about battles against tribal enemies….

If these desires were to disappear, I believe that humans could live without a state. However, given these desires, the best approach for a peaceful large society is that which was undertaken in the U.S. when it was founded: freedom of religion guaranteed by the government, and a political system designed for peaceful succession and limitations on the power of any one political office….

I think that it is fine for libertarians to warn of the dangers of religion and to oppose war…. On other other hand, when libertarians assume away the desire for religion and war, their thinking becomes at best irrelevant and at worst nihilistic. [“Libertarians vs. Human Nature,” askblog, February 17, 2017]

In Madison’s words:

If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. [The Federalist No. 51, February 6, 1788]

Related posts:
On Liberty
Line-Drawing and Liberty
Pseudo-Libertarian Sophistry vs. True Libertarianism
Bounded Liberty: A Thought Experiment
More Pseudo-Libertarianism
True Libertarianism, One More Time
Human Nature, Liberty, and Rationalism
The Myth That Same-Sex “Marriage” Causes No Harm
Defending Liberty against (Pseudo) Libertarians
The Pseudo-Libertarian Temperament
Parsing Political Philosophy (II)
Libertarianism and the State
My View of Libertarianism
More About Social Norms and Liberty
The Authoritarianism of Modern Liberalism, and the Conservative Antidote
The Harm Principle Revisited: Mill Conflates Society and State
Liberty and Social Norms Re-examined


Yesterday I posted about the hysterics who decry AGW but don’t act on their own to prevent its (non) occurrence. They remind me of wealthy advocates of big government who complain that their taxes are too low but don’t make voluntary donations to the U.S. Treasury.

In fact, I’m confident that most of the wealthy advocates of higher taxation are also rampant emitters of CO2. They’re foolishly consistent: hypocrites about taxation, hypocrites about AGW.

Oh, the Hysteria!

There’s little to add to the unfounded hysteria about Trump’s decision to pull out of the Paris climate agreement … but this:

If all of the hysterics truly believe that a failure to reduce CO2 emissions will result in catastrophic global warming, they have it within their power to reduce emissions drastically. They can start by getting rid of their cars in favor of bikes and horses, moving to smaller homes, doing without air conditioning, keeping their homes at 50 degrees in the winter, bathing and washing clothes in cold water, growing and raising their own foodstuffs (to eliminate transportation-based emissions), reading by candle light, and throwing out all of their electrical appliances — even including their smart phones, which rely on electrically powered systems.

Given the number of hysterics out there, I’m sure that the (non) CO2 problem would be solved in no time. If their grandparents, great-grandparents, and all who came before them could live a CO2-minimal life, why can’t a few billion true-blue saviors of the world do the same?