Abortion Q and A

A new entry at Realities. Using a Q&A format, this article summarizes my writings on abortion over the past 14 years, since I first voiced my opposition to it.

Even More about Names

In previous posts about first names, I addressed names that had shifted from male to female, presidents’ surnames used as first names, the changing popularity of my grandparents’ first names, and the changing popularity of my high-school classmates’ first names.

I was reminded of those older posts (published in 2006, 2008, and 2012) by an analysis of gender-fluidity in names, that is, names which are becoming more neutral rather than being given mainly to boys or girls. The author, one Nikhil Sonnad,

calculated a “genderedness score” for every American baby name—and for the country on the whole. The score goes from zero to one. A zero means a name is perfectly non-gendered. That is to say, exactly half of the babies with that name are boys, and the other half are girls. A one, meanwhile, means the name is used exclusively for one gender. So a lower score means a name is more gender-neutral, and less biased.

How is it “biased” to use a boy’s name for a boy and a girl’s name for a girl? That statement should be damned to PC hell, along with the idea that gender is “assigned” at birth. It’s not assigned, it just is, except in rare instances. And it’s immutable, regardless of what the PC witch-doctors profess to believe.

Anyway, Sonnad continues:

The overall genderedness score was 0.97 in 1920, meaning nearly every kid had a name that was used almost exclusively for just boys or just girls. The score is falling, though. It hit 0.946 in 2016, the most recent year the SSA has name data for.

I’m gratified to learn that the generedness score is still almost 1. I’m further gratified to note that the genderedness score has dropped in the past, and then rebounded:

Part of the apparent decline in genderedness may be due to the fact that the Social Security database used by the author covers only the top 1,000 names of each sex in each year. The variety of names grows with population, so the top 1,000 isn’t as inclusive now as it used to be (see below). And gendereless or ambiguous names are undoubtedly rising in popularity (for now) because baby-naming is a faddish thing, like transgenderism.

The variety of names grows not only because of population growth but also because of its changing composition. There are many more Spanish names and Hispanic babies than 10, 20, or 30 years ago — and vastly more than there were 100 years ago. Also, blacks have increasingly branched out into African, pseudo-African, and black-redneck names — names that are identifiably black, but not always identifiably male or female (by whites, at least).

So the real news isn’t the rise of gender-ambiguous names, but the growing variety of names, which I will show in two ways. First, using the same Social Security database as Sonnad, I constructed the following tables. They list the 10 most popular baby names (male and female) for equidistant 17-year intervals spanning the 136 years between the first year (1880) and most recent year (2016) included in the database. The tables also show the percentages of male and female babies given those names in each of the years. (Right click to view enlarged image in a new tab.)


Source: Go here and scroll to the search tool at the bottom of the page (on the left).

Note the general decline in the percentages. Being in the top 10 these days is almost meaningless compared with being in the top 10 through the 1940s or 1950s.  Note also the generally lower percentages for girls’ names than for boys’ names until 2016. Girls have long had a greater variety of names, though boys are finally catching up.

The following graphs (derived from the same source) illustrate the same points. They also highlight the relative stability in the number of names until the 1950s and 1960s, when the percentages of babies with names in the top 10, 100, and 1,000 really began to dive.

Returning to the lists of names, I note that many of the recent top-10 names are throwbacks; the 19th century is “in” again. There are some old top-10 names that have fallen from popularity for good reason — they are grating, if not ugly; for example, Frank (Frank is a wiener; Francis is okay), Henry (better as Henri or Enrico), George (fit only for a king), Walter (a plumber’s name), Donald (rhymes with Ronald, as in McDonald), Richard (you know the nickname), Minnie (little fish), Bertha (as in “Big Bertha”), Florence (with the pursed lips), Ethel (ditto), and Dorothy, Shirley, and Doris (so 1930s and 1940s).

Anyway, those are my prejudices. What are yours?

Recommended Reading

Leftism, Political Correctness, and Other Lunacies (Dispatches from the Fifth Circle Book 1)

 

On Liberty: Impossible Dreams, Utopian Schemes (Dispatches from the Fifth Circle Book 2)

 

We the People and Other American Myths (Dispatches from the Fifth Circle Book 3)

 

Americana, Etc.: Language, Literature, Movies, Music, Sports, Nostalgia, Trivia, and a Dash of Humor (Dispatches from the Fifth Circle Book 4)

Pronoun Profusion

I could have called this post “Pronoun Confusion”, given what I found at Wikipedia:

And here:

I suppose there are other variations, but I quit digging before I became terminally confused. (UPDATE: Here are some new variations hot off the web. Read ’em and weep.)

It used to be that a person didn’t care what he (the generic kind) was called, as long as he wasn’t called late for dinner. That’s a tired joke, of course. People do care what they’re called, but it’s usually when they’re called something demeaning (e.g., “hey, you” to a general, “Harvey” to a doctor you don’t know as a friend) or insulting (e.g., “jerk” and worse).

I guess it’s insulting to (some) persons who have “chosen” a sex other than the one that they were born with (not “assigned at birth”) to be mistaken for persons of that sex. But give me a break. How am I supposed to know that you’re “really” a man if you look like a woman who’s trying to look like a man, or you’re “really” a woman who looks like a man who’s trying to look like a woman?

Sane persons — which is about 98 percent of the population, aside from posturing leftists and the gender-confused — are by definition in touch with reality. The traditional pronouns given in the first part of the Wikipedia table reflect that reality. They cover everything. I therefore reject all that follows, in the name of accuracy, clarity, and simplicity. (I would invoke Occam’s razor, but that might be taken as an endorsement of genital mutilation.)

So here’s the deal. If you don’t want to be called “he” or “she”, or any of their cognates, I will comply politely and use “you”, “your”, “yours”, or “yourself” when speaking or writing  to you. When speaking or writing about one of you, I will use “it”, “its”, or “itself”; for more than one of you, I will use “they”, “them”, “their”, “theirs”, or “themselves”.

There’s absolutely nothing insulting about such neutral usages. If you believe that there is, please consider the possibility that you are nuts (whether or not you have any). But don’t insult my intelligence by trying to make me believe that you’ve acquired a gender other than the one you were born with — or none at all.


Related reading:
Gregory Cochran, “Transsexuals“, West Hunter, May 8, 2013
Gregory Cochran, “Internal Contradictions“, West Hunter, December 12, 2015


Related posts:
1963: The Year Zero
The Transgender Fad and Its Consequences
Some Notes about Psychology and Intelligence

 

Courtship or Molestation?

Several weeks ago I happened upon a statement by Keith Burgess-Jackson (KBJ) about an blog post that he published in November 2017, which became a cause célèbre:

(That is the entire blog post, as reproduced in Alex Macon’s “UT-Arlington Professor: ‘What’s the Big Deal’ About Adult Men Dating Underage Girls?“, Dmagazine.com, November 30, 2017.)

There is presumably a connection between that post and the demise of Keith Burgess-Jackson (KBJ’s eponymous blog), where it was posted. But it is to KBJ’s credit that he quickly resumed blogging at Just Philosophy, and wasn’t cowed by the notoriety resulting from his post.

But I must say that my own reaction was similar to that of KBJ’s detractors:

I was trying to find a way into Keith Burgess-Jackson’s eponymous blog, which seems to have been closed to public view since he defended Roy Moore’s courtship of a 14-year-old person. (Perhaps Moore might have been cut some slack by a segment of the vast left-wing conspiracy had the person been a male.)

That is to say, I read KBJ’s post as a defense of Roy Moore’s “courtship” of a 14-year-old girl (or young woman). KBJ argues strenuously in his statement that he wasn’t defending Moore, who had been accused of more than “courting” the young woman. This account, from Wikipedia, refers to reportage that predates KBJ’s post:

On November 9, 2017, The Washington Post outlined an account of a woman, Leigh Corfman, who said that Moore initiated a sexual encounter with her in 1979, when she was 14 and he was 32 years old.[18] Corfman said that Moore met her and her mother in the hallway of the county courthouse, where Moore was working as an assistant district attorney, and offered to sit with Corfman while her mother went into a courtroom to testify.[18] Corfman said that during that discussion he asked for her phone number, which she gave him, they later went on two dates, for each date he picked her up in his car around the corner from her house and drove her to his house, and on the first date he “told her how pretty she was and kissed her”. On a second date, Moore allegedly “took off her shirt and pants and removed his clothes … touched her over her bra and underpants … and guided her hand to touch him over his underwear”.[18]

The incident, as described by Ms. Corfman, doesn’t resemble courtship as I have understood it in my lifetime, and I am older than KBJ and Roy Moore. Christian minister Patricia Bootsma explains that

in contrast to the modern conception of dating, in “courtship, time together in groups with family or friends is encouraged, and there is oversight by and accountability to parents or mentors”.[7] She further states that with courtship, “commitment happens before intimacy”.[7]

That is courtship, and I’m surprised when an erudite man who uses language precisely (i.e., KBJ) doesn’t know the difference between it and “making out“, which is more or less what Moore was (allegedly) bent on doing. Perhaps KBJ picked up the term from another news story, or perhaps he chose to use it as a euphemism for the acts described in the Post‘s story (which were repeated throughout the news media).

But I can understand the objections to KBJ’s post because (a) the story wasn’t about “courtship”, it was about a 32-year-old man (allegedly) making sexual advances to a 14-year-old girl-woman. Moreover, the alleged behavior took place in 1979, not in 1922, when KBJ’s maternal grandparents were “courting” or courting, as the case may be.

In 1922, legislative battles about age-of-consent laws had only recently been settled (for the most part):

While the general age of consent is now set between 16 and 18 in all U.S. states, the age of consent has widely varied across the country in the past. In 1880, the age of consent was set at 10 or 12 in most states, with the exception of Delaware where it was 7.[2] The ages of consent were raised across the U.S. during the late 19th century and the early 20th century.[3][4] By 1920 ages of consent generally rose to 16-18 and small adjustments to these laws occurred after 1920. As of 2015 the final state to raise its age of general consent was Hawaii, which changed it from 14 to 16 in 2001.[5]

By contrast, Alabama’s age of consent (which was 10 in 1880) had been 16 since 1920. Sexual behavior that might have been deemed acceptable in 1922, when old ways were a fresh memory, was surely beyond the pale in 1979 — 59 years after Alabama’s age of consent had been raised to 16.

So to answer KBJ’s question: It was a very big deal if Moore had in fact done the things that he is accused of having done with a 14-year-old girl-woman. Things were different in 1979 than in 1922. As someone who is older than KBJ, I will even say that things were nearly as different in 1979 as they were in 2017. The Mad Men days were by 1979 almost a faint memory. (Not in Hollywood, politics, or the upper echelons of the business world, but the Mad Men days never ended there — or not until recently, maybe. I’m talking about the workaday world of real people, when sexual harassment had by 1979 become widely frowned on, if not always suppressed.)

KBJ’s outrage about “people imposing their own moral standards on people of the past” is obviously misplaced. Because of that, any reasonable reader — even a leftist — could conclude that KBJ was attempting to excuse Moore’s alleged behavior.

I can’t quote portions of KBJ’s long, copyrighted statement because of the terms of the copyright (“Publishable in Its Entirety or Not at All”). I will just say that it struck me as an after-the-fact justification of a reflexive defense of Roy Moore (widely considered a conservative) by a conservative blogger who is (rightly) exasperated by the torrent of abuse that is heaped continuously on (actual and self-styled) conservatives.

Having said all of that, I should add that I am very much a fan of KBJ. I’m glad that he quickly resumed blogging, despite the barrage of criticism that was aimed at him — much of it, I’m sure, by leftists who attacked him reflexively because of his conservatism.

Better-than-Best Pictures, or Who Needs the Oscars?

As you know by now, The Shape of Water won the Oscar for Best Picture of 2017. But read this post before you rush to a theater to see it. Or, if you’ve already seen it, claim that it’s among the best movies ever.

Business Insider offers a ranking of Best-Picture winners in “All 89 Oscar Best-Picture Winners, Ranked from Worst to Best by Movie Critics“, which covers releases through 2016. (This year’s Best Picture award is for a film released in 2017.) Business Insider bases its ranking on critics’ reviews, as summarized at Rotten Tomatoes.

The Business Insider piece doesn’t help the viewer who’s in search of a better film than those that have been voted Best Picture.  A post at Political Calculations takes a stab at the problem by offering alternatives to the five worst-ever Best-Picture awardees. But the alternatives are limited to films that were nominated for Best Picture for the same five years.

Three years ago, in the wake of the Academy Awards for 2014, I posted “Another Trip to the Movies“.  There, I showed that of the 88 films which had then earned the Best-Picture award, only 14 were in fact the highest-rated among U.S.-made feature films released in the same year.

I based my comparison on ratings given by users at Internet Movie Database (IMDb). IMDb user ratings aren’t a sure guide to artistic merit — as the latter is judged by members of the Academy of Motion Picture Arts and Sciences (AMPAS), or by movie critics. But members of AMPAS and movie critics are notoriously wrong-headed about artistic merit. The Shape of Water exemplifies their wrong-headedness:

This Guillermo del Toro film has gotten rave reviews from critics, with a Rotten Tomatoes rating of 93%, and lots of awards-season buzz. And while some elements of the film are praiseworthy, … the film turns out to be little more than a collection of manipulative and ludicrous set-ups for social-justice lectures lacking any nuance or wit. The Shape of Water assumes its audience to be idiots, which makes this the kind of painful and unoriginal exercise that is all but certain to win awards throughout this winter in Hollywood….

The Shape of Water never allows the audience to get the message of tolerance from the central allegory of the love between Elisa and the creature. Instead, del Toro and the writers fill up every square inch with contrivances and lectures.

And those lectures come with all of the subtlety of a jackhammer. Giles lost his job in the advertising business for unexplained reasons, but which seem to be connected to his sexual orientation. He tries to reach out to a waiter at his favorite diner, who rejects him just as the waiter also gets a chance to demonstrate his racism by refusing service to a black couple, both of which are completely gratuitous to the film or to Amphibian Man’s fate. Shannon’s Strickland spouts religious nonsense to justify cruelty, and sexually oppresses his wife in another gratuitous scene, sticking his gangrenous fingers over her mouth to keep her from expressing pleasure…. The bad guys are the US space program (!) and the military, while the most sympathetic character apart from the four main protagonists is a Soviet spy. Strickland dismisses Elisa and Zelda as suspects, angrily lamenting his decision to “question the help,” just in case the class-warfare argument escaped the audience to that point. Oh, he’s also a major-league sexual harasser in the workplace.  And so on. [Ed Morrissey, “The Shape of Water: Subtle As a Jackhammer and Almost As Intelligent“, Hot Air, March 5, 2018]

If The Shape of Water is your kind of film, you’re at the wrong blog.

In any event, IMDb user ratings are a good guide to audience appeal, which certainly doesn’t preclude artistic merit. (I would argue that audience appeal is a better gauge of artistic merit than critical consensus.) For example, I have seen 10 of the 14 top-rated Oscar winners listed in the Business Insider article, but only 5 of the winners that I have seen are among my 14 top-rated Oscar winners.

The first table below lists all 91 of the Best Picture winners, ranked according to the average rating given each film by IMDb users. The second table lists the 100 features given the highest average ratings by IMDb users. (The list includes films released in the U.S. through 2017 that have been rated by at least 3,500 users, which is the approximate number for Cavalcade, the least-viewed of Oscar-winning pictures.) Only 16 of the 91 Oscar-winning films (highlighted in red) are among the top 100. (Lawrence of Arabia would be among the top 100, but IMDb categorizes it as a UK film.)

In short, there are many Better-Than-Best Pictures to choose from.

Top 100 films through 2017

 

See also my post “A Trip to the Movies“, and John Sexton’s “Oscar Ratings Likely to Set an All-Time Low” (Hot Air, March 5, 2018).

Trump vs. Obama on Taxes

In January 2013, Congress passed and Barack Obama jubilantly signed what The Wall Street Journal called “the largest tax increase in the past two decades”:

More than three-quarters of American households would see a tax increase from their 2012 tax levels, according to an analysis by the Tax Policy Center, a joint venture of the Brookings Institution and the Urban Institute.

In December 2017, Congress passed and Donald Trump jubilantly signed a bill that cut corporate income taxes and almost every taxpayer’s federal income taxes.

If you take the view taxpayers’ money really belongs to the government — as “liberals” are wont to do — you would have to concede that Mr. Obama was niggardly toward taxpayers, in comparison with Mr. Trump.


Related posts:
Ignorance Abounds
Defending the Offensive

Modernism in Austin

Tout Austin (i.e., Austin’s civic and cultural elites) turned out for the grand opening of the late Ellsworth Kelly’s “Austin”, described and depicted here as a chapel of light for “contemplation”. Not contemplation, mind you, but “contemplation”, which must be something akin to “mindfulness“.

Kelly’s “chapel” has taken up residence at the Blanton Museum of Art, which belongs to the University of Texas at Austin. (The prepositional modifier is the official but unnecessary place designator for the intellectual and social blight known as UT).

What does “Austin” look like? This:

Impressive, no? No, not impressive.

About a mile away — an easy walk or bike ride for a contemplative student, faculty member, or taxpayer — is a real work of art, Austin’s Cathedral of St. Mary:

To quote myself:

In the early decades of the twentieth century, the various arts became an “inside game”. Painters, sculptors, composers (of “serious” music), and choreographers began to create works not for the enjoyment of audiences but for the sake of exploring “new” forms. Given that the various arts had been perfected by the early 1900s (at the outside), the only way to explore “new” forms was to regress toward primitive ones — toward a lack of structure…. Aside from its baneful influence on many true artists, the regression toward the primitive has enabled persons of inferior talent (and none) to call themselves “artists”. Thus modernism is banal when it is not ugly.

Painters, sculptors, etc., have been encouraged in their efforts to explore “new” forms by critics, by advocates of change and rebellion for its own sake (e.g., “liberals” and “bohemians”), and by undiscriminating patrons, anxious to be au courant. Critics have a special stake in modernism because they are needed to “explain” its incomprehensibility and ugliness to the unwashed.

The unwashed have nevertheless rebelled against modernism, and so its practitioners and defenders have responded with condescension, one form of which is the challenge to be “open minded” (i.e., to tolerate the second-rate and nonsensical). A good example of condescension is heard on Composers Datebook, a syndicated feature that runs on some NPR stations. Every Composers Datebook program closes by “reminding you that all music was once new.” As if to lump Arnold Schoenberg and John Cage with Johann Sebastian Bach and Ludwig van Beethoven.

All music, painting, sculpture, and dance were once new, but new doesn’t necessarily mean good. Much (most?) of what has been produced since 1900 (if not before) is inferior, self-indulgent crap.

This Is a Test

Scott McKay writes:

Thursday saw a media firestorm erupt over a Washington Post report that amid a White House meeting with several members of Congress working on a compromise having to do with the Obama-era Deferred Action for Childhood Arrivals program, or DACA, President Trump asked why America should have to take in so many immigrants from “s***hole countries” rather than people from places like Norway.

The Post article isn’t exactly the finest example of American journalism, identifying as its source no one actually in the room to confirm what Trump supposedly said but instead naming two anonymous people who were “briefed on the meeting.”

I won’t get into the truth or falsity of the reporting. I suspect that it’s true. And it doesn’t bother me in the least if President Trump characterized some countries as s***holes. They are, and for two very good reasons: the low intelligence of their populations and their anti-libertarian governments (which make the U.S. seem like an anarcho-capitalist’s paradise).

Why are so many people (leftists, that is) upset? Because calling a s***hole a s***hole is a sin against cant and hypocrisy, in which the left specializes.

Here’s the test: If you were forced to live in another country, would you choose Norway or Haiti? Any sensible person — and perhaps even a leftist — would choose Norway.


Related posts:
Ruminations on the Left in America
The Euphemism Conquers All
Superiority
Non-Judgmentalism as Leftist Condescension
Leftist Condescension

A Glimmer of Hope on the Education Front

Gregory Cochran (West Hunter) points to an item from 2014 that gives the annual distribution of bachelor’s degrees by field of study for 1970-2011. (I would say “major”, but many of the categories encompass several related majors.) I extracted the values for 1970, 1990, and 2011, and assigned a “hardness” value to each field of study:

The distribution of degrees seems to have been shifting away from “soft” fields to “middling” and “hard” ones:

The number of graduates has increased with time, of course, so there are still more soft bachelor’s degrees being granted now than in 1970. But the shift toward harder fields is comforting because soft fields seem to attract squishy-minded leftists in disproportionate numbers.

The graph suggests that the college-educated workforce of the future will be somewhat less dominated by squishy-minded leftists than it has been since 1970. It was around then that many of the flower-children and radicals of the 1960s graduated and went on to positions of power and prominence in the media, the academy, and politics.

It’s faint hope for a future that’s less dominated by leftists than the recent past and present — but it is hope.

CAVEATS:

1. The results shown in the graph are sensitive to my designation of each field’s level of “hardness”. If you disagree with any of those assignments, let me know and I’ll change the inputs and see what difference they make. The table and graph are in a spreadsheet, and changes in the table will instantly show up as changes in the graph.

2. The decline of “soft” fields is due mainly to the sharp decline of Education as a percentage of all bachelor’s degrees, which occurred between 1971 and 1985. To the extent that some Education majors migrated to STEM fields, the overall shift toward “hard” fields is overstated. A prospective teacher who happens to major in math is probably of less-squishy stock than a prospective teacher who happens to major in English, History, or similar “soft” fields — but he is likely to be more squishy than the math major who intends to pursue an advanced degree in his field, and to “do” rather than teach at any level.

“Capitalism” Is a Dirty Word

Dyspepsia Generation points to a piece at reason.com, which explains that capitalism is a Marxist coinage. In fact, capitalism

is what the Dutch call a geuzennaam—a word assigned by one’s sneering enemies, such as Quaker or Tory or Whig, but later adopted proudly by the victims themselves.

I have long viewed it that way. Capitalism conjures the greedy, coupon-clipping, fat-cat of Monopoly:

Thus did a board-game that vaulted to popularity during the Great Depression signify the identification of capitalism with another “bad thing”: monopoly. And, more recently, capitalism has been conjoined with yet another “bad thing”: income inequality.

 

In fact, capitalism

is a misnomer for the system of free markets that could deliver abundant prosperity and happiness, were markets left free. Free does not mean unfettered; competition for the favor of consumers exerts strong discipline on markets. And laws against theft, deception, and fraud would serve amply to keep markets honest, the worrying classes to the contrary notwithstanding.

What the defenders of capitalism are defending — or should be — is voluntary, market-based exchange. It doesn’t roll off the tongue, but that’s no excuse for continuing to use a Marxist smear-word for the best of all possible economic systems.


Related posts:
More Commandments of Economics (#13 and #19)
Monopoly and the General Welfare
Monopoly: Private Is Better than Public
Some Inconvenient Facts about Income Inequality
Mass (Economic) Hysteria: Income Inequality and Related Themes
Income Inequality and Economic Growth
A Case for Redistribution, Not Made
McCloskey on Piketty
Nature, Nurture, and Inequality
Diminishing Marginal Utility and the Redistributive Urge
Capitalism, Competition, Prosperity, and Happiness
Economic Mobility Is Alive and Well in America
The Essence of Economics
“Rent” Is Indispensable

Sexual Misconduct: A New Crime, A New Kind of Justice

Not all bad behavior is, or should be, the subject of official investigation, prosecution, and punishment. It should be enough, in the vast majority of cases, to stop bad behavior and discourage its repetition by simply saying “no”, administering a spanking, or subjecting the miscreant to social scorn.

These time-honored methods gave way decades ago to the sob-sister school of pseudo-psychology, which instructs all and sundry that it is harmful to young psyches to say “no” without a long explanation (couched in psychological rather than moral terms), to spank (or otherwise administer corporal punishment), or to squelch “creativity” (i.e., mischief-making) by any method of communication, from frowning to screaming.

It should therefore come as no surprise that several generations of persons born after World War II — which includes almost all of today’s practicing politicians, lawyer, judges, and celebrities — have lacked the benefit of moral guidance. What they seem to have learned is not to eschew bad behavior, but to feign contrition for it when caught. Pseudo-contrition can be made to seem genuine by a method-acting technique: converting mortification for being caught into sorrow for having committed the offending deed.

Meanwhile, the broader system of justice, which encompasses the kinds of social censure discussed above, is shifting away from the inculcation of traditional morality (which would reinforce “white privilege” and “patriarchy”) and becoming a delivery vehicle for socio-political vengeance. This perversion seemed to have peaked with the Obama-Holder regime’s penchant for launching federal investigations of shootings by police when the persons shot were black, under the rubric of “civil rights”, and with the refusal of campus and municipal officials to curb violence committed by leftists and their protégés (e.g., Antifa and BLM).

But the perversion of justice has reached a new low with the wave of public accusations of sexual misconduct fomented by the #MeToo campaign,

to denounce sexual assault and harassment, in the wake of sexual misconduct allegations against film producer and executive Harvey Weinstein. The phrase, long used in this sense by social activist Tarana Burke, was popularized by actress Alyssa Milano, who encouraged women to tweet it to publicize experiences to demonstrate the widespread nature of misogynistic behavior.

Dozens of prominent or high-ranking men in politics, entertainment, and business have been accused of various acts of sexual misconduct. Many of them have lost their jobs as a result of the accusations. Roy Moore probably lost the special election in Alabama because of the accusations. It is a widely held view on the left that Donald Trump should lose his job because of accusations that have been leveled against him, and also because he’s a creepy loud-mouth who mainly takes a conservative political stance and is a “racist” to boot. (“Racist” is the go-to word for leftists who want to open the southern border to more waves of future Democrat voters.)

In other words, there’s a new crime on the block: sexual misconduct. It consists not only of actual crimes — such as rape — that ought to be prosecuted, and have been prosecuted since long before the #MeToo campaign. It also consists of any perceived sexism or slight on the part of a male toward a female.

This new, ill-defined crime is in the mind of the beholder. She may perceive a crime simply because she hates men or finds it psychologically satisfying to think of them as the enemy — along with Republicans, Israelis, “the rich” (one of which she may well be), climate-change “deniers”, NASCAR fans, and on and on.

In fact, it’s the old double-standard at work: Misogyny (real or imagined) is bad, but man-hating is good. Or so it has become among many women (and their male sycophants) who, with unintentional irony, call themselves “liberal” and “progressive”.  It is illiberal in the extreme to deprive someone of life, liberty, property, or a job based on mere accusations, but that is what is happening. It is regressive in the extreme to wage war against half the population (minus the mental cuckolds who are their allies) when it is the half of the population that does the really hard and dangerous jobs that make it possible for them to live in a hypocritical state of comfort and security.

So, despite my schadenfreude about the comeuppance of many left-wing males (most of whom probably deserve it), I am unenthusiastic about this latest incarnation of the Salem witch-trials. It is too much of a piece with the many memes that have captured the fickle attentions of neurotic leftists in recent decades, years, months, weeks, and days; for example, eugenics, prohibition, repeal of prohibition, peace through unilateral disarmament, overpopulation, global cooling, peak oil, global warming, carbon footprints, recycling, income inequality, unconscious racism, white privilege, forced integration, forces segregation (if blacks want it), coeducation, mixed-sexed dorms, single-sex schools, any reference to or image of a firearm, keeping score, winning, cultural appropriation, diversity, globalization, free speech (not), homophobia, same-sex “marriage”, smoking, gender “assignment” at birth, “free” college for all, “settled science”, collective guilt (but only of straight, white, conservative males of European descent, and Germans in 1933-1945), racial profiling and stereotyping (except when leftists do it), etc., etc., etc.

Each “good” can be attained and each “bad” averted simply by enacting laws, regulations, and punishments. Though nature and human nature are not so easily controlled (let alone changed), the neurotic appetite for action can be sated temporarily by the mere enactment of laws, regulations, and punishments. And when these have been piled one on top of the other for decades, the results are as predicted by conservatives and libertarians: the suppression of liberty and economic growth.

There’s real crime for you.


Related posts:
Greed, Cosmic Justice, and Social Welfare
Positive Rights and Cosmic Justice
Liberalism and Sovereignty
Fascism with a “Friendly” Face
Penalizing “Thought Crimes”
Democracy and Liberty
The Interest-Group Paradox
Inventing “Liberalism”
Civil Society and Homosexual “Marriage”
Fascism and the Future of America
The Indivisibility of Economic and Social Liberty
The Near-Victory of Communism
Tocqueville’s Prescience
Accountants of the Soul
In Defense of Marriage
The Left
Rationalism, Social Norms, and Same-Sex “Marriage”
Our Enemy, the State
“Intellectuals and Society”: A Review
The Left’s Agenda
The Left and Its Delusions
“Occupy Wall Street” and Religion
Merit Goods, Positive Rights, and Cosmic Justice
The Spoiled Children of Capitalism
Politics, Sophistry, and the Academy
Subsidizing the Enemies of Liberty
More about Merit Goods
The Morality of Occupying Private Property
Prohibition, Abortion, and “Progressivism”
Liberty, Negative Rights, and Bleeding Hearts
Our Perfect, Perfect Constitution
Liberty and Society
Tolerance on the Left
The Eclipse of “Old America”
The Fallacy of Human Progress
Fighting Modernity
Defining Liberty
The Culture War
Modern Liberalism as Wishful Thinking
Getting Liberty Wrong
Romanticizing the State
Governmental Perversity
The Pretence of Knowledge
“The Science Is Settled”
Ruminations on the Left in America
No Wonder Liberty Is Disappearing
Academic Ignorance
More About Social Norms and Liberty
The Euphemism Conquers All
Defending the Offensive
Superiority
The War on Conservatism
Whiners
A Dose of Reality
God-Like Minds
The Authoritarianism of Modern Liberalism, and the Conservative Antidote
Society, Polarization, and Dissent
Non-Judgmentalism as Leftist Condescension
An Addendum to (Asymmetrical) Ideological Warfare
The Rahn Curve Revisited
Social Justice vs. Liberty
The Left and “the People”
Why Conservatives Shouldn’t Compromise
Liberal Nostrums
Liberty and Social Norms Re-examined
Retrospective Virtue-Signalling
The Left and Violence
Four Kinds of “Liberals”
Leftist Condescension
If Men Were Angels
The Vast Left-Wing Conspiracy
The Left and Evergreen State: Reaping What Was Sown
Liberty in Chains
Leftism
Leftism As Crypto-Fascism: The Google Paradigm
What Is Going On? A Stealth Revolution
Libertarianism, Conservatism, and Political Correctness
“Liberalism” and Leftism
Disposition and Ideology
Much Ado about the Unknown and Unknowable
A (Long) Footnote about Science
Down the Memory Hole
The Dumbing-Down of Public Schools
Cakes and Liberty

The Dumbing-Down of Public Schools

You may have read stories about the difficulty of tests given to public-school students in the late 1800s and early 1900s. Some of the questions would have challenged even the brighter seniors in today’s schools. This anecdotal evidence suggests that educational standards were generally much higher in the public schools of yore than they are now. One reason, I suspect, is the dumbing-down of schools that probably accompanied the social and legal push to keep children in school through the 12th grade.

In Michigan, when my father was of school age, it wasn’t uncommon for children (especially boys) to drop out after the 8th grade. High school, in those days, seemed to be considered preparatory for college. Boys like my father, who was intelligent but of a poor family, weren’t considered college material and would often drop out after completing the 8th grade in order to go to work, perhaps even to learn a trade that would pay more than they could earn from manual labor. (When my father dropped out, the Great Depression was at its depth and it was all the more necessary for him to take whatever job he could get, to help support his family.)

The prep-school thesis came to me when I was browsing old yearbooks and found a yearbook for 1921 in which my high-school principal is pictured as a high-school senior. (It was the high school that my father would have attended had he gone beyond the 8th grade.) One page of the yearbook gives a list of the 1920 graduates and tells what each of them is doing (e.g., attending the University of Michigan, working at a particular factory, at home without a job). Here are some things I gleaned from the page:

The class of 1920 consisted of 28 males and 50 females. This is an improbable male-female ratio, which supports my thesis that dropping out to work was common among males. (Most male members of the class of 1920 would have been only 16 at the end of World War I ended, so it is unlikely that the war had more than a minute effect on the number of males who reached graduation age.)

A year after graduation, two-thirds of the males (19 of the 28) were enrolled in college (mostly at the University of Michigan), and another one was attending a technical school in Chicago. That two-thirds of the males were in college in 1921, long before the insane push for universal higher education, support the idea that most males who went to high school were considered college material.

Of the 50 females, only 8 were definitely in college, with 5 of them at teachers’ colleges (then called “normal schools”). Several others gave locations that might have indicated college attendance (e.g., Oberlin, Albion). But there were at most 14 collegians among the 50 females.

Two females were already teaching, presumably in small, rural schools. But the fact that they were teaching at the age of 19 (not uncommon in the “old days”) is testimony to the quality of high-school education in those days. It also says a lot about the needless inflation of standards for teaching young children.

It pains me to think of the tens of millions of young persons — male and female alike — who have been pushed into high school, and then into college, instead of being allowed to find their own way in life. They have been denied the opportunity to learn a trade through apprenticeship, or just by working hard; the opportunity to learn self-reliance and responsibility; and the opportunity to contribute more to the well-being of others than most of them will contribute by going to college.

Smaller high-school enrollments would also mean fewer public-school teachers and administrators to feed at the public trough and fuel the expensive (and largely fruitless) war among school systems to see which one can spend the most per pupil.  Fewer students pushed into college would also mean fewer college-professors and administrators to feed at the public trough, and to spew their pseudo-intellectual nonsense.

The best thing about smaller high-school enrollments would be the reduction in the number of impressionable young persons who are indoctrinated in left-wing views by high-school teachers, and then by college professors.


Related posts:
School Vouchers and Teachers’ Unions
Whining about Teachers’ Pay: Another Lesson about the Evils of Public Education
I Used to Be Too Smart to Understand This
The Higher-Education Bubble
The Public-School Swindle
Is College for Everyone?
A Sideways Glance at Public “Education”

Where’s the Outrage?

UPDATE (11/18/17): It’s only fair to note that in the three days since I posted this I haven’t seen any objections to the Wumo strip. It’s either too funny for outrage or not on the radar of the easily outraged — perhaps both.

This funny Wumo strip, though dated November 3, appeared in today’s papers:

I expect Wumo and/or many of the newspapers to apologize abjectly for giving offense to transgendered persons. Those who are secure in their adopted sexual identity will find it funny. But there will howls of outrage from “liberals”, whose search for “victims” to defend is never-ending.

American Dialects

Some time ago I posted “A Guide to the Pronunciation of General American English“. Recently, I came across a useful adjunct: “North American English Dialects, Based on Pronunciation Patterns“.

If you scroll down the page, you’ll find a list of links to samples of the speech patterns of various regions. I was amused to hear Larry Page, a co-founder of Google, in this YouTube video clip. It’s almost like hearing myself — another native of the central and southeastern parts of Michigan’s lower peninsula.

What that clip and others illustrate is that a broad pattern of pronunciation (e.g., General American English) is overlaid with many sub-regional variations in the form of accents and emphases.

Larry Page’s accent, for example, has a nasal and breathy (or reedy) quality. He might say TOE-MAY-TOE instead of TOE-MAH-TOE, but the way in which he says it will be different than the way it sounds coming from the mouth of an Iowan.

Further, in many parts of the Upper Midwest (certainly including parts of Minnesota, Wisconsin, and Michigan) there is a tendency to end a sentence on an ascending tone, as if a question were being asked even though the sentence is declarative. Again, the sentence may end with TOE-MAY-TOE instead of TOE-MAH-TOE, but it will have a distinctly different sound than TOE-MAY-TOE placed elsewhere in a sentence and TOE-MAY-TOE pronounced at the end of a sentence by an Iowan.

Racism on Parade

There has been much ado about an article by lawprofs Amy Wax (University of Pennsylvania) and Larry Alexander (University of San Diego), “Paying the Price for the Breakdown of the Country’s Bourgeois Culture” (The Inquirer, August 9, 2017). Wax and Alexander say this:

Too few Americans are qualified for the jobs available. Male working-age labor-force participation is at Depression-era lows. Opioid abuse is widespread. Homicidal violence plagues inner cities. Almost half of all children are born out of wedlock, and even more are raised by single mothers. Many college students lack basic skills, and high school students rank below those from two dozen other countries.

The causes of these phenomena are multiple and complex, but implicated in these and other maladies is the breakdown of the country’s bourgeois culture.

That culture laid out the script we all were supposed to follow: Get married before you have children and strive to stay married for their sake. Get the education you need for gainful employment, work hard, and avoid idleness. Go the extra mile for your employer or client. Be a patriot, ready to serve the country. Be neighborly, civic-minded, and charitable. Avoid coarse language in public. Be respectful of authority. Eschew substance abuse and crime.

These basic cultural precepts reigned from the late 1940s to the mid-1960s. They could be followed by people of all backgrounds and abilities, especially when backed up by almost universal endorsement. Adherence was a major contributor to the productivity, educational gains, and social coherence of that period.

Did everyone abide by those precepts? Of course not. There are always rebels — and hypocrites, those who publicly endorse the norms but transgress them. But as the saying goes, hypocrisy is the homage vice pays to virtue. Even the deviants rarely disavowed or openly disparaged the prevailing expectations….

… The loss of bourgeois habits seriously impeded the progress of disadvantaged groups. That trend also accelerated the destructive consequences of the growing welfare state, which, by taking over financial support of families, reduced the need for two parents. A strong pro-marriage norm might have blunted this effect. Instead, the number of single parents grew astronomically, producing children more prone to academic failure, addiction, idleness, crime, and poverty.

This cultural script began to break down in the late 1960s. A combination of factors — prosperity, the Pill, the expansion of higher education, and the doubts surrounding the Vietnam War — encouraged an antiauthoritarian, adolescent, wish-fulfillment ideal — sex, drugs, and rock-and-roll — that was unworthy of, and unworkable for, a mature, prosperous adult society….

And those adults with influence over the culture, for a variety of reasons, abandoned their role as advocates for respectability, civility, and adult values. As a consequence, the counterculture made great headway, particularly among the chattering classes — academics, writers, artists, actors, and journalists — who relished liberation from conventional constraints and turned condemning America and reviewing its crimes into a class marker of virtue and sophistication.

All cultures are not equal. Or at least they are not equal in preparing people to be productive in an advanced economy. The culture of the Plains Indians was designed for nomadic hunters, but is not suited to a First World, 21st-century environment. Nor are the single-parent, antisocial habits, prevalent among some working-class whites; the anti-“acting white” rap culture of inner-city blacks; the anti-assimilation ideas gaining ground among some Hispanic immigrants. These cultural orientations are not only incompatible with what an advanced free-market economy and a viable democracy require, they are also destructive of a sense of solidarity and reciprocity among Americans. If the bourgeois cultural script — which the upper-middle class still largely observes but now hesitates to preach — cannot be widely reinstated, things are likely to get worse for us all….

… Among those who currently follow the old precepts, regardless of their level of education or affluence, the homicide rate is tiny, opioid addiction is rare, and poverty rates are low. Those who live by the simple rules that most people used to accept may not end up rich or hold elite jobs, but their lives will go far better than they do now. All schools and neighborhoods would be much safer and more pleasant. More students from all walks of life would be educated for constructive employment and democratic participation.

But restoring the hegemony of the bourgeois culture will require the arbiters of culture — the academics, media, and Hollywood — to relinquish multicultural grievance polemics and the preening pretense of defending the downtrodden. Instead of bashing the bourgeois culture, they should return to the 1950s posture of celebrating it.

There’s a nit-picky but not fundamentally damaging commentary here, which follows a positive commentary by Jonathan Haidt, whom I presume to be a neutral party given his political centrism and rigorous approach to the psychology of politics.

As for me, I am skeptical about the restoration of the hegemony of bourgeois culture. It’s my view that when constructive social norms (e.g., work rather than welfare, marriage before children) have been breached on a large scale (as in Charles Murray’s “Fishtown”), they can’t be put back together again. Not on a large scale among persons now living, at least.

It’s true that many aspiring escapees from “Fishtown” (and its equivalents among blacks and Hispanics) will emulate the social norms of the middle and upper-middle classes. Those who are steadfast in their emulation are more likely to escape their respective white, tan, and black “ghettos” than those who don’t try or give up.

But “ghettos” will persist for as long as government provides “freebies” to people for not working, for not marrying, and for having children out of wedlock. And I see no end to to the “freebies” because (a) there are a lot of votes in the “ghettos” and (b) there are too many members of the middle and upper-middle classes — mainly but not exclusively “progressives” — who would rather give a man a fish every day rather than teach him how to fish.

That said, the heated controversy about the Wax-Alexander piece stems from its perceived racism — perceived by the usual, hyper-sensitive suspects. How dare Wax and Alexander drag blacks and Hispanics into their discussion by referring to

  • homicidal violence that plagues inner cities
  • the fact that almost half of all children are born out of wedlock, and even more are raised by single mothers
  • the anti-“acting white” rap culture of inner-city blacks
  • the anti-assimilation ideas gaining ground among some Hispanic immigrants

And how dare they assert (quite reasonably) that not all cultures are equal.

So the condemnation began. The thrust of it, of course, is the Wax and Alexander are “racist”.

For her sins, Wax was the target of an open letter of condemnation signed by 33 of her law school colleagues at UPenn. And for his sins, Alexander was singled out for criticism by the dean of USD’s law school.

Turnabout is fair play — or it will be as long as there are vestiges of free speech on college campuses. Tom Smith, a lawprof at USD who blogs at The Right Coast, is mightily miffed about his dean’s response to the Wax-Alexander piece. Smith and seven other USD lawprofs signed a letter which reads, in part:

Yesterday, Stephen Ferruolo, dean of the University of San Diego School of Law, sent to the entire law school community a lengthy email message entitled “Our Commitment to Diversity and Inclusion.” The message began by thanking those who have “expressed their concerns” about an op-ed written by our colleague Larry Alexander and University of Pennsylvania law professor Amy Wax and published last month in the Philadelphia Inquirer…. While acknowledging that Professor Alexander has a right to his views, the dean then declared, “I personally do not agree with those views, nor do I believe that they are representative of the views of our law school community.”…

The dean did not describe the contents of the Alexander-Wax op-ed, and he offered no specifics about what he disagreed with. In the context of the overall message, readers of the dean’s statement will inevitably infer that, at least in the dean’s view, Professor Alexander’s op-ed was in some sense supportive of exclusion or “racial discrimination or cultural subordination.” In effect, the dean adopted the extraordinary measure of singling out a colleague, by name, for a kind of public shaming through unsupported insinuation.

As colleagues of Professor Alexander, we write in response for two principal reasons.

First, the law school community and the interested public should know that Professor Alexander is an honorable, honest man who is not in any way racist…. Just last May, Dean Ferruolo along with the deans of the Yale Law School and the University of Illinois Law School praised Professor Alexander effusively at a conference convened at Yale Law School specifically to discuss and commemorate Professor Alexander’s scholarly contributions in a variety of fields. Considering this distinguished career and unparalleled contribution to the law school, we believe it is unconscionable for a law school dean to subject Professor Alexander to this sort of public shaming.

Second, we are concerned about the harmful effects of the dean’s message for the law school community. A law school and a university should be places where the free exchange of ideas is encouraged, not inhibited…. We have been grateful to study, teach, and write at USD, where in our experience civility and a commitment to freedom of discussion have prevailed. But this commitment is seriously undermined if faculty or students come to perceive that their expression of views disfavored by some may cause them to be singled out for public disapproval by university officials.

We understand that there are limits to the freedom of expression. Anyone, including colleagues and deans, should of course feel free to challenge on the merits the views expressed by other members of the community. As noted, Dean Ferruolo’s email made no attempt to do this. In addition, a member of the university who is shown to promote racist or bigoted views or practices may deserve public censure. However, we challenge the dean or other critics to identify anything in Professor Alexander’s op-ed that expresses or endorses bigotry or “racial discrimination or cultural subordination.”…

Smith continues, in his inimitable style:

I signed onto the letter and I’m grateful to find my name in such distinguished company. More emails and no doubt facebook posts, tweets, blog posts and so forth will no doubt issue in response to these letters. I am breaching my usual dirty bird principle (from the adage, “it’s a dirty bird who fouls his (or her!) own nest”) because this controversy sounds so directly on matters I blog about, sometimes humorously and usually carefully…. [A] man or woman should be entitled to express him or herself in the public prints without having a Dean rain down a ton of politically correct nonsense on his head, for heaven’s sake…. And also, I just have to say, what Larry is calling for (get up in the morning, go to your job, don’t take drugs, don’t have kids out of wedlock, etc., etc.) is rather in line with traditional Catholic teaching, is it not? So if someone says something that is “loudly dogma[tic]”, to coin a phrase, in a newspaper, or at least is consistent with that dogma, he runs the risk of being shamed by the administration of a nominally Catholic law school? That just ain’t rat. Larry of course is not Catholic, he’s a secular Jew, but he’s advocating things that are absolutely in line with what a good or even just sort of good Catholic person would do or practice.

I must say, I feel just a teensy bit neglected myself here. Have I not said things at least as politically incorrect as Larry? What am I, chopped liver? Or whatever the WASP equivalent of chopped liver is? Bologna and mayonnaise perhaps? Celery with peanut butter? Alas, we are but a small blog. But no matter. All in all, this is just a hellova way to thank Larry, who is nearing the end of his career and has given all of it to a small law school when, at least by professional lights, he should have been at a top ten school. And I don’t see how the situation can really be put right at this point. But who knows, perhaps somehow it will be. Meanwhile, the weather finally is beautiful again here today, for what that’s worth.

As for the “racist” label that has been so freely flung at Wax and Alexander, I’ll tell you what’s racist. It’s people like Dean Steve (which is as much of an honorific as he deserves) who assert that it’s racist to advise anyone (of any race, creed, color, national origin, sexual orientation, or whatever other identifying characteristics seem to matter these days) to get a job, stick to it, work hard at it, and take responsibility for yourself.

There are lots of blacks — undoubtedly a majority of them (and many of whom I worked with) — who don’t think such attitudes are racist. But Dean Steve and his ilk seem to believe that such attitudes are racist. Which means that Dean Steve and his ilk are racists, because they believe that all blacks either (a) don’t work hard, etc., and/or (b) are affronted by the idea that hard work, etc., are virtues. How racist can you get?


Related posts:
The Euphemism Conquers All
Superiority
Non-Judgmentalism as Leftist Condescension
Retrospective Virtue-Signalling
Leftist Condescension
Leftism As Crypto-Fascism: The Google Paradigm

Why ‘s Matters

I have added this to my page, “Writing: A Guide“, as section IV.B.4.

Most newspapers and magazines follow the convention of forming the possessive of a word ending in “s” by putting an apostrophe after the “s”; for example:

Dallas’ (for Dallas’s)

Texas’ (for Texas’s)

Jesus’ (for Jesus’s)

This may work on a page or screen, but it can cause ambiguity if carried over into speech*. (Warning: I am about to take liberties with the name of Jesus and the New Testament, about which I will write as if it were a contemporary document. Read no further if you are easily offended.)

What sounds like “Jesus walks on water” could mean just what it sounds like: a statement about a feat of which Jesus is capable or is performing. But if Jesus walks on the water more than once, it could refer to his plural perambulations: “Jesus’ walks on water”**, as it would appear in a newspaper.

The simplest and best way to avoid the ambiguity is to insist on “Jesus’s walks on water”** for the possessive case, and to inculcate the practice of saying it as it reads. How else can the ambiguity be avoided, in the likely event that the foregoing advice will be ignored?

If what is meant is “Jesus walks on water”, one could say “Jesus can [is able to] walk on water” or “Jesus is walking on water”, according to the situation.

If what is meant is that Jesus walks on water more than once, “Jesus’s walks on water” is unambiguous (assuming, of course, that one’s listeners have an inkling about the standard formation of a singular possessive). There’s no need to work around it, as there is in the non-possessive case. But if you insist on avoiding the ‘s formation, you can write or say “the water-walks of Jesus”.

I now take it to the next level.

What if there’s more than one Jesus who walks on water? Well, if they all can walk on water and the idea is to say so, it’s “The Jesuses walk on water”. And if they all walk on water and the idea is to refer to those outings as the outings of them all, it’s “The water-walks of the Jesuses”.

Why? Because the standard formation of the plural possessive of Jesus is Jesuses’. Jesusues’s would be too hard to say or comprehend. But Jesuses’ sounds the same as Jesuses, and must therefore be avoided in speech, and in writing intended to be read aloud. Thus “the water walks of the Jesuses” instead of “the Jesuses’ walks on water”, which is ambiguous to a listener.
__________
* A good writer will think about the effect of his writing if it is read aloud.

** “Jesus’ walks on water” and “Jesus’s walks on water” misuse the possessive case, though it’s a standard kind misuse that is too deeply entrenched to be eradicated. Strictly speaking, Jesus doesn’t own walks on water, he does them. The alternative construction, “the water-walks of Jesus”, is better; “the water-walks by Jesus” is best.

Self-Made Victims

The author of Imlac’s Journal quotes Malcolm Muggeridge on George Bernard Shaw:

He wanted to make a lot of money without being considered rich.

Here is Theodore Dalrymple, writing in the same vein:

[D]uring the early years of the AIDS epidemic … it was demanded of us that we should believe incompatible things simultaneously, for example that it was simply a disease like any other and that it was a disease of unprecedented importance and unique significance; that it could strike anybody but that certain group were martyrs to it; that it must be normalized and yet treated differently….  It was a bit like living under a small version of a communist dictatorship, in which the law of noncontradiction had been abrogated in favor of dialectics, under which all contradictions were compatible, but which contradictions had to be accepted was a matter of the official policy of the moment….

The demand for recognition and nonrecognition at the same time is surely one of the reasons for the outbreak of mass self-mutilation in the Western world in an age of celebrity. A person who treats his face and body like an ironmongery store can hardly desire or expect that you fail to notice it, but at the same time demands that you make no comment about it, draw no conclusions from it, express no aversion toward it, and treat him no differently because of it. You must accept him as he is, however he is, because he has an inalienable right to such acceptance….

I think the same dynamic (if I may call it such) is at work in the current vogue for transsexualism: “You must recognize me and not recognize me at the same time.” In this way, people can simultaneously enjoy the fruits of being normal and very different. To be merely the same as others is a wound to the ego in an age of celebrity, and yet we are herd animals who do not want to wander too far from the herd. And in an age of powerlessness we want to exert power.

What will be the next attempted reconciliation of our incompatible desires? [“Everyday Snowflakes“, Taki’s Magazine, July 15, 2017]

Good question. I don’t have a ready answer, but I have some other examples of incompatible desiderata. Each entry in the list below has two parts: (on the left) an objective that most leftists would claim to support and (on the right) the left-wing policy that hinders attainment of the objective.

Ample employment opportunities for low-skill workers – Minimum wage

Vigorous economic growth – Regulation

Property rights* and freedom of association – Public-accommodation laws

Less crime – Strict gun control or confiscation of guns*

Peace – Less defense spending (and therefore lack of deterrence)

The result of each left-wing policy is to create victims, ranging from young black men to law-abiding citizens to most Americans. The left’s constant search for “victims” is evidently hindered by intellectual myopia.

Moreover, in many cases leftists are actual or potential victims of their own policy preferences. But their magical thinking (unconstrained vision) blinds them to the incompatibility of their desires.


* There are many hypocrites on the left (like Shaw) who would vigorously defend their property rights while proclaiming their attachment to socialism, and who employ guards (with guns) to protect their property.


More posts about the left and magical thinking:
The Left and Its Delusions
A Keynesian Fantasy Land
The Keynesian Fallacy and Regime Uncertainty
America: Past, Present, and Future
IQ, Political Correctness, and America’s Present Condition
The Barbarians Within and the State of the Union
The Pretence of Knowledge
“The Science Is Settled”
The Harmful Myth of Inherent Equality
“And the Truth Shall Set You Free”
The Transgender Fad and Its Consequences

Death of a Nation

More than 50 years ago I heard a white woman say of blacks “They’re not Americans.” I was appalled by that statement, for it contradicted what I had been taught to believe about America, namely, this:

“America is not just a country,” said the rock singer Bono, in Pennsylvania in 2004: “It’s an idea.”

That’s the opening of John O’Sullivan’s essay, “A People, Not Just an Idea” (National Review, November 19, 2015).

Bono is a decent, thoughtful, and public-spirited man. I didn’t choose his quotation to suggest that this view of America is a kind of pop opinion. It just happened that in my Google search his name came ahead of many others, from George Will to Irving Kristol to almost every recent presidential candidate, all of whom had described America either as an idea or as a “proposition nation,” to distinguish it from dynastic realms or “blood and soil” ethnicities. This philosophical definition of America is now the conventional wisdom of Left and Right, at least among people who write and talk of such things.

Indeed, we have heard variations on Bono’s formulation so many times that we probably fail to notice how paradoxical it is. But listen to how it sounds when reversed: “America is not just an idea; it is a nation.” Surely that version has much more of the ring of common sense. For a nation is plainly something larger, more complex, and richer than an idea. A nation may include ideas. It may have evolved under the influence of a particular set of ideas. But because it encompasses so many other things — notably the laws, institutions, language of the nation; the loyalties, stories, and songs of the people; and above all Lincoln’s “mystic chords of memory” — the nation becomes more than an idea with every election, every battle, every hero, every heroic tale, every historical moment that millions share.

That is not to deny that the United States was founded on some very explicit political ideas, notably liberty and equality, which Jefferson helpfully wrote down in the Declaration of Independence. To be founded on an idea, however, is not the same thing as to be an idea. A political idea is not a destination or a conclusion but the starting point of an evolution — and, in the case of the U.S., not really a starting point, either. The ideas in the Declaration on which the U.S. was founded were not original to this country but drawn from the Anglo-Scottish tradition of Whiggish liberalism. Not only were these ideas circulating well before the Revolution, but when the revolutionaries won, they succeeded not to a legal and political wasteland but to the institutions, traditions, and practices of colonial America — which they then reformed rather than abolished….

As John Jay pointed out, Americans were fortunate in having the same religion (Protestantism), the same language, and the same institutions from the first. Given the spread of newspapers, railways, and democratic debate, that broad common culture would intensify the sense of a common American identity over time. It was a cultural identity more than an ethnic one, and one heavily qualified by regional loyalties… And the American identity might have become an ethnic one in time if it had not been for successive waves of immigration that brought other ethnicities into the nation.

That early American identity was robust enough to absorb these new arrivals and to transform them into Americans. But it wasn’t an easy or an uncomplicated matter. America’s emerging cultural identity was inevitably stretched by the arrivals of millions of people from different cultures. The U.S. government, private industry, and charitable organizations all set out to “Americanize” them. It was a great historical achievement and helped to create a new America that was nonetheless the old America in all essential respects….

By World War II, … all but the most recent migrants had become culturally American. So when German commandos were wandering behind American lines in U.S. uniforms during the Battle of the Bulge, the G.I.s testing their identity asked not about … the First Amendment but questions designed to expose their knowledge (or ignorance) of American life and popular culture….

Quite a lot flows from this history. Anyone can learn philosophical Americanism in a civics class; for a deeper knowledge and commitment, living in America is a far surer recipe…. Americans are a distinct and recognizable people with their own history, culture, customs, loyalties, and other qualities that are wider and more various than the most virtuous summary of liberal values….

… If Americans are a distinct people, with their own history, traditions, institutions, and common culture, then they can reasonably claim that immigrants should adapt to them and to their society rather than the reverse. For most of the republic’s history, that is what happened. And in current circumstances, it would imply that Muslim immigrants should adapt to American liberty as Catholic immigrants once did.

If America is an idea, however, then Americans are not a particular people but simply individuals or several different peoples living under a liberal constitution.

For a long time the “particular people” were not just Protestants but white Protestants of European descent. As O’Sullivan points out, Catholics (of European descent) eventually joined the ranks of “particular people”. But there are others — mostly blacks and Hispanics — who never did and never will join those ranks. Whatever the law may say about equality, access to housing, access to public accommodations, and so on, membership in the ranks of “particular people” is up to those who are already members.

The woman who claimed that blacks weren’t Americans was a member. She was a dyed-in-the-wool Southerner, but her attitude wasn’t untypical of the attitudes of many white Americans — Northern and Southern, past and present. Like it or not, the attitude remains prevalent in the country. (Don’t believe polls that purport to demonstrate racial comity; there’s a well-known aversion to giving a “wrong” answer to a pollster.)

The revealed preference of most whites (a preference shared by most blacks) is for racial segregation. Aggregate statistics hide the real story, which is the gentrification of some parts of inner cities (i.e., the creation of white enclaves) and “white flight” from suburbs to which inner-city blacks are fleeing. (See this article, for instance.)

The taste for segregation shows up in statistics about public-school enrollment. (See this article, for instance.) White parents (and affluent blacks) are more often keeping their children out of local public schools with large “minority” enrollments by choosing one of the alternatives legally available to them (e.g., home schooling). (Presidents with school-age children — including Barack Obama — have done the same thing to avoid sending their children to the public schools of the District of Columbia, whose students are predominantly black and Hispanic.)

I have focused on voluntary racial segregation because it underscores the fact — not lost on the white, Southern woman of my acquaintance — that the United States was once built upon the “blood and soil” ethnicity of whites whose origins lay in Europe. Blacks can never be part of that nation. Neither can Hispanics, who now outnumber blacks in America. Blacks and Hispanics belong to the “proposition” nation.

They have been joined by the large numbers of Americans who no longer claim allegiance to the “blood and soil” nation, regardless of their race or ethnicity — leftists, in other words. Since the 1960s leftists have played an ever-larger, often dominant, role in the governance of America. They have rejected the “history, culture, customs, [and] loyalties” which once bound most Americans. In fact they are working daily — through the academy, the media, and the courts — to transform America fundamentally by erasing the “history, culture, customs, [and] loyalties” of Americans from the people’s consciousness and the nation’s laws.

Pat Buchanan, who is usually too strident for my taste, hits it on the head:

In Federalist No. 2, John Jay writes of them as “one united people . . . descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs . . .”

If such are the elements of nationhood and peoplehood, can we still speak of Americans as one nation and one people?

We no longer have the same ancestors. They are of every color and from every country. We do not speak one language, but rather English, Spanish and a host of others. We long ago ceased to profess the same religion. We are Evangelical Christians, mainstream Protestants, Catholics, Jews, Mormons, Muslims, Hindus and Buddhists, agnostics and atheists.

Federalist No. 2 celebrated our unity. Today’s elites proclaim that our diversity is our strength. But is this true or a tenet of trendy ideology?

After the attempted massacre of Republican Congressmen at that ball field in Alexandria, Fareed Zakaria wrote: “The political polarization that is ripping this country apart” is about “identity . . . gender, race, ethnicity, sexual orientation (and) social class.” He might have added — religion, morality, culture and history.

Zakaria seems to be tracing the disintegration of our society to that very diversity that its elites proclaim to be its greatest attribute: “If the core issues are about identity, culture and religion … then compromise seems immoral. American politics is becoming more like Middle Eastern politics, where there is no middle ground between being Sunni or Shiite.”

Among the issues on which we Americans are at war with one another — abortion, homosexuality, same-sex marriage, white cops, black crime, Confederate monuments, LGBT rights, affirmative action.

America is no longer a nation whose inhabitants are bound mainly by “blood and soil”. Worse than that, it was — until the election of 2016 — fast becoming a nation governed by the proposition that liberty is only what leftists say it is: the liberty not to contradict the left’s positions on climate, race, intelligence, economics, religion, marriage, the right to life, and government’s intrusive role in all of those things and more. The resistance to Donald Trump is fierce and unforgiving because his ascendancy threatens what leftists have worked so hard to achieve in the last 50 years: the de-Americanization of America.

Is all of this just the grumbling of white men of European descent? I think not. Measures of national unity are hard to come by. Opinion polls, aside from their relatively brief history (compared with the age of the Union), are notoriously unreliable. Presidential elections are more meaningful because (some degree of chicanery aside) they reflect voters’ feelings about the state of the Union. Regardless of the party affiliation of the winning candidate, a strong showing usually reflects broad satisfaction with the nation’s direction; a weak showing usually reflects the opposite.

Popular votes were first recorded in the election of 1824. Here is a graphical history of the winning candidate’s percentages of the vote in each election from 1824 through 2016 (with the exclusion of 1864, when the South wasn’t in the Union):


Derived from this table in this article at Wikipedia.

Election-to-election variations reflect the personal popularity of some candidates, the strength of third-party movements, and various other transitory factors. The 5-election average smooths those effects and reveals what is (to me) an obvious story: national disunity in the years before and after the Civil War; growing unity during the first half of the 20th century, peaking during the Great Depression and World War II; modest post-war decline followed by stability through the 1980s; and rapid decline since then because of the left’s growing power and the rapid rise of the Hispanic population.

The graph underscores what I already knew: The America in which I was born and raised — the America of the 1940s and 1950s — has been beaten down. It is more likely to die than it is to revive. And even if it revives to some degree, it will never be the same.


Related posts:
Academic Bias
Intellectuals and Capitalism
“Intellectuals and Society”: A Review
The Left’s Agenda
The Left and Its Delusions
The Spoiled Children of Capitalism
Politics, Sophistry, and the Academy
Subsidizing the Enemies of Liberty
Are You in the Bubble?
Liberty and Society
The Eclipse of “Old America”
Genetic Kinship and Society
The Culture War
Ruminations on the Left in America
Academic Ignorance
The Euphemism Conquers All
Defending the Offensive
Superiority
Whiners
A Dose of Reality
God-Like Minds
Non-Judgmentalism as Leftist Condescension
An Addendum to (Asymmetrical) Ideological Warfare
Retrospective Virtue-Signalling
The Left and Violence
Four Kinds of “Liberals”
Leftist Condescension
Class in America
A Word of Warning to Leftists (and Everyone Else)
Another Thought or Two about Class
The Vast Left-Wing Conspiracy
The Left and Evergreen State: Reaping What Was Sown

Academic Freedom, Freedom of Speech, and the Demise of Civility

A professor has been suspended after claiming that anyone witnessing white people in mortal danger should “let them fucking die.”

Daniel Payne, The College Fix (June 27, 2017)

Predictably, the suspension of the professor — one Johnny Eric Williams of Trinity College in Hartford, Connecticut — and a similar case at Essex County (New Jersey) College caused the usual (left-wing) suspects to defend the offenders and claim that their “academic freedom” and “freedom of speech” were being violated. (Boo hoo.) I will be unsurprised if the ACLU doesn’t weigh in on the side of hate.

This is what happens when the law becomes an abstraction, separate and apart from social norms. There is no better example of the degradation of the law, and of public discourse, than the case of Snyder v. Phelps, which I addressed in “Rethinking the Constitution: ‘Freedom of Speech and of the Press’“. What follows is based on that post.

Contrary to the current state of constitutional jurisprudence, freedom of speech and freedom of the press — and, by implication, academic freedom — do not comprise an absolute license to “express” almost anything, regardless of the effects on the social fabric.

One example of misguided absolutism is found in Snyder v. Phelps, a case recently and wrongly decided by the U.S. Supreme Court. This is from “The Burkean Justice” (The Weekly Standard, July 18, 2011):

When the Supreme Court convened for oral argument in Snyder v. Phelps, judicial formalities only thinly veiled the intense bitterness smoldering among the parties and their supporters. At one table sat counsel for Albert Snyder, father of the late Marine Lance Corporal Matthew Snyder, who was killed in al Anbar Province, Iraq. At the other sat Margie Phelps, counsel for (and daughter of) Fred Phelps, whose notorious Westboro Baptist Church descended upon Snyder’s Maryland funeral, waving signs bearing such startlingly offensive slogans as “Thank God for IEDs,” “God Hates Fags,” and “Thank God for Dead Soldiers.” A federal jury had awarded Snyder nearly $11 million for the “severe depression” and “exacerbated preexisting health conditions” that Phelps’s protest had caused him.

In the Supreme Court, Phelps argued that the jury’s verdict could not stand because the First Amendment protected Westboro’s right to stage their protest outside the funeral. As the Court heard the case on a gray October morning, Westboro protesters marched outside the courthouse, informing onlookers that God still “Hates Fags” and advising them to “Pray for More Dead Soldiers.”

Amidst that chaos, the Court found not division, but broad agreement. On March 2, 2011, it held that Westboro’s slurs were protected by the First Amendment, and that Snyder would receive no compensation, let alone punitive damages, for the emotional injuries that he had suffered. Chief Justice John Roberts wrote the Court’s opinion, speaking for all of his brethren, conservatives and liberals alike—except one.

Justice Samuel Alito rejected the Court’s analysis and wrote a stirring lone dissent. “The Court now holds that the First Amendment protected respondents’ right to brutalize Mr. Snyder. I cannot agree.” Repeatedly characterizing Westboro’s protest as not merely speech but “verbal assaults” that “brutally attacked” the fallen Snyder and left the father with “wounds that are truly severe and incapable of healing themselves,” Justice Alito concluded that the First Amendment’s text and precedents did not bar Snyder’s lawsuit. “In order to have a society in which public issues can be openly and vigorously debated, it is not necessary to allow the brutalization of innocent victims. .  .  . I therefore respectfully dissent.”

There is more:

Snyder v. Phelps would not be the last time that Alito stood nearly alone in a contentious free speech case this term. Just weeks ago, as the Court issued its final decisions of the term, Alito rejected the Court’s broad argument that California could not ban the distribution of violent video games without parental consent. Although he shared the Court’s bottom-line conclusion that the particular statute at issue was unconstitutional, he criticized the majority’s analysis in Brown v. Entertainment Merchants Association as failing to give states and local communities latitude to promote parental control over children’s video-game habits. The states, he urged, should not be foreclosed from passing better-crafted statutes achieving that legitimate end.

Moreover, Alito’s opinions in those cases followed a solo dissent late in the previous term, in United States v. Stevens, where eight of the nine justices struck down a federal law barring the distribution of disturbing “crush videos” in which, for example, a woman stabs a kitten through the eye with her high heel, all for the gratification of anonymous home audiences.

The source of Alito’s positions:

[T]hose speculating as to the roots of Alito’s jurisprudence need look no further than his own words—in public documents, at his confirmation hearing, and elsewhere. Justice Alito is uniquely attuned to the space that the Constitution preserves for local communities to defend the vulnerable and to protect traditional values. In these three new opinions, more than any others, he has emerged as the Court’s Burkean justice….

A review of Alito’s Snyder, Brown, and Stevens opinions quickly suggests the common theme: Alito, more than any of his colleagues, would not allow broad characterizations of the freedom of speech effectively to immunize unlawful actions. He sharply criticized the Court for making generalized pronouncements on the First Amendment’s reach, when the Court’s reiterations of theory glossed over the difficult factual questions that had given rise to regulation in the first place—whether in grouping brutal verbal attacks with protected political speech; or in equating interactive Duke Nukem games with the text of Grimm’s Fairy Tales; or in extending constitutional protection to the video of women illegally crushing animals. And Alito was particularly sensitive to the Court’s refusal to grant at least a modicum of deference to the local communities and state officials who were attempting to protect their populations against actions that they found so injurious as to require state intervention.

A general and compelling case against the current reign of absolutism is made by David Lowenthal in No Liberty for License: The Forgotten Logic of the First Amendment. My copy is now in someone else’s hands, so I must rely on Edward J. Erler’s review of the book:


Liberty is lost when the law allows “freedom of speech, and of the press” to undermine the social norms that enable liberty. Liberty is not an abstraction, it is it is the scope of action that is allowed by socially agreed upon rights. It is that restrained scope of action which enables people to coexist willingly, peacefully, and cooperatively for their mutual benefit. Such coexistence depends greatly on mutual trust, respect, and forbearance. Liberty is therefore necessarily degraded when courts sunder social restraints in the name of liberty.


Other related posts:
On Liberty
Line-Drawing and Liberty
The Meaning of Liberty
Positive Liberty vs. Liberty
Facets of Liberty
Burkean Libertarianism
What Is Libertarianism?
True Libertarianism, One More Time
Human Nature, Liberty, and Rationalism
Liberty, Negative Rights, and Bleeding Hearts
Why Conservatism Works
Liberty and Society
Liberty as a Social Construct: Moral Relativism?
Defending Liberty against (Pseudo) Libertarians
Defining Liberty
The Futile Search for “Natural Rights”
The Pseudo-Libertarian Temperament
Parsing Political Philosophy (II)
Getting Liberty Wrong
Libertarianism and the State
“Liberalism” and Personal Responsibility
My View of Libertarianism
More About Social Norms and Liberty
The War on Conservatism
Social Justice vs. Liberty
Economically Liberal, Socially Conservative
The Harm Principle Revisited: Mill Conflates Society and State
Liberty and Social Norms Re-examined
Natural Law, Natural Rights, and the Real World
Rescuing Conservatism
If Men Were Angels
The Left and Evergreen State: Reaping What Was Sown