Regarding Napoleon Chagnon

Napoleon Alphonseau Chagnon (1938-2019) was a noted anthropologist to whom the label “controversial” was applied. Some of the story is told in this surprisingly objective New York Times article about Chagnon’s life and death. Matthew Blackwell gives a more complete account in “The Dangerous Life of an Anthropologist” (Quilette, October 5, 2019).

Chagnon’s sin was his finding that “nature” trumped “nurture”, as demonstrated by his decades-long ethnographic field work among the Yanomamö, indigenous Amazonians who live in the border area between Venezuela and Brazil. As Blackwell tells it,

Chagnon found that up to 30 percent of all Yanomamö males died a violent death. Warfare and violence were common, and duelling was a ritual practice, in which two men would take turns flogging each other over the head with a club, until one of the combatants succumbed. Chagnon was adamant that the primary causes of violence among the Yanomamö were revenge killings and women. The latter may not seem surprising to anyone aware of the ubiquity of ruthless male sexual competition in the animal kingdom, but anthropologists generally believed that human violence found its genesis in more immediate matters, such as disputes over resources. When Chagnon asked the Yanomamö shaman Dedeheiwa to explain the cause of violence, he replied, “Don’t ask such stupid questions! Women! Women! Women! Women! Women!” Such fights erupted over sexual jealousy, sexual impropriety, rape, and attempts at seduction, kidnap and failure to deliver a promised girl….

Chagnon would make more than 20 fieldwork visits to the Amazon, and in 1968 he published Yanomamö: The Fierce People, which became an instant international bestseller. The book immediately ignited controversy within the field of anthropology. Although it commanded immense respect and became the most commonly taught book in introductory anthropology courses, the very subtitle of the book annoyed those anthropologists, who preferred to give their monographs titles like The Gentle Tasaday, The Gentle People, The Harmless People, The Peaceful People, Never in Anger, and The Semai: A Nonviolent People of Malaya. The stubborn tendency within the discipline was to paint an unrealistic façade over such cultures—although 61 percent of Waorani men met a violent death, an anthropologist nevertheless described this Amazonian people as a “tribe where harmony rules,” on account of an “ethos that emphasized peacefulness.”…

These anthropologists were made more squeamish still by Chagnon’s discovery that the unokai of the Yanomamö—men who had killed and assumed a ceremonial title—had about three times more children than others, owing to having twice as many wives. Drawing on this observation in his 1988 Science article “Life Histories, Blood Revenge, and Warfare in a Tribal Population,” Chagnon suggested that men who had demonstrated success at a cultural phenomenon, the military prowess of revenge killings, were held in higher esteem and considered more attractive mates. In some quarters outside of anthropology, Chagnon’s theory came as no surprise, but its implication for anthropology could be profound. In The Better Angels of Our Nature, Steven Pinker points out that if violent men turn out to be more evolutionarily fit, “This arithmetic, if it persisted over many generations, would favour a genetic tendency to be willing and able to kill.”…

Chagnon considered his most formidable critic to be the eminent anthropologist Marvin Harris. Harris had been crowned the unofficial historian of the field following the publication of his all-encompassing work The Rise of Anthropological Theory. He was the founder of the highly influential materialist school of anthropology, and argued that ethnographers should first seek material explanations for human behavior before considering alternatives, as “human social life is a response to the practical problems of earthly existence.” Harris held that the structure and “superstructure” of a society are largely epiphenomena of its “infrastructure,” meaning that the economic and social organization, beliefs, values, ideology, and symbolism of a culture evolve as a result of changes in the material circumstances of a particular society, and that apparently quaint cultural practices tend to reflect man’s relationship to his environment. For instance, prohibition on beef consumption among Hindus in India is not primarily due to religious injunctions. These religious beliefs are themselves epiphenomena to the real reasons: that cows are more valuable for pulling plows and producing fertilizers and dung for burning. Cultural materialism places an emphasis on “-etic” over “-emic” explanations, ignoring the opinions of people within a society and trying to uncover the hidden reality behind those opinions.

Naturally, when the Yanomamö explained that warfare and fights were caused by women and blood feuds, Harris sought a material explanation that would draw upon immediate survival concerns. Chagnon’s data clearly confirmed that the larger a village, the more likely fighting, violence, and warfare were to occur. In his book Good to Eat: Riddles of Food and Culture Harris argued that fighting occurs more often in larger Yanomamö villages because these villages deplete the local game levels in the rainforest faster than smaller villages, leaving the men no option but to fight with each other or to attack outside groups for meat to fulfil their protein macronutrient needs. When Chagnon put Harris’s materialist theory to the Yanomamö they laughed and replied, “Even though we like meat, we like women a whole lot more.” Chagnon believed that smaller villages avoided violence because they were composed of tighter kin groups—those communities had just two or three extended families and had developed more stable systems of borrowing wives from each other.

There’s more:

Survival International … has long promoted the Rousseauian image of a traditional people who need to be preserved in all their natural wonder from the ravages of the modern world. Survival International does not welcome anthropological findings that complicate this harmonious picture, and Chagnon had wandered straight into their line of fire….

For years, Survival International’s Terence Turner had been assisting a self-described journalist, Patrick Tierney, as the latter investigated Chagnon for his book, Darkness in El Dorado: How Scientists and Journalists Devastated the Amazon. In 2000, as Tierney’s book was being readied for publication, Turner and his colleague Leslie Sponsel wrote to the president of the American Anthropological Association (AAA) and informed her that an unprecedented crisis was about to engulf the field of anthropology. This, they warned, would be a scandal that, “in its scale, ramifications, and sheer criminality and corruption, is unparalleled in the history of Anthropology.” Tierney alleged that Chagnon and Neel had spread measles among the Yanomamö in 1968 by using compromised vaccines, and that Chagnon’s documentaries depicting Yanomamö violence were faked by using Yanomamö to act out dangerous scenes, in which further lives were lost. Chagnon was blamed, inter alia, for inciting violence among the Yanomamö, cooking his data, starting wars, and aiding corrupt politicians. Neel was also accused of withholding vaccines from certain populations of natives as part of an experiment. The media were not slow to pick up on Tierney’s allegations, and the Guardian ran an article under an inflammatory headline accusing Neel and Chagnon of eugenics: “Scientists ‘killed Amazon Indians to test race theory.’” Turner claimed that Neel believed in a gene for “leadership” and that the human genetic stock could be upgraded by wiping out mediocre people. “The political implication of this fascistic eugenics,” Turner told the Guardian, “is clearly that society should be reorganised into small breeding isolates in which genetically superior males could emerge into dominance, eliminating or subordinating the male losers.”

By the end of 2000, the American Anthropological Association announced a hearing on Tierney’s book. This was not entirely reassuring news to Chagnon, given their history with anthropologists who failed to toe the party line….

… Although the [AAA] taskforce [appointed to investigate Tierney’s accusations] was not an “investigation” concerned with any particular person, for all intents and purposes, it blamed Chagnon for portraying the Yanomamö in a way that was harmful and held him responsible for prioritizing his research over their interests.

Nonetheless, the most serious claims Tierney made in Darkness in El Dorado collapsed like a house of cards. Elected Yanomamö leaders issued a statement in 2000 stating that Chagnon had arrived after the measles epidemic and saved lives, “Dr. Chagnon—known to us as Shaki—came into our communities with some physicians and he vaccinated us against the epidemic disease which was killing us. Thanks to this, hundreds of us survived and we are very thankful to Dr. Chagnon and his collaborators for help.” Investigations by the American Society of Human Genetics and the International Genetic Epidemiology Society both found Tierney’s claims regarding the measles outbreak to be unfounded. The Society of Visual Anthropology reviewed the so-called faked documentaries, and determined that these allegations were also false. Then an independent preliminary report released by a team of anthropologists dissected Tierney’s book claim by claim, concluding that all of Tierney’s most important assertions were either deliberately fraudulent or, at the very least, misleading. The University of Michigan reached the same conclusion. “We are satisfied,” its Provost stated, “that Dr. Neel and Dr. Chagnon, both among the most distinguished scientists in their respective fields, acted with integrity in conducting their research… The serious factual errors we have found call into question the accuracy of the entire book [Darkness in El Dorado] as well as the interpretations of its author.” Academic journal articles began to proliferate, detailing the mis-inquiry and flawed conclusions of the 2002 taskforce. By 2005, only three years later, the American Anthropological Association voted to withdraw the 2002 taskforce report, re-exonerating Chagnon.

A 2000 statement by the leaders of the Yanomamö and their Ye’kwana neighbours called for Tierney’s head: “We demand that our national government investigate the false statements of Tierney, which taint the humanitarian mission carried out by Shaki [Chagnon] with much tenderness and respect for our communities. The investigation never occurred, but Tierney’s public image lay in ruins and would suffer even more at the hands of historian of science Alice Dreger, who interviewed dozens of people involved in the controversy. Although Tierney had thanked a Venezuelan anthropologist for providing him with a dossier of information on Chagnon for his book, the anthropologist told Dreger that Tierney had actually written the dossier himself and then misrepresented it as an independent source of information.

A “dossier” and its use to smear an ideological opponent. Where else have we seen that?

Returning to Blackwell:

Scientific American has described the controversy as “Anthropology’s Darkest Hour,” and it raises troubling questions about the entire field. In 2013, Chagnon published his final book, Noble Savages: My Life Among Two Dangerous Tribes—The Yanomamö and the Anthropologists. Chagnon had long felt that anthropology was experiencing a schism more significant than any difference between research paradigms or schools of ethnography—a schism between those dedicated to the very science of mankind, anthropologists in the true sense of the word, and those opposed to science; either postmodernists vaguely defined, or activists disguised as scientists who seek to place indigenous advocacy above the pursuit of objective truth. Chagnon identified Nancy Scheper-Hughes as a leader in the activist faction of anthropologists, citing her statement that we “need not entail a philosophical commitment to Enlightenment notions of reason and truth.”

Whatever the rights and wrong of his debates with Marvin Harris across three decades, Harris’s materialist paradigm was a scientifically debatable hypothesis, which caused Chagnon to realize that he and his old rival shared more in common than they did with the activist forces emerging in the field: “Ironically, Harris and I both argued for a scientific view of human behavior at a time when increasing numbers of anthropologists were becoming skeptical of the scientific approach.”…

Both Chagnon and Harris agreed that anthropology’s move away from being a scientific enterprise was dangerous. And both believed that anthropologists, not to mention thinkers in other fields of social sciences, were disguising their increasingly anti-scientific activism as research by using obscurantist postmodern gibberish. Observers have remarked at how abstruse humanities research has become and even a world famous linguist like Noam Chomsky admits, “It seems to me to be some exercise by intellectuals who talk to each other in very obscure ways, and I can’t follow it, and I don’t think anybody else can.” Chagnon resigned his membership of the American Anthropological Association in the 1980s, stating that he no longer understood the “unintelligible mumbo jumbo of postmodern jargon” taught in the field. In his last book, Theories of Culture in Postmodern Times, Harris virtually agreed with Chagnon. “Postmodernists,” he wrote, “have achieved the ability to write about their thoughts in a uniquely impenetrable manner. Their neo-baroque prose style with its inner clauses, bracketed syllables, metaphors and metonyms, verbal pirouettes, curlicues and figures is not a mere epiphenomenon; rather, it is a mocking rejoinder to anyone who would try to write simple intelligible sentences in the modernist tradition.”…

The quest for knowledge of mankind has in many respects become unrecognizable in the field that now calls itself anthropology. According to Chagnon, we’ve entered a period of “darkness in cultural anthropology.” With his passing, anthropology has become darker still.

I recount all of this for three reasons. First, Chagnon’s findings testify to the immutable urge to violence that lurks within human beings, and to the dominance of “nature” over “nurture”. That dominance is evident not only in the urge to violence (pace Steven Pinker), but in the strong heritability of such traits as intelligence.

The second reason for recounting Chagnon’s saga it is to underline the corruption of science in the service of left-wing causes. The underlying problem is always the same: When science — testable and tested hypotheses based on unbiased observations — challenges left-wing orthodoxy, left-wingers — many of them so-called scientists — go all out to discredit real scientists. And they do so by claiming, in good Orwellian fashion, to be “scientific”. (I have written many posts about this phenomenon.) Leftists are, in fact, delusional devotees of magical thinking.

The third reason for my interest in the story of Napoleon Chagnon is a familial connection of sorts. He was born in a village where his grandfather, also Napoleon Chagnon, was a doctor. My mother was one of ten children, most of them born and all of them raised in the same village. When the tenth child was born, he was given Napoleon as his middle name, in honor of Doc Chagnon.

More Presidential Trivia

The modern presidency began with the adored “activist”, Teddy Roosevelt. From TR to the present, there have been only four (of twenty) presidents who first competed in a general election as a candidate for the presidency: Taft, Hoover, Eisenhower, and Trump. Trump is alone in having had no previous governmental service before becoming president. There’s no moral to this story. Make of it what you will.

(See also “Presidents: Key Dates and Various Trivia“, to which this commentary has been added.)

Viewing Recommendations: TV Series and Mini-Series

My wife and I have watched many a series and mini-series. Some of them predate the era of VHS, DVD, and streaming, though much of the older fare is now available on DVD (and sometimes on streaming media). Our long list of favorites includes these (right-click a link to open it in a new tab):

Better Call Saul
Rumpole of the Bailey
Slings and Arrows
Pride and Prejudice
Cold Lazarus
Karaoke
Love in a Cold Climate
Oliver Twist
Bleak House
The Six Wives of Henry VIII
Danger UXB
Lonesome Dove
Sunset Song
Lillie
Vienna 1900
The Durrells in Corfu
The Wire
The Glittering Prizes
Bron/Broen
Wallander
Little Dorrit
Justified
Cracker
Pennies from Heaven
Mad Men
The Sopranos
Charters & Caldicott
Reckless
Our Mutual Friend
The First Churchills
The Unpleasantness at the Bellona Club
Murder Must Advertise
The Nine Tailors
Cakes and Ale
Madame Bovary
I, Claudius
Smiley’s People
Reilly: Ace of Spies
Prime Suspect
The Norman Conquests
Bramwell
Prime Suspect 2
Prime Suspect 3
Mystery!: Cadfael
Prime Suspect 5: Errors of Judgement
David Copperfield
Prime Suspect 6: The Last Witness
The Forsyte Saga
Elizabeth R
Jude the Obscure
Clouds of Witness
Country Matters
Notorious Woman
Five Red Herrings
Anna Karenina
Brideshead Revisited
To Serve Them All My Days

If you have more than a passing acquaintance with this genre, you will recognize that almost all of the fare is British. The Brits seem to have a near-lock on good acting and literate and clever writing.

Alas, of the series listed above, only Better Call Saul, Bron/Broen, and The Durrells in Corfu are still running. The Durrells comes to end this fall for U.S. viewers (Brits have already seen the final season). The final season of Bron/Broen has also aired in Europe, but isn’t yet available in the U.S.

As for Better Call Saul, the fifth season of which will air in 2020, there are rumors of a sixth and final season to follow.

Enjoy!

Rooted in the Real World of Real People

I am far from nostalgic about my home town. But it’s still my home town, and I often revisit it in my mind’s eye.

The only places that I mentally revisit with pleasure are the first home that I can remember — where I lived from age 1 to age 7 — and the first of the three red-brick school houses that I attended.

I haven’t been to my home town in four years. The occasion was the funeral of my mother, who lived to the age of 99.

I may not go back again. But it’s still my home town.

I think of it that way not only because I grew up there but also because it’s a “real” place: a small, mostly run-down, Midwestern city with a population of about 30,000 — the largest city in a county that lies beyond the fringes of the nearest metropolitan area.

Perhaps I’m nostalgic about it, after all, because “real” places like my home town seem to be vanishing from the face of America. By real, I mean places where (real) people still work with their hands; live in houses that are older than they are, and have fewer bathrooms than bedrooms; mow their own lawns, clean their own homes, and make their own meals (except when they partake of the American Legion fish fry or go to a Chick-Fil-A); bowl, hunt, fish, stroll their neighborhoods and know their neighbors (who have been their neighbors for decades); read Reader’s Digest, Popular Mechanics, and romance novels; go to bars that lack ferns and chrome; prefer Fox News and country music to NPR, CNN, MSNBC, and hip-hop; go to church and say grace before meals; and vote for politicians who don’t think of real people as racists, ignoramuses, gun nuts, or religious zealots (“deplorables”, in other words).

In fact, America is (or was) those real places with real people in them. And it is vanishing with them.

P.S. I have lived outside the real world of real people for a very long time, but the older I get, the more I miss it.

Organized?

I see ads on TV (with sound muted), at shopping website, and in periodicals for organizing systems and services. And I wonder who buys such things. It can’t be persons who are organized; they don’t need them. So it must be persons who are disorganized, and who benefit from them briefly and then go back to their old ways.

Sort of related, and worth a visit if you like trivia, is a post of mine from two years ago: “You Can Look That Up in Your Funk & Wagnall’s“.

Summer School?

Summer has long been my favorite season, not least because it meant summer vacation for many years. In those days of yore, school stayed in session until mid-June and didn’t resume until after Labor Day. (In fact, my college was on the quarter system, and school didn’t resume until late September.) Does anyone know why school (in large swaths of the country) now ends in early May and resumes in mid-to-late August? It doesn’t make sense to me because (1) there’s still cool, rainy weather in May, (2) there’s still a lot of summer left after school resumes, and (3) taxpayers must be paying for a lot of extra air-conditioning as a result of (1) and (2).

There are explanations for this idiocy (e.g., here), but I find them unpersuasive and rather like explanations of how the tail wags the dog.

Knot for Me

I was amused by this photo of Jeff Bezos sporting a Full Windsor knot:

(A compensatory device, perhaps?)

When I first learned to tie a necktie, more than 60 years ago, I used what is properly called a Half Windsor Knot (though it is often called a Windsor Knot). The Half Windsor is neater and more elegant than the Full Windsor, which looks like a chin-cushion.

But when I began working in a professional setting, where necktie wearing was then (early 1960s) de rigeur, I adopted the Four-in-hand knot, which is faster and easier to tie than either of the Windsors. The article linked to in the preceding sentence alleges that the four-in-hand is “notably asymmetric”. But it isn’t if one is careful about pulling the knot up into the “notch” between collar points, and sticks to straight-collar shirts (which also lend a more professional appearance than spread collars and button-downs).

In fact, a properly tied four-in-hand is more elegant than its cumbersome Windsor rivals. For one thing, the four-in-hand knot doesn’t overwhelm the long part of the tie, which (if one has good taste in ties) is what one wants to show off.  In addition, the four-in-hand lends itself to a neat dimple, which can be achieved with the Half Windsor but not the Full Windsor.

The neat (centered) dimple says: “I am a fastidious person” — and I am.

End of a Generation

The so-called greatest generation has died out in my family, as it soon will die out across the land. The recent death of my mother-in-law at age 98 removed from the scene the last of my wife’s and my parents and their siblings: 26 of them in all.

Their birth years ranged from 1903 to 1922. There were, oddly, 18 males as against only 8 females, and the disparity held for all four sets of siblings:

7 to 3 for my mother’s set

2 to 1 for my father’s set

5 to 3 for my wife’s mother’s set

4 to 1 for my wife’s father’s set.

Only one of the 26 died before reaching adulthood (my father’s younger brother at 18 months). Two others (also males) died relatively young. One of my mother’s brothers died just a few weeks before his 40th birthday as a result of a jeep accident (he was on active duty in the Coast Guard). One of my wife’s mother’s brothers died at age 48 as a long delayed result of a blow to the head by a police truncheon.

The other 15 males lived to ages ranging from 65 to 96, with an average age at death of 77 years. The 8 females lived to ages ranging from 69 to 99, with an average age at death of 87 years. The longest-lived of the males was the only one to pass the 90 mark. Four of the females lived into their 90s, dying at ages 91, 96, 98, and 99.

All of the 25 who reached adulthood also married. Only two of them had a marriage end in divorce. All of them were raised in near-poverty or in somewhat comfortable circumstances that vanished with the onset of the Great Depression. All of them worked hard, whether in the home or outside of it; none of them went on welfare; most of the men and two of the women served in uniform during World War II.

Thus passeth a generation sui generis.

Where are Elmer, Herman, Bert, Tom and Charley,
The weak of will, the strong of arm, the clown, the boozer, the fighter?
All, all, are sleeping on the hill….

Where are Ella, Kate, Mag, Lizzie and Edith,
The tender heart, the simple soul, the loud, the proud, the happy one?
All, all, are sleeping on the hill.

Edgar Lee Masters, Spoon River Anthology (“The Hill“)

A Summing Up

This post has been updated and moved to “Favorite Posts“.

V-J Day Stirs Memories

V-J Day in the United States commemorates the official surrender of Japan to the Allied Forces, and the end of World War II. The surrender ceremony took place on September 2, 1945 (the date in Japan), beginning at 9:00 a.m. Tokyo time. The ceremony was held in Tokyo Bay, aboard U.S.S. Missouri, and was presided over by General Douglas MacArthur:

Though it was actually September 1 in the United States at the time of the ceremony, V-J Day is traditionally observed in the U.S. on September 2.

The Monday after the surrender was Labor Day in the U.S. And in those more civilized times (barbarous wars aside), school began on the day after Labor Day.

On September 4, 1945 (the day after Labor Day), I entered kindergarten at the age of 4-2/3 years. Here’s the school that I attended:

PolkSch

In those innocent days, students got to school and back home by walking. Here’s the route that I followed as a kindergartener:

Route to Polk School

A 4-year-old walking several blocks between home and school, usually alone most of the way? Unheard of today, it seems. But that was a different time, in many ways.

For more, see “The Passing of Red-Brick Schoolhouses and a Way of Life“.

Recommended Reading

Leftism, Political Correctness, and Other Lunacies (Dispatches from the Fifth Circle Book 1)

 

On Liberty: Impossible Dreams, Utopian Schemes (Dispatches from the Fifth Circle Book 2)

 

We the People and Other American Myths (Dispatches from the Fifth Circle Book 3)

 

Americana, Etc.: Language, Literature, Movies, Music, Sports, Nostalgia, Trivia, and a Dash of Humor (Dispatches from the Fifth Circle Book 4)

“Look That Up in Your Funk & Wagnall’s”

The title of this post is a catchphrase from Laugh In (1968-1973), a weekly comedy show that I sometimes found funny, sometimes found amusing, and often found stupid. I bring it up because my parents owned a Funk & Wagnall’s encyclopedia. It was of a 1940s vintage, though I don’t remember perusing it until 1953, when we moved to a house with a built-in living-room bookcase, where the volumes occupied a prominent spot.

In any event, I looked at and into the encyclopedia so often from 1953 until I left for college in 1958 that I still remember the alphabetic divisions noted on the spines of the volumes:

I recall that there was also a final volume which contained a comprehensive index. And there were some “yearbook” updates.

Note the preponderance of words beginning with letters in the first half of the alphabet. Entries with letters beginning with “n” through “z” occupy only 7-plus of the 25 volumes.

What happened to the set? I don’t know. My mother moved out of the house in 1990, not long after the death of my father. Her next abode was much smaller, and the encyclopedia wasn’t in it. I never thought to ask her what happened to it. And now she is beyond asking — having died a few years ago at the age of 99.

Thank you for indulging this bit of nostalgia.

A Personality Test: Which Antagonist Do You Prefer?

1. Archangel Michael vs. Lucifer (good vs. evil)

2. David vs. Goliath (underdog vs. bully)

3. Alexander Hamilton vs. Aaron Burr (a slippery politician vs. a slippery politician-cum-traitor)

4. Richard Nixon vs. Alger Hiss (a slippery politician vs. a traitorous Soviet spy)

5. Sam Ervin vs. Richard Nixon (an upholder of the Constitution vs. a slippery politician)

6. Kenneth Starr vs. Bill Clinton (a straight arrow vs. a slippery politician)

7. Elmer Fudd vs. Bugs Bunny (a straight arrow with a speech impediment vs. a rascally rabbit)

8. Jerry vs. Tom (a clever mouse vs. a dumb but determined cat)

9. Tweety Bird vs. Sylvester the Cat (a devious bird vs. a predatory cat)

10. Road Runner vs. Wile E. Coyote (a devious bird vs. a stupid canine)

11. Rocky & Bullwinkle vs. Boris & Natasha (fun-loving good guys vs. funny bad guys)

12. Dudley Do-Right vs. Snidely Whiplash (a straight arrow vs. a stereotypical villain)

Summarize and explain your choices in the comments. Suggestions for other pairings are welcome.

The Midwest Is a State of Mind

I am a son of the Middle Border,* now known as the Midwest. I left the Midwest, in spirit, almost 60 years ago, when I matriculated at a decidedly cosmopolitan State university. It was in my home State, but not much of my home State.

Where is the Midwest? According to Wikipedia, the U.S. Census Bureau defines the Midwest as comprising the 12 States shaded in red:

They are, from north to south and west to east, North Dakota, South Dakota, Nebraska, Kansas, Minnesota, Iowa Missouri, Wisconsin, Illinois, Michigan, Indiana, and Ohio.

In my experience, the Midwest really begins on the west slope of the Appalachians and includes much of New York State and Pennsylvania. I have lived and traveled in that region, and found it, culturally, to be much like the part of “official” Midwest where I was born and raised.

I am now almost 60 years removed from the Midwest (except for a three-year sojourn in the western part of New York State, near the Pennsylvania border). Therefore, I can’t vouch for the currency of a description that appears in Michael Dirda’s review of Jon K. Lauck’s From Warm Center to Ragged Edge: The Erosion of Midwestern Literary and Historical Regionalism, 1920-1965 (Iowa and the Midwest Experience). Dirda writes:

[Lauck] surveys “the erosion of Midwestern literary and historical regionalism” between 1920 and 1965. This may sound dull as ditch water to those who believe that the “flyover” states are inhabited largely by clodhoppers, fundamentalist zealots and loudmouthed Babbitts. In fact, Lauck’s aim is to examine “how the Midwest as a region faded from our collective imagination” and “became an object of derision.” In particular, the heartland’s traditional values of hard work, personal dignity and loyalty, the centrality it grants to family, community and church, and even the Jeffersonian ideal of a democracy based on farms and small land-holdings — all these came to be deemed insufferably provincial by the metropolitan sophisticates of the Eastern Seaboard and the lotus-eaters of the West Coast.

That was the Midwest of my childhood and adolescence. I suspect that the Midwest of today is considerably different. American family life is generally less stable than it was 60 years ago; Americans generally are less church-going than they were 60 years ago; and social organizations are less robust than they were 60 years ago. The Midwest cannot have escaped two generations of social and cultural upheaval fomented by the explosion of mass communications, the debasement of mass culture, the rise of the drugs-and-rock culture, the erasure of social norms by government edicts, and the creation of a culture of dependency on government.

I nevertheless believe that there is a strong, residual longing for and adherence to the Midwestern culture of 60 years ago — though it’s not really unique to the Midwest. It’s a culture that persists throughout America, in rural areas, villages, towns, small cities, and even exurbs of large cities.

The results of last year’s presidential election bear me out. Hillary Clinton represented the “sophisticates” of the Eastern Seaboard and the lotus-eaters of the West Coast. She represented the supposed superiority of technocracy over the voluntary institutions of civil society. She represented a kind of smug pluralism and internationalism that smirks at traditional values and portrays as clodhoppers and fundamentalist zealots those who hold such values. Donald Trump, on the other hand (and despite his big-city roots and great wealth), came across as a man of the people who hold such values.

What about Clinton’s popular-vote “victory”? Nationally, she garnered 2.9 million more votes than Trump. But the manner of Clinton’s “victory” underscores the nation’s cultural divide and the persistence of a Midwestern state of mind. Clinton’s total margin of victory in California, New York, and the District of Columbia was 6.3 million votes. That left Trump ahead of Clinton by 3.4 million votes in the other 48 States, and even farther ahead in non-metropolitan areas. Clinton’s “appeal” (for want of a better word) was narrow; Trump’s was much broader (e.g., winning a higher percentage than Romney did of the two-party vote in 39 States). Arguably, it was broader than that of every Republican presidential candidate since Ronald Reagan won a second term in 1984.

The Midwestern state of mind, however much it has weakened in the last 60 years, remains geographically dominant. In the following graph, counties won by Clinton are shaded in blue; counties won by Trump are shaded in red:


Source: Wikipedia article about the 2016 presidential election.


* This is an allusion to Hamlin Garland‘s novel, A Son of the Middle Border. Garland, a native of Wisconsin, was himself a son of the Middle Border.


Related posts:
“Intellectuals and Society”: A Review
The Left’s Agenda
The Left and Its Delusions
The Spoiled Children of Capitalism
Politics, Sophistry, and the Academy
Subsidizing the Enemies of Liberty
Are You in the Bubble?
The Culture War
Ruminations on the Left in America
Academic Ignorance
The Euphemism Conquers All
Defending the Offensive
Superiority
Whiners
A Dose of Reality
God-Like Minds
An Addendum to (Asymmetrical) Ideological Warfare
Khizr Khan’s Muddled Logic
A Lesson in Election-Rigging
My Platform (which reflects a Midwestern state of mind)
Polarization and De-facto Partition
H.L. Mencken’s Final Legacy
The Shy Republican Supporters
Roundup (see “Civil War II”)
Retrospective Virtue-Signalling
The Left and Violence
Four Kinds of “Liberals”
Leftist Condescension
You Can’t Go Home Again
Class in America
A Word of Warning to Leftists (and Everyone Else)
Another Thought or Two about Class

You Can’t Go Home Again

You can’t go back home to your family, back home to your childhood … back home to a young man’s dreams of glory and of fame … back home to places in the country, back home to the old forms and systems of things which once seemed everlasting but which are changing all the time – back home to the escapes of Time and Memory.

— Thomas Wolfe, You Can’t Go Home Again

*      *      *

I have just begun to read a re-issue of Making It, Norman Podhoretz‘s memoir that stirred up the literati of New York City. According to Jennifer Schuessler (“Norman Podhoretz: Making Enemies,” Publisher’s Weekly, January 25, 1999), Podhoretz’s

frank 1967 account of the lust for success that propelled him from an impoverished childhood in Brooklyn to the salons of Manhattan, … scandalized the literary establishment that once hailed him as something of a golden boy. His agent wouldn’t represent it. His publisher refused to publish it. And just about everybody hated it. In 1972, Podhoretz’s first high-profile personal squabble, with Random House’s Jason Epstein, went public when the New York Times Magazine published an article called “Why Norman and Jason Aren’t Talking.” By 1979, when Podhoretz published Breaking Ranks, a memoir of his conversion from radicalism to militant conservatism, it seemed just about everybody wasn’t talking to Norman.

Next month, Podhoretz will add another chapter to his personal war chronicle with the publication of Ex-Friends: Falling Out with Allen Ginsberg, Lionel and Diana Trilling, Lillian Hellman, Hannah Arendt, and Norman Mailer. In this short, sharp, unabashedly name-dropping book, Podhoretz revisits the old battles over communism and the counterculture, not to mention his bad reviews. But for all his talk of continued struggle against the “regnant leftist culture that pollutes the spiritual and cultural air we all breathe,” the book is a frankly nostalgic, even affectionate look back at the lost world of “the Family,” the endlessly quarreling but close-knit group of left-leaning intellectuals that gathered in the 1940s and ’50s around such magazines as the Partisan Review and Commentary.

Given this bit of background, you shouldn’t be surprised that it was Podhoretz who said that a conservative is a liberal who has been mugged by reality. Nor should you be surprised that Podhoretz wrote this about Barack Obama (which I quote in “Presidential Treason“):

His foreign policy, far from a dismal failure, is a brilliant success as measured by what he intended all along to accomplish….

… As a left-wing radical, Mr. Obama believed that the United States had almost always been a retrograde and destructive force in world affairs. Accordingly, the fundamental transformation he wished to achieve here was to reduce the country’s power and influence. And just as he had to fend off the still-toxic socialist label at home, so he had to take care not to be stuck with the equally toxic “isolationist” label abroad.

This he did by camouflaging his retreats from the responsibilities bred by foreign entanglements as a new form of “engagement.” At the same time, he relied on the war-weariness of the American people and the rise of isolationist sentiment (which, to be sure, dared not speak its name) on the left and right to get away with drastic cuts in the defense budget, with exiting entirely from Iraq and Afghanistan, and with “leading from behind” or using drones instead of troops whenever he was politically forced into military action.

The consequent erosion of American power was going very nicely when the unfortunately named Arab Spring presented the president with several juicy opportunities to speed up the process. First in Egypt, his incoherent moves resulted in a complete loss of American influence, and now, thanks to his handling of the Syrian crisis, he is bringing about a greater diminution of American power than he probably envisaged even in his wildest radical dreams.

For this fulfillment of his dearest political wishes, Mr. Obama is evidently willing to pay the price of a sullied reputation. In that sense, he is by his own lights sacrificing himself for what he imagines is the good of the nation of which he is the president, and also to the benefit of the world, of which he loves proclaiming himself a citizen….

No doubt he will either deny that anything has gone wrong, or failing that, he will resort to his favorite tactic of blaming others—Congress or the Republicans or Rush Limbaugh. But what is also almost certain is that he will refuse to change course and do the things that will be necessary to restore U.S. power and influence.

And so we can only pray that the hole he will go on digging will not be too deep for his successor to pull us out, as Ronald Reagan managed to do when he followed a president into the White House whom Mr. Obama so uncannily resembles. [“Obama’s Successful Foreign Failure,” The Wall Street Journal, September 8, 2013]

Though I admire Podhoretz’s willingness to follow reality to its destination in conservatism — because I made the same journey myself — I am drawn to his memoir by another similarity between us. In the Introduction to the re-issue, Terry Teachout writes:

Making It is never more memorable than when it describes its author’s belated discovery of “the brutal bargain” to which he was introduced by “Mrs. K.,” a Brooklyn schoolteacher who took him in hand and showed him that the precocious but rough-edged son of working-class Jews from Galicia could aspire to greater things— so long as he turned his back on the ghettoized life of his émigré parents and donned the genteel manners of her own class. Not until much later did he realize that the bargain she offered him went even deeper than that:

She was saying that because I was a talented boy, a better class of people stood ready to admit me into their ranks. But only on one condition: I had to signify by my general deportment that I acknowledged them as superior to the class of people among whom I happened to have been born. . . . what I did not understand, not in the least then and not for a long time afterward, was that in matters having to do with “art” and “culture” (the “life of the mind,” as I learned to call it at Columbia), I was being offered the very same brutal bargain and accepting it with the wildest enthusiasm.

So he did, and he never seriously doubted that he had done the only thing possible by making himself over into an alumnus of Columbia and Cambridge and a member of the educated, art-loving upper middle class. At the same time, though, he never forgot what he had lost by doing so, having acquired in the process “a distaste for the surroundings in which I was bred, and ultimately (God forgive me) even for many of the people I loved.”

It’s not an unfamiliar story. But it’s a story that always brings a pang to my heart because it reminds me too much of my own attitudes and behavior as I “climbed the ladder” from the 1960s to the 1990s. Much as I regret the growing gap between me and my past, I have learned from experience that I can’t go back, and don’t want to go back.

What happened to me is probably what happened to Norman Podhoretz and tens of millions of other Americans. We didn’t abandon our past; we became what was written in our genes.

This mini-memoir is meant to illustrate that thesis. It is aimed at those readers who can’t relate to a prominent New Yorker, but who might see themselves in a native of flyover country.

My “ghetto” wasn’t a Jewish enclave like Podhoretz’s Brownsville, but an adjacent pair of small cities in the eastern flatlands of Michigan, both of them predominantly white and working-class. They are not suburbs of Detroit — as we used to say emphatically — nor of any other largish city. We were geographically and culturally isolated from the worst and best that “real” cities have to offer in the way of food, entertainment, and ethnic variety.

My parents’ roots, and thus my cultural inheritance, were in small cities, towns and villages in Michigan and Ontario. Life for my parents, as for their forbears, revolved around making a living, “getting ahead” by owning progressively nicer (but never luxurious) homes and cars, socializing with friends over card games, and keeping their houses and yards neat and clean.

All quite unexceptional, or so it seemed to me as I was growing up. It only began to seem exceptional when I became the first scion of the the family tree to “go away to college,” as we used to say. (“Going away” as opposed to attending a local junior college, as did my father’s younger half-brother about eight years before I matriculated.)

Soon after my arrival on the campus of a large university, whose faculty and students hailed from around the world, I began to grasp the banality of my upbringing in comparison to the cultural richness and sordid reality of the wider world. It was a richness and reality of which my home-town contemporaries and I knew little because we were raised in the days of Ozzie and Harriet — before the Beatles, Woodstock, bearded men with pony-tails, shacking up as a social norm, widespread drug use, and the vivid depiction of sex in all of its natural and unnatural variety.

My upbringing, like that of my home-town contemporaries was almost apolitical. If we overheard our parents talking about politics, we overheard a combination of views that today seems unlikely: suspicion of government; skepticism about unions (my father had to join one in order to work), disdain for “fat cats”; sympathy for “the little guy”; and staunch patriotism. There was nothing about civil rights and state-enforced segregation, which were seen (mistakenly) as peculiarly Southern issues. Their own racism was seldom in evidence because blacks generally “knew their place” in our white-dominated communities.

And then, as a student at a cosmopolitan Midwestern university (that isn’t an oxymoron), I began to learn — in and out of class. The out-of-class lessons came through conversations with students whose backgrounds differed greatly from mine, including two who had been displaced persons in the wake of World War II. My first-year roommate was a mild-mannered Iranian doctoral student whose friends (some of them less mild-mannered) spoke openly about the fear in which Iranians lived because of SAVAK‘s oppressive practices. In my final year as an undergraduate I befriended some married graduate students, one of whom (an American) had spent several years in Libya as a geologist for an American oil company and had returned to the States with an Italian wife.

One of the off-campus theaters specialized in foreign films, which I had never before seen, and which exposed me to people, places, attitudes, and ideas that were intellectually foreign to me, but which I viewed avidly and with acceptance. My musical education was advanced by a friendship with a music major, through whom I met other music majors and learned much about classical music and, of all things, Gilbert and Sullivan. One of the music majors was a tenor who had to learn The Mikado, and did so by playing a recording of it over and over. I became hooked, and to this day can recite large chunks of the libretto. I used to sing them, but my singing days are over.

Through my classes — and often through unassigned reading — I learned how to speak and read French (fluently, those many years ago), and ingested various-sized doses of philosophy, history (ancient and modern), sociology, accounting (the third of four majors), and several other things that escape me at the moment.

Through economics (my fourth and final major), I learned (but didn’t then question), the contradictory tenets of microeconomics (how markets work to allocate resources and satisfy wants efficiently) and macroeconomics (then dominated by the idea of government’s indispensable role in the economic order). But I was drawn in by the elegance of economic theory, and mistook its spurious precision for deep understanding. Though I have since rejected macroeconomic orthodoxy (e.g., see this).

My collegiate “enlightenment” was mild, by today’s standards, but revelatory to a small-city boy. And I was among the still relatively small fraction of high-school graduates who went away to college. So my exposure to a variety of people, cultures, and ideas immediately set me apart — apart not only from my parents and the members of their generation, but also apart from most of the members of my own generation.

What set me apart more than anything was my loss of faith. In my second year I went from ostentatiously devout Catholicism to steadfast agnosticism in a span of weeks. I can’t reconstruct the transition at a remove of almost 60 years, but I suspect that it involved a mixture of delayed adolescent rebellion, a reckoning (based on things I had learned) that the roots of religion lay in superstition, and a genetic predisposition toward skepticism (my father was raised Protestant but scorned religion in his mild way). At any rate, when I walked out of church in the middle of Mass one Sunday morning, I felt as if I had relieved myself of a heavy burden and freed my mind for the pursuit of knowledge.

The odd thing is that, to this day, I retain feelings of loyalty to the Church of my youth — the Church of the Latin Mass (weekly on Sunday morning, not afternoon or evening), strict abstinence from meat on Friday, confession on Saturday, fasting from midnight on Sunday (if one were in a state of grace and fit for Holy Communion), and the sharp-tongued sisters with sharp-edged rulers who taught Catechism on Saturday mornings (parochial school wasn’t in my parents’ budget). I have therefore been appalled, successively, by Vatican Council II, most of the popes of the past 50 years, the various ways in which being a Catholic has become easier, and (especially) the egregious left-wing babbling of Francis. And yet I remain an agnostic who only in recent years has acknowledged the logical necessity of a Creator, but probably not the kind of Creator who is at the center of formal religion. Atheism — especially of the strident variety — is merely religion turned upside down; a belief in something that is beyond proof; I scorn it.

To complete this aside, I must address the canard peddled by strident atheists and left-wingers (there’s a lot of overlap) about the evil done in the name of religion, I say this: Religion doesn’t make fanatics, it attracts them (in relatively small numbers), though some Islamic sects seem to be bent on attracting and cultivating fanaticism. Far more fanatical and attractive to fanatics are the “religions” of communism, socialism (including Hitler’s version), and progressivism (witness the snowflakes and oppressors who now dominate the academy). I doubt that the number of murders committed in the name of religion amounts to one-tenth of the number of murders committed by three notable anti-religionists: Hitler (yes, Hitler), Stalin, and Mao. I also believe — with empirical justification — that religion is a bulwark of liberty; whereas, the cult of libertarianism — usually practiced by agnostics and atheists — is not (e.g., this post and the others linked to therein).

It’s time to return to the chronological thread of my narrative. I have outlined my graduate-school and work experiences in “About.” The main thing to note here is what I learned during the early mid-life crisis which took me away from the D.C. rat race for about three years, as owner-operator of a (very) small publishing company in a rural part of New York State.

In sum, I learned to work hard. Before my business venture I had coasted along using my intelligence but not a lot of energy, but nevertheless earning praise and good raises. I was seldom engaged it what I was doing: the work seemed superficial and unconnected to anything real to me. That changed when I became a business owner. I had to meet a weekly deadline or lose advertisers (and my source of income), master several new skills involved in publishing a weekly “throwaway” (as the free Pennysaver was sometimes called), and work six days a week with only two brief respites in three years. Something clicked, and when I gave up the publishing business and returned to the D.C. area, I kept on working hard — as if my livelihood depended on it.

And it did. Much as I had loved being my own boss, I wanted to live and retire more comfortably than I could on the meager income that flowed uncertainly from the Pennysaver. Incentives matter. So in the 18 years after my return to the D.C. area I not only kept working hard and with fierce concentration, but I developed (or discovered) a ruthless streak that propelled me into the upper echelon of the think-tank.

And in my three years away from the D.C. area I also learned, for the first time, that I couldn’t go home again.

I was attracted to the publishing business because of its location in a somewhat picturesque village. The village was large enough to sport a movie theater, two super markets, and a variety of commercial establishments, including restaurants, shoe stores, clothing stores, jewelers, a Rite-Aid drug store, and even a J.J. Newberry dime store. It also had many large, well-kept homes All in all, it appealed to me because, replete with a “real” main street, it reminded me of the first small city in which I grew up.

But after working and associating with highly educated professionals, and after experiencing the vast variety of restaurants, museums, parks, and entertainment of the D.C. area, I found the village and its natives dull. Not only dull, but also distant. They were humorless and closed to outsiders. It came to me that the small cities in which I had grown up were the same way. My memories of them were distorted because they were memories of a pre-college boy who had yet to experience life in the big city. They were memories of a boy whose life centered on his parents and a beloved grandmother (who lived in a small village of similarly golden memory).

You can’t go home again, metaphorically, if you’ve gone away and lived a different life. You can’t because you are a different person than you were when you were growing up. This lesson was reinforced at the 30-year reunion of my high-school graduating class, which occurred several years after my business venture and a few years after I had risen into the upper echelon of the think-tank.

There I was, with my wife and sister (who graduated from the same high school eight years after I did), happily anticipating an evening of laughter and shared memories. We were seated at a table with two fellows who had been good friends of mine (and their wives, whom I didn’t know). It was deadly boring; the silences yawned; we had nothing to say to each other. One of the old friends, who had been on the wagon, was so unnerved by the forced bonhomie of the occasion that he fell off the wagon. Attempts at mingling after dinner were awkward. My wife and sister readily agreed to abandon the event. We drove several miles to an elegant, river-front hotel where we had a few drinks on the deck. Thus the evening ended on a cheery note, despite the cool, damp drizzle. (A not untypical August evening in Michigan.)

I continued to return to Michigan for another 27 years, making what might be my final trip for the funeral of my mother, who lived to the age of 99. But I went just to see my parents and siblings, and then only out of duty.

The golden memories of my youth remain with me, but I long ago gave up the idea of reliving the past that is interred in those memories.

Three Now-Obscure Actors

The 1950s were as dull as they’re made out to be. (Oh, to have them back!) Among the landmarks of that dull decade were three actors who, between them, seemed to appear almost every night on one TV drama or another: Henry Jones (1912-1999), John Newland (1917-2000), and Harry Townes (1914-2001). All three had long acting careers, and Newland was also a producer and director. But you probably can’t put faces with the names. Here they are:

Henry Jones
Henry Jones

 

John Newland
John Newland

 

Harry Townes
Harry Townes

Beyond the Far Horizon

Several years ago I began to track some celebrities who had attained the age of 90. The rather quirky list of notables now looks like this:

Luise Rainer 104, George Beverly Shea 104, Charles Lane 102Irwin Corey 101, Herman Wouk 101, George Kennan 101, Olivia de Havilland 100 (on July 1*), Gloria Stuart 100Eddie Albert 99, Michael DeBakey 99, Zsa Zsa Gabor 99, Vera Lynn 99, Mitch Miller 99, Max Schmeling 99, Risë Stevens 99, John Wooden 99Tony Martin 98, Dale Messick 98, Eli Wallach 98John Kenneth Galbraith 97, Ernest Gallo 97, Billy Graham 97, Estée Lauder 97, Art Linkletter 97, Al Lopez 97Karl Malden 97, John Mills 97, Kitty Carlisle 96Monte Irvin 96, Jack LaLanne 96, Kevin McCarthy 96, Harry Morgan 96, Fay Wray 96Jane Wyatt 96, Joseph Barbera 95, Ernest Borgnine 95, Henri Cartier-Bresson 95Herbert Lom 95, Peter Rodino, Jr 95, Sargent Shriver 95, Patty Andrews 94, Sammy Baugh 94, Constance Cummings 94, Lady Bird Johnson 94, Robert Mondavi 94, Byron Nelson 94, Les Paul 94, Ruth Hussey 93, Frankie Laine 93, Robert McNamara 93, Artie Shaw 93,  Richard Widmark 93, Oleg Cassini 92, Ralph Edwards 92Bob Feller 92, Ernie Harwell 92, Lena Horne 92Julia Child 91, Archibald Cox 91, Geraldine Fitzgerald 91, Frances Langford 91, John Profumo 91, William Westmoreland 91Jane Wyman 90.

I was reminded of this list by a name in the “Today’s Birthdays” feature of the newspaper: actress June Lockhart 91. Because only six members of my original list remain among the living, I’m adding Lockhart to the list, as well as these notables of interest to me: baseball player Bobby Doerr 98, justice John Paul Stevens 96, economist and secretary of state George Shultz 95, prince consort Philip Mountbatten 95, actress Betty White 94, secretary of defense Melvin Laird 93, baseball player Red Schoendienst 93, actress Rose Marie 92, physicist Freeman Dyson 92, president George H.W. Bush 92, and actor Hal Holbrook 91.

_________
* By my reckoning, of the dozens (or hundreds) of actors who starred in Hollywood films before World War II, only Olivia de Havilland survives. She attained star billing in 1935, at the age of 19, for her role in Captain Blood. Other pre-war films in which she starred include The Adventures of Robin Hood (1938) and Gone with the Wind (1939).