Images from the Past

I mentioned the hardier Americans of more than a century ago in the preceding post. I had in mind my own grandparents and the members of their generation and socioeconomic class. They grew up in hard-scrabble times and never rose far above them. But they were a tough lot who would be appalled by the whiny Boomers and later generations who are ashamed of their own race and cringe in fear at the sight of a maskless person.

Here are my maternal grandparents in 1935, as the Great Depression dragged on. He was a caretaker at a resort in Michigan; she raised 10 children and lived to the age of 96. His body gave out at age 64, and he was buried on the day of my birth.

They were 59 and 55 when the photograph was taken. I inherited his hairline and nose and their hair pigmentation. (At the age of 80, my hair still isn’t completely gray.)

Here is my maternal grandfather with six of his seven sons, also in 1935. (All seven served in the armed forces, six of them during World War II.) All of his and my grandmother’s children lived to adulthood, with an aunt making it to 96 and my mother to 99.

Below is my paternal grandfather in 1917, with my father in his arms. My grandfather was a laborer all of his life and died at the age of 57 in a construction accident. His two wives bore him 13 children, of whom 12 survived to adulthood and three are still living. The house is typical of what he could afford on a laborer’s wages. The front door is probably open to provide “air conditioning” because the photo was taken in late August — a hot and humid time in the part of Michigan where he lived.

Despite my grandfather’s meager income and the cost of supporting a large family, thrift and hard work enabled him to buy farm property on which his second wife (and mother of 10 of his children) lived to the age of 97. The property is still in the family, and some of his children and their descendants have homes there.

Below is my paternal grandmother (left) with her younger sister, about 1919. The unpainted house in the background is probably the same shabby one glimpsed in the preceding photo. Grandmother died at the age of 25, leaving my grandfather with two surviving children; their youngest child having died at the age of 13 months. (Her grave went unmarked for 90 years until my sister and I had a headstone erected in 2015.)

I didn’t know my maternal grandfather and paternal grandmother, both of whom died before I was born. But I remember well my paternal grandfather, and especially well my maternal grandmother who lived until I was 36 years old. It is her house that is the centerpiece of one of my earliest posts: “Reveries“.

“Tough” doesn’t begin to describe the generation of my grandparents or that of their children, who became known as the “greatest generation”.

For the departed:

Time, you old gipsy man,
Will you not stay,
Put up your caravan
Just for one day?….

Last week in Babylon,
Last night in Rome,
Morning and in the crush
Under Paul’s dome;
Under Paul’s dial
You tighten your rein —
Only a moment, and off once again;
Off to some city
Now blind in the womb,
Off to another
Ere that’s in the tomb.

From Time, You Old Gipsy Man, by Ralph Hodgson (1871-1962)

Speaking of Cultural Appropriation …

… as I was here, I have a serious bone to pick with the parents of yore who broke the gender barrier by giving boy names to girl babies. There was a time when Ashley, Beverly, Evelyn, Leslie, Marion, Meredith, Merle, Shirley, and Vivian were boy names (exclusively or predominantly). The Brits even had girl-name equivalents for Leslie (Lesley) and Vivian (Vivien).

That was in the dim past. Naming has since gotten out of hand:

In 1910, just 5% of American babies named “Charlie” were girls. Over 100 years later, girl Charlies took over their male counterparts for the first time in 2016—making up 51% of the share.

With little fuss or fanfare, Charlie has gone gender-neutral….

Quartz analyzed the Social Security Administration’s public data on baby names to find out whether what happened with “Charlie” is an exception, or part of a wider trend. Our results show that, on average, the country is slowly moving toward using more gender-neutral names. And a few popular names are leading the way.

To analyze the trend, we calculated a “genderedness score” for every American baby name—and for the country on the whole. The score goes from zero to one. A zero means a name is perfectly non-gendered. That is to say, exactly half of the babies with that name are boys, and the other half are girls. A one, meanwhile, means the name is used exclusively for one gender. So a lower score means a name is more gender-neutral, and less biased.

“Biased”? What’s biased about calling a boy by a boy’s name and a girl by a girl’s name? The PC brigade to the contrary  notwithstanding, sex (a.k.a. gender) isn’t “assigned” at birth — it just is.

Anyway …

American parents have long had a strong preference for gendered names. The overall genderedness score was 0.97 in 1920, meaning nearly every kid had a name that was used almost exclusively for just boys or just girls. The score is falling, though. It hit 0.946 in 2016, the most recent year the SSA has name data for. The 1920 score is close to the historical average for names like “Billy,” “Selma,” and “Otis.” Names around the new—less gender-specific—number include “Jerry,” “Aden,” and “Orion.”

That’s another thing: Made-up names that have no historical roots. (And don’t get me started on “black” names.)

Continuing …

Several popular names, Charlie among them, are driving this trend [toward gender-neutrality]. No girls named “Blake” show up in the data at all until 1951. But today, one-quarter of American Blakes are female. And it’s not just boys’ names being given to girls, either. “Marion,” for example, has seen a major shift from girls to boys….

Many other popular names from the 2016 dataset are also gender-neutral, including “Finley,” “Justice,” and “Armani.” Here are the least-gendered 20, only including those with more than 500 babies with that name.

Name Gendered score Births
Charlie 0.02 3,448
Oakley 0.05 1,009
Justice 0.05 1,257
Landry 0.07 612
Armani 0.07 962
Skyler 0.09 1,667
Azariah 0.1 656
Finley 0.16 2,961
Royal 0.16 1,134
Lennon 0.19 1,095
Hayden 0.2 3,942
Casey 0.22 834
Emerson 0.23 3,163
Rowan 0.24 3,522
Baylor 0.24 548
Dakota 0.24 2,266
River 0.24 2,943
Remy 0.24 1,042
Emory 0.25 715
Phoenix 0.26 1,945

At the same time, some names are becoming more gendered. “Ashton” has gone from being pretty equal to primarily a boys’ name. “Harper” used to be more common for boys, but is now over 97% girls. And the most popular names from 2016 score high on the genderedness scale—Emma and Olivia at 0.99, and Scarlett and Victoria at 1.00, without a single boy.

Given that the average is moving the other way, though, it seems these mono-gendered choices are slowly becoming less popular. Gender-neutral options like Parker, Jordan, and Riley were among the top 100 in 2016.

Note the number of made-up names and names that (in saner times) would be thought of as masculine (e.g., Landry, Finley, Lennon, Casey, Emerson, Baylor, and Emory).

Unmentioned by the author is a phenomenon that would be obvious to an attentive reader: The appropriation of names (like cultural appropriation generally) is a one-way street. Girls get to do it (well, their parents do); boys just suffer in silence (or else) as their names become sissified.

You’ll know that the cultural revolution has succeeded when Emma, Scarlett, and Victoria become accepted as boys’ names.

I Knew (of) a Man Who Knew (of) a Man …

… who, etc., etc., knew (of) George Washington.

A bit of trivia for this, the 288th anniversary of Washington’s birth on February 22, 1732 (New Style date). Washington served as president from April 30, 1789 to March 4, 1797. He died on December 14, 1799.

Martin Van Buren (8th president) was born on December 5, 1782, and was 6 to 14 years old during Washington’s presidency. Van Buren was therefore had memories of the first presidency. Van Buren served as president from March 4, 1837, to March 4, 1841. He died on July 24, 1862.

Rutherford B. Hayes (19th president) was born on October 4, 1822, and was 14 to 18 years old during Van Buren’s presidency. Hayes served as president from March 4, 1877, to March 4, 1881. He died on January 17, 1893.

William Howard Taft (27th president) was born on September 15, 1857, and was 19 to 23 years old during Hayes’s presidency. Taft served as president from March 4, 1909, to March 4, 1913. He died on March 8, 1930.

Dwight D. Eisenhower (34th president) was born on October 14, 1890, and was 18 to 22 years old during Taft’s presidency. Eisenhower served as president from January 20, 1953, to January 20, 1961. He died on March 28, 1969.

Donald J. Trump (45th president) was born on June 14, 1946, and was 6 to 14 years old during Eisenhower’s presidency. Trump’s presidency began on January 20, 2017.

From Washington to Trump: 5 degrees of separation.

(For more facts about presidents, see “Presidents: Key Dates and Various Trivia“.)

A Footnote to “Peak Civilization”

I ended that post with this:

Every line of human endeavor reaches a peak, from which decline is sure to follow if the things that caused it to peak are mindlessly rejected for the sake of novelty (i.e., rejection of old norms just because they are old). This is nowhere more obvious than in the arts.

It should be equally obvious to anyone who takes an objective look at the present state of American society and is capable of comparing it with American society of the 1940s and 1950s. For all of its faults it was a golden age. Unfortunately, most Americans now living (Noah Smith definitely included) are too young and too fixated on material things to understand what has been lost — irretrievably, I fear.

My point is underscored by Annebelle Timsit, writing at Quartz:

The endless stretch of a lazy summer afternoon. Visits to a grandparent’s house in the country. Riding your bicycle through the neighborhood after dark. These were just a few of the revealing answers from more than 400 Twitter users in response to a question: “What was a part of your childhood that you now recognize was a privilege to have or experience?”

That question, courtesy of writer Morgan Jerkins, revealed a poignant truth about the changing nature of childhood in the US: The childhood experiences most valued by people who grew up in the 1970s and 1980s are things that the current generation of kids are far less likely to know.

That’s not a reference to cassette tapes, bell bottoms, Blockbuster movies, and other items popular on BuzzFeed listicles. Rather, people are primarily nostalgic for a youthful sense of independence, connectedness, and creativity that seems less common in the 21st century. The childhood privileges that respondents seemed to appreciate most in retrospect fall into four broad categories:

“Riding my bike at all hours of the day into the evening throughout many neighborhoods without being stopped or asked what I was doing there,” was one Twitter user’s answer to Jerkins’ question. Another commenter was grateful for “summer days & nights spent riding bikes anywhere & everywhere with friends, only needing to come home when the streetlights came on,” while yet another recalled “having a peaceful, free-range childhood.” Countless others cited the freedom to explore—with few restrictions—as a major privilege of their childhood.

American children have less independence and autonomy today than they did a few generations ago.

For many of today’s children, that privilege is disappearing. American children have less independence and autonomy today than they did a few generations ago. As parents have become increasingly concerned with safety, fewer children are permitted to go exploring beyond the confines of their own backyard. Some parents have even been prosecuted or charged with neglect for letting their children walk or play unsupervised. Meanwhile, child psychologists say that too many children are being ushered from one structured activity to the next, always under adult supervision—leaving them with little time to play, experiment, and make mistakes.

That’s a big problem. Kids who have autonomy and independence are less likely to be anxious, and more likely to grow into capable, self-sufficient adults. In a recent video for The Atlantic, Julie Lythcott-Haims, author of How to Raise an Adult, argues that so-called helicopter parents “deprive kids the chance to show up in their own lives, take responsibility for things and be accountable for outcomes.”

That message seems to be gaining traction. The state of Utah, for example, recently passed a “free-range” parenting law meant to give parents the freedom to send kids out to play on their own.”

“Bravo!” to the government of Utah.

Transport yourself back three decades from the 1970s and 1980s to the 1940s and 1950s, when I was a child and adoslescent, and the contrast between then and now is even more stark than the contrast noted by Timsit.

And it has a lot to do with the social ruin that has been visited upon America by the spoiled (cosseted) children of capitalism.


Other related posts:

Ghosts of Thanksgiving Past
The Passing of Red Brick Schoolhouses and a Way of Life
An Ideal World
‘Tis the Season for Nostalgia
Another Look into the Vanished Past
Whither (Wither) Classical Liberalism and America?

Another Look into the Vanished Past

In “Ghosts of Thanksgiving Past” I recall family gatherings of long ago. “The Passing of Red Brick Schoolhouses and a Way of Life” laments the passing of the schoolhouses of my childhood, along with the innocence that was once a hallmark of non-urban America. In “‘Tis the Season for Nostalgia” I recall Christmases past.

I was reminded of those trips into the past by a post at The Federalist by Nathaniel Blake, “What Good Is Cheaper Stuff If It Comes At The Expense Of Community?“. It prompted me to recall other things that meant much to me (in hindsight): the long-vanished locally-owned stores that provided groceries, meat, sundries, haircuts, baked goods, hobby supplies, and more. The owners worked in their stores. They knew you, and you knew them. Many of them were neighbors. Their livelihoods depended not only on providing products and services at reasonable prices — prices that saved you a trip to the big city — but on their friendliness and reputation for honesty.

Of the many stores of that ilk that I remember from early childhood until I went to college, 60 to 75 years ago, only one is still in business. It’s even at the same location, though in a new building, and it doesn’t carry the range of hobby supplies (e.g., model kits and collectible stamps) that it did when I shopped there eons ago.

Here are the sites as they look now (or looked recently), arrayed roughly in the order in which I first saw them (* indicates original building):

Grocery store and gas station*

Dairy store

Grocery store

Bakery

Grocery store and news stand

Grocery store with ice house in back

Meat market*

Meat market

Grocery store

Barber shop (left)* – Grocery store (right)*

Bakery (and owners’ residence)*

Grocery store (and owners’ residence)*

Grocery store

Hobby store*

Hobby store

Grocery store

Grocery store*

Barber shop – Drugstore (two separate buildings)

Grocery store

‘Tis the Season for Nostalgia

My hometown newspaper ran Charles Dickens’s A Christmas Carol every year, on or before Christmas, from 1940 through 1957. Here’s an image from the 1940 edition:

A Christmas Carol is a short book, and can easily fit onto six broadsheet pages, though it was spread over more, depending on the size and number of ads the paper was able to sell for the special section.

Another annual event, from 1934 through 1953 was Lionel Barrymore‘s half-hour radio version of A Christmas Carol, which I recall hearing at least once. Barrymore (1878-1954) was supposed to play Ebenezer Scrooge in the 1938 film version of the book (my favorite version of the many that have been made), but severe arthritis and a broken hip precluded that.

Barrymore later essayed the role of Mr. Potter in the Christmas-time classic, It’s a Wonderful Life (1946). The Wikipedia article about Barrymore says, accurately, that “the role suggested that of the ‘unreformed’ stage of Barrymore’s ‘Scrooge’ characterization.”

I can dredge up many other wonderful bits of nostalgia about Christmas, but I’ll stop with this one:

We often visited my grandmother at Christmas, and I like to relive the Christmas eve when we made the 90-mile trip as feathery snow slowly piled deeper on the deserted, lakeside highway we traversed through quiet villages: Lexington, Port Sanilac, Forester, Richmondville, Forestville, White Rock, Harbor Beach, Port Hope, Huron City, and — at last — Port Austin.

Snow at Christmas… It has been more than 16 years since I saw more than a dusting of snow on the ground (and then only twice). Snow at Christmas is only thing I miss about living in the North, but I don’t miss it enough to go back.

What’s in a Name?

A lot, especially if it’s the name of a U.S. Navy ship. Take the aircraft carrier, for instance, which has been the Navy’s capital ship since World War II. The first aircraft carrier in the U.S. fleet was the USS Langley, commissioned in 1922. Including escort carriers, which were smaller than the relatively small carriers of World War II, a total of 154 carriers have been commissioned and put into service in the U.S. Navy. (During World War II, some escort carriers were transferred to the Royal Navy upon commissioning.)

As far as I am able to tell, not one of the the 82 escort carriers was named for a person. Of the 72 “regular” carriers, which includes 10 designated as light aircraft carriers, none was named for a person until CVB-49, the Franklin D. Roosevelt, was commissioned in 1945, several months after the death of its namesake. The next such naming came in 1947, with the commissioning of the Wright, named for Wilbur and Orville Wright, the aviation pioneers. There was a hiatus of 8 years, until the commissioning of the Forrestal in 1955; a ship named for the late James Forrestal, the first secretary of defense.

The dam burst in 1968, with the commissioning of John F. Kennedy. That carrier and the 11 commissioned since have been named for persons, only one of whom, Admiral of the Fleet Chester W. Nimitz, was a renowned naval person. In addition to Kennedy, the namesakes include former U.S. presidents (Eisenhower, T. Roosevelt, Lincoln, Washington, Truman, Reagan, Bush 41, and Ford), Carl Vinson (a long-serving chairman of the House Armed Services Committee), and John C. Stennis (a long-serving chairman of the Senate Armed Services Committee). Reagan and Bush were honored while still living (though Reagan may have been unaware of the honor because of the advanced state of his Alzheimer’s disease).

All but the Kennedy are on active service. And the Kennedy, which was decommissioned in 2007, is due to be replaced by a namesake next year. But that may be the end of it. Wisdom may have prevailed before the Navy becomes embroiled in nasty, needless controversies over the prospect of naming of a carrier after Lyndon Johnson, Richard Nixon, Jimmy Carter, Bill Clinton, George Bush, Barack Obama, or Donald Trump.

The carrier after Kennedy (II) will be named Enterprise — the third carrier to be thus named. Perhaps future carriers will take the dashing names of those that I remember well from my days as a young defense analyst: Bon Homme Richard (a.k.a, Bonny Dick), Kearsarge, Oriskany, Princeton, Shangri-La, Lake Champlain, Tarawa, Midway, Coral Sea, Valley Forge, Saipan, Saratoga, Ranger, Independence, Kitty Hawk, Constellation, Enterprise (II), and America.

And while we’re at it, perhaps the likes of Admiral William McRaven (USN ret.) will do their duty, become apolitical, and shut up.

Regarding Napoleon Chagnon

Napoleon Alphonseau Chagnon (1938-2019) was a noted anthropologist to whom the label “controversial” was applied. Some of the story is told in this surprisingly objective New York Times article about Chagnon’s life and death. Matthew Blackwell gives a more complete account in “The Dangerous Life of an Anthropologist” (Quilette, October 5, 2019). UPDATE 11/27/19: Alice Dreger’s article, “Napoleon Chagnon Is Dead” (The Chronicle Review, October 23, 2019) reveals the inner man and underscores his integrity.

Chagnon’s sin was his finding that “nature” trumped “nurture”, as demonstrated by his decades-long ethnographic field work among the Yanomamö, indigenous Amazonians who live in the border area between Venezuela and Brazil. As Blackwell tells it,

Chagnon found that up to 30 percent of all Yanomamö males died a violent death. Warfare and violence were common, and duelling was a ritual practice, in which two men would take turns flogging each other over the head with a club, until one of the combatants succumbed. Chagnon was adamant that the primary causes of violence among the Yanomamö were revenge killings and women. The latter may not seem surprising to anyone aware of the ubiquity of ruthless male sexual competition in the animal kingdom, but anthropologists generally believed that human violence found its genesis in more immediate matters, such as disputes over resources. When Chagnon asked the Yanomamö shaman Dedeheiwa to explain the cause of violence, he replied, “Don’t ask such stupid questions! Women! Women! Women! Women! Women!” Such fights erupted over sexual jealousy, sexual impropriety, rape, and attempts at seduction, kidnap and failure to deliver a promised girl….

Chagnon would make more than 20 fieldwork visits to the Amazon, and in 1968 he published Yanomamö: The Fierce People, which became an instant international bestseller. The book immediately ignited controversy within the field of anthropology. Although it commanded immense respect and became the most commonly taught book in introductory anthropology courses, the very subtitle of the book annoyed those anthropologists, who preferred to give their monographs titles like The Gentle Tasaday, The Gentle People, The Harmless People, The Peaceful People, Never in Anger, and The Semai: A Nonviolent People of Malaya. The stubborn tendency within the discipline was to paint an unrealistic façade over such cultures—although 61 percent of Waorani men met a violent death, an anthropologist nevertheless described this Amazonian people as a “tribe where harmony rules,” on account of an “ethos that emphasized peacefulness.”…

These anthropologists were made more squeamish still by Chagnon’s discovery that the unokai of the Yanomamö—men who had killed and assumed a ceremonial title—had about three times more children than others, owing to having twice as many wives. Drawing on this observation in his 1988 Science article “Life Histories, Blood Revenge, and Warfare in a Tribal Population,” Chagnon suggested that men who had demonstrated success at a cultural phenomenon, the military prowess of revenge killings, were held in higher esteem and considered more attractive mates. In some quarters outside of anthropology, Chagnon’s theory came as no surprise, but its implication for anthropology could be profound. In The Better Angels of Our Nature, Steven Pinker points out that if violent men turn out to be more evolutionarily fit, “This arithmetic, if it persisted over many generations, would favour a genetic tendency to be willing and able to kill.”…

Chagnon considered his most formidable critic to be the eminent anthropologist Marvin Harris. Harris had been crowned the unofficial historian of the field following the publication of his all-encompassing work The Rise of Anthropological Theory. He was the founder of the highly influential materialist school of anthropology, and argued that ethnographers should first seek material explanations for human behavior before considering alternatives, as “human social life is a response to the practical problems of earthly existence.” Harris held that the structure and “superstructure” of a society are largely epiphenomena of its “infrastructure,” meaning that the economic and social organization, beliefs, values, ideology, and symbolism of a culture evolve as a result of changes in the material circumstances of a particular society, and that apparently quaint cultural practices tend to reflect man’s relationship to his environment. For instance, prohibition on beef consumption among Hindus in India is not primarily due to religious injunctions. These religious beliefs are themselves epiphenomena to the real reasons: that cows are more valuable for pulling plows and producing fertilizers and dung for burning. Cultural materialism places an emphasis on “-etic” over “-emic” explanations, ignoring the opinions of people within a society and trying to uncover the hidden reality behind those opinions.

Naturally, when the Yanomamö explained that warfare and fights were caused by women and blood feuds, Harris sought a material explanation that would draw upon immediate survival concerns. Chagnon’s data clearly confirmed that the larger a village, the more likely fighting, violence, and warfare were to occur. In his book Good to Eat: Riddles of Food and Culture Harris argued that fighting occurs more often in larger Yanomamö villages because these villages deplete the local game levels in the rainforest faster than smaller villages, leaving the men no option but to fight with each other or to attack outside groups for meat to fulfil their protein macronutrient needs. When Chagnon put Harris’s materialist theory to the Yanomamö they laughed and replied, “Even though we like meat, we like women a whole lot more.” Chagnon believed that smaller villages avoided violence because they were composed of tighter kin groups—those communities had just two or three extended families and had developed more stable systems of borrowing wives from each other.

There’s more:

Survival International … has long promoted the Rousseauian image of a traditional people who need to be preserved in all their natural wonder from the ravages of the modern world. Survival International does not welcome anthropological findings that complicate this harmonious picture, and Chagnon had wandered straight into their line of fire….

For years, Survival International’s Terence Turner had been assisting a self-described journalist, Patrick Tierney, as the latter investigated Chagnon for his book, Darkness in El Dorado: How Scientists and Journalists Devastated the Amazon. In 2000, as Tierney’s book was being readied for publication, Turner and his colleague Leslie Sponsel wrote to the president of the American Anthropological Association (AAA) and informed her that an unprecedented crisis was about to engulf the field of anthropology. This, they warned, would be a scandal that, “in its scale, ramifications, and sheer criminality and corruption, is unparalleled in the history of Anthropology.” Tierney alleged that Chagnon and Neel had spread measles among the Yanomamö in 1968 by using compromised vaccines, and that Chagnon’s documentaries depicting Yanomamö violence were faked by using Yanomamö to act out dangerous scenes, in which further lives were lost. Chagnon was blamed, inter alia, for inciting violence among the Yanomamö, cooking his data, starting wars, and aiding corrupt politicians. Neel was also accused of withholding vaccines from certain populations of natives as part of an experiment. The media were not slow to pick up on Tierney’s allegations, and the Guardian ran an article under an inflammatory headline accusing Neel and Chagnon of eugenics: “Scientists ‘killed Amazon Indians to test race theory.’” Turner claimed that Neel believed in a gene for “leadership” and that the human genetic stock could be upgraded by wiping out mediocre people. “The political implication of this fascistic eugenics,” Turner told the Guardian, “is clearly that society should be reorganised into small breeding isolates in which genetically superior males could emerge into dominance, eliminating or subordinating the male losers.”

By the end of 2000, the American Anthropological Association announced a hearing on Tierney’s book. This was not entirely reassuring news to Chagnon, given their history with anthropologists who failed to toe the party line….

… Although the [AAA] taskforce [appointed to investigate Tierney’s accusations] was not an “investigation” concerned with any particular person, for all intents and purposes, it blamed Chagnon for portraying the Yanomamö in a way that was harmful and held him responsible for prioritizing his research over their interests.

Nonetheless, the most serious claims Tierney made in Darkness in El Dorado collapsed like a house of cards. Elected Yanomamö leaders issued a statement in 2000 stating that Chagnon had arrived after the measles epidemic and saved lives, “Dr. Chagnon—known to us as Shaki—came into our communities with some physicians and he vaccinated us against the epidemic disease which was killing us. Thanks to this, hundreds of us survived and we are very thankful to Dr. Chagnon and his collaborators for help.” Investigations by the American Society of Human Genetics and the International Genetic Epidemiology Society both found Tierney’s claims regarding the measles outbreak to be unfounded. The Society of Visual Anthropology reviewed the so-called faked documentaries, and determined that these allegations were also false. Then an independent preliminary report released by a team of anthropologists dissected Tierney’s book claim by claim, concluding that all of Tierney’s most important assertions were either deliberately fraudulent or, at the very least, misleading. The University of Michigan reached the same conclusion. “We are satisfied,” its Provost stated, “that Dr. Neel and Dr. Chagnon, both among the most distinguished scientists in their respective fields, acted with integrity in conducting their research… The serious factual errors we have found call into question the accuracy of the entire book [Darkness in El Dorado] as well as the interpretations of its author.” Academic journal articles began to proliferate, detailing the mis-inquiry and flawed conclusions of the 2002 taskforce. By 2005, only three years later, the American Anthropological Association voted to withdraw the 2002 taskforce report, re-exonerating Chagnon.

A 2000 statement by the leaders of the Yanomamö and their Ye’kwana neighbours called for Tierney’s head: “We demand that our national government investigate the false statements of Tierney, which taint the humanitarian mission carried out by Shaki [Chagnon] with much tenderness and respect for our communities. The investigation never occurred, but Tierney’s public image lay in ruins and would suffer even more at the hands of historian of science Alice Dreger, who interviewed dozens of people involved in the controversy. Although Tierney had thanked a Venezuelan anthropologist for providing him with a dossier of information on Chagnon for his book, the anthropologist told Dreger that Tierney had actually written the dossier himself and then misrepresented it as an independent source of information.

A “dossier” and its use to smear an ideological opponent. Where else have we seen that?

Returning to Blackwell:

Scientific American has described the controversy as “Anthropology’s Darkest Hour,” and it raises troubling questions about the entire field. In 2013, Chagnon published his final book, Noble Savages: My Life Among Two Dangerous Tribes—The Yanomamö and the Anthropologists. Chagnon had long felt that anthropology was experiencing a schism more significant than any difference between research paradigms or schools of ethnography—a schism between those dedicated to the very science of mankind, anthropologists in the true sense of the word, and those opposed to science; either postmodernists vaguely defined, or activists disguised as scientists who seek to place indigenous advocacy above the pursuit of objective truth. Chagnon identified Nancy Scheper-Hughes as a leader in the activist faction of anthropologists, citing her statement that we “need not entail a philosophical commitment to Enlightenment notions of reason and truth.”

Whatever the rights and wrong of his debates with Marvin Harris across three decades, Harris’s materialist paradigm was a scientifically debatable hypothesis, which caused Chagnon to realize that he and his old rival shared more in common than they did with the activist forces emerging in the field: “Ironically, Harris and I both argued for a scientific view of human behavior at a time when increasing numbers of anthropologists were becoming skeptical of the scientific approach.”…

Both Chagnon and Harris agreed that anthropology’s move away from being a scientific enterprise was dangerous. And both believed that anthropologists, not to mention thinkers in other fields of social sciences, were disguising their increasingly anti-scientific activism as research by using obscurantist postmodern gibberish. Observers have remarked at how abstruse humanities research has become and even a world famous linguist like Noam Chomsky admits, “It seems to me to be some exercise by intellectuals who talk to each other in very obscure ways, and I can’t follow it, and I don’t think anybody else can.” Chagnon resigned his membership of the American Anthropological Association in the 1980s, stating that he no longer understood the “unintelligible mumbo jumbo of postmodern jargon” taught in the field. In his last book, Theories of Culture in Postmodern Times, Harris virtually agreed with Chagnon. “Postmodernists,” he wrote, “have achieved the ability to write about their thoughts in a uniquely impenetrable manner. Their neo-baroque prose style with its inner clauses, bracketed syllables, metaphors and metonyms, verbal pirouettes, curlicues and figures is not a mere epiphenomenon; rather, it is a mocking rejoinder to anyone who would try to write simple intelligible sentences in the modernist tradition.”…

The quest for knowledge of mankind has in many respects become unrecognizable in the field that now calls itself anthropology. According to Chagnon, we’ve entered a period of “darkness in cultural anthropology.” With his passing, anthropology has become darker still.

I recount all of this for three reasons. First, Chagnon’s findings testify to the immutable urge to violence that lurks within human beings, and to the dominance of “nature” over “nurture”. That dominance is evident not only in the urge to violence (pace Steven Pinker), but in the strong heritability of such traits as intelligence.

The second reason for recounting Chagnon’s saga it is to underline the corruption of science in the service of left-wing causes. The underlying problem is always the same: When science — testable and tested hypotheses based on unbiased observations — challenges left-wing orthodoxy, left-wingers — many of them so-called scientists — go all out to discredit real scientists. And they do so by claiming, in good Orwellian fashion, to be “scientific”. (I have written many posts about this phenomenon.) Leftists are, in fact, delusional devotees of magical thinking.

The third reason for my interest in the story of Napoleon Chagnon is a familial connection of sorts. He was born in a village where his grandfather, also Napoleon Chagnon, was a doctor. My mother was one of ten children, most of them born and all of them raised in the same village. When the tenth child was born, he was given Napoleon as his middle name, in honor of Doc Chagnon.

More Presidential Trivia

The modern presidency began with the adored “activist”, Teddy Roosevelt. From TR to the present, there have been only four (of twenty) presidents who first competed in a general election as a candidate for the presidency: Taft, Hoover, Eisenhower, and Trump. Trump is alone in having had no previous governmental service before becoming president. There’s no moral to this story. Make of it what you will.

(See also “Presidents: Key Dates and Various Trivia“, to which this commentary has been added.)

Viewing Recommendations: TV Series and Mini-Series

My wife and I have watched many a series and mini-series. Some of them predate the era of VHS, DVD, and streaming, though much of the older fare is now available on DVD (and sometimes on streaming media). Our long list of favorites includes these (right-click a link to open it in a new tab):

Better Call Saul
Rumpole of the Bailey
Slings and Arrows
Pride and Prejudice
Cold Lazarus
Karaoke
Love in a Cold Climate
Oliver Twist
Bleak House
The Six Wives of Henry VIII
Danger UXB
Lonesome Dove
Sunset Song
Lillie
Vienna 1900
The Durrells in Corfu
The Wire
The Glittering Prizes
Bron/Broen
Wallander
Little Dorrit
Justified
Cracker
Pennies from Heaven
Mad Men
The Sopranos
Charters & Caldicott
Reckless
Our Mutual Friend
The First Churchills
The Unpleasantness at the Bellona Club
Murder Must Advertise
The Nine Tailors
Cakes and Ale
Madame Bovary
I, Claudius
Smiley’s People
Reilly: Ace of Spies
Prime Suspect
The Norman Conquests
Bramwell
Prime Suspect 2
Prime Suspect 3
Mystery!: Cadfael
Prime Suspect 5: Errors of Judgement
David Copperfield
Prime Suspect 6: The Last Witness
The Forsyte Saga
Elizabeth R
Jude the Obscure
Clouds of Witness
Country Matters
Notorious Woman
Five Red Herrings
Anna Karenina
Brideshead Revisited
To Serve Them All My Days

If you have more than a passing acquaintance with this genre, you will recognize that almost all of the fare is British. The Brits seem to have a near-lock on good acting and literate and clever writing.

Alas, of the series listed above, only Better Call Saul, Bron/Broen, and The Durrells in Corfu are still running. The Durrells comes to end this fall for U.S. viewers (Brits have already seen the final season). The final season of Bron/Broen has also aired in Europe, but isn’t yet available in the U.S.

As for Better Call Saul, the fifth season of which will air in 2020, there are rumors of a sixth and final season to follow.

Enjoy!

Rooted in the Real World of Real People

I am far from nostalgic about my home town. But it’s still my home town, and I often revisit it in my mind’s eye.

The only places that I mentally revisit with pleasure are the first home that I can remember — where I lived from age 1 to age 7 — and the first of the three red-brick school houses that I attended.

I haven’t been to my home town in four years. The occasion was the funeral of my mother, who lived to the age of 99.

I may not go back again. But it’s still my home town.

I think of it that way not only because I grew up there but also because it’s a “real” place: a small, mostly run-down, Midwestern city with a population of about 30,000 — the largest city in a county that lies beyond the fringes of the nearest metropolitan area.

Perhaps I’m nostalgic about it, after all, because “real” places like my home town seem to be vanishing from the face of America. By real, I mean places where (real) people still work with their hands; live in houses that are older than they are, and have fewer bathrooms than bedrooms; mow their own lawns, clean their own homes, and make their own meals (except when they partake of the American Legion fish fry or go to a Chick-Fil-A); bowl, hunt, fish, stroll their neighborhoods and know their neighbors (who have been their neighbors for decades); read Reader’s Digest, Popular Mechanics, and romance novels; go to bars that lack ferns and chrome; prefer Fox News and country music to NPR, CNN, MSNBC, and hip-hop; go to church and say grace before meals; and vote for politicians who don’t think of real people as racists, ignoramuses, gun nuts, or religious zealots (“deplorables”, in other words).

In fact, America is (or was) those real places with real people in them. And it is vanishing with them.

P.S. I have lived outside the real world of real people for a very long time, but the older I get, the more I miss it.

Organized?

I see ads on TV (with sound muted), at shopping website, and in periodicals for organizing systems and services. And I wonder who buys such things. It can’t be persons who are organized; they don’t need them. So it must be persons who are disorganized, and who benefit from them briefly and then go back to their old ways.

Sort of related, and worth a visit if you like trivia, is a post of mine from two years ago: “You Can Look That Up in Your Funk & Wagnall’s“.

Summer School?

Summer has long been my favorite season, not least because it meant summer vacation for many years. In those days of yore, school stayed in session until mid-June and didn’t resume until after Labor Day. (In fact, my college was on the quarter system, and school didn’t resume until late September.) Does anyone know why school (in large swaths of the country) now ends in early May and resumes in mid-to-late August? It doesn’t make sense to me because (1) there’s still cool, rainy weather in May, (2) there’s still a lot of summer left after school resumes, and (3) taxpayers must be paying for a lot of extra air-conditioning as a result of (1) and (2).

There are explanations for this idiocy (e.g., here), but I find them unpersuasive and rather like explanations of how the tail wags the dog.

Knot for Me

I was amused by this photo of Jeff Bezos sporting a Full Windsor knot:

(A compensatory device, perhaps?)

When I first learned to tie a necktie, more than 60 years ago, I used what is properly called a Half Windsor Knot (though it is often called a Windsor Knot). The Half Windsor is neater and more elegant than the Full Windsor, which looks like a chin-cushion.

But when I began working in a professional setting, where necktie wearing was then (early 1960s) de rigeur, I adopted the Four-in-hand knot, which is faster and easier to tie than either of the Windsors. The article linked to in the preceding sentence alleges that the four-in-hand is “notably asymmetric”. But it isn’t if one is careful about pulling the knot up into the “notch” between collar points, and sticks to straight-collar shirts (which also lend a more professional appearance than spread collars and button-downs).

In fact, a properly tied four-in-hand is more elegant than its cumbersome Windsor rivals. For one thing, the four-in-hand knot doesn’t overwhelm the long part of the tie, which (if one has good taste in ties) is what one wants to show off.  In addition, the four-in-hand lends itself to a neat dimple, which can be achieved with the Half Windsor but not the Full Windsor.

The neat (centered) dimple says: “I am a fastidious person” — and I am.

End of a Generation

The so-called greatest generation has died out in my family, as it soon will die out across the land. The recent death of my mother-in-law at age 98 removed from the scene the last of my wife’s and my parents and their siblings: 26 of them in all.

Their birth years ranged from 1903 to 1922. There were, oddly, 18 males as against only 8 females, and the disparity held for all four sets of siblings:

7 to 3 for my mother’s set

2 to 1 for my father’s set

5 to 3 for my wife’s mother’s set

4 to 1 for my wife’s father’s set.

Only one of the 26 died before reaching adulthood (my father’s younger brother at 18 months). Two others (also males) died relatively young. One of my mother’s brothers died just a few weeks before his 40th birthday as a result of a jeep accident (he was on active duty in the Coast Guard). One of my wife’s mother’s brothers died at age 48 as a long delayed result of a blow to the head by a police truncheon.

The other 15 males lived to ages ranging from 65 to 96, with an average age at death of 77 years. The 8 females lived to ages ranging from 69 to 99, with an average age at death of 87 years. The longest-lived of the males was the only one to pass the 90 mark. Four of the females lived into their 90s, dying at ages 91, 96, 98, and 99.

All of the 25 who reached adulthood also married. Only two of them had a marriage end in divorce. All of them were raised in near-poverty or in somewhat comfortable circumstances that vanished with the onset of the Great Depression. All of them worked hard, whether in the home or outside of it; none of them went on welfare; most of the men and two of the women served in uniform during World War II.

Thus passeth a generation sui generis.

Where are Elmer, Herman, Bert, Tom and Charley,
The weak of will, the strong of arm, the clown, the boozer, the fighter?
All, all, are sleeping on the hill….

Where are Ella, Kate, Mag, Lizzie and Edith,
The tender heart, the simple soul, the loud, the proud, the happy one?
All, all, are sleeping on the hill.

Edgar Lee Masters, Spoon River Anthology (“The Hill“)

A Summing Up

This post has been updated and moved to “Favorite Posts“.

V-J Day Stirs Memories

V-J Day in the United States commemorates the official surrender of Japan to the Allied Forces, and the end of World War II. The surrender ceremony took place on September 2, 1945 (the date in Japan), beginning at 9:00 a.m. Tokyo time. The ceremony was held in Tokyo Bay, aboard U.S.S. Missouri, and was presided over by General Douglas MacArthur:

Though it was actually September 1 in the United States at the time of the ceremony, V-J Day is traditionally observed in the U.S. on September 2.

The Monday after the surrender was Labor Day in the U.S. And in those more civilized times (barbarous wars aside), school began on the day after Labor Day.

On September 4, 1945 (the day after Labor Day), I entered kindergarten at the age of 4-2/3 years. Here’s the school that I attended:

PolkSch

In those innocent days, students got to school and back home by walking. Here’s the route that I followed as a kindergartener:

Route to Polk School

A 4-year-old walking several blocks between home and school, usually alone most of the way? Unheard of today, it seems. But that was a different time, in many ways.

For more, see “The Passing of Red-Brick Schoolhouses and a Way of Life“.

Recommended Reading

Leftism, Political Correctness, and Other Lunacies (Dispatches from the Fifth Circle Book 1)

 

On Liberty: Impossible Dreams, Utopian Schemes (Dispatches from the Fifth Circle Book 2)

 

We the People and Other American Myths (Dispatches from the Fifth Circle Book 3)

 

Americana, Etc.: Language, Literature, Movies, Music, Sports, Nostalgia, Trivia, and a Dash of Humor (Dispatches from the Fifth Circle Book 4)

“Look That Up in Your Funk & Wagnall’s”

The title of this post is a catchphrase from Laugh In (1968-1973), a weekly comedy show that I sometimes found funny, sometimes found amusing, and often found stupid. I bring it up because my parents owned a Funk & Wagnall’s encyclopedia. It was of a 1940s vintage, though I don’t remember perusing it until 1953, when we moved to a house with a built-in living-room bookcase, where the volumes occupied a prominent spot.

In any event, I looked at and into the encyclopedia so often from 1953 until I left for college in 1958 that I still remember the alphabetic divisions noted on the spines of the volumes:

I recall that there was also a final volume which contained a comprehensive index. And there were some “yearbook” updates.

Note the preponderance of words beginning with letters in the first half of the alphabet. Entries with letters beginning with “n” through “z” occupy only 7-plus of the 25 volumes.

What happened to the set? I don’t know. My mother moved out of the house in 1990, not long after the death of my father. Her next abode was much smaller, and the encyclopedia wasn’t in it. I never thought to ask her what happened to it. And now she is beyond asking — having died a few years ago at the age of 99.

Thank you for indulging this bit of nostalgia.