Scrooge buys a turkey and has it delivered anonymously to the Cratchits.
(Illustration from A Christmas Carol,
as published in my hometown newspaper on December 25, 1940.)
Realities presents the best of Politics & Prosperity in the form of polished and updated articles. Some of them consolidate material that was scattered among several related posts at this blog. Give it a look.
Published to date:
From the keyboard of Maverick Philosopher (“Is There a Defensible Sense in Which Human Beings Are Equal?“):
Empirical inequality cannot be denied: by the various empirical measures there is plenty of inequality among individuals and groups. (Trivial example: men on average are taller than women. Height is an example of an empirically measurable attribute.) So if human beings are taken solely in their empirical and material natures, or if human beings are nothing more than material beings, then talk of the equality of all human beings is either false or trivial. (That all human beings are equal in that they all have been born at or near the surface of the earth is empirically true, but trivially true.)….
Given the plain fact of empirical inequality, is there any defensible sense in which human beings could be said to be equal and in possession of equal rights?…
[A] person [in the descriptive sense] is a conscious and thus sentient individual, capable of self-consciousness, possessing feeling and will and memory and the capacity to reason and plan and seek the truth about itself and everything else…. A person in the normative sense is a rights-possessor which, in virtue of having rights, induces in other persons various duties. For example, my right to life induces in you the duty to refrain from taking my life, and your duty derives from my right. In this sense rights and duties are correlative….
My claim, then, is that we are all equal as persons in the descriptive sense, and therefore all equal in the normative sense. That is, if any one of us is a rights-possessor in virtue of being a descriptive person, then every one of us is a rights-possessor in virtue of being a descriptive person. And all of this regardless of sex, race, age, and any other empirical feature. We are equal as persons even if my will is stronger than yours and my intellect more penetrating. We are equal as persons even if you are more compassionate than me.
The point, then, is that equality is grounded in personhood, not in animal constitution….
The above definition of ‘person’ allows for persons that are not human beings and human beings (genetic humans) that are not persons, as well as persons that are human beings…. Examples of humans that are not persons, on my definition of ‘person,’ would be anencephalic human neonates. They would not be persons because of their lack of capacity to develop language and reasoning skills. (For more on the anencephalic business, see Potentiality and the Substance View of Persons, the comments to which were good.) But these anencephalic individuals are nonetheless genetically human as the offspring of human parents.
To repeat, our equality is grounded in our shared personhood despite our considerable empirical differences. Personhood cannot be understood in natural-scientific terms.
I will try to reduce this to a syllogism:
1. A person is a human being who is a conscious and thus sentient individual, capable of self-consciousness, possessing feeling and will and memory and the capacity to reason and plan and seek the truth about itself and everything else. (A human being who lacks the potential for becoming all of those things is not a person.)
2. All persons are equal, in the sense that they all possess or exhibit personhood, as defined in 1.
3. Given that all persons are equal, if any one of them is a rights-possessor, all of them possess the same rights by virtue of their inherent equality.
I am bothered by the distinction made in point 1 between human persons and human non-persons. This opens the door to the kinds of distinctions that are used to justify abortion and involuntary euthanasia.
Point 2 merely says that a person is a person, as defined in point 1. This is a trivial definition of equality. Are Hitler, Stalin, and Mao “equal” to Francis of Assisi, John Paul II, and Mother Teresa? By asking such a question am I proposing the kind of arbitrary distinction that I object to in point 1? (Arbitrary because it emerges from an a priori analysis rather than experience.) The answer is no, as discussed below.
The rights in point 3 seem to be free-floating Platonic entities, independent of the existence and socialization of human beings. But rights are not like that. Nor are they unitary; an all-or-nothing set that is bestowed on every person. Rights are complex and socially constructed*, and they arise from distinctions of the kind that I make between a Hitler and a Mother Teresa. There are persons who are so despicable that they should have no rights; unlike unwitting fetuses and helpless old people, they should be erased from the face of the earth for the good of humankind.
Social intercourse is capable of generating innumerable gradations of rights, from a positive right to be cared for in one’s old age to a negative right to be allowed to die in peace without the intervention of “life saving” measures. In between are such rights as the right to resume living among free human beings, working at gainful employment, enjoying normal social pleasures, and so on, after having been imprisoned for committing socially defined harms.
Equality, then, is the enjoyment of the same socially bestowed rights as others who are similarly situated (e.g., not incarcerated, eligible for care).
One way of defining liberty is to say that it is the scope of action that is allowed by socially agreed upon rights. Negative rights define what one may not do to others; positive rights define what others must do for the beneficiaries of such rights.
* In the best case, the state would enforce socially constructed negative rights (e.g., the right not to be murdered), and would not be a tool for the fabrication and enforcement of so-called positive rights. Such rights do arise from social intercourse, but when the state enforces them it imposes burdens on persons who are not party to the creation of such rights (e.g., the duty of care for others may vary considerably from culture to culture, even within a nation-state). State and society are synonymous only in small, cohesive, and kinship groups.
Rights, Liberty, the Golden Rule, and the Legitimate State
“Natural Rights” and Consequentialism
More about Consequentialism
Line-Drawing and Liberty
What Are “Natural Rights”?
The Golden Rule and the State
Bounded Liberty: A Thought Experiment
Evolution, Human Nature, and “Natural Rights”
The Meaning of Liberty
Positive Liberty vs. Liberty
On Self-Ownership and Desert
The Golden Rule as Beneficial Learning
Facets of Liberty
Rights: Source, Applicability, How Held
Human Nature, Liberty, and Rationalism
Merit Goods, Positive Rights, and Cosmic Justice
More about Merit Goods
Society and the State
Liberty, Negative Rights, and Bleeding Hearts
Liberty and Society
Genetic Kinship and Society
Liberty as a Social Construct: Moral Relativism?
The Social Animal and the “Social Contract”
The Futile Search for “Natural Rights”
Getting Liberty Wrong
The Harmful Myth of Inherent Equality
The Principles of Actionable Harm
More About Social Norms and Liberty
The Harm Principle Revisited: Mill Conflates Society and State
Liberty and Social Norms Re-examined
Natural Law, Natural Rights, and the Real World
Natural Law and Natural Rights Revisited
You can’t go back home to your family, back home to your childhood … back home to a young man’s dreams of glory and of fame … back home to places in the country, back home to the old forms and systems of things which once seemed everlasting but which are changing all the time – back home to the escapes of Time and Memory.
— Thomas Wolfe, You Can’t Go Home Again
* * *
I have just begun to read a re-issue of Making It, Norman Podhoretz‘s memoir that stirred up the literati of New York City. According to Jennifer Schuessler (“Norman Podhoretz: Making Enemies,” Publisher’s Weekly, January 25, 1999), Podhoretz’s
frank 1967 account of the lust for success that propelled him from an impoverished childhood in Brooklyn to the salons of Manhattan, … scandalized the literary establishment that once hailed him as something of a golden boy. His agent wouldn’t represent it. His publisher refused to publish it. And just about everybody hated it. In 1972, Podhoretz’s first high-profile personal squabble, with Random House’s Jason Epstein, went public when the New York Times Magazine published an article called “Why Norman and Jason Aren’t Talking.” By 1979, when Podhoretz published Breaking Ranks, a memoir of his conversion from radicalism to militant conservatism, it seemed just about everybody wasn’t talking to Norman.
Next month, Podhoretz will add another chapter to his personal war chronicle with the publication of Ex-Friends: Falling Out with Allen Ginsberg, Lionel and Diana Trilling, Lillian Hellman, Hannah Arendt, and Norman Mailer. In this short, sharp, unabashedly name-dropping book, Podhoretz revisits the old battles over communism and the counterculture, not to mention his bad reviews. But for all his talk of continued struggle against the “regnant leftist culture that pollutes the spiritual and cultural air we all breathe,” the book is a frankly nostalgic, even affectionate look back at the lost world of “the Family,” the endlessly quarreling but close-knit group of left-leaning intellectuals that gathered in the 1940s and ’50s around such magazines as the Partisan Review and Commentary.
Given this bit of background, you shouldn’t be surprised that it was Podhoretz who said that a conservative is a liberal who has been mugged by reality. Nor should you be surprised that Podhoretz wrote this about Barack Obama (which I quote in “Presidential Treason“):
His foreign policy, far from a dismal failure, is a brilliant success as measured by what he intended all along to accomplish….
… As a left-wing radical, Mr. Obama believed that the United States had almost always been a retrograde and destructive force in world affairs. Accordingly, the fundamental transformation he wished to achieve here was to reduce the country’s power and influence. And just as he had to fend off the still-toxic socialist label at home, so he had to take care not to be stuck with the equally toxic “isolationist” label abroad.
This he did by camouflaging his retreats from the responsibilities bred by foreign entanglements as a new form of “engagement.” At the same time, he relied on the war-weariness of the American people and the rise of isolationist sentiment (which, to be sure, dared not speak its name) on the left and right to get away with drastic cuts in the defense budget, with exiting entirely from Iraq and Afghanistan, and with “leading from behind” or using drones instead of troops whenever he was politically forced into military action.
The consequent erosion of American power was going very nicely when the unfortunately named Arab Spring presented the president with several juicy opportunities to speed up the process. First in Egypt, his incoherent moves resulted in a complete loss of American influence, and now, thanks to his handling of the Syrian crisis, he is bringing about a greater diminution of American power than he probably envisaged even in his wildest radical dreams.
For this fulfillment of his dearest political wishes, Mr. Obama is evidently willing to pay the price of a sullied reputation. In that sense, he is by his own lights sacrificing himself for what he imagines is the good of the nation of which he is the president, and also to the benefit of the world, of which he loves proclaiming himself a citizen….
No doubt he will either deny that anything has gone wrong, or failing that, he will resort to his favorite tactic of blaming others—Congress or the Republicans or Rush Limbaugh. But what is also almost certain is that he will refuse to change course and do the things that will be necessary to restore U.S. power and influence.
And so we can only pray that the hole he will go on digging will not be too deep for his successor to pull us out, as Ronald Reagan managed to do when he followed a president into the White House whom Mr. Obama so uncannily resembles. [“Obama’s Successful Foreign Failure,” The Wall Street Journal, September 8, 2013]
Though I admire Podhoretz’s willingness to follow reality to its destination in conservatism — because I made the same journey myself — I am drawn to his memoir by another similarity between us. In the Introduction to the re-issue, Terry Teachout writes:
Making It is never more memorable than when it describes its author’s belated discovery of “the brutal bargain” to which he was introduced by “Mrs. K.,” a Brooklyn schoolteacher who took him in hand and showed him that the precocious but rough-edged son of working-class Jews from Galicia could aspire to greater things— so long as he turned his back on the ghettoized life of his émigré parents and donned the genteel manners of her own class. Not until much later did he realize that the bargain she offered him went even deeper than that:
She was saying that because I was a talented boy, a better class of people stood ready to admit me into their ranks. But only on one condition: I had to signify by my general deportment that I acknowledged them as superior to the class of people among whom I happened to have been born. . . . what I did not understand, not in the least then and not for a long time afterward, was that in matters having to do with “art” and “culture” (the “life of the mind,” as I learned to call it at Columbia), I was being offered the very same brutal bargain and accepting it with the wildest enthusiasm.
So he did, and he never seriously doubted that he had done the only thing possible by making himself over into an alumnus of Columbia and Cambridge and a member of the educated, art-loving upper middle class. At the same time, though, he never forgot what he had lost by doing so, having acquired in the process “a distaste for the surroundings in which I was bred, and ultimately (God forgive me) even for many of the people I loved.”
It’s not an unfamiliar story. But it’s a story that always brings a pang to my heart because it reminds me too much of my own attitudes and behavior as I “climbed the ladder” from the 1960s to the 1990s. Much as I regret the growing gap between me and my past, I have learned from experience that I can’t go back, and don’t want to go back.
What happened to me is probably what happened to Norman Podhoretz and tens of millions of other Americans. We didn’t abandon our past; we became what was written in our genes.
This mini-memoir is meant to illustrate that thesis. It is aimed at those readers who can’t relate to a prominent New Yorker, but who might see themselves in a native of flyover country.
My “ghetto” wasn’t a Jewish enclave like Podhoretz’s Brownsville, but an adjacent pair of small cities in the eastern flatlands of Michigan, both of them predominantly white and working-class. They are not suburbs of Detroit — as we used to say emphatically — nor of any other largish city. We were geographically and culturally isolated from the worst and best that “real” cities have to offer in the way of food, entertainment, and ethnic variety.
My parents’ roots, and thus my cultural inheritance, were in small cities, towns and villages in Michigan and Ontario. Life for my parents, as for their forbears, revolved around making a living, “getting ahead” by owning progressively nicer (but never luxurious) homes and cars, socializing with friends over card games, and keeping their houses and yards neat and clean.
All quite unexceptional, or so it seemed to me as I was growing up. It only began to seem exceptional when I became the first scion of the the family tree to “go away to college,” as we used to say. (“Going away” as opposed to attending a local junior college, as did my father’s younger half-brother about eight years before I matriculated.)
Soon after my arrival on the campus of a large university, whose faculty and students hailed from around the world, I began to grasp the banality of my upbringing in comparison to the cultural richness and sordid reality of the wider world. It was a richness and reality of which my home-town contemporaries and I knew little because we were raised in the days of Ozzie and Harriet — before the Beatles, Woodstock, bearded men with pony-tails, shacking up as a social norm, widespread drug use, and the vivid depiction of sex in all of its natural and unnatural variety.
My upbringing, like that of my home-town contemporaries was almost apolitical. If we overheard our parents talking about politics, we overheard a combination of views that today seems unlikely: suspicion of government; skepticism about unions (my father had to join one in order to work), disdain for “fat cats”; sympathy for “the little guy”; and staunch patriotism. There was nothing about civil rights and state-enforced segregation, which were seen (mistakenly) as peculiarly Southern issues. Their own racism was seldom in evidence because blacks generally “knew their place” in our white-dominated communities.
And then, as a student at a cosmopolitan Midwestern university (that isn’t an oxymoron), I began to learn — in and out of class. The out-of-class lessons came through conversations with students whose backgrounds differed greatly from mine, including two who had been displaced persons in the wake of World War II. My first-year roommate was a mild-mannered Iranian doctoral student whose friends (some of them less mild-mannered) spoke openly about the fear in which Iranians lived because of SAVAK‘s oppressive practices. In my final year as an undergraduate I befriended some married graduate students, one of whom (an American) had spent several years in Libya as a geologist for an American oil company and had returned to the States with an Italian wife.
One of the off-campus theaters specialized in foreign films, which I had never before seen, and which exposed me to people, places, attitudes, and ideas that were intellectually foreign to me, but which I viewed avidly and with acceptance. My musical education was advanced by a friendship with a music major, through whom I met other music majors and learned much about classical music and, of all things, Gilbert and Sullivan. One of the music majors was a tenor who had to learn The Mikado, and did so by playing a recording of it over and over. I became hooked, and to this day can recite large chunks of the libretto. I used to sing them, but my singing days are over.
Through my classes — and often through unassigned reading — I learned how to speak and read French (fluently, those many years ago), and ingested various-sized doses of philosophy, history (ancient and modern), sociology, accounting (the third of four majors), and several other things that escape me at the moment.
Through economics (my fourth and final major), I learned (but didn’t then question), the contradictory tenets of microeconomics (how markets work to allocate resources and satisfy wants efficiently) and macroeconomics (then dominated by the idea of government’s indispensable role in the economic order). But I was drawn in by the elegance of economic theory, and mistook its spurious precision for deep understanding. Though I have since rejected macroeconomic orthodoxy (e.g., see this).
My collegiate “enlightenment” was mild, by today’s standards, but revelatory to a small-city boy. And I was among the still relatively small fraction of high-school graduates who went away to college. So my exposure to a variety of people, cultures, and ideas immediately set me apart — apart not only from my parents and the members of their generation, but also apart from most of the members of my own generation.
What set me apart more than anything was my loss of faith. In my second year I went from ostentatiously devout Catholicism to steadfast agnosticism in a span of weeks. I can’t reconstruct the transition at a remove of almost 60 years, but I suspect that it involved a mixture of delayed adolescent rebellion, a reckoning (based on things I had learned) that the roots of religion lay in superstition, and a genetic predisposition toward skepticism (my father was raised Protestant but scorned religion in his mild way). At any rate, when I walked out of church in the middle of Mass one Sunday morning, I felt as if I had relieved myself of a heavy burden and freed my mind for the pursuit of knowledge.
The odd thing is that, to this day, I retain feelings of loyalty to the Church of my youth — the Church of the Latin Mass (weekly on Sunday morning, not afternoon or evening), strict abstinence from meat on Friday, confession on Saturday, fasting from midnight on Sunday (if one were in a state of grace and fit for Holy Communion), and the sharp-tongued sisters with sharp-edged rulers who taught Catechism on Saturday mornings (parochial school wasn’t in my parents’ budget). I have therefore been appalled, successively, by Vatican Council II, most of the popes of the past 50 years, the various ways in which being a Catholic has become easier, and (especially) the egregious left-wing babbling of Francis. And yet I remain an agnostic who only in recent years has acknowledged the logical necessity of a Creator, but probably not the kind of Creator who is at the center of formal religion. Atheism — especially of the strident variety — is merely religion turned upside down; a belief in something that is beyond proof; I scorn it.
To complete this aside, I must address the canard peddled by strident atheists and left-wingers (there’s a lot of overlap) about the evil done in the name of religion, I say this: Religion doesn’t make fanatics, it attracts them (in relatively small numbers), though some Islamic sects seem to be bent on attracting and cultivating fanaticism. Far more fanatical and attractive to fanatics are the “religions” of communism, socialism (including Hitler’s version), and progressivism (witness the snowflakes and oppressors who now dominate the academy). I doubt that the number of murders committed in the name of religion amounts to one-tenth of the number of murders committed by three notable anti-religionists: Hitler (yes, Hitler), Stalin, and Mao. I also believe — with empirical justification — that religion is a bulwark of liberty; whereas, the cult of libertarianism — usually practiced by agnostics and atheists — is not (e.g., this post and the others linked to therein).
It’s time to return to the chronological thread of my narrative. I have outlined my graduate-school and work experiences in “About.” The main thing to note here is what I learned during the early mid-life crisis which took me away from the D.C. rat race for about three years, as owner-operator of a (very) small publishing company in a rural part of New York State.
In sum, I learned to work hard. Before my business venture I had coasted along using my intelligence but not a lot of energy, but nevertheless earning praise and good raises. I was seldom engaged it what I was doing: the work seemed superficial and unconnected to anything real to me. That changed when I became a business owner. I had to meet a weekly deadline or lose advertisers (and my source of income), master several new skills involved in publishing a weekly “throwaway” (as the free Pennysaver was sometimes called), and work six days a week with only two brief respites in three years. Something clicked, and when I gave up the publishing business and returned to the D.C. area, I kept on working hard — as if my livelihood depended on it.
And it did. Much as I had loved being my own boss, I wanted to live and retire more comfortably than I could on the meager income that flowed uncertainly from the Pennysaver. Incentives matter. So in the 18 years after my return to the D.C. area I not only kept working hard and with fierce concentration, but I developed (or discovered) a ruthless streak that propelled me into the upper echelon of the think-tank.
And in my three years away from the D.C. area I also learned, for the first time, that I couldn’t go home again.
I was attracted to the publishing business because of its location in a somewhat picturesque village. The village was large enough to sport a movie theater, two super markets, and a variety of commercial establishments, including restaurants, shoe stores, clothing stores, jewelers, a Rite-Aid drug store, and even a J.J. Newberry dime store. It also had many large, well-kept homes All in all, it appealed to me because, replete with a “real” main street, it reminded me of the first small city in which I grew up.
But after working and associating with highly educated professionals, and after experiencing the vast variety of restaurants, museums, parks, and entertainment of the D.C. area, I found the village and its natives dull. Not only dull, but also distant. They were humorless and closed to outsiders. It came to me that the small cities in which I had grown up were the same way. My memories of them were distorted because they were memories of a pre-college boy who had yet to experience life in the big city. They were memories of a boy whose life centered on his parents and a beloved grandmother (who lived in a small village of similarly golden memory).
You can’t go home again, metaphorically, if you’ve gone away and lived a different life. You can’t because you are a different person than you were when you were growing up. This lesson was reinforced at the 30-year reunion of my high-school graduating class, which occurred several years after my business venture and a few years after I had risen into the upper echelon of the think-tank.
There I was, with my wife and sister (who graduated from the same high school eight years after I did), happily anticipating an evening of laughter and shared memories. We were seated at a table with two fellows who had been good friends of mine (and their wives, whom I didn’t know). It was deadly boring; the silences yawned; we had nothing to say to each other. One of the old friends, who had been on the wagon, was so unnerved by the forced bonhomie of the occasion that he fell off the wagon. Attempts at mingling after dinner were awkward. My wife and sister readily agreed to abandon the event. We drove several miles to an elegant, river-front hotel where we had a few drinks on the deck. Thus the evening ended on a cheery note, despite the cool, damp drizzle. (A not untypical August evening in Michigan.)
I continued to return to Michigan for another 27 years, making what might be my final trip for the funeral of my mother, who lived to the age of 99. But I went just to see my parents and siblings, and then only out of duty.
The golden memories of my youth remain with me, but I long ago gave up the idea of reliving the past that is interred in those memories.
I skipped the Academy Awards show, as I always do. This year’s show was especially skip-worthy, inasmuch as it featured self-flagellation by Hollywood types because of the lack of “diversity” (i.e., not enough blacks) among Oscar nominees. The self-flagellators dare not speak the truth which is that (1) Hollywood makes movies for profit, (2) movies must therefore appeal to a wide audience, (3) the average black has less money to spend than the average white, and (4) blacks remain in the vast minority of the populace.
Anyway, movies aren’t what they used to be, and never will be. Thus the following diatribe, which I have borrowed from my book, Americana, Etc.
My inventory of modern films — those released in 1932 and later — comprises 2,369 titles, 2,067 of which I have rated, and 660 of those (32 percent) at 8, 9, or 10 on the Internet Movie Database (IMDb) scale. But those numbers mask vast differences in the quality of modern films, which were produced in three markedly different eras:
• Golden Age (1932-1942) — 237 films seen, 207 rated, 117 favorites (57 percent)
• Dreary Years (1943-1965) — 368 films seen, 287 rated, 110 favorites (38 percent)
• Abysmal Epoch (1966-present) — 1,764 films seen, 1,573 rated, 433 favorites (28 percent)
What made the Golden Age golden, and why did films go from golden to dreary to abysmal? Read on.
To understand what made the Golden Age golden, let’s consider what makes a great movie: a novel or engaging plot, characters who entice or excite, dialogue that is fresh (and witty or funny, if the film calls for it), and strong performances (acting, singing, and/or dancing), and excellent production values (locations, cinematography, sets, costumes, etc.). The Golden Age was golden largely because the advent of sound fostered creativity — plots could be advanced through dialogue, actors could deliver real dialogue, and singers and orchestras could deliver the real thing.
It took a few years to fully realize the potential of sound, but movies hit their stride just as Americans were seeking respite from the cares of a lingering and deepening Depression. Studios vied with each other to entice movie-goers with new plots (or plots that seemed new when embellished with sound), fresh and often wickedly witty dialogue, and — perhaps most important of all — captivating performers. The generation of super-stars that came of age in the 1930s consisted mainly of handsome men and beautiful women, blessed with distinctive personalities, and equipped by their stage experience to deliver their lines vibrantly and with impeccable diction.
What were the great movies of the Golden Age, and who starred in them? Here’s a sample of the titles: 1932 — Grand Hotel; 1933 — Dinner at Eight, Flying Down to Rio, Morning Glory; 1934 — It Happened One Night, The Thin Man, Twentieth Century; 1935 — Mutiny on the Bounty, A Night at the Opera, David Copperfield; 1936 — Libeled Lady, Mr. Deeds Goes to Town, Show Boat; 1937 — The Awful Truth, Captains Courageous, Lost Horizon; 1938 — The Adventures of Robin Hood, Bringing up Baby, Pygmalion; 1939 — Destry Rides Again, Gunga Din, The Hunchback of Notre Dame, The Wizard of Oz, The Women; 1940 — The Grapes of Wrath, His Girl Friday, The Philadelphia Story; 1941 — Ball of Fire, The Maltese Falcon, Suspicion; 1942 — Casablanca, The Man Who Came to Dinner, Woman of the Year.
And who starred in the greatest movies of the Golden Age? Here’s a goodly sample of the era’s superstars, a few of whom came on the scene toward the end: Jean Arthur, Fred Astaire, John Barrymore, Lionel Barrymore, Ingrid Bergman, Humphrey Bogart, James Cagney, Claudette Colbert, Ronald Colman, Gary Cooper, Joan Crawford, Bette Davis, Irene Dunne, Nelson Eddy, Errol Flynn, Joan Fontaine, Henry Fonda, Clark Gable, Cary Grant, Jean Harlow, Olivia de Havilland (the last survivor among pre-World War II stars), Katharine Hepburn, William Holden, Leslie Howard, Allan Jones, Charles Laughton, Carole Lombard, Myrna Loy, Jeanette MacDonald, Joel McCrea, Merle Oberon, Laurence Olivier, William Powell, Ginger Rogers, Rosalind Russell, Norma Shearer, Barbara Stanwyck, James Stewart, and Spencer Tracy. There were other major stars, and many popular supporting players, but it seems that a rather small constellation of superstars commanded a disproportionate share of the leading roles in the best movies of the Golden Age.
Why did movies go into decline after 1942’s releases? World War II certainly provided an impetus. The war diverted resources from the production of major theatrical films; grade-A features gave way to low-budget fare. And some of the superstars of the Golden Age went off to war. (Two who remained civilians — Leslie Howard and Carole Lombard — were killed during the war.) With the resumption of full production in 1946, the surviving superstars who hadn’t retired were fading fast, though their presence still propelled many films of the Dreary Years.
Stars come and go, however, as they have done since Shakespeare’s day. The decline into the Dreary Years and Abysmal Epoch has deeper causes than the dimming of old stars:
• The Golden Age had deployed all of the themes that could be used without explicit sex, graphic violence, and crude profanity — none of which become an option for American movie-makers until the mid-1960s.
• Prejudice got significantly more play after World War II, but it’s a theme that can’t be used very often without becoming tiresome.
• Other attempts at realism (including film noir) resulted mainly in a lot of turgid trash laden with unrealistic dialogue and shrill emoting — keynotes of the Dreary Years.
• Hollywood productions often sank to the level of TV, apparently in a misguided effort to compete with that medium. The use of garish Technicolor — a hallmark of the 1950s — highlighted the unnatural neatness and cleanliness of settings that should have been rustic if not squalid.
• The transition from dreary to abysmal coincided with the cultural “liberation” of the mid-1960s, which saw the advent of the “f” word in mainstream films. Yes, the Abysmal Epoch brought more realistic plots and better acting (thanks mainly to the Brits). But none of that compensates for the anti-social rot that set in around 1966: drug-taking, drinking and smoking are glamorous; profanity proliferates to the point of annoyance; sex is all about lust and little about love; violence is gratuitous and beyond the point of nausea; corporations and white, male Americans with money are evil; the U.S. government (when Republican-controlled) is in thrall to that evil; etc., etc. etc.
To be sure, there have been outbreaks of greatness since the Golden Age. But every excellent film produced during the Dreary Years and Abysmal Epoch has been surrounded by outpourings of dreck, schlock, and bile. The generally tepid effusions of the Dreary Years were succeeded by the excesses of the Abysmal Epoch: films that feature noise, violence, sex, and drugs for the sake of noise, violence, sex, and drugs; movies whose only “virtue” is their appeal to such undiscerning groups as teeny-boppers, wannabe hoodlums, resentful minorities, and reflexive leftists; movies filled with “bathroom” and other varieties of “humor” so low as to make the Keystone Cops seem paragons of sophisticated wit.
In sum, movies have become progressively worse since the end of the Golden Age. Here’s a case in point: Last year I tried to watch Birdman, which won the Oscar for Best Picture of 2014. It failed to rise above trendy quirkiness, foul language, and stilted (though improvised) dialogue. I turned it off. It’s the only Best Picture winner, of those that I’ve watched, that I couldn’t sit through.
There have now been 89 Best Picture winners, and I’ve seen 69 of them. (I include Birdman because the several minutes of it that I watched seemed like two hours.) Of the 89 winners, only 14 are the highest-rated of the feature films released in the same year: All Quiet on the Western Front (1930), It Happened One Night (1934), Casablanca (1942), Patton (1970), The Godfather (1972), The Sting (1973), The Godfather: Part II (1974), One Flew Over the Cuckoo’s Nest (1975), The Deer Hunter (1978), The Silence of the Lambs (1991), Schindler’s List (1993), Gladiator (2000), The Lord of the Rings: The Return of the King (2003), and The Departed (2006).
A movie-watcher in search of good entertainment will often find it in a film other than one from the Best Picture list. But don’t put too much stock in the relative ratings of films across the years. If you’re in search of a great comedy, for example, go with one of the top-rated choices from the 1930s — It Happened One Night, A Night at the Opera, or Bringing Up Baby, for example — as opposed to more recent fare, such as Toy Story, The Big Lebowski, or The Grand Budapest Hotel. And if you want sustained laughter without the bother of dialogue, look no further than the silent films of Harold Lloyd and Buster Keaton.
I have withdrawn my books from print publication so that I can make them available free to readers via Google Drive. Here are the links:
Hint: If you save a file in Word, you can then send it to your Kindle, though I don’t know how the formatting will look.
If you prefer a bona fide Kindle edition, they’re cheap:
Americana, Etc. ($1.99)
The Kindle books are free to Kindle Unlimited subscribers.
Americana, Etc., is Volume IV of the series Dispatches from the Fifth Circle. The first three volumes are
Leftism, Political Correctness, and Other Lunacies
On Liberty: Impossible Dreams, Utopian Schemes
“We the People” and Other American Myths
There are links to and descriptions of Volumes I, II, and III in the four preceding posts.
Here’s some of the Introduction to Volume IV:
Volumes I, II, and III of this series are rather deep. It’s time for a break. The entries in this volume are sometimes serious, but the mood of the volume is light. It’s also rather random, jumping from baseball to movies to classical music to nostalgia, and so on.
I’ve included a long, final entry, “On Writing,” for want of a better venue. “On Writing” incorporates some of the ideas advanced in a few earlier entries, but it goes well beyond them. I commend it to you if you’re serious about becoming a better writer.
An annotated table of contents will give you an idea of the broad range of topics covered in Volume IV:
Political Parlance — A translation of words and phrases often used in politics.
Some Management Tips — A quiz to find out if you’re the pointy-haired boss.
Ten-Plus Commandments of Liberalism, er, Progressivism– What to believe if you want to be a good progressive (oxymoron alert).
Pet Peeves — The things that get my goat (and should get yours, too).
To Pay or Not to Pay — “Shakespeare” on taxes.
The Ghost of Impeachments Past Presents: The Trials of William Jefferson Whatsit — How Clinton’s impeachment trial should have gone.
The Good Old Days — Nostalgia.
Getting It Perfect — A satirical look at the Constitution’s amendments.
His Life as a Victim — Bill Clinton’s biography reviewed.
Modernism and the Arts — Why classical music and art went to the dogs in the 20th century.
Reveries — A remembrance of places past.
Thinking Back — The good and bad of technological change.
Thoughts of Winter — A selection of poetry for enjoying while sitting by the fire on a snowy evening.
Baseball Nostalgia — The Detroit Tigers “real” ballpark and great players.
Comix, Past and Present — The comic strips and books of my youth, some of which survive.
PC Madness — Why aren’t Norwegians up in arms about the Minnesota “Vikings”?
The Seven Faces of Blogging — A different take on Shakespeare’s “Seven Ages of Man.”
Christmas Movies — The best of the bunch.
Mister Hockey — Gordie Howe beats Wayne Gretzky, hands down, and I have the numbers to prove it.
The Passing of Red-Brick Schoolhouses and a Way of Life — The end of the age of innocence.
My Old Sears Home — Sears used to sell houses, and I owned one of them.
Baseball Realignment — Adding spice to the game, cutting off the cold ends of the season.
Wordplay — The vagaries of English pronunciation in a few lines.
Nameplay — Fun facts about the waxing and waning popularity of first names, with some excursions into president’s names.
Pride and Prejudice on Film — My favorite version, and others.
September Songs — Autumnal melancholia.
Testing for Steroids — McGwire and Bonds, guilty by the numbers.
Baseball’s Losers — Three long-suffering franchises.
The War: A Final Grade — How to feel guilty about winning the “good” war.
Did Roger Do It? — Probably, but not by the numbers.
Stuff White (Liberal Yuppie) People Like — You’ll like it if you aren’t a white liberal yuppie.
Baseball and Groundhog Day — Arcane facts about baseball standings.
The Seven-Game World Series — Not as suspenseful as it could be.
Presidential Trivia — More arcana about names, heights, longevity, etc.
The American League’s Greatest Hitters — Was Ty Cobb really the greatest of them all?
Driving and Politics — What a person’s driving habits (might) say about his politics.
A Trip to the Movies — The quality of films over the decades, with some bows to the best.
Men’s Health — Remedying an oversight in this age of feminism.
Arm-Waving and Longevity — Do conductors really live longer, and is arm-waving the cause?
So, Who Made You Laugh? — A tribute to the many great Jewish comedians and comic actors whose performances I have enjoyed for almost seven decades.
Hopefully Arrives — Language debasement with a stamp of approval (not by me).
Why Prescriptivism? — The constructive role of language rules.
I’ve Got a Little List — My updating of Sir William S. Gilbert’s lyrics.
Speaking in Foreign Tongues — Why is it hard for adult Americans to speak foreign languages properly?
A Guide to the Pronunciation of General American English — For foreigners, Southerners, and New Englanders.
Home-Field Advantage — It’s real.
Looking Askance — Satirical takes on military strategy, cabinet positions, politicians’ memoirs, and public education.
Competitiveness in Major League Baseball — There’s a lot more of it than there used to be.
May the Best Team Lose — The meaninglessness of baseball’s post-season playoffs.
“Than I” or “Than Me”? — I have the answer.
The Hall of Fame Reconsidered — How to cull the riff-raff from baseball’s “shrine.”
On Writing — How to and how not to write right.
I have cut the prices of the Kindle editions of my books to $0.99 — that’s 99 cents each. Follow the links to take advantage of this deal:
My latest book is now available at Amazon.com
From the Preface:
I decided to title this volume “We the People” and Other American Myths because there are so many misconceptions about the governance of the United States, beginning with the fable that the Constitution is somehow a product of “the people.” Following closely upon that myth is the be-lief that the Supreme Court — which has violated the Constitution countless times — is the final and sole interpreter of its meaning.
Two other myths that I address in this volume are the illegality of secession and the idea that secession is “bad” be-cause it’s associated with the defense of slavery. Secession is legal, and the South had good reason to secede, other than a desire to preserve slavery.
• the constitutionality of the sacred cow known as Social Security
• freedom of the press, freedom of speech, and privacy as absolute rights under the Constitution
• feel-good attitudes, such as nation = society, active presidents are great presidents, and democracy is to die for.
There’s much more packed into the 49 essays comprised in the volume.
My new book is now available at Amazon.com,
The paperback version is priced much too high at $16.95, though it’s just above the minimum dictated by Amazon. The Kindle edition is only $6.95.
What’s in it? An introductory chapter and 56 essays drawn from posts at Politics & Prosperity and Liberty Corner.
Here’s the text of the introductory chapter, “What Lies Ahead” (1. INTRODUCTION, 2. UNDERSTANDING LIBERTY, etc., refer to the five parts into which the book is divided):
The next two essays are “A Declaration and Defense of My Prejudices about Governance” and “Parsing Political Philosophy.” “A Declaration…” tells you where I’m coming from, if you haven’t already figured it out by reading the first volume, the preface to this one, or this introductory essay. “Parsing…” details my political philosophy (right-minarchism), puts it in perspective, and presages much of what follows in Parts 2 — 5 of this volume.
2. UNDERSTANDING LIBERTY
I begin Part 2 with essays which argue that liberty is a product of social intercourse, not abstract principles, and certainly not ratiocination. Liberty is a modus vivendi, not the result of a rational political scheme. Though a rational political scheme, such as the one laid out in the Constitution of the United States, could promote liberty.
The key to a libertarian modus vivendi is the evolutionary development and widespread observance of social norms that foster peaceful coexistence and mutually beneficial cooperation. And that is liberty. The state’s sole legitimate role, other than procedural ones (e.g., the administration of voting) is the defense of liberty from foreign and domestic predators.
Is my claim that liberty is a modus vivendi based on social norms an endorsement of moral relativism? It is not, as I explain. There is also much in Part 2 about civil society, the institutions of which (family, church, club, etc.) are the keepers and transmitters of social norms. The second part also addresses the relation of liberty to science, religion, and democracy. There are several essays on the state of liberty in America (and many more in Volume I [my previous book]).
3. RIGHTS: NEGATIVE, POSITIVE, AND “NATURAL”
Liberty enables a person to exercise rights, which are the subject of Part 3. Those rights derive from social norms, which set the boundaries of permissible behavior. Social norms arise from the operation of the Golden Rule. Rights are “natural” only in the sense that they result naturally from social intercourse; they are not mysterious essences that inhere in human beings.
In a regime of liberty, rights are negative rather than positive; that is, they oblige others (including the state) to leave a person alone when his behavior is within the boundaries established by voluntarily evolved social norms. Positive rights, by contrast, entitle certain identifiable groups to benefits, the costs of which must be defrayed by everyone else. This is a fool’s game, of course, because it spurs the creation of additional positive rights for yet other groups, leaving almost everyone in the position of paying, indirectly, for benefits received. But it’s not a zero-sum game because the “house” — government — rakes in a percentage of the take.
4. LIBERTARIANISM, TRUE AND FALSE
In Part 4, I explain why traditional conservatism is true libertarianism. I also detail the vacuousness and fatuousness of the doctrines that commonly pass for libertarianism, anarchism among them.
Standard leave-me-alone libertarianism (based on the harm principle) is a form of rationalism: an undue reliance on pure reason, without regard for the realities of nature and human nature. Rationalists are fond of conjuring “perfect” political arrangements that simply won’t work.
Part 4 also exposes the essential authoritarianism of some so-called libertarians — oxymoronically called left-libertarians — who are intolerant of liberty when it yields the “wrong” results. In that respect, many so-called libertarians are like modern liberals (i.e., leftists). So, I end the Part 4 with some essays that trace the descent of modern liberalism from classical liberalism, and illuminate the parallels between modern liberalism and the “libertarian” left. (The sins of modern liberalism are treated at length in Volume I.)
5. SOME MORE “ISMs”
The final part explores Objectivism, anarchism, utilitarianism, and fascism. (I will tackle another prominent and relevant “ism” — “libertarian” paternalism — in a later volume.)
I address Objectivism because it is often confused with standard leave-me-alone libertarianism. Objectivism is a cult whose members swear fealty to “reality,” in the name of an unrealistic, sophomoric philosophy. It might as well be standard leave-me-alone libertarianism.
As long as I’m writing about unrealistic, sophomoric philosophy, there’s anarchism. I address it fleetingly in Parts 1 – 4. I return to it in Part 5 just to drive home my arguments against it.
Utilitarianism isn’t to be confused with consequentialism, which simply holds that liberty and its concomitant, negative rights, are desired (and thus desirable) because of the superior social and economic consequences of peaceful coexistence and mutually beneficial cooperation. (Liberty is desired and desirable on its own account, of course.) Utilitarians are wont to evaluate social and economic policies from the standpoint of a dictatorial actor (though utilitarians don’t seem to grasp this implication of their practice). The conceit of utilitarianism is the (implied or express) existence a social-welfare function which (somehow) sums the happiness and unhappiness of a relevant portion of humanity (the portion in which a utilitarian is interested). A policy or program is favored if it yields a greater sum of happiness (“the greatest amount of happiness altogether”), even if that greater sum includes a rise in A’s happiness at the expense of B (who is unlikely to be amused by the outcome).
Finally, I come to fascism, which seems to be the inevitable fate of representative democracies. Popular imagery to the contrary notwithstanding, fascism isn’t jack-booted despotism; rather:
Fascism is a system in which the government leaves nominal ownership of the means of production in the hands of private individuals but exercises control by means of regulatory legislation and reaps most of the profit by means of heavy taxation. In effect, fascism is simply a more subtle form of government ownership than is socialism. Under fascism, producers are allowed to keep a nominal title to their possessions and to bear all the risks involved in entrepreneurship, while the government has most of the actual control and gets a great deal of the profit (and takes none of the risks). The U.S.A. is moving increasingly away from a free-market economy and toward fascist totalitarianism. [Linda and Morris Tannehill, The Market for Liberty, p. 18]
Fascism usually is described as a right-wing phenomenon, but with respect to liberty there’s no difference between the extreme right and the extreme left. They are merely different manifestations of despotism. In the United States, fascism takes the form of a “soft” despotism, one that is outwardly benign, but which suppresses liberty nonetheless.
The arrival of American fascism (“soft” despotism, if you prefer) was inevitable because representative democracy empowers government to act on behalf of “the people.” But government can do so only by stripping power from the people through taxation and regulation. Politicians hold onto their power by seeming to deliver special benefits to various segments of the populace.
That the benefits are largely illusory, as discussed earlier, matters little. The benefits are visible (to those who receive them), while the tax and regulatory burdens are diffuse. And so, “the people” keep asking for more, and the state keeps spending, taxing, and regulating to deliver it.
Thus does democracy destroy liberty.
On that cheery note…
On sale now at Amazon.com (excerpts below):
This is a retired blogger’s version of John Henry Newman’s Apologia Pro Vita Sua. It may seem immodest of me to suggest intellectual kinship with Cardinal Newman, but bloggers aren’t modest. If they were, they wouldn’t expose their innermost thoughts to the world.
It’s true that many bloggers choose to remain anonymous. But that doesn’t diminish their immodesty — or their credibility. Ideas should be judged on their own merits, not by their author’s reputation or rank.
If Alexander Hamilton, James Madison, and John Jay chose to remain anonymous (to all but a few keen observers) when they wrote as Publius to urge ratification of the Constitution, why can’t a blogger emulate them in urging policies that would restore constitutional governance (as I do in many of my posts)?…
* * *
When I lived in the D.C. area and subscribed to The Washington Post, I occasionally read a column by Leonard Pitts Jr. This masochistic practice served two purposes. First, it exercised my cardiovascular system (i.e., raised my heart rate and blood pressure). Second, it helped me to keep up with what passes for wisdom among the race-card-playing set.
Mr. Pitts, who is a syndicated columnist operating out of The Miami Herald, comes by his race-card-playing naturally, as a black and — given his age (about 50) — a likely beneficiary of reverse discrimination (a.k.a. affirmative action). I should note that Pitts plays the race-card game clumsily, probably because his mental warehouse is stocked with gross generalizations and logical fallacies.
I was provoked to write this post by a recent Pitts column, to which I will come, where (in passing) he defends the socialization of medicine because other things also have been socialized. By that logic, Pitts would excuse the murder of his wife because millions of murders already have been committed….
* * *
Some time back, Tom Smith of The Right Coast referred to the NYT columnist and pseudo-conservative David Brooks as “prissy little Miss Brooks.” Smith’s recycling of the appellation has not diminished its satirical effect — or its substantive accuracy.
Miss Brooks recently cringed when she contemplated an America without government, in the aftermath of a victorious Tea Party movement. Miss Brooks, it seems, is besotted with the manliness of limited-but-energetic governments
that used aggressive [emphasis added] federal power to promote growth and social mobility. George Washington used industrial policy, trade policy and federal research dollars to build a manufacturing economy alongside the agricultural one….
* * *
Thomas Sowell‘s Intellectuals and Society is a rewarding and annoying book.
The book is rewarding because it adds to the thick catalog of left-wing sins that Sowell has compiled and explicated in his long career as a public intellectual. When Sowell criticizes the anti-gun, soft-on-crime, peace-at-any-price, tax-spend-and-regulate crowd, he does it by rubbing their noses in the facts and figures about the messes that have been created by the policies they have promoted….
The left’s essential agenda is the repudiation of ordered liberty of the kind that arises from evolved social norms, and the replacement of that liberty by sugar-coated oppression. The bread and circuses of imperial Rome have nothing on Social Security, Medicaid, Medicare, Obamacare, and the many other forms of personal and corporate welfare that are draining America of its wealth and élan. All of that “welfare” has been bought at the price of economic and social liberty, which are indivisible.
Leftists like to say that there is a difference between opposition and disloyalty. But, in the case of the left, opposition arises from a fundamental kind of disloyalty. For, at bottom, the left pursues its agenda because it hates the idea of what America used to stand for: liberty with responsibility, strength against foreign and domestic enemies.
Most leftists are simply shallow-minded trend-followers, who believe in the power of government to do things that are “good,” “fair,” or “compassionate,” with no regard for the costs and consequences of those things. Shallow leftists know not what they do. But they do it. And their shallowness does not excuse them for having been accessories to the diminution of America. A rabid dog may not know that it is rabid, but its bite is no less lethal for that.
The leaders of the left — the office-holders, pundits, and intelligentsia — usually pay lip-service to “goodness,” “fairness,” and “compassion.” But their lip-service fails to conceal their brutal betrayal of liberty. Their subtle and not-so-subtle treason is despicable almost beyond words. But not quite….
* * *
The proximate cause of this post is a column by Nicholas Kristof, “Equality, a True Soul Food“ (The New York Times, January 1, 2011 ), in which Kristof pleads for less income inequality in the United States. His plea is based, in part, on the premise that persons of low status suffer because they envy persons of higher status (an assertion that’s based on research about monkeys)….
There is no theoretical or factual argument for income redistribution that cannot be met by a superior theoretical or factual argument against it. In the end, the case for (somehow) reducing income inequality turns on an emotional appeal for “social justice,” that is, for reshaping the world in a way that pleases the pleader. As if the pleader — in his or her pure, misguided arrogance — has superior wisdom about how the world should be shaped….
* * *
The U.S. Supreme Court’s finding for Wal-Mart in the case of Wal-Mart v. Dukes predictably set off a storm of criticism by Wal-Mart’s critics, who are legion. Those critics, predictably, are mainly upper-middle class professionals who do not shop at Wal-Mart, would not work at Wal-Mart, and fastidiously scorn the politics and religion of those who do shop and work at Wal-Mart….
I have news for Yuppies and other critics of Wal-Mart. There are no goon squads dragging unwilling people in from the streets to work in Wal-Mart stores. There are no Wal-Mart employees caged in their work areas. There are no secret prisons in Arkansas where they send Wal-Mart employees who elect to move on to more highly compensated jobs at other companies….
* * *
“Culture war” is a familiar term, but one that I hadn’t thought deeply about until a few days ago. I read something about abortion in which “culture war” occurred. The fog lifted, and I grasped what should have been obvious to me all along: The “culture war” isn’t about “culture,” it’s about morality and liberty….
“Thanks” to the signals sent by the state — many of them in the form of legislative, executive, and judicial dictates — we now have not just easy divorce, subsidized illegitimacy, and legions of non-mothering mothers, but also abortion, concerted (and deluded ) efforts to defeminize females and to neuter or feminize males, forced association (with accompanying destruction of property and employment rights), suppression of religion, absolution of pornography, and the encouragement of “alternative lifestyles” that feature disease, promiscuity, and familial instability….
* * *
An image of the Battle Flag of the Army of Northern Virginia is displayed prominently in the sidebar of my blog. I do not display the flag to defend it, as one reader suggests. As I say in the text that accompanies the image of the flag, I display it to symbolize my hope for deliverance from an oppressive national government (the present one) and to signify my opposition to political correctness (of the kind that can’t tolerate the display of the flag for any purpose).
I certainly do not display the flag to defend the Confederacy’s central cause: the preservation of slavery…. But I do defend the legality of secession, as a constitutional right of States. Nor does the display signify racism on my part, because I am not racist. Clicking on the flag takes the reader to my “moral profile,” where the last entry strongly supports my claim of race-neutrality….
A lot of people just want to be offended, and they look for ways of achieving their aim. Take the controversies about the use of “niggardly.”…
Symbols of the Confederacy are the new “niggardly,” but on a grander scale. As suddenly and pervasively as the hula-hoop craze of the 1950s — and mainly because of a single act of violence in Charleston — it has become de rigeur to condemn persons, places, and things associated with the Confederacy. This is nothing but hysterical nonsense….
Clearly, the culture war has entered a new and dangerous phase, reminiscent of China’s Cultural Revolution under Mao.
I have posted only three times in August and September, and not at all since August 22. At first, I was occupied by moving my very old parents-in-law (ages 96 and 95), so that my father-in-law could receive proper care in a skilled nursing facility and my mother-in-law could have an assisted-living apartment in the same building. Just four weeks after placing my father-in-law in skilled nursing, he succumbed to his accumulated ailments. The planning of his funeral and the tying up of financial loose ends has taken up much of my time, and will continue to do so for a while longer.
On top of all that, it has been only three months since I dealt with my mother’s passing (at age 99) and closed our her (exceedingly modest) estate.
I shall return, but I can’t say when or with how much vigor.
Too much time has passed since my last substantive post. I have decided to suspend blogging, perhaps forever.
* * *
Related post: The Many-Sided Curse of Very Old Age
While Thomas is on a sabbatical from the blog, I will continue my occasional guest posts – Postmodern Conservative.
When William F. Buckley passed away in February, I found myself harboring mixed emotions. I probably wasn’t the only one. The man had quite a legacy, fostering a major movement that was an improvement on the conspiracy-obsessed and isolationist John Birch variety of right wing politics that had become a stereotype of conservative thinking in the mid-20th century. At the same time, I could not embrace Buckley as a hero. He believed in the legalization of marijuana and, more importantly, adopted the pose of an urbane sophisticate who winked at the seedier side of popular culture. What seemed to be his main gripe was not so much bad morals as a lack of panache. Thus he would write witty pieces for Penthouse magazine and his National Review rather infamously celebrated the fiftieth anniversary of Playboy in the 2003 article by Catherine Seipp (a fact which alienated social conservatives). Was this just fashionable posing? Even Buckley’s take on the issue was infuriatingly contrarian and ambiguous. There is something sanctimonious in that, like wanting to have your cake and eat it too.
I don’t think anyone believes conservatives must be puritans, but the obvious problem with Hugh Hefner’s Playboy is that it took a cultural sideline—eroticism and sexual irresponsibility—into the mainstream. The barrier was down and worse things would follow. It was not possible to keep things in little boxes, as the libertines (conservative or liberal) imagined. After all, Hefner and his lobby worked heavily to promote abortion and homosexuality. If nothing else, the whole STD dilemma that we are still grappling with is due in large part to the attitudes fostered by the Playboy lifestyle.
If it’s true that the conservative movement that came out of Buckley’s experience was an intellectual improvement, it was not necessarily a philosophical one. There is a difference. While it’s important to reach people through the common culture, it does not mean dumbing-down beliefs in favor of short-term ideological gains. It is this glib attitude that, rightly or wrongly, caused many people to split off from Buckleyite conservatism into the paleo-con movement.
As John Henry Newman put it: “Knowledge is one thing, virtue is another; good sense is not conscience, refinement is not humility….” Newman was as much an intellectual as Buckley, but he knew that in the end people are converted from error not because an argument is clever but because it is right. And one has to wonder how long Buckley’s influence will last. Will it be as defining as that of Malcolm Muggeridge, Russell Kirk, and Thomas Sowell? Time will tell.
Go to “Favorite Posts” at Politics & Prosperity.
I have written many times about the dependence of liberty on traditional norms. (See, for example, “Social Norms and Liberty” and “‘Family Values,’ Liberty, and the State.”) Thanks to this article by Lee Harris, I have come across his long essay on “The Future of Tradition: Transmitting the visceral ethical code of civilization” (Policy Review, June & July 2005).
The essay is best read in its entirety, with all of its logical connections. However, I cannot resist quoting the passages that make Harris’s central point:
[I]t is a mistake to conflate the automatic with the irrational, since, as we have seen, an automatic and mindless response is precisely the mechanism by which the visceral code speaks to us. It triggers a rush of emotions because it is designed to do precisely this. Like certain automatic reflexes, such as jerking your hand off a burning stovetop, the sheer immediacy of our visceral response, far from being proof of its irrationality, demonstrates the critical importance, in times of peril and crisis, of not thinking before we act. If a man had to think before jumping out of the way of an onrushing car, or to meditate on his options before removing his hand from that hot stovetop, then reason, rather than being our help, would become our enemy. Some decisions are better left to reflexes — be these of our neurological system or of our visceral system….
Imagine a stranger coming up to you and asking if he can drive your eight-year-old daughter around town in his new car. Presumably, no matter how nicely the stranger asked this question, you would say no. But suppose he started to ask why you won’t let him take your little girl for a ride. What if he said, “Listen, tell you what. I’ll give her my cell phone and you can call her anytime you want”? What kind of obligation are you under to give a reason to a complete stranger for why he shouldn’t be allowed to drive off with your daughter?
None. A question that is out of order does not require or deserve an answer. The moment you begin to answer the question as if it were in order, it is too late to point out your original objection to the question in the first place, which really was: Over my dead body.
Marriage was something that, until only quite recently, seemed to be securely in the hands of married people. It was what married people had engaged in, and certainly not a special privilege that had been extended to them to the exclusion of other human beings…. Was [marriage] defined as between a man and a woman? Well, yes, but only in the sense that a cheese omelet is defined as an egg and some cheese — without the least intention of insulting either orange juice or toast by their omission from this definition. Orange juice and toast are fine things in themselves — you just can’t make an omelet out of them.
Those who are married now, and those thinking about getting married or teaching their children that they should grow up and get married, may all be perfect idiots, mindlessly parroting a message wired into them before they were old enough to know better. But they are passing on, through the uniquely reliable visceral code, the great postulate of transgenerational duty: not to beseech people to make the world a better place, but to make children whose children will leave it a better world and not merely a world with better abstract ideals….
Marriage must not be mocked or ridiculed. But can marriage keep its solemnity now? Who will tell the rising generation that there are standards they must not fail to meet if they wish to live in a way that their grandfathers could respect?
This is how those fond of abstract reasoning can destroy the ethical foundations of a society without anyone’s noticing it. They throw up for debate that which no one before ever thought about debating. They take the collective visceral code that has bound parents to grandchildren from time immemorial, in every culture known to man, and make of it a topic for fashionable intellectual chatter.
Ask yourself what is so secure about the ethical baseline of our current level of civilization that it might not be opened up for question, or what deeply cherished way of doing things will suddenly be cast in the role of a “residual personal prejudice.”
We are witnessing the triumph of a Newspeak in which those who simply wish to preserve their own way of life, to pass their core values down to their grandchildren more or less intact, no longer even have a language in which they can address their grievances. In this essay I have tried to produce the roughest sketch of what such language might look like and how it could be used to defend those values that represent what Hegel called the substantive class of community — the class that represents the ethical baseline of the society and whose ethical solidity and unimaginativeness permit the high-spirited experimentation of the reflective class to go forward without the risk of complete societal collapse.
If the reflective class, represented by intellectuals in the media and the academic world, continues to undermine the ideological superstructure of the visceral code operative among the “culturally backward,” it may eventually succeed in subverting and even destroying the visceral code that has established the common high ethical baseline of the average American — and it will have done all of this out of the insane belief that abstract maxims concerning justice and tolerance can take the place of a visceral code that is the outcome of the accumulated cultural revolution of our long human past.
Thus, in the name of “enlightenment,” the “reflective class” subverts liberty.
Harris, by the way, has no immediate, personal interest in the preservation of marriage as a heterosexual institution. He flatly states in the essay that he is homosexual.