Writing: A Guide (Introduction and Part I)

This series is aimed at writers of non-fiction works, but writers of fiction may also find it helpful. There are four parts:

I. Some Writers to Heed and Emulate

A. The Essentials: Lucidity, Simplicity, Euphony
B. Writing Clearly about a Difficult Subject
C. Advice from an American Master
D. Also Worth a Look

II. Step by Step

A. The First Draft

1. Decide — before you begin to write — on your main point and your purpose for making it.
2. Avoid wandering from your main point and purpose; use an outline.
3. Start by writing an introductory paragraph that summarizes your “story line”.
4. Lay out a straight path for the reader.
5. Know your audience, and write for it.
6. Facts are your friends — unless you’re trying to sell a lie, of course.
7. Momentum is your best friend.

B. From First Draft to Final Version

1. Your first draft is only that — a draft.
2. Where to begin? Stand back and look at the big picture.
3. Nit-picking is important.
4. Critics are necessary, even if not mandatory.
5. Accept criticism gratefully and graciously.
6. What if you’re an independent writer and have no one to turn to?
7. How many times should you revise your work before it’s published?

III. Reference Works

A. The Elements of Style
B. Eats, Shoots & Leaves
C. Follett’s Modern American Usage
D. Garner’s Modern American Usage
E. A Manual of Style and More

IV. Notes about Grammar and Usage

A. Stasis, Progress, Regress, and Language
B. Illegitimi Non Carborundum Lingo

1. Eliminate filler words.
2. Don’t abuse words.
3. Punctuate properly.
4. Why ‘s matters, or how to avoid ambiguity in possessives.
5. Stand fast against political correctness.
6. Don’t split infinitives.
7. It’s all right to begin a sentence with “And” or “But” — in moderation.
8. There’s no need to end a sentence with a preposition.

Some readers may conclude that I prefer stodginess to liveliness. That’s not true, as any discerning reader of this blog will know. I love new words and new ways of using words, and I try to engage readers while informing and persuading them. But I do those things within the expansive boundaries of prescriptive grammar and usage. Those boundaries will change with time, as they have in the past. But they should change only when change serves understanding, not when it serves the whims of illiterates and language anarchists.

This guide is long because It covers a lot of topics aside from the essentials of clear writing. The best concise guide to clear writing that I’ve come across is by David Randall: “Twenty-five Guidelines for Writing Prose” (National Association of Scholars).

Parts II, III, and IV are here, here, and here.


I. SOME WRITERS TO HEED AND EMULATE

A. The Essentials: Lucidity, Simplicity, Euphony

I begin with the insights of a great writer, W. Somerset Maugham (English, 1874-1965). Maugham was a prolific and popular playwright, novelist, short-story writer, and author of non-fiction works. He reflected on his life and career as a writer in The Summing Up. It appeared in 1938, when Maugham was 64 years old and more than 40 years into his very long career. I first read The Summing Up about 40 years ago, and immediately became an admirer of Maugham’s candor and insight. This led me to become an avid reader of Maugham’s novels and short-story collections. And I have continued to consult The Summing Up for booster shots of Maugham’s wisdom.

Maugham’s advice to “write lucidly, simply, euphoniously and yet with liveliness” is well-supported by examples and analysis. I offer the following excerpts of the early pages of The Summing Up, where Maugham discusses the craft of writing:

I have never had much patience with the writers who claim from the reader an effort to understand their meaning…. There are two sorts of obscurity that you find in writers. One is due to negligence and the other to wilfulness. People often write obscurely because they have never taken the trouble to learn to write clearly. This sort of obscurity you find too often in modern philosophers, in men of science, and even in literary critics. Here it is indeed strange. You would have thought that men who passed their lives in the study of the great masters of literature would be sufficiently sensitive to the beauty of language to write if not beautifully at least with perspicuity. Yet you will find in their works sentence after sentence that you must read twice to discover the sense. Often you can only guess at it, for the writers have evidently not said what they intended.

Another cause of obscurity is that the writer is himself not quite sure of his meaning. He has a vague impression of what he wants to say, but has not, either from lack of mental power or from laziness, exactly formulated it in his mind and it is natural enough that he should not find a precise expression for a confused idea. This is due largely to the fact that many writers think, not before, but as they write. The pen originates the thought…. From this there is only a little way to go to fall into the habit of setting down one’s impressions in all their original vagueness. Fools can always be found to discover a hidden sense in them….

Simplicity is not such an obvious merit as lucidity. I have aimed at it because I have no gift for richness. Within limits I admire richness in others, though I find it difficult to digest in quantity. I can read one page of Ruskin with delight, but twenty only with weariness. The rolling period, the stately epithet, the noun rich in poetic associations, the subordinate clauses that give the sentence weight and magnificence, the grandeur like that of wave following wave in the open sea; there is no doubt that in all this there is something inspiring. Words thus strung together fall on the ear like music. The appeal is sensuous rather than intellectual, and the beauty of the sound leads you easily to conclude that you need not bother about the meaning. But words are tyrannical things, they exist for their meanings, and if you will not pay attention to these, you cannot pay attention at all. Your mind wanders…..

But if richness needs gifts with which everyone is not endowed, simplicity by no means comes by nature. To achieve it needs rigid discipline…. To my mind King James’s Bible has been a very harmful influence on English prose. I am not so stupid as to deny its great beauty, and it is obvious that there are passages in it of a simplicity which is deeply moving. But the Bible is an oriental book. Its alien imagery has nothing to do with us. Those hyperboles, those luscious metaphors, are foreign to our genius…. The plain, honest English speech was overwhelmed with ornament. Blunt Englishmen twisted their tongues to speak like Hebrew prophets. There was evidently something in the English temper to which this was congenial, perhaps a native lack of precision in thought, perhaps a naive delight in fine words for their own sake, an innate eccentricity and love of embroidery, I do not know; but the fact remains that ever since, English prose has had to struggle against the tendency to luxuriance…. It is obvious that the grand style is more striking than the plain. Indeed many people think that a style that does not attract notice is not style…. But I suppose that if a man has a confused mind he will write in a confused way, if his temper is capricious his prose will be fantastical, and if he has a quick, darting intelligence that is reminded by the matter in hand of a hundred things he will, unless he has great self-control, load his pages with metaphor and simile….

Whether you ascribe importance to euphony … must depend on the sensitiveness of your ear. A great many readers, and many admirable writers, are devoid of this quality. Poets as we know have always made a great use of alliteration. They are persuaded that the repetition of a sound gives an effect of beauty. I do not think it does so in prose. It seems to me that in prose alliteration should be used only for a special reason; when used by accident it falls on the ear very disagreeably. But its accidental use is so common that one can only suppose that the sound of it is not universally offensive. Many writers without distress will put two rhyming words together, join a monstrous long adjective to a monstrous long noun, or between the end of one word and the beginning of another have a conjunction of consonants that almost breaks your jaw. These are trivial and obvious instances. I mention them only to prove that if careful writers can do such things it is only because they have no ear. Words have weight, sound and appearance; it is only by considering these that you can write a sentence that is good to look at and good to listen to.

I have read many books on English prose, but have found it hard to profit by them; for the most part they are vague, unduly theoretical, and often scolding. But you cannot say this of Fowler’s Dictionary of Modern English Usage. It is a valuable work. I do not think anyone writes so well that he cannot learn much from it. It is lively reading. Fowler liked simplicity, straightforwardness and common sense. He had no patience with pretentiousness. He had a sound feeling that idiom was the backbone of a language and he was all for the racy phrase. He was no slavish admirer of logic and was willing enough to give usage right of way through the exact demesnes of grammar. English grammar is very difficult and few writers have avoided making mistakes in it….

But Fowler had no ear. He did not see that simplicity may sometimes make concessions to euphony. I do not think a far-fetched, an archaic or even an affected word is out of place when it sounds better than the blunt, obvious one or when it gives a sentence a better balance. But, I hasten to add, though I think you may without misgiving make this concession to pleasant sound, I think you should make none to what may obscure your meaning. Anything is better than not to write clearly. There is nothing to be said against lucidity, and against simplicity only the possibility of dryness. This is a risk that is well worth taking when you reflect how much better it is to be bald than to wear a curly wig. But there is in euphony a danger that must be considered. It is very likely to be monotonous…. I do not know how one can guard against this. I suppose the best chance is to have a more lively faculty of boredom than one’s readers so that one is wearied before they are. One must always be on the watch for mannerisms and when certain cadences come too easily to the pen ask oneself whether they have not become mechanical. It is very hard to discover the exact point where the idiom one has formed to express oneself has lost its tang….

If you could write lucidly, simply, euphoniously and yet with liveliness you would write perfectly: you would write like Voltaire. And yet we know how fatal the pursuit of liveliness may be: it may result in the tiresome acrobatics of Meredith. Macaulay and Carlyle were in their different ways arresting; but at the heavy cost of naturalness. Their flashy effects distract the mind. They destroy their persuasiveness; you would not believe a man was very intent on ploughing a furrow if he carried a hoop with him and jumped through it at every other step. A good style should show no sign of effort. What is written should seem a happy accident…. [Pp. 23-32, passim, Pocket Book edition, 1967]

You should also study Maugham’s The Summing Up for its straightforward style. I return to these opening sentences of a paragraph:

Another cause of obscurity is that the writer is himself not quite sure of his meaning. He has a vague impression of what he wants to say, but has not, either from lack of mental power or from laziness, exactly formulated it in his mind and it is natural enough that he should not find a precise expression for a confused idea. This is due largely to the fact that many writers think, not before, but as they write. The pen originates the thought…. [Pp. 23-24]

This is a classic example of good writing (and it conveys excellent advice). The first sentence states the topic of the paragraph. The following sentences elaborate it. Each sentence is just long enough to convey a single, complete thought. Because of that, even the rather long second sentence in the second block quotation should be readily understood by a high-school graduate (a graduate of a small-city high school in the 1950s, at least).

B. Writing Clearly about a Difficult Subject

I offer a great English mathematician, G.H. Hardy, as a second exemplar. In particular, I recommend Hardy’s A Mathematician’s Apology. (It’s an apology in the sense of “a formal written defense of something you believe in strongly”, where the something is the pursuit of pure mathematics.) The introduction by C.P Snow is better than Hardy’s long essay, but Snow was a published novelist as well as a trained scientist. Hardy’s publications, other than the essay, are mathematical. The essay is notable for its accessibility, even to non-mathematicians. Of its 90 pages, only 23 (clustered near the middle) require a reader to cope with mathematics, but it’s mathematics that shouldn’t daunt a person who has taken and passed high-school algebra.

Hardy’s prose is flawed, to be sure. He overuses shudder quotes, and occasionally gets tangled in a too-long sentence. But I’m taken by his exposition of the art of doing higher mathematics, and the beauty of doing it well. Hardy, in other words, sets an example to be followed by writers who wish to capture the essence of a technical subject and convey that essence to intelligent laymen.

Here are some samples:

There are many highly respectable motives which may lead men to prosecute research, but three which are much more important than the rest. The first (without which the rest must come to nothing) is intellectual curiosity, desire to know the truth. Then, professional pride, anxiety to be satisfied with one’s performance, the shame that overcomes any self-respecting craftsman when his work is unworthy of his talent. Finally, ambition, desire for reputation, and the position, even the power or the money, which it brings. It may be fine to feel, when you have done your work, that you have added to the happiness or alleviated the sufferings of others, but that will not be why you did it. So if a mathematician, or a chemist, or even a physiologist, were to tell me that the driving force in his work had been the desire to benefit humanity, then I should not believe him (nor should I think any better of him if I did). His dominant motives have been those which I have stated and in which, surely, there is nothing of which any decent man need be ashamed. [Pp. 78-79, 1979 paperback edition]

*     *     *

A mathematician, like a painter or a poet, is a maker of patterns. If his patterns are more permanent than theirs, it is because they are made with ideas. A painter makes patters with shapes and colors, a poet with words. A painting may embody an ‘idea’, but the idea is usually commonplace and unimportant. In poetry, ideas count for a good deal more; but, as Housman insisted, the importance of ideas in poetry is habitually exaggerated…

… A mathematician, on the other hand, has no material to work with but ideas, and his patterns are likely to last longer, since ideas wear less with time than words. [Pp. 84-85]

C. Advice from an American Master

A third exemplar is E.B. White, a successful writer of fiction who is probably best known for The Elements of Style. (It’s usually called “Strunk & White” or “the little book”.) It’s an outgrowth of a slimmer volume of the same name by William Strunk Jr. (Strunk had been dead for 13 years when White produced the first edition of Strunk & White.)

I’ll address the little book’s authoritativeness in a later section. Here, I’ll highlight White’s style of writing. This is from the introduction to the third edition (the last one edited by White):

The Elements of Style, when I re-examined it in 1957, seemed to me to contain rich deposits of gold. It was Will Strunk’s parvum opus, his attempt to cut the vast tangle of English rhetoric down to size and write its rules and principles on the head of a pin. Will himself had hung the tag “little” on the book; he referred to it sardonically and with secret pride as “the little book,” always giving the word “little” a special twist, as though he were putting a spin on a ball. In its original form, it was  forty-three-page summation of the case for cleanliness, accuracy, and brevity in the use of English…. [P. xi]

Vivid, direct, and engaging. And the whole book reads like that.

D. Also Worth a Look

Read Steven Pinker‘s essay, “Why Academics Stink at Writing” (The Chronicle Review, September 26, 2014). You may not be an academic, but I’ll bet that you sometimes lapse into academese. (I know that I sometimes do.) Pinker’s essay will help you to recognize academese, and to understand why it’s to be avoided.

Pinker’s essay also appears in a booklet, “Why Academics Stink at Writing–and How to Fix It”, which is available here in exchange for your name, your job title, the name of your organization, and your e-mail address. (Whether you wish to give true information is up to you.) Of the four essays that follow Pinker’s, I prefer the one by Michael Munger.

Beyond that, pick and chose by searching on “writers on writing”. Google gave me 391,000 hits on the day that I published this post. Hidden among the dross, I found this, which led me to this gem: “George Orwell on Writing, How to Counter the Mindless Momentum of Language, and the Four Questions a Great Writer Must Ask Herself”. (“Herself”? I’ll say something about gender in part IV.)

Jerks and Psychopaths

Crudeness vs. subtlety.

Regarding jerks, here’s Eric Schwitzgebel, writing in “How to Tell if You’re a Jerk” (Nautilus, November 16, 2017):

Jerks are people who culpably fail to appreciate the perspectives of the people around them, treating others as tools to be manipulated or fools to be dealt with, rather than as moral and epistemic peers….

Jerks see the world through goggles that dim others’ humanity. The server at the restaurant is not a potentially interesting person with a distinctive personality, life story, and set of goals to which you might possibly relate. Instead, he is merely a tool by which to secure a meal or a fool on which you can vent your anger.

Why is it jerky to view a server waiter as a “tool” (loaded word) by which to secure a meal? That’s his job, just as it’s the job of a clerk to ring up your order, the job of a service advisor to see that your car is serviced, etc. Pleasantness and politeness are called for in dealings with people in service occupations — as in dealings with everyone — though it may be necessary to abandon them in the face of incompetence, rudeness, or worse.

What’s not called for is a haughty or dismissive air, as if the waiter, clerk, etc., were a lesser being. I finally drew a line (mentally) through a long-time friendship when the friend — a staunch “liberal” who, by definition, didn’t view people as mere tools — was haughty and dismissive toward a waiter, and a black one at that. His behavior exemplified jerkiness. Whatever he thought about the waiter as a human being (and I have no way of knowing that), he acted the way he did because he sees himself as a superior being — an attitude to which I can attest by virtue of long acquaintance. (When haughtiness wasn’t called for, condescension was. Here‘s a perfect example of it.)

That’s what makes a jerk a jerk: an overt attitude of superiority. It usually comes out as rudeness, pushiness, or loudness — in short, dominating a situation by assertive behavior.

Does the jerk have an inferiority complex for which he is compensating? Was he a spoiled child? Is he a neurotic who tries to conquer his insecurity by behaving more assertively than necessary? Does he fail to appreciate the perspectives of other people, as Schwitzgebel puts it?

Who knows? And why does it matter? When confronted with a jerk, I deal with the behavior — or ignore or avoid it. The cause would matter only if I could do something about it. But jerks (like the relatively poor) are always with us.

So are psychopaths, though they must be dealt with differently.

Schwitzgebel addresses the connection between jerkiness and psychopathy, but gets it wrong:

People with psychopathic personalities are selfish and callous, as is the jerk, but they also incline toward impulsive risk-taking, while jerks can be calculating and risk-averse.

A jerk doesn’t care (or think) about his treatment of other people in mundane settings. He is just getting away with what he can get away with at the moment; that is, he is being impulsive. Nor is jerky behavior necessarily risk-averse; it often invites a punch in the mouth.

A psychopath, by contrast, is often calculating and ingratiating — especially when he is setting up a victim for whatever he has in mind, be it setting up a co-worker for termination or setting up an innocent for seduction and murder.

A psychopath doesn’t do such things because he is devoid of empathy. A successful psychopath is skilled at “reading” his victims — empathizing with them — in order to entice them into a situation where he gets what he wants from them.

In evidence, I turn to Paul Bloom’s “The Root of All Cruelty?” (The New Yorker, November 27, 2017):

The thesis that viewing others as objects or animals enables our very worst conduct would seem to explain a great deal. Yet there’s reason to think that it’s almost the opposite of the truth.

At some European soccer games, fans make monkey noises at African players and throw bananas at them. Describing Africans as monkeys is a common racist trope, and might seem like yet another example of dehumanization. But plainly these fans don’t really think the players are monkeys; the whole point of their behavior is to disorient and humiliate. To believe that such taunts are effective is to assume that their targets would be ashamed to be thought of that way—which implies that, at some level, you think of them as people after all.

Consider what happened after Hitler annexed Austria, in 1938. Timothy Snyder offers a haunting description in Black Earth: The Holocaust as History and Warning:

The next morning the “scrubbing parties” began. Members of the Austrian SA, working from lists, from personal knowledge, and from the knowledge of passersby, identified Jews and forced them to kneel and clean the streets with brushes. This was a ritual humiliation. Jews, often doctors and lawyers or other professionals, were suddenly on their knees performing menial labor in front of jeering crowds. Ernest P. remembered the spectacle of the “scrubbing parties” as “amusement for the Austrian population.” A journalist described “the fluffy Viennese blondes, fighting one another to get closer to the elevating spectacle of the ashen-faced Jewish surgeon on hands and knees before a half-dozen young hooligans with Swastika armlets and dog-whips.” Meanwhile, Jewish girls were sexually abused, and older Jewish men were forced to perform public physical exercise.

The Jews who were forced to scrub the streets—not to mention those subjected to far worse degradations—were not thought of as lacking human emotions. Indeed, if the Jews had been thought to be indifferent to their treatment, there would have been nothing to watch here; the crowd had gathered because it wanted to see them suffer. The logic of such brutality is the logic of metaphor: to assert a likeness between two different things holds power only in the light of that difference. The sadism of treating human beings like vermin lies precisely in the recognition that they are not.

As with jerkiness, I don’t care what motivates psychopathy. If jerks are to be avoided, psychopaths are to be punished — good and hard — by firing them (if they are workplace psychopaths) or jailing and executing them (if they are criminal psychopaths).

Come to think of it, if jerks were punched in the mouth more often, perhaps there would be less jerky behavior. And, for most of us, it is jerks — not psychopaths — who make life less pleasant than it could be.

1963: The Year Zero

From steady progress under sane governance to regress spurred by the counter-culture.

[A] long habit of not thinking a thing WRONG, gives it a superficial appearance of being RIGHT…. Time makes more converts than reason. — Thomas Paine, Common Sense

If ignorance and passion are the foes of popular morality, it must be confessed that moral indifference is the malady of the cultivated classes. The modern separation of enlightenment and virtue, of thought and conscience, of the intellectual aristocracy from the honest and common crowd is the greatest danger that can threaten liberty. — Henri Frédéric Amiel, Journal

The Summer of Love ignited the loose, Dionysian culture that is inescapable today. The raunch and debauchery, radical individualism, stylized non-conformity, the blitzkrieg on age-old authorities, eventually impaired society’s ability to function. — Gilbert T. Sewall, “Summer of Love, Winter of Decline

If, like me, you were an adult when John F. Kennedy was assassinated, you may think of his death as a watershed moment in American history. I say this not because I’m an admirer of Kennedy the man (I am not), but because American history seemed to turn a corner after Kennedy was murdered. To take the metaphor further, the corner marked the juncture of a sunny, tree-lined street (America from the end of World War II to November 22, 1963) and a dingy, littered street (America since November 22, 1963).

Changing the metaphor, I acknowledge that the first 18 years after V-J Day were by no means halcyon, but they were the spring that followed the long, harsh winter of the Great Depression and World War II. Yes, there was the Korean War, but that failure of political resolve was only a rehearsal for later debacles. McCarthyism, a political war waged (however clumsily) on America’s actual enemies, was benign compared with the war on civil society that began in the 1960s and continues to this day. The threat of nuclear annihilation, which those of you who were schoolchildren of the 1950s will remember well, had begun to subside with the advent of JFK’s military policy of flexible response, and seemed to evaporate with JFK’s resolution of the Cuban Missile Crisis (however poorly he managed it). And for all of his personal faults, JFK was a paragon of grace, wit, and charm — a movie-star president — compared with his many successors, with the possible exception of Ronald Reagan, who had been a real movie star.

What follows is an impression of America since November 22, 1963, when spring became a long, hot summer, followed by a dismal autumn and another long, harsh winter — not of deprivation, and perhaps not of war, but of rancor and repression.

This petite histoire begins with the Vietnam War and its disastrous mishandling by LBJ, its betrayal by the media, and its spawning of the politics of noise. “Protests” in public spaces (which spill destructively onto private property) are a main feature of the politics of noise. In the new age of instant and sympathetic media attention to “protests”, civil and university authorities often refuse to enforce order. The media portray obstructive and destructive disorder as “free speech”. Thus do “protestors” learn that they can, with impunity, inconvenience and cow the masses who simply want to get on with their lives and work.

Whether “protestors” learned from rioters, or vice versa, they learned the same lesson. Authorities, in the age of Dr. Spock, lack the guts to use force, as necessary, to restore civil order. (LBJ’s decision to escalate gradually in Vietnam — “signaling” to Hanoi — instead of waging all-out war was of a piece with the “understanding” treatment of demonstrators and rioters.) Rioters learned another lesson — if a riot follows the arrest, beating, or death of a black person, it’s a “protest” against something (usually white-racist oppression, regardless of the facts), not wanton mayhem. After a hiatus of 21 years, urban riots resumed in 1964, and continue to this day.

LBJ’s “Great Society” marked the resurgence of FDR’s New Deal — with a vengeance — and the beginning of a long decline of America’s economic vitality. The combination of the Great Society (and its later extensions, such as Medicare Part D and Obamacare) with the rampant growth of regulatory activity has cut the rate of real economic growth from more than 4 percent to less than percent (and falling). Work has been discouraged; dependency has been encouraged. America since 1963 has been visited by a perfect storm of economic destruction that seems to have been designed by America’s enemies.

The Civil Rights Act of 1964 unnecessarily crushed property rights, along with freedom of association, to what end? So that a violent, dependent, Democrat-voting underclass could arise from the Great Society? So that future generations of privilege-seekers could cry “discrimination” if anyone dares to denigrate their “lifestyles”? There was a time when immigrants and other persons who seemed “different” had the good sense to strive for success and acceptance as good neighbors, employees, and merchants. But the Civil Rights Act of 1964 and its various offspring — State and local as well as federal — are meant to short-circuit that striving and to force acceptance, whether or not a person has earned it. The vast, silent majority is caught between empowered privilege-seekers and powerful privilege-granters. The privilege-seekers and privilege-granters are abetted by dupes who have, as usual, succumbed to the people’s romance — the belief that government represents society.

Presidents, above all, like to think that they represent society. What they represent, of course, are their own biases and the interests to which they are beholden. Truman, Ike, and JFK were imperfect presidential specimens, but they are shining idols by contrast with most of their successors. The downhill slide from the Vietnam and the Great Society to Obamacare, lawlessness on immigration, the bugout from Afghanistan, and the feckless war on “climate change” has been punctuated by many shameful episodes; for example:

  • LBJ — the botched war in Vietnam, repudiation of property rights and freedom of association (the Civil Rights Act)

  • Nixon — price controls, Watergate

  • Carter — dispiriting leadership and impotence in the Iran hostage crisis

  • Reagan — bugout from Lebanon, rescue of Social Security

  • Bush I — failure to oust Saddam when it could have been done easily, the broken promise about taxes

  • Clinton — bugout from Somalia, push for an early version of Obamacare, budget-balancing at the cost of defense, and perjury

  • Bush II — No Child Left Behind Act, Medicare Part D, the initial mishandling of Iraq, and Wall Street bailouts

  • Obama — stimulus spending, Obamacare, reversal of Bush II’s eventual success in Iraq, naive backing for the “Arab spring”,  acquiescence to Iran’s nuclear ambitions, unwillingness to acknowledge or do anything about the expansionist aims of Russia and China, neglect or repudiation of traditional allies (especially Israel), and refusal to take care that the immigration laws are executed faithfully

  • Trump — many good policies (e.g., immigration control, regulatory rollbacks, more defense spending, energy self-sufficiency) offset by reckelss spending and a failure to conquer the “deep state”

  • Biden — Obama on steroids, with the reversal of Trump’s policies accompanied by the self-inflicted wound of energy dependence, socially divisive and wrong-headed policies, and the movement toward outlawing political opponents.

Only Reagan’s defense buildup and its result — victory in the Cold War — stands out as a great accomplishment. But the victory was squandered: The “peace dividend” should have been peace through continued strength, not unpreparedness for the post 9/11 wars and the resurgence of Russia and China.

The war on defense has been accompanied by a war on science. The party that proclaims itself the party of science is anything but that. It is the party of superstitious, Luddite anti-science. Witness the embrace of extreme environmentalism, the arrogance of proclamations that AGW is “settled science”, unjustified fear of genetically modified foodstuffs, the implausible doctrine that race is nothing but a social construct, and on and on.

With respect to the nation’s moral well-being, the most destructive war of all has been the culture war, which assuredly began in the 1960s. Almost overnight, it seems, the nation was catapulted from the land of Ozzie and Harriet, Father Knows Best, and Leave It to Beaver to the land of the free- filthy-speech movement, Altamont, Woodstock, Hair, and the unspeakably loud, vulgar, and violent offerings that are now plastered all over the air waves, the internet, theater screens, and “entertainment” venues.

Adherents of the ascendant culture esteem protest for its own sake, and have stock explanations for all perceived wrongs (whether or not they are wrongs): racism, sexism, homophobia, Islamophobia, hate, white privilege, inequality (of any kind), Wall  Street, climate change, Zionism, and so on.

Then there is the campaign to curtail freedom of speech. This purported beneficiaries of the campaign are the gender-confused and the easily offended (thus “microagressions” and “trigger warnings”). The true beneficiaries are leftists. Free speech is all right if it’s acceptable to the left. Otherwise, it’s “hate speech,” and must be stamped out. This is McCarthyism on steroids. McCarthy, at least, was pursuing actual enemies of liberty; today’s leftists are the enemies of liberty.

There’s a lot more, unfortunately. The organs of the state have been enlisted in an unrelenting campaign against civilizing social norms. As I say here,

we now have not just easy divorce, subsidized illegitimacy, and legions of non-mothering mothers, but also abortion, concerted (and deluded) efforts to defeminize females and to neuter or feminize males, forced association (with accompanying destruction of property and employment rights), suppression of religion, absolution of pornography, and the encouragement of “alternative lifestyles” that feature disease, promiscuity, and familial instability. The state, of course, doesn’t act of its own volition. It acts at the behest of special interests — interests with a “cultural” agenda….  They are bent on the eradication of civil society — nothing less — in favor of a state-directed Rousseauvian dystopia from which morality and liberty will have vanished, except in Orwellian doublespeak.

If there are unifying themes in this petite histoire, they are the death of common sense and the rising tide of moral vacuity — thus the epigrams at the top of the post. The history of the United States since 1963 supports the proposition that the nation is indeed going to hell in a handbasket.


Related reading (a small sample of writings that attest to the decline of America and its civilizing norms):

The British Roots of the Founding, and of Liberty in America

Both have withered and are almost dead.

Before America became purely a “proposition nation”, it was also an “ethnic nation”. As Malcolm Pollack puts it here:

Once upon a time, an ordinary understanding of nationalism embraced all of this: love for, and loyalty to, not only shared beliefs, but also for one’s people, their common heritage and traditions, and their homeland. But in these withered times, we must pry it all apart and pare away everything, no matter how common and natural and healthy, that violates our new ideological orthodoxy. We have to be content, now, with what our grandparents would surely have seen as a sad and shriveled “patriotism”: all that is left for us to love about our nation is a handful of philosophical postulates….

There should be no doubt that the founding of the United States rests upon a set of propositions that articulate a theory of natural law and natural rights, chief among which is the proposition that no human being is by nature rightfully sovereign over any other. (This, and pretty much only this, is what the Founders meant when they said “created equal”.) So in that sense it is correct to call the United States a “proposition nation”.

The problem is that nowadays it is all too common to stop there: to declare the United States to be a “proposition nation” and nothing more….

The founders knew very well that for a society based on natural liberty and limited government to flourish would require civic virtue, and a sense of civic duty, and that these in turn required commonality: not just the commonality of assent to a set of political abstracta, but also the natural cohesion of a community of people who share history, culture, traditions, and a broad sense of actual kinship.

John Jay wrote about this in Federalist 2 (my emphasis):

It has often given me pleasure to observe that independent America was not composed of detached and distant territories, but that one connected, fertile, widespreading country was the portion of our western sons of liberty. Providence has in a particular manner blessed it with a variety of soils and productions, and watered it with innumerable streams, for the delight and accommodation of its inhabitants. A succession of navigable waters forms a kind of chain round its borders, as if to bind it together; while the most noble rivers in the world, running at convenient distances, present them with highways for the easy communication of friendly aids, and the mutual transportation and exchange of their various commodities.

With equal pleasure I have as often taken notice that Providence has been pleased to give this one connected country to one united people–a people descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs, and who, by their joint counsels, arms, and efforts, fighting side by side throughout a long and bloody war, have nobly established general liberty and independence.

This country and this people seem to have been made for each other, and it appears as if it was the design of Providence, that an inheritance so proper and convenient for a band of brethren, united to each other by the strongest ties, should never be split into a number of unsocial, jealous, and alien sovereignties.

… The American Founding could not have happened elsewhere: swap out the colonial population of 1776 with a random assortment of people from everywhere on Earth and it would quickly have failed. The particularities of the “matter” upon which the American propositions were to act were every bit as determining as the “form” — the propositions — themselves.

But the unique circumstances of the Founding could not be preserved against the onslaught of abstract legalisms, which put the “propositions” of the Founding ahead of its substance: the culture of America’s predominantly British Founders. Examine the lists of signatories of America’s Founding Documents in the table below and you will see, in addition to many duplicated names and family names, only a few obviously non-British person among the 123 listed.

Substantive liberty in America — the true liberty of beneficial cooperation based on mutual trust, respect, and forbearance — could not withstand the onslaught of three forces: (1) cultural fragmentation, (2) the concomitant rise of legalistic abstraction (e.g., free-speech absolutism), and (3) the aggressive growth of the central government, which is both the beneficiary and initiator of the first and second forces.

Monarchs of Wessex, England, and the United Kingdom

From Cerdic (r. 519-534) to Charles III (r. 2022- )

Charles III is the 83nd monarch. His royal lineage goes back to Sweyn, the 34th monarch (b. 960, r. 1013-1014).

Click on the image to zoom in.

"Intelligence" as a Dirty Word …

… and other evasions of the truth.

Once upon a time I read a post, “The Nature of Intelligence”,  at a now-defunct blog called MBTI Truths. (MBTI refers to a controversial personality test: Myers-Briggs Type Indicator.) Here is the entire text of the post:

A commonly held misconception within the MBTI community is that iNtuitives are smarter than Sensors. They are thought to have higher intelligence, but this belief is misguided. In an assessment of famous people with high IQs, the vast majority of them are iNtuitive. However, IQ tests measure only two types of intelligences: linguistic and logical-mathematical. In addition to these, there are six other types of intelligence: spatial, bodily-kinesthetic, musical, interpersonal, intrapersonal, and naturalistic. Sensors would probably outscore iNtuitives in several of these areas. Perhaps MBTI users should come to see iNtuitives, who make up 25 percent of the population, as having a unique type of intelligence instead of superior intelligence.

The use of “intelligence” with respect to traits other than brain-power is misguided. “Intelligence” has a clear and unambiguous meaning in everyday language; for example:

The capacity to acquire, understand, and use knowledge.

That is the way in which I use “intelligence” in “Intelligence, Personality, Politics, and Happiness”, and it is the way in which the word is commonly understood. The application of “intelligence” to other kinds of ability — musical, interpersonal, etc. — is a fairly recent development that smacks of anti-elitism. It is a way of saying that highly intelligent individuals (where “intelligence” carries its traditional meaning) are not necessarily superior in all respects. No kidding!

As to the merits of the post at MBTI Truths, it is mere speculation to say that “Sensors would probably outscore iNtuitives in several of these” other types of ability. (And what is “naturalistic intelligence”, anyway?)

Returning to a key point of “Intelligence, Personality, Politics, and Happiness”, the claim that iNtuitives are generally smarter than Sensors is nothing but a claim about the relative capacity of iNtuitives to acquire and apply knowledge. It is quite correct to say that iNtuitives are not necessarily better than Sensors at, say, sports, music, glad-handing, and so one. It is also quite correct to say that iNtuitives generally are more intelligent than Sensors, in the standard meaning of “intelligence”.

Other so-called types of intelligence are not types of intelligence, at all. They are simply other types of ability, each of them is (perhaps) valuable in its own way. But calling them types of intelligence is a transparent effort to denigrate the importance of real intelligence, which is an important determinant of significant life outcomes: learning, job performance, income, health, and criminality (in the negative).

It is a sign of the times that an important human trait is played down in an effort to inflate the egos of persons who are not well endowed with respect to that trait. The attempt to redefine or minimize intelligence is of a piece with the use of genteelisms, which Wilson Follett defines as

soft-spoken expressions that are either unnecessary or too regularly used. The modern world is much given to making up euphemisms that turn into genteelisms. Thus newspapers and politicians shirk speaking of the poor and the crippled. These persons become, respectively, the underprivileged (or disadvantaged) and the handicapped [and now -challenged and -abled: ED]. (Modern American Usage (1966), p. 169)

Finally:

Genteelisms may be of … the old-fashioned sort that will not name common things outright, such as the absurd plural bosoms for breasts, and phrases that try to conceal accidental associations of ideas, such as back of for behind. The advertiser’s genteelisms are too numerous to count. They range from the false comparative (e.g., the better hotels) to the soapy phrase (e.g., gracious living), which is supposed to poeticize and perfume the proffer of bodily comforts. (Ibid., p. 170)

And so it is that such traits as athleticism, musical virtuosity, and garrulousness become kinds of intelligence. Why? Because it is somehow inegalitarian — and therefore unmentionable — that some persons are smarter than others.

Life just isn’t fair, so get over it.

A closely related matter is the use of euphemisms. A euphemism is

an innocuous word or expression used in place of one that is deemed offensive or suggests something unpleasant.

The market in euphemisms has been cornered by politically correct leftists, who can’t confront reality and wish to erect a fantasy in its place. A case in point is a “bias-free language guide” that was posted on the website of the University of New Hampshire in 2013 and stayed there for a few years. The guide disappeared after Mark Huddleston, the university’s president, issued this statement:

While individuals on our campus have every right to express themselves, I want to make it absolutely clear that the views expressed in this guide are NOT the policy of the University of New Hampshire. I am troubled by many things in the language guide, especially the suggestion that the use of the term ‘American’ is misplaced or offensive. The only UNH policy on speech is that it is free and unfettered on our campuses. It is ironic that what was probably a well-meaning effort to be ‘sensitive’ proves offensive to many people, myself included. [Quoted in “University President Offended by Bias-Free Language Guide,” an Associated Press story published in USA Today, July 30, 2015]

The same story adds some detail about the contents of the guide:

One section warns against the terms “older people, elders, seniors, senior citizens.” It suggests “people of advanced age” as preferable, though it notes that some have “reclaimed” the term “old people.” Other preferred terms include “person of material wealth” instead of rich, “person who lacks advantages that others have” instead of poor and “people of size” to replace the word overweight.

There’s more from another source:

Saying “American” to reference Americans is also problematic. The guide encourages the use of the more inclusive substitutes “U.S. citizen” or “Resident of the U.S.”

The guide notes that “American” is problematic because it “assumes the U.S. is the only country inside [the continents of North and South America].” (The guide doesn’t address whether or not the terms “Canadians” and “Mexicans” should be abandoned in favor of “Residents of Canada” and “Residents of Mexico,” respectively.)

The guide clarifies that saying “illegal alien” is also problematic. While “undocumented immigrant” is acceptable, the guide recommends saying “person seeking asylum,” or “refugee,” instead. Even saying “foreigners” is problematic; the preferred term is “international people.”

Using the word “Caucasian” is considered problematic as well, and should be discontinued in favor of “European-American individuals.” The guide also states that the notion of race is “a social construct…that was designed to maintain slavery.”

The guide also discourages the use of “mothering” or “fathering,” so as to “avoid gendering a non-gendered activity.”

Even saying the word “healthy” is problematic, the university says. The “preferred term for people without disabilities,” the university says, is “non-disabled.” Similarly, saying “handicapped” or “physically-challenged” is also problematic. Instead, the university wants people to use the more inclusive “wheelchair user,” or “person who is wheelchair mobile.”

Using the words “rich” or “poor” is also frowned upon. Instead of saying “rich,” the university encourages people to say “person of material wealth.” Rather than saying a person is “poor,” the university encourages its members to substitute “person who lacks advantages that others have” or “low economic status related to a person’s education, occupation and income.”

Terms also considered problematic include: “elders,” “senior citizen,” “overweight” (which the guide says is “arbitrary”), “speech impediment,” “dumb,” “sexual preference,” “manpower,” “freshmen,” “mailman,” and “chairman,” in addition to many others. [Peter Hasson, “Bias-Free Language Guide Claims the Word ‘American’ Is ‘Problematic’,” Campus Reform, July 28, 2015]

And more, from yet another source:

Problematic: Opposite sex. Preferred: Other sex.

Problematic: Homosexual. Preferred: Gay, Lesbian, Same Gender Loving

Problematic: Normal … healthy or whole. Preferred: Non-disabled.

Problematic/Outdated: Mothering, fathering. Preferred: Parenting, nurturing. [Jennifer Kabbany, “University’s ‘Bias-Free Language Guide’ Criticizing Word ‘American’ Prompts Shock, Anger,” The College Fix, July 30, 2015

The UNH students who concocted the guide — and the thousands (millions?) at other campuses who think similarly — must find it hard to express themselves clearly. Every word must be weighed before it is written or spoken, for fear of giving offense to a favored group or implying support of an idea, cause, institution, or group of which the left disapproves. (But it’s always open season on “fascist, capitalist pigs”.)

Gee, it must be nice to live in a fantasy world, where reality can be obscured or changed just by saying the right words. Here’s a thought for the fantasists of the left: You don’t need to tax, spend, and regulate Americans until they’re completely impoverished and subjugated. Just say that it’s so — and leave the rest of us alone.

Getting It Perfect

Wry commentary about the Constitution and other things.

Have you ever noticed that Americans are perfectionists? It’s true.

It all began with the U.S. Constitution. The preamble to the Constitution says it was “ordained and established” (a ringing phrase, that) “in order to form a more perfect union” — among other things. It’s been all downhill since then.

The Federalists (pro-Constitution) and anti-Federalists (anti-Constitution) continued to squabble for a decade or so after ratification of the Constitution. The anti-Federalists believed the union to be perfect enough under the Articles of Confederation. But those blasted perfectionist Federalists won the debate. So here we are.

The Federalists were such perfectionists that they left room in the Constitution for amending it. After all, a “more perfect union” can’t be attained in a day. Thus, in our striving toward perfection — Consitution-wise, that is — we have now amended it 27 times. We even adopted an amendment (XVIII, the Prohibition amendment, 1919) and only 14 years later amended it out of existence (XXI, the Repeal amendment, 1933).

But we can be very patient when it comes to perfecting the Constitution through amendments. Amendment XXVII (the most recent amendment) was submitted to the States on September 25, 1789, as part of the proposed Bill of Rights. It wasn’t ratified until May 7, 1992. Not to worry, though, Amendment XXVII isn’t about rights, it merely prevents a sitting Congress from raising its own pay:

No law, varying the compensation for the services of the Senators and Representatives, shall take effect, until an election of Representatives shall have intervened.

So now the only group of public servants that can vote itself a pay raise must wait out an election before a raise takes effect. Big deal. Most members of Congress get re-elected, anyway.

Where was I? Oh, yes, perfectionism. Well, after the Constitution was ratified, the next big squabble was about States’ rights. Some politicians from the North preferred to secede rather than remain in a union that permitted slavery. Some politicians from the South said the slavery issue was just a Northern excuse to bully the South; the South, they said, should secede from the union. The union, it seems, just wasn’t perfect enough for either the North or the South. Well, the South won that squabble by seceding first, which so ticked off the North that it dragged the South back into the union, kickin’ and hollerin’. The North had decided that the only perfect union was a whole union, rednecks and all.

The union didn’t get noticeably more perfect with the South back in the fold. Things just went squabbling along through the Spanish-American War and World War I. There was a lot more prosperity in the Roaring ’20s, but that was spoiled by Prohibition. It wasn’t hard to find a drink, but you never knew when your local speakeasy might be raided by Elliot Ness or when you might get caught in a shoot-out between rival bootleggers.

The Great Depression put an end to the Roaring ’20s, and that sent perfection for a real loop. But Franklin D. Roosevelt got the idea that he could help us out of the Depression by creating a bunch of alphabet-soup agencies, including the CCC, the PWA, the FSA, and the WPA. I guess he got his idea from his older cousin, Teddy, who created his own alphabet-soup agencies back in the early 1900s.

Well, Franklin really got the ball rolling, and it seems like almost every president since him has added a bunch of alphabet-soup agencies to the executive branch. And when a president has been unable to think of new alphabet-soup agencies, Congress has stepped in and helped him out. (It’s not yet necessary to say “him, her, or it”) It seems that our politicians think we’ll attain perfection when there are enough agencies to use every possible combination of three letters from the alphabet. (That’s only 17,576 agencies; we must be getting close to perfection by now.)

During the Great Depression some people began to think that criminals (especially juvenile delinquents) weren’t naturally bad. Nope, their criminality was “society’s fault” (not enough jobs), and juvenile delinquents could be rehabilitated — made more perfect, if you will — through “understanding”, which would make model citizens of psychopaths. That idea was put on hold during World War II because we needed those former juvenile delinquents and their younger brothers to kill Krauts and Japs. (Oops, spell-checker doesn’t like “Japs”; “Nips” is okay, though.)

The idea of rehabilitating juvenile delinquents through “understanding” took hold after the war. (There was no longer a Great Depression, but there still weren’t enough jobs because of the minimum wage, another great advance toward perfection.) In fact, the idea “undertanding” miscreants spread beyond the ranks of juvenile delinquents to encompass every tot and pre-adolescent in the land. Corporal punishment became a no-no. Giving into Johnny and Jill’s every whim became a yes-yes. Guess what? What: Johnny and Jill grew up to be voters. Politicians quickly learned not to say “no” to Johnny and Jill’s demands for — whatever — otherwise Johnny and Jill would throw a fit (and throw a politician out of office). So, politicians just got in the habit of approving things Johnny and Jill asked for. In fact, they even got in the habit of approving things Johnny and Jill might ask for. (Better safe than out of office.) A perfect union, after all, is one that grants our every wish — isn’t it? We’re not there yet, but we’re trying like hell.

Sometimes you can’t attain perfection through legislation. Then you go to court. Remember a few years ago when an Alabama jury awarded millions (millions!) of dollars to the purchaser of a new BMW who discovered that its paint job was not pristine? Or how about the small machine-tool company that was sued by a workman who lost three fingers while using (or misusing) the company’s product, even though the machine had been rebuilt at least once and had changed hands four times. (Somebody’s gotta pay for my stupidity.) Then there was the infamous case in which a jury found in favor of a woman who had burned herself with hot coffee (what did she expect?) dispensed by a fast-food chain.

The upshot of our litigiousness? The politicians elected by Johnny and Jill — ever in the pursuit of more perfection — have mandated warning labels for everything. THIS SAW IS SHARP. THIS COFFEE IS HOT. DON’T PUT THIS PLASTIC BAG OVER YOUR HEAD, STUPID. DON’T STICK YOUR HAND DOWN THIS GARBAGE DISPOSAL, YOU MORON. THIS TOY GUN WON’T KILL AN ARMED INTRUDER (HA, HA, HA, YOU GUN NUT!).

You may have noticed a trend in my tale. Politicians quit trying some years ago to perfect the union; their aim is to perfect US (We the People). That’s why they keep raising cigarette and gasoline taxes. Everyone knows that smoking is a slovenly redneck habit (movie stars excepted, of course).

As for gasoline, it’s a fossil fuel. (Think of all the dinosaurs who gave their lives so that you can guzzle gas.) But lots of people — especially politicians — “know” (because “the science” says so) that the use of fossil fuels has caused a (rather erratic) rise in Earth’s average temperature (as measured mainly by thermometers situated in urban heat-islands) amounting to 2 degrees in 170 years. (You can get same effect by sitting in the same place for a few minutes after the sun rises.)

According to “the science” this rather puny (and mostly phony) phenomenon is due to something called the “greenhouse effect”, in which atmospheric gases capture heat and prevent it from escaping to outer space. One of the gases (a very minor one) is CO2, some of which is emitted by human activity (e.g., guzzling gas), although the amount of CO2 in the atmosphere keeps rising even when human activity slows down (as in during economic recessions and pandemics). Neverthelyess, politicians believe “the science” and therefore believe that the use of fossil fuels must be stopped (STOPPED) even if it means another Great Depression and mass starvation. Or, even better, stopping the use of fossil fuels will result in the near-extinction of the human race so that evil human beings will no longer be able to use fossil fuels, and good ones (who also use them but are excused because they believe in “the science”) will have all the fossil fuels to themselves.

Ah, perfection at last.

Cass Sunstein, Part 6

Farewell — for now — to the plausible authoritarian.

This is the sixth (and perhaps final) installment of a series about Cass Sunstein, whom I have dubbed the plausible authoritarian because of his ability to make authoritarian measures seem like reasonable ways of advancing democratic participation and social comity. The first five installments are here, here, here, here, and here.

Cass Sunstein (CS) became Barack Obama’s “regulatory czar” (Administrator of the Office of Information and Regulatory Affairs), following a prolonged delay in action on his nomination to the office because of his controversial views. This post draws on posts that I wrote during and after CS’s “czarship”, which lasted from September 2009 to August 2012.

Alec Rawls, writing at his blog, Error Theory, found CS on the wrong side of history (to borrow one of his boss’s favorite slogans):

As Congress considers vastly expanding the power of copyright holders to shut down fair use of their intellectual property, this is a good time to remember the other activities that Obama’s “regulatory czar” Cass Sunstein wants to shut down using the tools of copyright protection. For a couple of years now, Sunstein has been advocating that the “notice and take down” model from copyright law should be used against rumors and conspiracy theories, “to achieve the optimal chilling effect.”

Why?

Sunstein seems most intent on suppressing is the accusation, leveled during the 2008 election campaign, that Barack Obama “pals around with terrorists.” (“Look Inside” page 3.) Sunstein fails to note that the “palling around with terrorists” language was introduced by the opposing vice presidential candidate, Governor Sarah Palin (who was implicating Obama’s relationship with domestic terrorist Bill Ayers). Instead Sunstein focuses his ire on “right wing websites” that make “hateful remarks about the alleged relationship between Barack Obama and the former radical Bill Ayers,” singling out Sean Hannity for making hay out of Obama’s “alleged associations” (op. cit., pages 13-14, no longer displayed).

What could possibly be more important than whether a candidate for president does indeed “pal around with terrorists”? Of all the subjects to declare off limits, this one is right up there with whether the anti-CO2 alarmists who are trying to unplug the modern world are telling the truth. And Sunstein’s own bias on the matter could hardly be more blatant. Bill Ayers is a “former” radical? Bill “I don’t regret setting bombs” Ayers? Bill “we didn’t do enough” Ayers?

For the facts of the Obama-Ayers relationship, Sunstein apparently accepts Obama’s campaign dismissal of Ayers as just “a guy who lives in my neighborhood.” In fact their relationship was long and deep. Obama’s political career was launched via a fundraiser in Bill Ayers’ living room; Obama was appointed the first chairman of the Ayers-founded Annenberg Challenge, almost certainly at Ayers’ request [link broken]; Ayers and Obama served together on the board of the Woods Foundation, distributing money to radical left-wing causes; and it has now been reported by full-access White House biographer Christopher Andersen (and confirmed by Bill Ayers) that Ayers actually ghost wrote Obama’s first book Dreams from My Father.

Whenever free speech is attacked, the real purpose is to cover up the truth. Not that Sunstein himself knows the truth about anything. He just knows what he wants to suppress, which is exactly why government must never have this power.

As Rawls further noted, CS also wanted to protect “warmists” from their critics, that is, to suppress science in the name of science:

In climate science, there is no avoiding “reference to the machinations of powerful people, who have also managed to conceal their role.” The Team has always been sloppy about concealing its machinations, but that doesn’t stop Sunstein from using climate skepticism as an exemplar of pernicious conspiracy theorizing, and his goal is perfectly obvious: he wants the state to take aggressive action that will make it easier for our powerful government funded scientists to conceal their machinations.

After CS returned to academe, his spirit lived on in the White House, particularly with regard to CS’s advocacy of thought control, which I exposed at length in part 4 of this series.

Thus:

[Obama] Administration officials have asked YouTube to review a controversial video that many blame for spurring a wave of anti-American violence in the Middle East.

The administration flagged the 14-minute “Innocence of Muslims” video and asked that YouTube evaluate it to determine whether it violates the site’s terms of service, officials said Thursday. The video, which has been viewed by nearly 1.7 million users, depicts Muhammad as a child molester, womanizer and murderer — and has been decried as blasphemous and Islamophobic.

“Review” it, or else. When the 500-pound gorilla speaks, you say “yes, sir.”

Way to go, O-blame-a. Do not stand up for Americans. Suppress them instead. It’s the CS way.

CS later regaled his followers with this:

Suppose that an authoritarian government decides to embark on a program of curricular reform, with the explicit goal of indoctrinating the nation’s high school students. Suppose that it wants to change the curriculum to teach students that their government is good and trustworthy, that their system is democratic and committed to the rule of law, and that free markets are a big problem.

Will such a government succeed? Or will high school students simply roll their eyes?

Questions of this kind have long been debated, but without the benefit of reliable evidence. New research, from Davide Cantoni of the University of Munich and several co-authors, shows that recent curricular reforms in China, explicitly designed to transform students’ political views, have mostly worked….

… [G]overnment planners were able to succeed in altering students’ views on fundamental questions about their nation. As Cantoni and his co-authors summarize their various findings, “the state can effectively indoctrinate students.” To be sure, families and friends matter, as do economic incentives, but if an authoritarian government is determined to move students in major ways, it may well be able to do so.

Is this conclusion limited to authoritarian nations? In a democratic country with a flourishing civil society, a high degree of pluralism, and ample room for disagreement and dissent — like the U.S. — it may well be harder to use the curriculum to change the political views of young people. But even in such societies, high schools probably have a significant ability to move students toward what they consider “a correct worldview, a correct view on life, and a correct value system.” That’s an opportunity, to be sure, but it is also a warning. [“Open Brain, Insert Ideology,” Bloomberg View, May 20, 2014]

Where had CS been? He seemed unaware of the left-wing ethos that has long prevailed in most of America’s so-called institutions of learning. It doesn’t take an authoritarian government (well, not one as authoritarian as China’s) to indoctrinate students in “a correct worldview, a correct view on life, and a correct value system”. All it takes is the spread of left-wing “values” by the media and legions of pedagogues, most of them financed (directly and indirectly) by a thoroughly subverted government. It’s almost a miracle — and something of a moral victory — that there are still tens of millions of Americans who resist and oppose left-wing “values”.

Moving on, I found CS arguing circularly in his contribution to a collection of papers entitled “Economists on the Welfare State and the Regulatory State: Why Don’t Any Argue in Favor of One and Against the Other?” (Econ Journal Watch, Volume 12, Issue 1, January 2015):

… [I]t seems unhelpful, even a recipe for confusion, to puzzle over the question whether economists (or others) ‘like,’ or ‘lean toward,’ both the regulatory state and the welfare state, or neither, or one but not the other. But there is a more fine-grained position on something like that question, and I believe that many (not all) economists would support it. The position is this: The regulatory state should restrict itself to the correction of market failures, and redistributive goals are best achieved through the tax system. Let’s call this (somewhat tendentiously) the Standard View….

My conclusion is that it is not fruitful to puzzle over the question whether economists and others ‘favor’ or ‘lean’ toward the regulatory or welfare state, and that it is better to begin by emphasizing that the first should be designed to handle market failures, and that the second should be designed to respond to economic deprivation and unjustified inequality…. [Sunstein, “Unhelpful Abstractions and the Standard View,” op cit.]

“Market failures” and “unjustified inequality” are the foundation stones of what passes for economic and social thought on the left. Every market outcome that falls short of the left’s controlling agenda is a “failure”. And market and social outcomes that fall short of the left’s illusory egalitarianism are “unjustified”. CS, in other words, couldn’t (and probably still can’t) see that he is a typical leftist who (implicitly) favors both the regulatory state and the welfare state. He is like a fish in water.

Along came a writer who seemed bent on garnering sympathy for CS. I am referring to Andrew Marantz, who wrote “How a Liberal Scholar of Conspiracy Theories Became the Subject of a Right-Wing Conspiracy Theory” (New Yorker, December 27, 2017):

In 2010, Marc Estrin, a novelist and far-left activist from Vermont, found an online version of a paper by Cass Sunstein, a professor at Harvard Law School and the most frequently cited legal scholar in the world. The paper, called “Conspiracy Theories,” was first published in 2008, in a small academic journal called the Journal of Political Philosophy. In it, Sunstein and his Harvard colleague Adrian Vermeule attempted to explain how conspiracy theories spread, especially online. At one point, they made a radical proposal: “Our main policy claim here is that government should engage in cognitive infiltration of the groups that produce conspiracy theories.” The authors’ primary example of a conspiracy theory was the belief that 9/11 was an inside job; they defined “cognitive infiltration” as a program “whereby government agents or their allies (acting either virtually or in real space, and either openly or anonymously) will undermine the crippled epistemology of believers by planting doubts about the theories and stylized facts that circulate within such groups.”

Nowhere in the final version of the paper did Sunstein and Vermeule state the obvious fact that a government ban on conspiracy theories would be unconstitutional and possibly dangerous. (In a draft that was posted online, which remains more widely read, they emphasized that censorship is “inconsistent with principles of freedom of expression,” although they “could imagine circumstances in which a conspiracy theory became so pervasive, and so dangerous, that censorship would be thinkable.”)* “I was interested in the mechanisms by which information, whether true or false, gets passed along and amplified,” Sunstein told me recently. “I wanted to know how extremists come to believe the warped things they believe, and, to a lesser extent, what might be done to interrupt their radicalization. But I suppose my writing wasn’t very clear.”

On the contrary, CS’s writing was quite clear. So clear that even leftists were alarmed by it. Returning to Marantz’s account:

When Barack Obama became President, in 2009, he appointed Sunstein, his friend and former colleague at the University of Chicago Law School, to be the administrator of the Office of Information and Regulatory Affairs. The O.I.R.A. reviews drafts of federal rules, and, using tools such as cost-benefit analysis, recommends ways to make them more efficient. O.I.R.A. administrator is the sort of in-the-weeds post that even lifelong technocrats might find unglamorous; Sunstein had often described it as his “dream job.” He took a break from academia and moved to Washington, D.C. It soon became clear that some of his published views, which he’d thought of as “maybe a bit mischievous, but basically fine, within the context of an academic journal,” could seem far more nefarious in the context of the open Internet.

Estrin, who seems to have been the first blogger to notice the “Conspiracy Theories” paper, published a post in January, 2010, under the headline “Got Fascism?” “Put into English, what Sunstein is proposing is government infiltration of groups opposing prevailing policy,” he wrote on the “alternative progressive” Web site the Rag Blog. Three days later, the journalist Daniel Tencer (Twitter bio: “Lover of great narratives in all their forms”) expanded on Estrin’s post, for Raw Story. Two days after that, the civil-libertarian journalist Glenn Greenwald wrote a piece for Salon headlined “Obama Confidant’s Spine-Chilling Proposal.” Greenwald called Sunstein’s paper “truly pernicious,” concluding, “The reason conspiracy theories resonate so much is precisely that people have learned—rationally—to distrust government actions and statements. Sunstein’s proposed covert propaganda scheme is a perfect illustration of why that is.” Sunstein’s “scheme,” as Greenwald put it, wasn’t exactly a government action or statement. Sunstein wasn’t in government when he wrote it, in 2008; he was in the academy, where his job was to invent thought experiments, including provocative ones. But Greenwald was right that not all skepticism is paranoia.

And then:

Three days after Estrin’s post was published on the Rag Blog, the fire jumped to the other side of the road. Paul Joseph Watson, writing for the libertarian conspiracist outfit InfoWars, linked to Estrin’s post and riffed on it, in a free-associative mode, for fifteen hundred words. “It is a firmly established fact that the military-industrial complex which also owns the corporate media networks in the United States has numerous programs aimed at infiltrating prominent Internet sites and spreading propaganda to counter the truth,” Watson wrote. His boss at InfoWars, Alex Jones, began expanding on this talking point on his daily radio show: “Cass Sunstein says ban conspiracy theories, and that’s whatever he says it is. That’s on record.”

At the time, Glenn Beck hosted both a daily TV show on Fox News and a syndicated radio show; according to a Harris poll, he was the country’s second-favorite TV personality, after Oprah Winfrey. Beck had been delivering impassioned rants against Sunstein for months, calling him “the most dangerous man in America.” Now he added the paper about conspiracy theories to his litany of complaints. In one typical TV segment, in April of 2010, he devoted several minutes to a close reading of the paper, which lists five possible ways that a government might respond to conspiracy theories, including banning them outright. “The government should ban them,” Beck said, over-enunciating to express his incredulity. “How a government with an amendment guaranteeing freedom of speech bans a conspiracy theory is absolutely beyond me, but it’s not beyond a great mind and a great thinker like Cass Sunstein.” In another show, Beck insinuated that Sunstein had been inspired by Edward Bernays, the author of a 1928 book called “Propaganda.” “I got a flood of messages that night, saying, ‘You should be ashamed of yourself, you’re a disciple of Bernays,’ ” Sunstein recalled. “The result was that I was led to look up this interesting guy Bernays, whom I might not have heard of otherwise.”

For much of 2010 and 2011, Sunstein was such a frequent target on right-wing talk shows that some Tea Party-affiliated members of Congress started to invoke his name as a symbol of government overreach. Earlier in the Obama Administration, Beck had targeted Van Jones, now of CNN, who was then a White House adviser on green jobs. After a few weeks of Beck’s attacks, Jones resigned. “Then Beck made it sort of clear that he wanted me to be next,” Sunstein said. “It wasn’t a pleasant fact, but I didn’t see what I could do about it. So I put it out of my mind.”

Sunstein was never asked to resign. He served as the head of O.I.R.A. for three years, then returned to Harvard, in 2012. Two years later, he published an essay collection called “Conspiracy Theories and Other Dangerous Ideas.” The first chapter was a revised version of the “Conspiracy Theories” paper, with several qualifications added and with Vermeule’s name removed. But the revisions did nothing to improve Sunstein’s standing on far-right talk shows, where he had already earned a place, along with Saul Alinsky and George Soros and Al Gore, in the pantheon of globalist bogeymen. Beck referred to Sunstein as recently as last year, on his radio show, while discussing the Obama Administration’s “propaganda” in favor of the Iran nuclear deal. “We no longer have Jefferson and Madison leading us,” Beck said. “We have Saul Alinsky and Cass Sunstein. Whatever it takes to win, you do.” Last December, Alex Jones—who is, improbably, now taken more seriously than Beck by many conservatives, including some in the White House—railed against a recent law, the Countering Foreign Propaganda and Disinformation Act, claiming, speciously, that it would “completely federalize all communications in the United States” and “put the C.I.A. in control of media.” According to Jones, blame for the law rested neither with the members of Congress who wrote it nor with President Obama, who signed it. “I was sitting here this morning . . . And I keep thinking, What are you looking at that’s triggered a memory here?” Jones said. “And then I remembered, Oh, my gosh! It’s Cass Sunstein.”

Cue the tears for Sunstein:

Recently, on the Upper East Side, Sunstein stood behind a Lucite lectern and gave a talk about “#Republic.” Attempting to end on a hopeful note, he quoted John Stuart Mill: “It is hardly possible to overrate the value . . . of placing human beings in contact with persons dissimilar to themselves.” He then admitted, with some resignation, that this describes the Internet we should want, not the Internet we have.

After the talk, we sat in a hotel restaurant and ordered coffee. Sunstein has a sense of humor about his time in the spotlight—what he calls not his fifteen minutes of fame but his Two Minutes Hate, an allusion to “1984”—and yet he wasn’t sure what lessons he had learned from the experience, if any. “I can’t say I spent much time thinking about it, then or now,” he said. “The rosy view would be that it says something hopeful about us—about Americans, that is. We’re highly distrustful of anything that looks like censorship, or spying, or restriction of freedom in any way. That’s probably a good impulse.” He folded his hands on the table, as if to signal that he had phrased his thoughts as diplomatically as possible.

I’m not buying it. CS deserved (and deserves) every bit of blame that has come his way, and I certainly wouldn’t buy a car or house from him. He was attacked from the left and right for good reason, and portraying his attackers as kooks and extremists doesn’t change the facts of the matter. Sunstein’s 2010 article wasn’t a one-off thing. Six years earlier he published “The Future of Free Speech”, which I quoted from and analyzed in part 4 of this series. I ended with this:

[T]he fundamental reason to reject [CS’s] scheme its authoritarianism. It would effectively bring the broadcast media and the internet under control by a government bureaucracy. Any bureaucracy that is empowered to insist upon “completeness”, “fairness”, and “balance” in the exposition of ideas is thereby empowered to define and enforce its conception of those attributes. It is easy to imagine how a bureaucracy that is dominated by power-crazed zealots who espouse socialism, gender fluidity, “equity”, etc., etc., would deploy its power.

In an earlier post I said that Cass Sunstein is to the integrity of constitutional law as Pete Rose was to the integrity of baseball. It’s worse than that: Sunstein’s willingness to abuse constitutional law in the advancement of a statist agenda reminds me of Hitler’s abuse of German law to advance his repugnant agenda.

There is remorse for having done something wrong, and there is chagrin at having been caught doing something wrong. CS’s conversation-over-coffee with Marantz reads very much like the latter.

It remains a mystery to me why CS has been called a “legal Olympian.” Then, again, if there were a legal Olympics, its main events would be Obfuscation and Casuistry, and CS would be a formidable contestant in both events.

How's Your Implicit Attitude?

Mine’s just fine, thank you.

I was unaware of the Implicit Association Test (IAT) until a few years ago, when I took a test at YourMorals.Org that purported to measure my implicit racial preferences. I’ll say more about that after discussing IAT, which has been exposed as junk. That’s what John. J. Ray calls it:

Psychologists are well aware that people often do not say what they really think.  It is therefore something of a holy grail among them to find ways that WILL detect what people really think. A very popular example of that is the Implicit Associations test (IAT).  It supposedly measures racist thoughts whether you are aware of them or not.  It sometimes shows people who think they are anti-racist to be in fact secretly racist.

I dismissed it as a heap of junk long ago (here and here) but it has remained very popular and is widely accepted as revealing truth.  I am therefore pleased that a very long and thorough article has just appeared which comes to the same conclusion that I did. [“Psychology’s Favorite Tool for Measuring Racism Isn’t Up to the Job“, Political Correctness Watch, September 6, 2017]

The article in question (which has the same title as Ray’s post) is by Jesse Singal. It appeared at Science of Us on January 11, 2017. Here are some excerpts:

Perhaps no new concept from the world of academic psychology has taken hold of the public imagination more quickly and profoundly in the 21st century than implicit bias — that is, forms of bias which operate beyond the conscious awareness of individuals. That’s in large part due to the blockbuster success of the so-called implicit association test, which purports to offer a quick, easy way to measure how implicitly biased individual people are….

Since the IAT was first introduced almost 20 years ago, its architects, as well as the countless researchers and commentators who have enthusiastically embraced it, have offered it as a way to reveal to test-takers what amounts to a deep, dark secret about who they are: They may not feel racist, but in fact, the test shows that in a variety of intergroup settings, they will act racist….

[The] co-creators are Mahzarin Banaji, currently the chair of Harvard University’s psychology department, and Anthony Greenwald, a highly regarded social psychology researcher at the University of Washington. The duo introduced the test to the world at a 1998 press conference in Seattle — the accompanying press release noted that they had collected data suggesting that 90–95 percent of Americans harbored the “roots of unconscious prejudice.” The public immediately took notice: Since then, the IAT has been mostly treated as a revolutionary, revelatory piece of technology, garnering overwhelmingly positive media coverage….

Maybe the biggest driver of the IAT’s popularity and visibility, though, is the fact that anyone can take the test on the Project Implicit website, which launched shortly after the test was unveiled and which is hosted by Harvard University. The test’s architects reported that, by October 2015, more than 17 million individual test sessions had been completed on the website. As will become clear, learning one’s IAT results is, for many people, a very big deal that changes how they view themselves and their place in the world.

Given all this excitement, it might feel safe to assume that the IAT really does measure people’s propensity to commit real-world acts of implicit bias against marginalized groups, and that it does so in a dependable, clearly understood way….

Unfortunately, none of that is true. A pile of scholarly work, some of it published in top psychology journals and most of it ignored by the media, suggests that the IAT falls far short of the quality-control standards normally expected of psychological instruments. The IAT, this research suggests, is a noisy, unreliable measure that correlates far too weakly with any real-world outcomes to be used to predict individuals’ behavior — even the test’s creators have now admitted as such.

How does IAT work? Singal summarizes:

You sit down at a computer where you are shown a series of images and/or words. First, you’re instructed to hit ‘i’ when you see a “good” term like pleasant, or to hit ‘e’ when you see a “bad” one like tragedy. Then, hit ‘i’ when you see a black face, and hit ‘e’ when you see a white one. Easy enough, but soon things get slightly more complex: Hit ‘i’ when you see a good word or an image of a black person, and ‘e’ when you see a bad word or an image of a white person. Then the categories flip to black/bad and white/good. As you peck away at the keyboard, the computer measures your reaction times, which it plugs into an algorithm. That algorithm, in turn, generates your score.

If you were quicker to associate good words with white faces than good words with black faces, and/or slower to associate bad words with white faces than bad words with black ones, then the test will report that you have a slight, moderate, or strong “preference for white faces over black faces,” or some similar language. You might also find you have an anti-white bias, though that is significantly less common. By the normal scoring conventions of the test, positive scores indicate bias against the out-group, while negative ones indicate bias against the in-group.

The rough idea is that, as humans, we have an easier time connecting concepts that are already tightly linked in our brains, and a tougher time connecting concepts that aren’t. The longer it takes to connect “black” and “good” relative to “white” and “good,” the thinking goes, the more your unconscious biases favor white people over black people.

Singal continues (at great length) to pile up the mountain of evidence against IAT, and to caution against reading anything into the results it yields.

Having become aware of the the debunking of IAT, I went to the website of Project Implicit. I was surprised to learn that I could not only find out whether I’m a closet racist but also whether I prefer dark or light skin tones, Asians or non-Asians, Trump or a previous president, and several other things or their opposites. I chose to discover my true feelings about Trump vs. a previous president, and was faced with a choice between Trump and Clinton.

What was the result of my several minutes of tapping “e” and “i” on the keyboard of my PC? This:

Your data suggest a moderate automatic preference for Bill Clinton over Donald Trump.

Balderdash! Though Trump is obviously not of better character than Clinton, he’s obviously not of worse character. And insofar as policy goes, the difference between Trump and Clinton is somewhat like the difference between a non-silent Calvin Coolidge and an FDR without the patriotism. (With apologies to the memory of Coolidge, my favorite president.)

Now, what did IAT say about my racism, or lack thereof? For years I proudly posted these results at the bottom of my “About” page and in the accompanying moral profile:

The study you just completed is an Implicit Association Test (IAT) that compares the strength of automatic mental associations. In this version of the IAT, we investigated positive and negative associations with the categories of “African Americans” and “European Americans”.

The idea behind the IAT is that concepts with very closely related (vs. unrelated) mental representations are more easily and quickly responded to as a single unit. For example, if “European American” and “good” are strongly associated in one’s mind, it should be relatively easy to respond quickly to this pairing by pressing the “E” or “I” key. If “European American” and “good” are NOT strongly associated, it should be more difficult to respond quickly to this pairing. By comparing reaction times on this test, the IAT gives a relative measure of how strongly associated the two categories (European Americans, African Americans) are to mental representations of “good” and “bad”. Each participant receives a single score, and your score appears below.

Your score on the IAT was 0.07.

Positive scores indicate a greater implicit preference for European Americans relative to African Americans, and negative scores indicate an implicit preference for African Americans relative to European Americans.

Your score appears in the graph below in green. The score of the average Liberal visitor to this site is shown in blue and the average Conservative visitor’s score is shown in red.

Moral profile-implicit association test

It should be noted that my slightly positive score probably was influenced by the order in which choices were presented to me. Initially, pleasant concepts were associated with photos of European-Americans. I became used to that association, and so found that it affected my reaction time when I was faced with pairings of pleasant concepts and photos of African-Americans. The bottom line: My slight preference for European-Americans probably is an artifact of test design.

In other words, I believed that my very low score, despite the test set-up, “proved” that I am not a racist. But thanks (or no thanks) to John Ray and Jesse Singal, I must conclude, sadly, that I have no “official” proof of my non-racism.

I suspect that I am not a racist. I don’t despise blacks as a group, nor do I believe that they should have fewer rights and privileges than whites. (Neither do I believe that they should have more rights and privileges than whites or persons of Asian or Ashkenazi Jewish descent — but they certainly do when it comes to college admissions, hiring, and firing.) It isn’t racist to understand that race isn’t a social construct (except in a meaningless way) and that there are general differences between races (see many of the posts listed here). That’s just a matter of facing facts, not ducking them, as leftists are wont to do.

What have I learned from the IAT? I must have very good reflexes. A person who processes information rapidly and then almost instantly translates it into a physical response should be able to “beat” the IAT. And that’s probably what I did in the Trump vs. Clinton test, if not in the racism test. I’m a fast typist and very quick at catching dropped items before they hit the floor. (My IQ, or what’s left of it, isn’t bad either; go here and scroll down to the section headed “Intelligence, Temperament, and Beliefs”.)

Perhaps the IAT for racism could be used to screen candidates for fighter-pilot training. Only “non-racists” would be admitted. Anyone who isn’t quick enough to avoid the “racist” label isn’t quick enough to win a dogfight.

The Iraq War in Retrospect

Hold your moral judgements.

The Iraq War has been called many things, “immoral” being among the leading adjectives for it. Was it altogether immoral? Was it immoral to remain in favor of the war after it was (purportedly) discovered that Saddam Hussein didn’t have an active program for the production of weapons of mass destruction? Or was the war simply misdirected from its proper — and moral — purpose: the service of Americans’ interests by stabilizing the Middle East? I address those and other questions about the war in what follows.

THE WAR-MAKING POWER AND ITS PURPOSE

The sole justification for the United States government is the protection of Americans’ interests. Those interests are spelled out broadly in the Preamble to the Constitution: justice, domestic tranquility, the common defense, the general welfare, and the blessings of liberty.

Contrary to leftist rhetoric, the term “general welfare” in the Preamble (and in Article I, Section 8) doesn’t grant broad power to the national government to do whatever it deems to be “good”. “General welfare” — general well-being, not the well-being of particular regions or classes — is merely one of the intended effects of the enumerated and limited powers granted to the national government by conventions of the States.

One of the national government’s specified powers is the making of war. In the historical context of the adoption of the Constitution, it is clear the the purpose of the war-making power is to defend Americans and their legitimate interests: liberty generally and, among other things, the free flow of trade between American and foreign entities. The war-making power carries with it the implied power to do harm to foreigners in the course of waging war. I say that because the Framers, many of whom fought for independence from Britain, knew from experience that war, of necessity, must sometimes cause damage to the persons and property of non-combatants.

In some cases, the only way to serve the interests of Americans is to inflict deliberate damage on non-combatants. That was the case, for example, when U.S. air forces dropped atomic bombs on Hiroshima and Nagasaki to force Japan’s surrender and avoid the deaths and injuries of perhaps a million Americans. Couldn’t Japan have been “quarantined” instead, once its forces had been driven back to the homeland? Perhaps, but at great cost to Americans. Luckily, in those days American leaders understood that the best way to ensure that an enemy didn’t resurrect its military power was to defeat it unconditionally and to occupy its homeland. You will have noticed that as a result, Germany and Japan are no longer military threats to the U.S., whereas Iraq remained one after the Gulf War of 1990-1991 because Saddam wasn’t deposed. Russia, which the U.S. didn’t defeat militarily — only symbolically — is resurgent militarily. China, which wasn’t even defeated symbolically in the Cold War, is similarly resurgent, and bent on regional if not global hegemony, necessarily to the detriment of Americans’ interests. To paraphrase: There is no substitute for unconditional military victory.

That is a hard and unfortunate truth, but it eludes many persons, especially those of the left. They suffer under dual illusions, namely, that the Constitution is an outmoded document and that “world opinion” trumps the Constitution and the national sovereignty created by it. Neither illusion is shared by Americans who want to live in something resembling liberty and to enjoy the advantages pertaining thereto, including prosperity.

CASUS BELLI

The invasion of Iraq in 2003 by the armed forces of the U.S. government (and those of other nations) had explicit and implicit justifications. The explicit justifications for the U.S. government’s actions are spelled out in the Authorization for Use of Military Force Against Iraq of 2002 (AUMF). It passed the House by a vote of 296 – 133 and the Senate by a vote of 77 – 23, and was signed into law by President George W. Bush on October 16, 2002.

There are some who focus on the “weapons of mass destruction” (WMD) justification, which figures prominently in the “whereas” clauses of the AUMF. But the war, as it came to pass when Saddam failed to respond to legitimate demands spelled out in the AUMF, had a broader justification than whatever Saddam was (or wasn’t) doing with weapons of mass destruction (WMD). The final “whereas” puts it succinctly: it is in the national security interests of the United States to restore international peace and security to the Persian Gulf region.

An unstated but clearly understood implication of “peace and security in the Persian Gulf region” was the security of the region’s oil supply against Saddam’s capriciousness. The mantra “no blood for oil” to the contrary notwithstanding, it is just as important to defend the livelihoods of Americans as it is to defend their lives — and in many instances it comes to the same thing.

In sum, I disregard the WMD rationale for the Iraq War. The real issue is whether the war secured the stability of the Persian Gulf region (and the Middle East in general). And if it didn’t, why did it fail to do so?

ROADS TAKEN AND NOT TAKEN

One can only speculate about what might have happened in the absence of the Iraq War. For instance, how many more Iraqis might have been killed and tortured by Saddam’s agents? How many more terrorists might have been harbored and financed by Saddam? How long might it have taken him to re-establish his WMD program or build a nuclear weapons program? Saddam, who started it all with the invasion of Kuwait, wasn’t a friend of the U.S. or the West in general. The U.S. isn’t the world’s policeman, but the U.S. government has a moral obligation to defend the interests of Americans, preemptively if necessary.

By the same token, one can only speculate about what might have happened if the U.S. government had prosecuted the war differently than it did, which was “on the cheap”. There weren’t enough boots on the ground to maintain order in the way that it was maintained by the military occupations in Germany and Japan after World War II. Had there been, there wouldn’t have been a kind of “civil war” or general chaos in Iraq after Saddam was deposed. (It was those things, as much as the supposed absence of a WMD program that turned many Americans against the war.)

Speculation aside, I supported the invasion of Iraq, the removal of Saddam, and the rout of Iraq’s armed forces with the following results in mind:

  • A firm military occupation of Iraq, for some years to come.

  • The presence in Iraq and adjacent waters and airspace of U.S. forces in enough strength to control Iraq and deter misadventures by other nations in the region (e.g., Iran and Syria) and prospective interlopers (e.g., Russia).

  • Israel’s continued survival and prosperity under the large shadow cast by U.S. forces in the region.

  • Secure production and shipment of oil from Iraq and other oil-producing nations in the region.

All of that would have happened but for (a) too few boots on the ground (later remedied in part by the “surge”); (b) premature “nation-building”, which helped to stir up various factions in Iraq; (c) Obama’s premature surrender, which he was shamed into reversing; and (d) Obama’s deal with Iran, with its bundles of cash and blind-eye enforcement that supported Iran’s rearmament and growing boldness in the region. (The idea that Iraq, under Saddam, had somehow contained Iran is baloney; Iran was contained only until its threat to go nuclear found a sucker in Obama.)

In sum, the war was only a partial success because (once again) U.S. leaders failed to wage it fully and resolutely. This was due in no small part to incessant criticism of the war, stirred up and sustained by Democrats and the media.

WHO HAD THE MORAL HIGH GROUND?

In view of the foregoing, the correct answer is: the U.S. government, or those of its leaders who approved, funded, planned, and executed the war with the aim of bringing peace and security to the Persian Gulf region for the sake of Americans’ interests.

The moral high ground was shared by those Americans who, understanding the war’s justification on grounds broader than WMD, remained steadfast in support of the war despite the tumult and shouting that arose from its opponents.

There were Americans whose support of the war was based on the claim that Saddam had ore was developing WMD, and whose support ended or became less ardent when WMD seemed not to be in evidence. I wouldn’t presume to judge them harshly for withdrawing their support, but I would judge them myopic for basing it on solely on the WMD predicate. And I would judge them harshly if they joined the outspoken opponents of the war, whose opposition I address below.

What about those Americans who supported the war simply because they believed that President Bush and his advisers “knew what they were doing” or out of a sense of patriotism? That is to say, they had no particular reason for supporting the war other than a general belief that its successful execution would be a “good thing”. None of those Americans deserves moral approbation or moral blame. They simply had better things to do with their lives than to parse the reasons for going to war and for continuing it. And it is no one’s place to judge them for not having wasted their time in thinking about something that was beyond their ability to influence. (See the discussion of “public opinion” below.)

What about those Americans who publicly opposed the war, either from the beginning or later? I cannot fault all of them for their opposition — and certainly not  those who considered the costs (human and monetary) and deemed them not worth the possible gains.

But there were (and are) others whose opposition to the war was and is problematic:

  • Critics of the apparent absence of an active WMD program in Iraq, who seized on the WMD justification and ignored (or failed to grasp) the war’s broader justification.

  • Political opportunists who simply wanted to discredit President Bush and his party, which included most Democrats (eventually), effete elites generally, and particularly most members of the academic-media-information technology complex.

  • An increasingly large share of the impressionable electorate who could not (and cannot) resist a bandwagon.

  • Reflexive pro-peace/anti-war posturing by the young, who are prone to oppose “the establishment” and to do so loudly and often violently.

The moral high ground isn’t gained by misguided criticism, posturing, joining a bandwagon, or hormonal emotionalism.

WHAT ABOUT “PUBLIC OPINION”?

Suppose you had concluded that the Iraq War was wrong because the WMD justification seemed to have been proven false as the war went on. Perhaps even than false: a fraud perpetrated by officials of the Bush administration, if not by the president himself, to push Congress and “public opinion” toward support for an invasion of Iraq.

If your main worry about Iraq, under Saddam, was the possibility that WMD would be used against Americans, the apparent falsity of the WMD claim — perhaps fraudulent falsity — might well have turned you against the war. Suppose that there were many millions of Americans like you, whose initial support of the war turned to disillusionment as evidence of an active WMD program failed to materialize. Would voicing your opinion on the matter have helped to end the war? Did you have a moral obligation to voice your opinion? And, in any event, should wars be ended because of “public opinion”? I will try to answer those questions in what follows.

The strongest case to be made for the persuasive value of voicing one’s opinion might be found in the median-voter theorem. According to Wikipedia, the median-voter theorem

“states that ‘a majority rule voting system will select the outcome most preferred by the median voter”….

The median voter theorem rests on two main assumptions, with several others detailed below. The theorem is assuming [sic] that voters can place all alternatives along a one-dimensional political spectrum. It seems plausible that voters could do this if they can clearly place political candidates on a left-to-right continuum, but this is often not the case as each party will have its own policy on each of many different issues. Similarly, in the case of a referendum, the alternatives on offer may cover more than one issue. Second, the theorem assumes that voters’ preferences are single-peaked, which means that voters have one alternative that they favor more than any other. It also assumes that voters always vote, regardless of how far the alternatives are from their own views. The median voter theorem implies that voters have an incentive to vote for their true preferences. Finally, the median voter theorem applies best to a majoritarian election system.

The article later specifies seven assumptions underlying the theorem. None of the assumptions is satisfied in the real world of American politics. Complexity never favors the truth of any proposition; it simply allows the proposition to be wrong in more ways if all of the assumptions must be true, as is the case here.

There is a weak form of the theorem, which says that

the median voter always casts his or her vote for the policy that is adopted. If there is a median voter, his or her preferred policy will beat any other alternative in a pairwise vote.

That still leaves the crucial assumption that voters are choosing between two options. This is superficially true in the case of a two-person race for office or a yes-no referendum. But, even then, a binary option usually masks non-binary ramifications that voters take into account.

In any case, it is trivially true to say that the preference of the median voter foretells the outcome of an election in a binary election, if the the outcome is decided by majority vote and there isn’t a complicating factor like the electoral college. One could say, with equal banality, that the stronger man wins the weight-lifting contest, the outcome of which determines who is the stronger man.

Why am I giving so much attention to the median-voter theorem? Because, according to a blogger whose intellectual prowess I respect, if enough Americans believe a policy of the U.S. government to be wrong, the policy might well be rescinded if the responsible elected officials (or, presumably, their prospective successors) believe that the median voter wants the policy rescinded. How would that work?

The following summary of the blogger’s case is what I gleaned from his original post on the subject and several comments and replies. I have inserted parenthetical commentary throughout.

  • The pursuit of the Iraq War after the WMD predicate for it was (seemingly) falsified — hereinafter policy X — was immoral because X led unnecessarily to casualties, devastation, and other costs. (As discussed above, there were other predicates for X and other consequences of X, some of them good, but they don’t seem to matter to the blogger.)

  • Because X was immoral (in the blogger’s reckoning), X should have been rescinded.

  • Rescission would have (might have?/should have?) occurred through the operation of the median-voter theorem if enough persons had made known their opposition to X. (How might the median-voter theorem have applied when X wasn’t on a ballot? See below.)

  • Any person who had taken the time to consider X (taking into account only the WMD predicate and unequivocally bad consequences) could only have deemed it immoral. (The blogger originally excused persons who deemed X proper, but later made a statement equivalent to the preceding sentence. This is a variant of “heads, I win; tails, you lose”.)

  • Having deemed X immoral, a person (i.e., a competent, adult American) would have been morally obliged to make known his opposition to X. Even if the person didn’t know of the spurious median-voter theorem, his opposition to X (which wasn’t on a ballot) would somehow have become known and counted (perhaps in a biased opinion poll conducted by an entity opposed to X) and would therefore have helped to move the median stance of the (selectively) polled fragment of the populace toward opposition to X, whereupon X would be rescinded, according to the median-voter theorem. (Or perhaps vociferous opposition, expressed in public protests, would be reported by the media — especially by those already opposed to X — as indicative of public opinion, whether or not it represented a median view of X.)

  • Further, any competent, adult American who didn’t bother to take the time to evaluate X would have been morally complicit in the continuation of X. (This must be the case because the blogger says so, without knowing each person’s assessment of the slim chance that his view of the matter would affect X, or the opportunity costs of evaluating X and expressing his view of it.)

  • So the only moral course of action, according to the blogger, was for every competent, adult American to have taken the time to evaluate X (in terms of the WMD predicate), to have deemed it immoral (there being no other choice given the constraint just mentioned), and to have made known his opposition to the policy. (This despite the fact that most competent, adult Americans know viscerally or from experience that the median-voter theorem is hooey — more about that below — and that it would therefore have been a waste of their time to get worked up about a policy that wasn’t unambiguously immoral. Further, they were and are rightly reluctant to align themselves with howling mobs and biased media — even by implication, as in a letter to the editor — in protest of a policy that wasn’t unambiguously immoral.)

  • Then, X (which wasn’t on a ballot) would have been rescinded, pursuant to the median-voter theorem (or, properly, the outraged/vociferous-pollee/protester-biased pollster/media theorem). (Except that X wasn’t, in fact, rescinded despite massive outpourings of outrage by small fractions of the populace, which were gleefully reflected in biased polls and reported by biased media. Nor was it rescinded by implication when President Bush was up for re-election — he won. It might have been rescinded by implication when the Bush was succeeded by Obama — an opponent of X — but there were many reasons other than X for Obama’s victory: mainly the financial crisis, McCain’s lame candidacy, and a desire by many voters to signal — to themselves, at least — their non-racism by voting for Obama. And X wasn’t doing all that badly at the time of Obama’s election because of the troop “surge” authorized by Bush. Further, Obama’s later attempt to rescind X had consequences that caused him to reverse his attempted rescission, regardless of any lingering opposition to X.)

What about other salient, non-ballot issues? Does “public opinion” make a difference? Sometimes yes, sometimes no. Obamacare, for example, was widely opposed until it was enacted by Congress and signed into law by Obama. It suddenly became popular because much of the populace wants to be on the “winning side” of an issue. (So much for the moral value of public opinion.) Similarly, abortion was widely deemed to be immoral until the Supreme Court legalized it. Suddenly, it began to become acceptable according to “public opinion”. I could go on an on, but you get the idea: Public opinion often follows policy rather than leading it, and its moral value is dubious in any event.

But what about cases where government policy shifted in the aftermath of widespread demonstrations and protests? Did demonstrations and protests lead to the enactment of the Civil Rights Acts of the 1960s? Did they cause the U.S. government to surrender, in effect, to North Vietnam? No and no. From where I sat — and I was a politically aware, voting-age, adult American of the “liberal” persuasion at the time of those events — public opinion had little effect on the officials who were responsible for the Civil Rights Acts or the bug-out from Vietnam.

The civil-rights movement of the 1950s and 1960s and the anti-war movement of the 1960s and 1970s didn’t yield results until years after their inception. And those results didn’t (at the time, at least) represent the views of most Americans who (I submit) were either indifferent or hostile to the advancement of blacks and to the anti-patriotic undertones of the anti-war movement. In both cases, mass protests were used by the media (and incited by the promise of media attention) to shame responsible officials into acting as media elites wanted them to.

Further, it is a mistake to assume that the resulting changes in law (writ broadly to include policy) were necessarily good changes. The stampede to enact civil-rights laws in the 1960s, which hinged not so much on mass protests but on LBJ”s “white guilt” and powers of persuasion, resulted in the political suppression of an entire region, the loss of property rights, and the denial of freedom of association. (See, for example, Christopher Caldwell’s “The Roots of Our Partisan Divide“, Imprimis, February 2020.)

The bug-out from Vietnam foretold the U.S. government’s fecklessness in the Iran hostage crisis; the withdrawal of U.S. forces from Lebanon after the bombing of Marine barracks there; the failure of G.H.W. Bush to depose Saddam when it would have been easy to do so; the legalistic response to the World Trade Center bombing; the humiliating affair in Somalia; Clinton’s failure to take out Osama bin Laden; Clinton’s tepid response to Saddam’s provocations; nation-building (vice military occupation) in Iraq; and Obama’s attempt to pry defeat from the jaws of something resembling victory in Iraq.

All of that, and more, is symptomatic of the influence that “liberal” elites came to exert on American foreign and defense policy after World War II. Public opinion has been a side show, and protestors have been useful idiots to the cause of “liberal internationalism”, that is, the surrender of Americans’ economic and security interests for the sake of various rapprochements toward “allies” who scorn America when it veers ever so slightly from the road to serfdom, and enemies — Russia and China — who have never changed their spots, despite “liberal” wishful thinking. Handing America’s manufacturing base to China in the name of free trade is of a piece with all the rest.

IN CONCLUSION . . .

It is irresponsible to call a policy immoral without evaluating all of its predicates and consequences. One might as well call the Allied leaders of World War II immoral because they chose war — with all of its predictably terrible consequences — rather than abject surrender.

It is fatuous to ascribe immorality to anyone who was supportive of or indifferent to the war. One might as well ascribe immorality to the economic and political ignoramuses who failed to see that FDR’s policies would prolong the Great Depression, that Social Security and its progeny (Medicare and Medicaid) would become entitlements that paved the way for the central government’s commandeering of vast portions of the economy, or that the so-called social safety net would discourage work and permanently depress economic growth in America.

If I were in the business of issuing moral judgments about the Iraq War, I would condemn the strident anti-war faction for its perfidy.

I've Got a Little List …

… of irritating persons to be taken out and shot,
And who never would be missed — who never would be missed!
There’s the pestilential nuisances who shout into their phones,
Baring inner secrets at the volume of  trombones —
All people who wear stubbly beards and iridescent tats —
All children who are petulant and whiny little brats —
All drivers who in changing lanes do so without a glance —
And others who stare at green lights as if in lost a trance —
They’d none of ‘em be missed–they’d none of ‘em be missed!

CHORUS. He’s got ‘em on the list — he’s got ‘em on the list;
And they’ll none of ‘em be missed — they’ll none of
‘em be missed.

There’s the rap and hip-hop devotee, and the others of his ilk,
And the break-dance enthusiast — I’ve got him on the list!
And the people who eat a sushi roll and puff it in your face,
They never would be missed — they never would be missed!
Then the idiot who praises, with enthusiastic tone,
Films that don’t have endings, and all races but his own;
And the “lady” in the leotard, who looks just like a guy,
And who doesn’t need to marry, but would rather like to
try;
And that singular anomaly, the wealthy socialist —
I don’t think he’d be missed — I’m sure he’d not he missed!

CHORUS. He’s got him on the list — he’s got him on the list;
And I don’t think he’ll be missed — I’m sure
he’ll not be missed!

And that jurisprudential malcontent, who just now is rather rife,
The loose constructionist — I’ve got him on the list!
All perfumed fellows, girly men, and dykes who seek a “wife” —
They’d none of ‘em be missed — they’d none of ‘em be missed.
And apologetic statesmen of a compromising kind,
Such as — What d’ye call him — Thing’em-bob, and
likewise — Never-mind,
And ‘St–’st–’st–and What’s-his-name, and also You-know-who —
The task of filling up the blanks I’d rather leave to you.
But it really doesn’t matter whom you put upon the list,
For they’d none of ‘em be missed — they’d none of ‘em be
missed!

CHORUS. You may put ‘em on the list–you may put ‘em on the list;
And they’ll none of ‘em be missed — they’ll none of
‘em be missed!
__________
Adapted from W.S. Gilbert’s lyrics for “I’ve Got a Little List,” which is sung by the character Ko-Ko in Gilbert and Sullivan’s The Mikado (1885).  The original lyrics, with annotations, may be found here. The last seven lines of the final verse are unchanged, Gilbert’s observations about “statesmen” being timeless.

Driving and Politics

Reflections on my years in Austin.

Among the many reasons for my hatred of flying is that I am usually seated behind someone who fails to heed the notice to return his or her seat-back to the upright position. This is a mild annoyance, compared with the severe annoyances and outright dangers that go with driving in Austin. Austiners (a moniker that I prefer to the pretentiousness of “Austinites”) exhibit a variety of egregious driving habits, the number of which exceeds the number of Willie (The Actor) Sutton‘s convictions for bank robbery.

Without further ado, I give you driving in Austin:

First on the list, because I see it so often in my neck of Austin, is driving in the middle of an unstriped, residential street, even as another vehicle approaches. This practice might be excused as a precautionary because Austiners often exit parked cars by opening doors and stepping out, heedless of traffic. But middle-of-the-road driving occurs spontaneously and is of a piece with the following self-centered habits.

Next is waiting until the last split-second to turn onto a street.  This practice — which prevails along Florida’s Gulf Coast because of the age of the population there — is indulged in by drivers of all ages in Austin. It is closely related to the habit of ignoring stop signs, not just by failing to stop at them but also (and quite typically) failing to look before not stopping. Ditto — and more dangerously — red lights.

Not quite as dangerous, but mightily annoying, is the Austin habit of turning abruptly without giving a signal. And when the turn is to the right, it often is accompanied by a loop to the left, which thoroughly confuses the driver of the following vehicle and can cause him to veer into danger.

Loopy driving reaches new heights when an Austiner changes lanes or crosses lanes of traffic without looking. A signal, rarely given, occurs after the driver has made his or her move, and it means “I’m changing/crossing lanes because it’s my God-given right to do so whenever I feel like it, and it’s up to other drivers to avoid hitting my vehicle.”

The imperial prerogative — I drive where I please — also manifests itself in the form of crossing the center line while taking a curve. That this is done by drivers of all types of vehicle, from itsy-bitsy cars to hulking SUVs, indicates that the problem is sloppy driving habits, not unresponsive steering mechanisms. Other, closely related practices are taking a corner by cutting across the oncoming lane of traffic and zipping through a parking lot as if no child, other pedestrian, or vehicle might suddenly appear in the traffic lane.

At the other end of the spectrum, but just as indicative of thoughtlessness is the practice of yielding the right of way when it’s yours. This perverse courtesy only confuses the driver who doesn’t have the right of way and causes traffic to back up (needlessly) behind the yielding driver.

Then there is the seeming inability of most Austiners to park approximately in the middle of a head-in parking space and parallel to the stripes that delineate it.  The ranks of the parking-challenged seem to be filled with yuppie women in small BMWs, Infinitis, and Lexi; older women in almost any kind of vehicle; and (worst of all) drivers of SUVs – of which “green” Austin has far more than its share on its antiquated street grid. It should go without saying that most of Austin’s SUV drivers are obnoxious, tail-gating jerks when they’re on the road.

Contributing to the preceding practices — and compounding the dangers of the many dangerous ones — is the evidently inalienable right of an Austiner to talk on a cell phone while driving, everywhere and (it seems) always. Yuppie women in SUVs are the worst offenders, and the most dangerous of the lot because of their self-absorption and the number of tons they wield with consummate lack of skill. Austin, it should also go without saying, has more than its share of yuppie women.

None of the above is unique to Austin. But inconsiderate and dangerous driving habits seem much more prevalent in Austin than in other places where I have driven — even including the D.C. area, where I spent 37 years.

My theory is that the prevalence of bad-driving behavior in Austin — where “liberalism” is hard-left and dominant — reflects the essentially anti-social character of “liberalism”. Despite the lip-service that “liberals” give to such things as compassion, community, and society, they worship the state and use its power to do their will — without thought or care for the lives and livelihoods thus twisted and damaged.

Cass Sunstein, Part 5

The plausible authoritarian sits down for an (imaginary) interview.

Loquitur Veritatem (LV): Apropos my previous post, I wish you would quit beating around the bush. If you want something, you have to spell it out. Don’t be coy, Cass, tell us how you would amend the Constitution to ensure that all internet users are exposed to points of view that they would otherwise eschew.

Cass Sunstein (CS): Let’s start with the First Amendment, which deals with freedom of speech and of the press, among other things. I’m suggesting that we simply recognize that not all speech is protected and use that fact to force the purveyors of extreme points of view to acknowledge opposing points of view.

LV: Tell us how you would restate the First Amendment so that it does the right thing.

CS: I would add the following codicil: Congress, in order to promote a more efficacious deliberative democracy, may require persons to acknowledge opposing points of view when they communicate on a subject. Further, Congress may require communications media to assist in that endeavor and to transmit points of view other than those which they might willingly transmit.

LV: So, in the name of political freedom you would curtail freedom?

CS: I don’t think of it that way. We’re all more free, in an intellectual way, when we’re exposed to a diversity of experiences and points of view. Besides, freedom is something we receive from government; government may therefore withdraw some freedom from us when it’s for our good.

LV: Let’s assume, for the sake of this discussion, that people desire political freedom, and the social and economic freedoms that flow from it. Would we really be more free if government forced us to hear, or at least take part in the transmission of, views with which we disagree, or would we simply be encumbered with more rules about how to live our lives?

CS: That’s a negative way of looking at it.

LV: Let me draw an analogy from fiction. Have you read Portnoy’s Complaint?

CS: You aren’t about to slur my ethnicity, are you?

LV: No, not at all. It’s just that the novel’s protagonist, Alex Portnoy, has an experience that reminds me of your proposed codicil to the First Amendment. His mother stood over him with a knife in an effort to make him eat his dinner. Do you think government should act like Alex Portnoy’s mother?

CS: Well, she didn’t need to pull a knife on Alex, but she obviously needed to exert her maternal authority.

LV: You don’t think Alex would have voluntarily eaten his dinner in a day or two rather than starve?

CS: Why take chances? Alex’s mother obviously suffered from anxiety caused by Alex’s refusal to eat his dinner.

LV: But Alex’s mother — being older and larger than Alex, though evidently not wiser — might have reflected on the ramifications of her threat. She didn’t really save Alex from starvation, but she did cause him to disrespect and hate her.

CS: What does that have to do with my version of the First Amendment?

LV: It has a lot to do with what happens to the cohesiveness of society, which you seem to value, when government forces people to behave in certain ways. Consider blacks, for example, who in many cases have been disrespected because they are seen as “affirmative action” doctors, lawyers, teachers, etc. But let’s move on. What about the rules that would require the acknowledgement of opposing points of view? Who would make those rules? In particular, with respect to web sites, who would select those “sites that deal with substantive issues in a serious way”? And who would identify “highly partisan” web sites that “must carry” icons pointing to those “sites that deal with substantive issues in a serious way”?

CS: An agency authorized by Congress to do such things.

LV: Let’s assume it’s the FCC, whose members are appointed by the president, subject to confirmation by the Senate. The FCC is essentially a political body, composed of some mix of Democrats and Republicans.

CS: That’s inevitably the case with any regulatory agency.

LV: Right you are. So the FCC, or any agency newly created for the purpose, wouldn’t be neutral about such issues as what constitutes an opposing point of view, which sites deal with substantive issues in a serious way, and which sites are highly partisan.

CS: You have to rely on the judgment of those appointed to perform the task of making such evaluations.

LV: But not the judgment — or preferences — of purveyors of news and views?

CS: No, because they’re likely to be wedded to their positions and not open to opposing ideas.

LV: Unlike the political appointees on the FCC and the minions who would actually devise and execute the agency’s rules?

CS: Well, those political appointees would be scrutinized by Congress, and they would be responsible for the actions of their subordinates.

LV: Members of Congress, of course, are always balanced and neutral in its views, and which never try to inflict particular points of view on regulatory agencies. Ditto the political appointees and their subordinates, who are experts at “gaming” their bosses and who outlast them by decades.

CS: You’re trying to get me to say that my version of the First Amendment would impose the judgment of politicians and bureaucrats on the news and views of corporate and individual communicators.

LV: Isn’t that exactly what would happen?

CS: But we’re better off when our duly elected representatives and their agents make such decisions. That’s how deliberative democracy is supposed to work.

LV: Oh, we elect them to tell us how to live our lives?

CS: If that’s what it takes to make us better citizens, yes.

LV: You think coercion of that sort would make us a more cohesive society and would make us more appreciative of points of view that differ from our own?

CS: It’s worth a try.

LV: And where do you stop?

CS: What do you mean?

LV: How do you know when society is sufficiently cohesive and that an acceptable fraction of its members have become appreciative of differing points of view? What do you do if communicators simply refused to cooperate with your program?

CS: Well, as to your first question, the FCC would simply monitor the content of broadcasts and web sites. As to your second question, the FCC might shut down uncooperative outlets or place them in the hands of an appointed operator, much as bankruptcy courts use court-appointed receivers to hand the affairs of bankrupt businesses. In the extreme, the FCC might have to resort to criminal sanctions — fines and imprisonment. But that probably wouldn’t happen more than a few times before communicators began to comply with the law.

LV: What you really mean is that the punishment wouldn’t stop until communicators began to comply with the views of the agency’s bureaucrats and those members of Congress who have leverage on the agency because of their oversight roles. Suppose the FCC were composed entirely of members who had a peculiar regard for the original meaning of the Constitution. Suppose, further, that we had, at the same time, a president who felt the same way about the Constitution, and that Congress was in the hands of a sympathetic majority. Now, in the course of monitoring web sites the FCC comes across your essay on “The Future of Free Speech” and deems it an extremist screed, subversive of the Constitution. What do you suppose would happen?

CS: The FCC should order The Little Magazine to post a link to your commentary on my essay. Or it might order The Little Magazine to remove my essay from its site.

LV: Suppose the FCC did neither. Suppose the FCC gave the matter some thought and concluded that it would do nothing about your essay. Instead, it would hew to the original meaning of the Constitution and let you bloviate to your heart’s content.

CS: I would turn myself in to the FCC and demand to be sanctioned to the letter of the law.

LV: Oh, really? Can I count on that? I just want to be sure that you’re willing to live by the rules that you would impose on others.

CS: Most assuredly.

LV: Thank you very much for your (imaginary) time. That’s all for now. But don’t worry, I’ll be keeping an eye on you.

Cass Sunstein, Part 4

The plausible authoritarian’s dangerous mind.

Cass Sunstein’s blatherings at The Volokh Conspiracy about FDR’s “Second Bill of Rights” (addressed here, here, and here) made me want to find out more about his understanding of the proper role of government. I Googled the eminent professor and hit upon “The Future of Free Speech“, which appeared in The Little Magazine, a South Asian journal. Hold your nose and read the whole thing or spare yourself and get the gist of Sunstein’s argument in these excerpts:

My purpose here is to cast some light on the relationship between democracy and new communications technologies. I do so by emphasising the most striking power provided by emerging technologies: the growing power of consumers to “filter” what it is that they see. In the extreme case, people will be fully able to design their own communications universe. They will find it easy to exclude, in advance, topics and points of view that they wish to avoid. I will also provide some notes on the constitutional guarantee of freedom of speech.

An understanding of the dangers of filtering permits us to obtain a better sense of what makes for a well-functioning system of free expression. Above all, I urge that in a heterogeneous society, such a system requires something other than free, or publicly unrestricted, individual choices. On the contrary, it imposes two distinctive requirements. First, people should be exposed to materials that they would not have chosen in advance…. Second, many or most citizens should have a range of common experiences. Without shared experiences, a heterogeneous society will have a much more difficult time addressing social problems; people may even find it hard to understand one another…. [emphasis added]

Imagine … a system of communications in which each person has unlimited power of individual design…. Our communications market is moving rapidly toward this apparently utopian picture….

A distinctive feature of [the Supreme Court’s public forum doctrine] is that it creates a right of speakers’ access, both to places and to people. Another distinctive feature is that the public forum doctrine creates a right, not to avoid governmentally imposed penalties on speech, but to ensure government subsidies of speech…. Thus the public forum represents one place in which the right to free speech creates a right of speakers’ access to certain areas and also demands public subsidy of speakers….

Group polarisation is highly likely to occur on the Internet. Indeed, it is clear that the Internet is serving, for many, as a breeding ground for extremism, precisely because like-minded people are deliberating with one another, without hearing contrary views….

The most reasonable conclusion is that it is extremely important to ensure that people are exposed to views other than those with which they currently agree, in order to protect against the harmful effects of group polarisation on individual thinking and on social cohesion….

The phenomenon of group polarisation is closely related to the widespread phenomenon of ‘social cascades’. No discussion of social fragmentation and emerging communications technologies would be complete without a discussion of that phenomenon….

[O]ne group may end up believing something and another the exact opposite, because of rapid transmission of information within one group but not the other. In a balkanised speech market, this danger takes on a particular form: different groups may be led to dramatically different perspectives, depending on varying local cascades.

I hope this is enough to demonstrate that for citizens of a heterogeneous democracy, a fragmented communications market creates considerable dangers. There are dangers for each of us as individuals; constant exposure to one set of views is likely to lead to errors and confusions. And to the extent that the process makes people less able to work cooperatively on shared problems, there are dangers for society as a whole.

In a heterogeneous society, it is extremely important for diverse people to have a set of common experiences….

The points thus far raise questions about whether a democratic order is helped or hurt by a system of unlimited individual choice with respect to communications. It is possible to fear that such a system will produce excessive fragmentation, with group polarisation as a frequent consequence. It is also possible to fear that such a system will produce too little by way of solidarity goods, or shared experiences….

If the discussion thus far is correct, there are three fundamental concerns from the democratic point of view. These include:
• the need to promote exposure to materials, topics, and positions that people would not have chosen in advance, or at least enough exposure to produce a degree of understanding and curiosity;
• the value of a range of common experiences;
• the need for exposure to substantive questions of policy and principle, combined with a range of positions on such questions.

Of course, it would be ideal if citizens were demanding, and private information providers were creating, a range of initiatives designed to alleviate the underlying concerns…. But to the extent that they fail to do so, it is worthwhile to consider government initiatives designed to pick up the slack….

1. Producers of communications might be subject … to disclosure requirements…. On a quarterly basis, they might be asked to say whether and to what extent they have provided educational programming for children, free airtime for candidates, and closed captioning for the hearing impaired. They might also be asked whether they have covered issues of concern to the local community and allowed opposing views a chance to be heard…. Websites might be asked to say if they have allowed competing views a chance to be heard….

2. Producers of communications might be asked to engage in voluntary self-regulation…. [T]here is growing interest in voluntary self-regulation for both television and the Internet…. Any such code could, for example, call for an opportunity for opposing views to speak, or for avoiding unnecessary sensationalism, or for offering arguments rather than quick ‘sound-bytes’ whenever feasible.

3. The government might subsidise speech, as, for example, through publicly subsidised programming or Websites…. Perhaps government could subsidise a ‘public.net’ designed to promote debate on public issues among diverse citizens — and to create a right of access to speakers of various sorts.

4. If the problem consists in the failure to attend to public issues, the government might impose “must carry” rules on the most popular Websites, designed to ensure more exposure to substantive questions. Under such a program, viewers of especially popular sites would see an icon for sites that deal with substantive issues in a serious way…. Ideally, those who create Websites might move in this direction on their own. If they do not, government should explore possibilities of imposing requirements of this kind, making sure that no program draws invidious lines in selecting the sites whose icons will be favoured….

5. The government might impose “must carry” rules on highly partisan Websites, designed to ensure that viewers learn about sites containing opposing views…. Here too the ideal situation would be voluntary action. But if this proves impossible, it is worth considering regulatory alternatives….

This is “libertarian paternalism” on steroids, as it should be given Sunstein’s seminal role in that morally bankrupt endeavor. There many reasons to reject Sunstein’s scheme out of hand. Not the least of the reasons is its administrative cost and intrusiveness.

But the fundamental reason to reject the scheme its authoritarianism. It would effectively bring the broadcast media and the internet under control by a government bureaucracy. Any bureaucracy that is empowered to insist upon “completeness”, “fairness”, and “balance” in the exposition of ideas is thereby empowered to define and enforce its conception of those attributes. It is easy to imagine how a bureaucracy that is dominated by power-crazed zealots who espouse socialism, gender fluidity, “equity”, etc., etc., would deploy its power.

In an earlier post I said that Cass Sunstein is to the integrity of constitutional law as Pete Rose was to the integrity of baseball. It’s worse than that: Sunstein’s willingness to abuse constitutional law in the advancement of a statist agenda reminds me of Hitler’s abuse of German law to advance his repugnant agenda.

Cass Sunstein, Part 3

The plausible authoritarian adopts Sen(seless) economics.

In part 1 and part 2 I addressed Cass Sunstein’s laudatory exposition of FDR’s so-called Second Bill of Rights in two posts at The Volokh Conspiracy. CS continued in that vein with and another post that invoked economist Amartya Sen:

Randy [Barnett] asks whether the Second Bill should be seen as protecting “natural rights.” To say the least, the natural rights tradition has multiple strands; a good contemporary version is elaborated by Amartya Sen (see his Development as Freedom).

Here’s Dr. Sen (the 1998 Nobel laureate in Economics and a professor at Trinity College, Cambridge) to explain what he means by economic freedom (from an online essay entitled “Development as Freedom”):

We … live in a world with remarkable deprivation, destitution, and oppression….

Overcoming these problems is a central part of the exercise of development. We have to recognize the role of different freedoms in countering these afflictions. Indeed, individual agency is, ultimately, central to addressing these deprivations. On the other hand, the freedom of agency that we have is inescapably constrained by our social, political, and economic opportunities. We need to recognize the centrality of individual freedom and the force of social influences on the extent and reach of individual freedom. To counter the problems we face, we have to see individual freedom as a social commitment….

I view the expansion of freedom both as the primary end and as the principal means of development. Development consists of removing various types of unfreedoms that leave people with little choice and little opportunity of exercising their reasoned agency….

Development requires the removal of major sources of unfreedom: poverty as well as tyranny, poor economic opportunities as well as systemic social deprivation, neglect of public facilities as well as intolerance or overactivity of repressive states.

What’s wrong with this picture? Sen, Sunstein, and their ilk — clever arguers, all — equate economic freedom (delivered in this country through make-work jobs, welfare, the minimum wage, social security, subsidized housing, free medical care, legalized extortion of employers through unionization, etc., etc.) with political freedom (or liberty as it’s better known). The two things are incommensurate. Indeed, they are incompatible.

In order for some persons to enjoy the kind of economic freedom envisioned by FDR and his acolytes, government must impose what Sen would call economic unfreedom on other persons, through taxation and regulation. “Robbing Peter to pay Paul” still says it best.

Political freedom (liberty) works the other way around. One person’s political freedom — the freedom to speak out, to publish a newspaper, to cast a vote, and so on — doesn’t diminish another person’s political freedom.

True economic freedom flows from political freedom. True economic freedom encompasses such things as staying in school, studying, and graduating honorably; finding and keeeping a job, without paying off a union or invoking “minority” status; starting a business of one’s own and running it freely, without extorting or cheating others; and saving for one’s old age in real investments (not the Social Security Ponzi scheme). These are just a few of the many economic freedoms that government has circumscribed in its typically Orwellian effort to improve us by making us less free.

More importantly, from the Sunstein-Sen point of view, FDR-style economic freedom reduces the range of options available to individuals by significantly diminishing the economy (see “The Bad News about Economic Growth”). If the economy hadn’t been stunted by FDR-style economic freedom, and if FDR-style economic freedom hadn’t discouraged the habit of private charity, the poor, the infirm, and the aged, and the various “minority” groups of this land would be far better off than they are today.

The irony of Sen(seless) economics would be amusing if it weren’t tragic.