I Hate to Kick a Guy When He’s Already Down . . .

. . . but Steven Edwards of Wired News cannot be excused for his support of the Christopher Reeve Paralysis Act, even though Edwards is paralyzed from the shoulders down. His condition gives him no special standing to hijack taxpayers’ money. Let’s begin with this, from Edwards (whose material I italicize below):

With all the noise over stem-cell research, few people seem to have heard about or understood the importance of the Christopher Reeve Paralysis Act. This noncontroversial bill would improve the collaboration and coordination of federally funded paralysis research. . . .

If it’s noncontroversial, why is it “languishing” in Congress (Edwards’s term)?

Once you read the Paralysis Act, it’s almost impossible to oppose it. . . .

Impossible, eh? I guess that’s why it’s sailing through Congress. Or perhaps members of Congress can’t read — which is a possibility.

I believe embryonic stem cells hold tremendous promise as a research tool, and I support such research. . . .

As do I, on both counts, though I have strong reservations about the creation of life in order to destroy it. Such actions are on the slippery slope toward state control of human destiny. (Go here for more on that score.)

In any event, if a thing is worth doing, the private sector will do it, and do it better than government. (I exempt justice and defense only because of the danger that warlords might arise — or get stronger than they are.)

The Stem Cell Research Enhancement Act would loosen restrictions on federal funding of embryonic stem-cell research and establish guidelines giving infertile couples the option of donating their cryopreserved embryos for the specific purpose of deriving new embryonic stem-cell lines that would be eligible for federally funded research.

Loosening restrictions on federal funding is precisely the problem — and it doesn’t matter to me what’s being restricted. It could be free applie pie for all on Mothers’ Day and I would still oppose it. Why? Because it puts politicians and bureaucrats in charge of deciding how we should spend our money. Yes, it’s our money, not theirs, and decidely not Steven Edwards’s.

Many people have many different kinds of health problems. And those people are hurt when government hijacks their money just because some celebrity (or former president) happened to suffer from a particular disability or disease. Or do you believe in free lunches, Santa Claus, and the tooth fairy, Mr. Edwards?

Allan Bloom’s Mind, Revisited

I wrote recently and critically (in “Allan Bloom’s Mind“) about Jim Sleeper’s reconsideration of Allan Bloom’s work. For an authoritative trashing of Sleeper’s NYT article — and of Sleeper’s sleazy brand of Leftism — read this post by Roger Kimball at Armavirumque, the weblog of The New Criterion.

Roberts for Chief, Then What?

Very clever. . .

One suspects that the White House knew or surmised that Chief Justice Rehnquist was nearing the end when Judge Roberts was picked to succeed O’Connor. There may be a bit more rancor from the Left about Roberts, now that he’s moving up to CJ, but he’s still a very good bet for confirmation. (Unless a bigger skeleton than his opposition to “comparable worth” emerges from someone’s closest.)

Now the question is, who’s the pick for O’Connor’s seat? I stand by what I said here:

[W]ill Bush . . . nominate a limited-government conservative-libertarian like Janice Rogers Brown? If he doesn’t, Bush’s slide toward the accommodationist policies of his father will be confirmed. A sad waste of a Republican majority in Congress.

It’s showdown time. If Bush fails to nominate someone in the mold of Judge Brown I will withdraw everything I have ever said about the GOP being the last, best hope for the restoration of limited government.

My Labor Day Message

I posted this on Labor Day 2004. I stand by it.

Labor Day gives most workers a day off. That’s good because an extra day off now and then is a pause that refreshes. A longish trek to a park or a beach on a hot day with a car full of kids isn’t a refreshing way to spend Labor Day, but those workers who spend the day at home, perhaps reading a book and listening to music, will find their souls somewhat restored.

Now let us consider the significance of Labor Day as a holiday. According to Wikipedia:

The origins of Labor Day can be traced back to the Knights of Labor in the United States, and a parade organized by them at that time on September 5, 1882 in New York City. In 1884 another parade was held, and the Knights passed resolutions to make this an annual event. Other labour organizations (and there were many), but notably the affiliates of the International Workingmen’s Association who were seen as a hotbed of socialists and anarchists, favoured a May 1 holiday. With the event of Chicago’s Haymarket riots in early May of 1886, president Grover Cleveland believed that a May 1 holiday could become an opportunity to commemorate the riots. But fearing it may strengthen the socialist movement, he quickly moved in 1887 to support the position of the Knights of Labor and their date for Labor Day. The date was adopted in Canada in 1894 by the government of Prime Minister John Thompson, although the concept of a Labour Day actually originated with marches in both Toronto and Ottawa in 1872. On the other hand, socialist delegates in Paris in 1889 appointed May 1 as the official International Labour Day.

Labor Day has been celebrated on the first Monday in September in the United States and Canada since the 1880s. The September date has remained unchanged, even though the two governments were encouraged to adopt May 1 as Labor Day, the date celebrated by the majority of the world. Moving the holiday, in addition to violating U.S. tradition, could have been viewed as aligning U.S. labor movements with internationalist sympathies.

In summary (for those of you who didn’t grow up in the North), Labor Day is an invention of organized labor, and the historical roots of organized labor are socialistic.

Labor Day also serves to remind us of one of the “monuments” of FDR’s New Deal (quoting again from Wikipedia):

The National Labor Relations Act of 1935 (or Wagner Act) protects the rights of workers in the private sector of the United States to organize unions, to engage in collective bargaining over wages, hours, and terms and conditions of employment, and to take part in strikes and other forms of concerted activity in support of their demands….

In the first few years of the Wagner Act, however, many employers simply refused to recognize it as law. The United States Supreme Court had already struck down a number of other statutes passed during the New Deal on the grounds that Congress did not have the constitutional authority to enact them under its power to regulate interstate commerce. Most of the initial appellate court decisions reached the same conclusion, finding the Act unconstitutional and therefore unenforceable. It was not until the Supreme Court upheld the constitutionality of the statute in 1937 in National Labor Relations Board v. Jones & Laughlin Steel Corp. that the Wagner Act became law in practical terms as well.

Thus Labor Day, in its way, commemorates legislative and judicial infamy. The Wagner Act, at one stroke, deprived business owners of their property rights and thus discouraged investment and business formation; invalidated the freedom of employers to contract with employees on terms acceptable to employers as well as employees; caused artificially high wages and benefits that harmed American workers by making American industry less and less competitive with foreign industry; and set the stage for the use of the Commerce Clause as an excuse for the federal government’s interference in all aspects of business.

So, if you are a worker, enjoy your Labor Day holiday, but don’t thank organized labor or the New Deal for your material blessings.

Something Snapped

A portion of the bio of a contributor to the Blogger News Network, in which she notes that “something snapped inside”:

My mother was a Civil Rights activist and a teacher. She passed away in 1998.My father was an Army intel op in the Second World War. He passed away in 1985. I have been writing since I was very young. I have been involved in politics, the civil rights movement, and the anti-war movement since I was a child. My mother founded the first integrated pre-school for black and white children in Roxbury, Massachusetts in 1941. That was 13 years before the official beginning of the Civil Rights movement in America. Every weekend for our coming up years, my mother brought us into Boston for rallies and teach-ins. My early life was filled with the speeches of Martin Luther King. I heard them live, and I read them over and over. His writing had a profound effect on me. Later in life I read about Mahatma Ghandi. I think he might be my vote of the greatest political and religious leader who ever lived. My mother would have told us stories about Mahatma Ghandi and the Salt Marches. When war was declared against Iraq in 2003 I had been living out of the United States for 10 years or more. I lived an idyllic life in Ireland, in a beautiful cottage, with a lovely boyfriend who was one of the greatest musicians in all of Ireland. I played fiddle badly, but I had a supremely happy life. When I heard George W. Bush’s State of the Union Address in 2003, when I heard him outline the “Axis of Evil,” and when I heard him boast that the had sanctioned the summary execution of 3,000 Afghani prisoners, something snapped inside. When war was declared on Iraq I reached a turning point. For years I had been contributing 20% of everything I earned through my painting and writing to Medicins Sans Frontieres. For years I had enjoyed a life that few people could imagine. But it ended when war was declared on Iraq. I had many Iraqi friends, and because of the art and literature and antiquities in Iraq, I just could not countenance any war of agression against that country.

Obviously something “snapped inside” her, but it had snapped long before she heard George Bush inveigh against the “axis of evil.” Listen lady, if you can’t distinguish between enemy states and their people (most of whom are not our enemies), you are too stupid to be taken seriously about anything. If you’re defending the “axis” states of North Korea, Iran, and pre-invasion Iraq, you have forfeited your right to judge anyone else’s morality. And if you simply think that war is inherently “bad” because “it just is” or because civilians sometimes get caught in the crossfire, then you dishonor your father’s memory.

With company like that (and several dozen other nutcases and “liberal” statists), it’s no wonder I recently resigned from BNN. Something snapped.

Common Ground for Conservatives and Libertarians?

I am interested here in addressing Burkean conservatives — as opposed to yahoos, opportunistic Republicans, neoconservatives, protectionists, and isolationists, for example. Wikipedia says this about Burkean conservatism:

Edmund Burke [link added: ED] developed his ideas in reaction to the Enlightenment, and the idea of a society guided by abstract “Reason.” . . .

Some men, argued Burke, have more reason than others, and thus some men will make worse governments if they rely upon reason than others. To Burke, the proper formulation of government came not from abstractions such as “Reason,” but from time-honoured development of the state and of other important societal institutions such as the family and the Church. . . .

Burke argued that tradition is a much sounder foundation than “reason”. The conservative paradigm he established emphasises the futility of attempting to ground human society based on pure abstractions (such as “reason,” “equality,” or, more recently, “diversity”), and the necessity of humility in the face of the unknowable. Existing institutions have virtues that cannot be fully grasped by any single person or interest group or, in Burke’s view, even any single generation. . . .

Tradition draws on the wisdom of many generations and the tests of time, while “reason” may be a mask for the preferences of one man, and at best represents only the untested wisdom of one generation. In the conservative view, an attempt to modify the complex web of human interactions that form human society for the sake of some doctrine or theory runs the risk of running afoul of the iron law of unintended consequences. Burke advocates vigilance against the possibility of moral hazards. For Burkean conservatives, human society is something rooted and organic; to try to prune and shape it according to the plans of an ideologue is to invite unforeseen disaster.

Burkean conservatives are inherently skeptical of plans to re-model human society after an ideological model. They emphasise ‘continuity with tradition, which does [not] exclude changes within the framework of that tradition. They insist that political change should come about through legitimate political process, and oppose interference with that process, including extra-constitutional reactionary changes. So long as rule of law is upheld, and so long as change is effected gradually and constitutionally rather than [through] revolution, they are, in theory, content. Burkean conservatism is in principle neither revolutionary nor counter-revolutionary.

Now, if this seems familiar to libertarians, it should. Friedrich Hayek takes much the same tack in many of his writings. In “The Use of Knowledge in Society” (1945), Hayek says:

If it is fashionable today to minimize the importance of the knowledge of the particular circumstances of time and place, this is closely connected with the smaller importance which is now attached to change as such. Indeed, there are few points on which the assumptions made (usually only implicitly) by the “planners” differ from those of their opponents as much as with regard to the significance and frequency of changes which will make substantial alterations of production plans necessary. Of course, if detailed economic plans could be laid down for fairly long periods in advance and then closely adhered to, so that no further economic decisions of importance would be required, the task of drawing up a comprehensive plan governing all economic activity would be much less formidable. . . .

If we can agree that the economic problem of society is mainly one of rapid adaptation to changes in the particular circumstances of time and place, it would seem to follow that the ultimate decisions must be left to the people who are familiar with these circumstances, who know directly of the relevant changes and of the resources immediately available to meet them. . . .

The problem which we meet here is by no means peculiar to economics but arises in connection with nearly all truly social phenomena, with language and with most of our cultural inheritance, and constitutes really the central theoretical problem of all social science. As Alfred Whitehead has said in another connection, “It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them.” This is of profound significance in the social field. We make constant use of formulas, symbols, and rules whose meaning we do not understand and through the use of which we avail ourselves of the assistance of knowledge which individually we do not possess. We have developed these practices and institutions by building upon habits and institutions which have proved successful in their own sphere and which have in turn become the foundation of the civilization we have built up.

Hayek sums it up in The Constitution of Liberty (1960):

[B]efore we can try to remould society intelligently, we must understand its functioning; we must realise that, even when we believe that we understand it, we may be mistaken. What we must learn to understand is that human civilisation has a life of its own, that all our efforts to improve things must operate within a working whole which we cannot entirely control, and the operation of whose forces we can hope merely to facilitate and assist so far as we can understand them. [Chapter 4, pp. 69-70]

In a postcript to The Constitution of Liberty (“Why I Am Not a Conservative“), Hayek tries to distinguish his brand of liberalism (that is, classical liberalism or what is now minimal-state libertarianism) from conservatism:

This difference between liberalism and conservatism must not be obscured by the fact that in the United States it is still possible to defend individual liberty by defending long-established institutions. To the liberal they are valuable not mainly because they are long established or because they are American but because they correspond to the ideals which he cherishes. . . .

In the last resort, the conservative position rests on the belief that in any society there are recognizably superior persons whose inherited standards and values and position ought to be protected and who should have a greater influence on public affairs than others. The liberal, of course, does not deny that there are some superior people – he is not an egalitarian – bet he denies that anyone has authority to decide who these superior people are. . . .

Closely connected with this is the usual attitude of the conservative to democracy. I have made it clear earlier that I do not regard majority rule as an end but merely as a means, or perhaps even as the least evil of those forms of government from which we have to choose. But I believe that the conservatives deceive themselves when they blame the evils of our time on democracy. The chief evil is unlimited government, and nobody is qualified to wield unlimited power. . . . The powers which modern democracy possesses would be even more intolerable in the hands of some small elite.

Admittedly, it was only when power came into the hands of the majority that further limitations of the power of government was thought unnecessary. In this sense democracy and unlimited government are connected. But it is not democracy but unlimited government that is objectionable, and I do not see why the people should not learn to limit the scope of majority rule as well as that of any other form of government. At any rate, the advantages of democracy as a method of peaceful change and of political education seem to be so great compared with those of any other system that I can have no sympathy with the antidemocratic strain of conservatism. It is not who governs but what government is entitled to do that seems to me the essential problem.

I must point out that Hayek’s disparagement of conservatism is not aimed at Burkean conservatism. One account of Hayek’s life and thought explains that his criticism of conservatism

was aimed primarily at the European-style conservatism, which has often opposed capitalism as a threat to social stability and traditional values. Hayek identified himself as a classical liberal, but noted that in the United States it had become almost impossible to use “liberal” in the older sense that he gave to the term. In the U.S., Hayek is usually described as a “libertarian“, but the denomination that he preferred was “Old Whig” (a phrase borrowed from Edmund Burke).

Burkean conservatism, contra other forms of conservatism, doesn’t insist on the political dominance of a certain class. It insists on a rule of law that doesn’t allow the state to impose change on society but, rather, allows change to come from within society. (A quaint notion that held sway in the United States until the advent of the New Deal.)

Having apologized for Hayek’s position on conservatism, I must object to Hayek’s defense of democracy, which is now quaint. Hayek was writing 45 years ago, when it still seemed possible that we might return to the limited government of the written Constitution. But the forces of democracy-for-its-own-sake have since prevailed. Democracy and unlimited government are now bound together so tightly that Hayek’s fine distinction between the two is no longer valid, if ever it was. With unlimited government now in the saddle, the affairs of Americans are run by inferior men and women who are able — through demagoguery, bread, and circuses — to capture the allegiance of the inferior masses.

The limited government designed by the Framers was conservative, in a Burkean way. It was meant to enable superior persons to thrive — for the benefit of all — not through political dominance so much as through social and economic leadership. That design has long since given way, through extra-constitutional legislation and adjudication, to unlimited government, which disables superiority — to the detriment of all.

Is there not now a viable conservative movement in the United States? How else could Republicans have won seven of the last ten presidential elections and prevailed in the last six congressional elections? The problem is that Republicanism, which was more or less Burkean until the middle of the 20th century, sold its soul when it chose Dwight Eisenhower and Richard Nixon as its standard-bearers in 1952. From that point on, the GOP began to attract more than its share of yahoos, other cranks, and opportunists.

Barry Goldwater, the penultimate major politician of Burkean mien, lost resoundingly in the presidential election of 1964. And it has been downhill ever since. Nixon, an opportunist of the first rank, courted yahoos. Ronald Reagan, the last major politician of Burkean mien, was hampered by a Democrat-controlled Congress and his big-tent view of Republicanism. About Bush 41 and Bush 43, perhaps the best thing one can say is that they are not Michael Dukakis, Al Gore, or John Kerry.

The GOP is no longer reliably Burkean, though it certainly must attract far more Burkeans than the Democrat Party. The question is whether there are still enough Burkean conservatives (of any party) to constitute a viable political movement, one which might be influential if allied with those of us who choose to identify ourselves by other term, such as free-market capitalist or minarchist (minimal-state, Hayekian libertarian). As Austin Bramwell argues at The American Conservative,

one would think that [Russell] Kirk, Hayek, and others (including eccentric outsiders such as R.J. Rushdoony, L. Brent Bozell, and Ayn Rand) had left behind a commanding legacy. One would expect that, like Burke, they had articulated ideas so powerful that they can only be contended with, not refuted. . . .

Has conservatism achieved this exalted stature? If we are honest, we must answer no.

In the 1950s and ‘60s, conservatives sought not just to refute modern liberalism but to obliterate it. . . . Each conservative writer claimed to have uncovered the Holy Grail—the argument or principle that would expose the errors of liberalism (and communism, socialism, feminism, etc.) once and for all. . . .

Yet the Holy Grail has not been found. One can still find lapel-grabbing right-wingers who will argue late into the night that their favorite thinker has figured everything out for all time. (My personal favorite: certain libertarians believe that Alan Gewirth, a now forgotten philosopher of the 1970s, showed how the rightness of limited government derives ultimately from Aristotle’s law of non-contradiction.) This is not the place to take up the argument with them. I only wish to observe, as an empirical matter, that no one person’s ideas actually define American conservatism. If English conservatism is nothing other than Burkeanism, American conservatism is not Rothbardianism, Randianism, Jaffaism, or Hayekianism.

But Bramwell goes on to say that

[o]n the libertarian side, a small group of academics affiliated with the journal Critical Review [link added: ED] is quietly working a revolution. They forthrightly acknowledge that neither free-market economics nor moral philosophy have produced a comprehensive argument for libertarianism. Nonetheless, they argue, limited government is still preferable because it mitigates the problem of public ignorance.

The majority of voters in a mass democracy, they reason, are stunningly ignorant of even the most basic political information. Moreover, to the extent that their voting behavior can be rationalized, they employ heuristics of the most obtuse sort: “Candidate X cares about people like me.” As for the tiny but relatively well-informed elite, they too have limited intellectual resources for understanding current politics. Hence, they rely on naïve heuristics such as “Republicans are greedy, religious fanatics” or “liberals are hypocrites who only care about making themselves feel better.”

The reliance on such heuristics can perhaps be explained in terms of rational economic decision-making—in that there is not enough time in the day to bother to learn much about politics—but, more deeply, in terms of evolutionary psychology. The human mind is too primitive to understand the complexities of modern politics. Democratic politics thus present a choice between the ideological rigidity of the elites and the sheer incompetence of the masses. We can escape this predicament only by reducing the role of government in our lives.

In sum, if Austin Bramwell is a harbinger, the American conservative movement — the thoughtful branch of that movement, at least — is moving toward its natural ally: minarchist libertarianism. For, as I have tried to show here, Burkean conservatism and minarchism amount to the same thing. Would a working alliance of Burkeans and minarchists constitute an influential critical mass? That remains to be seen, but it’s a possibility to be encouraged.

Rehnquist’s Successor: A Test of Bush’s Political Philosophy and Resolve

John Roberts may be a “stealth nominee” for the Supreme Court, but it’s unlikely that Bush’s nominee to replace Rehnquist can be as stealthy as Roberts. For one thing, conservative Senators surely will try to ferret out, if not block, another pseudo-conservative like O’Connor or Kennedy (not to mention Souter).

Knowing that, will Bush decide to placate his conservative base and nominate a limited-government conservative-libertarian like Janice Rogers Brown? If he doesn’t, Bush’s slide toward the accommodationist policies of his father will be confirmed. A sad waste of a Republican majority in Congress.

It’s showdown time. If Bush fails to nominate someone in the mold of Judge Brown I will withdraw everything I have ever said about the GOP being the last, best hope for the restoration of limited government.

Allan Bloom’s Mind

Remember academic Allan Bloom, who rode to sudden fame in 1987 on the back of his book, The Closing of the American Mind? It’s been years since I read the book, so I must rely on Jim Sleeper’s essay at NYTimes.com for a refresher:

Who on an American campus could ignore Bloom’s accounts of Cornell faculty groveling before black-power student poseurs, or his sketches of politically correct administrator-mandarins and ditzy pomo professors? What dedicated teacher could dismiss his self-described ”meditation on the state of our souls, particularly those of the young, and their education”? Some thoughtful liberals found themselves reading ”The Closing” under their bedcovers with flashlights, unable either to endorse or repudiate it but sensing that some reckoning was due. Conservatives championed Bloom then, of course, and they invoke him still.

But, on closer inspection, it seems that

[f]ar from being a conservative ideologue, Bloom, a University of Chicago professor of political philosophy who died in 1992, was an eccentric interpreter of Enlightenment thought who led an Epicurean, quietly gay life. He had to be prodded to write his best-selling book by his friend Saul Bellow, whose novel ”Ravelstein” is a wry tribute to Bloom. Far more than liberal speech codes and diversity regimens, the bêtes noires of the intellectual right, darkened Bloom’s horizons: He also mistrusted modernity, capitalism and even democracy so deeply that he believed the university’s culture must be adversarial (or at least subtly subversive) before America’s market society, with its vulgar blandishments, religious enthusiasms and populist incursions.

In fact, a mistrust of modernity, capitalism, and democracy isn’t an unusual paleoconservative trait. Be that as it may, Bloom was right about the dangers of political correctness, and so Closing became — and still is — a rallying point for those conservatives, libertarians, and (true) liberals who oppose it.

Whatever else Allan Bloom might have opposed is of little moment. He was right about at least one thing, and his rightness about that thing has served us well.

"The Private Sector Isn’t Perfect"

Stephen Bainbridge (ProfessorBainbridge.com) actually says that in this post. Well, “So what?” you may say: No system for organizing human activity is perfect, except in such dream-worlds as anarcho-capitalism (where market forces defeat bullies by the sheer force of theory), Objectivism (which talks a good game about reality but seems unable to grasp it), and socialism (which promises free lunches and destroys incentives).

What Bainbridge goes on to say is almost right, however:

. . . but we’ve known since Adam Smith that economic incentives work. . . .

We also know that the modern public corporation is the greatest engine of prosperity the world has ever seen.[*] In The Company: A Short History of a Revolutionary Idea, John Micklethwait and Adrian Wooldridge demonstrate that the corporation is “the basis of the prosperity of the West and the best hope for the future of the rest of the world.”

The capital, product, and labor markets give corporate managers directors incentives to produce goods and services efficiently. What defenders of government regulation often overlook is that regulators are also actors with their own self-interested motivations. The trouble is that the incentives to which regulators and legislators respond are often contrary to the public interest. The incentives of legislators and regulators are driven by rent-seeking and interest group politics, which have no necessary correlation to corporate profit-maximization. Accordingly, government preparation for and response to disasters is likely to be driven by the political concerns of the governmental actors rather than the public good.

In sum, it may be time to try Adam Smith’s invisible hand by outsourcing disaster relief.

But the best way to outsource disaster relief isn’t to have government use our tax dollars to hire private disaster-relief specialists. No, the best way to “outsource” disaster relief is this:

  • Leave tax dollars in the hands of the private sector.
  • Tell the private sector that when it comes to disasters it’s your responsibility to plan prudently — and to bear the consequences of your planning.
  • Get government out of the insurance business and let the private sector respond (without restraint) to consumers’ demands for disaster insurance.

The best way to ensure that people make prudent decisions is to let them knowing that they’re responsible for themselves, require them to “play” with their own money, and allow them to spend their money where they think it will do them the most good.

Though it’s meant to bash the Bush administration, this headline (from Slate) captures the essence of the problem:

$41 Billion, and Not a Penny of Foresight
Why is the New Orleans recovery going so badly? Just look at the DHS budget.

As if government could ever take taxpayers’ money away from them and then spend it better than taxpayers could.** So Bush spends the money on the war (according to the Bush-bashers). Well, Clinton would have spent the money on his pet projects. That’s what happens when politicians get into your wallet. They decide what’s most important to you.

The private sector (i.e., free-market capitalism) is less perfect than everything, except all of the alternatives to it.

P.S. Ignoramuses and die-hard statists will say that free-market capitalism leaves everyone on his or her own. (Oh, how I hate the awkwardness that results from gender-correct writing.) In fact, free-market capitalism is the best vehicle for large-scale cooperation that has ever emerged from human endeavor. Free-market capitalism, among many other things, allows for insurance against risk and provides the wherewithal to combat the elements (e.g., plywood for boarding up windows, concrete for deep footings). What it doesn’t do is offer the illusion that “someone else” will protect you from all harm and immediately make you whole when you come to harm.

Related post: Katrina’s Aftermath: Who’s to Blame?
__________
* Apologists for the state like to say that public corporations couldn’t exist without the state’s blessing. Balderdash! Insurance markets would do the job of protecting shareholders quite nicely, thank you.

** I argue in “But Wouldn’t Warlords Take Over?” that government should take taxpayers’ money in order to provide for criminal justice and national defense. But that’s for the prudential reason suggested by the title of the post, not because government can necessarily provide such services more efficiently than free-market capitalism.

The Seven Faces of Blogging

Shakespeare’s wisdom about the seven stages of man:

All the world’s a stage,
And all the men and women merely players:
They have their exits and their entrances;
And one man in his time plays many parts,
His acts being seven ages. At first the infant,
Mewling and puking in the nurse’s arms.
And then the whining school-boy, with his satchel
And shining morning face, creeping like snail
Unwillingly to school. And then the lover,
Sighing like furnace, with a woeful ballad
Made to his mistress’ eyebrow. Then a soldier,
Full of strange oaths and bearded like the pard,
Jealous in honour, sudden and quick in quarrel,
Seeking the bubble reputation
Even in the cannon’s mouth. And then the justice,
In fair round belly with good capon lined,
With eyes severe and beard of formal cut,
Full of wise saws and modern instances;
And so he plays his part. The sixth age shifts
Into the lean and slipper’d pantaloon,
With spectacles on nose and pouch on side,
His youthful hose, well saved, a world too wide
For his shrunk shank; and his big manly voice,
Turning again toward childish treble, pipes
And whistles in his sound. Last scene of all,
That ends this strange eventful history,
Is second childishness and mere oblivion,
Sans teeth, sans eyes, sans taste, sans everything.

As You Like It, Act II, Scene VII
(Source: Shakespeare Online)

My take on the seven faces of blogging:

All the world’s a Web,
And all the men and women merely bloggers:
They have their posts and their comments;
And one man in his time plays many parts,
His blogs having seven faces. There’s the newbie,
Mewling and puking his embarrassing secrets.
And then the whining bloviator, with his hatreds
And roaring boldface caps, screeching like a cat
Thrown into a stream. And next the argufier,
Making like Quixote, with lance atilt
Charging anon at orthodoxies. Then the academician,
Full of theories and bearded for the part,
Jealous of his peers, sudden and quick in quarrel,
Seeking the savant’s reputation
But also the rabble’s roar. And then the pundit,
In pajamas clad and with good port plied,
With eyes alert for oddities to seize upon,
Full of eclectic wisdom and clever phrases;
And so he plays his part. Now we gaze upon
The would-be poet and belle-lettrist,
With drafts propped aside his monitor,
His fingers tapping dizzily at the keyboard,
His eyes ablaze with creative fervor; not wanting
To end his labors even as the bell tolls three,
He dozes in his lonely den. At last we come
To the dispirited burnt-out blogger who will not
Quit his blogging habit until, like Yorick, he is
Sans teeth, sans eyes, sans taste, sans everything.

Katrina’s Aftermath: Who’s to Blame?

My heart goes out to those who are injured, sick, homeless, jobless, and hungry in the aftermath of Hurricane Katrina. My heart does not go out to those who helped to make the disaster as vast as it is turning out to be. And who are those culprits?

Let’s start with government, which made it possible for people to live in low-lying areas by erecting dikes, levees, and pumping stations — not at the expense of the direct “beneficiaries” of such facilities but at the expense of taxpayers.

Let’s continue with government, which insists on taxing the rest of us so that the victims of disasters such as Katrina can “rebuild their lives and businesses” in the very same vulnerable places.

Let’s continue some more with government, which insists on taxing the rest of us to entice residents and businesses to remain in vulnerable areas — in the name of urban pride and “job creation” — through various forms of personal and corporate welfare.

And let’s end with voters, business owners, labor unions, and others who support the politicians who perpetuate all such government programs because those programs are “humane,” “compassionate,” or “essential.” Those voters, business owners, workers, and others who are victims of Katrina are, in fact, victims of their own willingness to extort taxpayers to pay for inadequate protection against foreseeable disasters, such as major hurricanes along the Gulf Coast. They would make better decisions if they had, instead, to choose between spending their own money for adequate protection from foreseeable disasters or exerting themselves to make a life or run a business out of the reach of such disasters.

Programs such as those I mention above create an expectation that government will take care of people who expose themselves to danger, thus making it likely that people will make decisions that do indeed expose them to danger. The price? Death, disease, homelessness, joblessness, and hunger. And the waste of billiions of dollars of taxpayers’ money.

Katrina is just one of the many natural disasters that government, acting at the prompting of voters and other interested parties, has converted to a vast human tragedy. And there will be many more such tragedies, I fear.

Science, Evolution, Religion, and Liberty

Excerpts of a long post at Liberty Corner II:

If a man will begin with certainties, he shall end in doubts, but if he will be content to begin with doubts, he shall end in certainties.

Francis Bacon (1561–1626),
British philosopher, essayist, statesman.
The Advancement of Learning, bk. 1, ch. 5, sct. 8 (1605).
(Source:
Bartleby.com)

Science begins with doubts — questions about the workings of the world around us — and moves bit by bit toward greater certainty, without ever reaching complete certainty. Philosophy and religion begin with certainties — a priori explanations about the workings of the world — and end in doubts because the world cannot be explained by pure faith or pure reason. But philosophy and religion can tell us how to live life morally, whereas science can only help us live life more comfortably, if that is what we wish to do.

Scientists — when they are being scientists — begin with questions (doubts), which lead them to make observations, and from those observations they derive theories about the workings of the universe or some aspect of it. Those theories can then be tested to see if they have predictive power, and revised when they are found wanting, that is, when new observations (facts) cast doubt on their validity. Scientific facts may sometimes be beyond doubt (e.g., the temperature at which water freezes under specified conditions), but scientific theories — which are generalizations from facts — are never beyond doubt. Or they never should be. . . .

Einstein stands as a paragon among scientists: unwilling to run with the herd, unwilling to “follow any fad or popular direction,” as Smolin puts it elsewhere in the essay quoted above. Now we seem to have herds of so-called scientists who cling to certain theories because those theories are popular and dominant. They may be great scientists — or hacks — who have come to a certain worldview and are loathe to abandon it, or they may be followers of renowned scientists who lack the imagination to see alternative explanations of phenomena. Whatever the case, a “scientist” who insists on the truth of his worldview has abandoned science for something that might as well be called religion or philosophy.

In the case of global warming, we’ve seen the herd instinct at work for many years. It has become an article of faith among academic and government scientists not only that global warming is due mainly to human activity but also that it is “bad.” . . .

Now we come to evolution. I have written elsewhere about the tendency of evolutionary biologists (and their hangers-on at places like The Panda’s Thumb) to act like priests of a secular religion. . . .

[T]he scientific consensus seems to be that any scientist who even entertains intelligent design (ID) as a supplementary explanation of the development of life forms has somehow become a non-scientist. . . .

I think it really boils down to this: Anti-ID scientists cannot prove that ID is unscientific; pro-ID scientists cannot prove that ID is anything more than a convenient explantion for currently unexplained phenomena. It’s the scientific (or non-scientific) version of a Mexican standoff. . . .

It is impossible to eliminate any explanation of the origin of life or the development of life forms, as long as that explanation doesn’t conflict with facts. Similarly, it is impossible to eliminate any explanation of the origin of the universe, as long as that explanation doesn’t conflict with facts. Staunch evolutionists — those who resist Creationism, intelligent design, or any other unfalsifiable or unfalsified explanation for the origin of the universe, the origin of life, or the development of life forms — are merely invoking their preferred worldview — not facts.

The best that science can do, under any foreseeable circumstances, is to investigate how life developed from the point in the known history of the universe at which there is evidence of life. But many (perhaps most) evolutionists and their hangers-on aren’t content to pursue that scientific agenda. . . .

Scientific elites and their hangers-on, like paternalists of all kinds, would like to tell us how to live our lives — for our own good, of course — because they think they have the answers, or can find them. (They would be benign technocrats, of course, unlike their counterparts in the old USSR.) And when they are thwarted, they get in a snit and issue manifestos.

But, as I said at the outset, science isn’t about how to live morally, it’s about how to live life more comfortably, if that is what we wish to do. To know how to live life morally we must turn to a philosophy that promotes liberty, and we must not reject the moral code of the Judeo-Christian tradition, in which one finds much support for liberty.

I’m very much for science, properly understood, which is the increase of knowledge. I’m very much against the misuse of science by scientists (and others) who invoke it to advance an extra-scientific agenda. Science, properly done, begins with doubts and ends in certainties, but those certainties extend only to the realm of observable, documented facts. Science has no claim to superiority over philosophy or religion in the extra-factual realm of morality.

I close by paraphrasing my son’s comment about my post on “Religion and Liberty“:

The basis of liberty is extra-scientific; thus the need for non-scientific moral institutions.


CLICK HERE TO READ THE FULL POST.

Science, Evolution, Religion, and Liberty

If a man will begin with certainties, he shall end in doubts, but if he will be content to begin with doubts, he shall end in certainties.

Francis Bacon (1561–1626),
British philosopher, essayist, statesman.
The Advancement of Learning, bk. 1, ch. 5, sct. 8 (1605).
(Source:
Bartleby.com)

Science begins with doubts — questions about the workings of the world around us — and moves bit by bit toward greater certainty, without ever reaching complete certainty. Philosophy and religion begin with certainties — a priori explanations about the workings of the world — and end in doubts because the world cannot be explained by pure faith or pure reason. But philosophy and religion can tell us how to live life morally, whereas science can only help us live life more comfortably, if that is what we wish to do.

Scientists — when they are being scientists — begin with questions (doubts), which lead them to make observations, and from those observations they derive theories about the workings of the universe or some aspect of it. Those theories can then be tested to see if they have predictive power, and revised when they are found wanting, that is, when new observations (facts) cast doubt on their validity. Scientific facts may sometimes be beyond doubt (e.g., the temperature at which water freezes under specified conditions), but scientific theories — which are generalizations from facts — are never beyond doubt. Or they never should be.

Consider Albert Einstein, arguably the greatest scientist who has yet lived. According to physicist Lee Smolin,

[a]lthough Einstein was . . . the discoverer of quantum phenomena, he became in time the main opponent of the theory of quantum mechanics. By his own account, he spent far more time thinking about quantum theory than he did about relativity. But he never found a theory of quantum physics that satisfied him. . . .

Quantum theory was not the only theory that bothered Einstein. Few people have appreciated how dissatisfied he was with his own theories of relativity. Special relativity grew out of Einstein’s insight that the laws of electromagnetism cannot depend on relative motion and that the speed of light therefore must be always the same, no matter how the source or the observer moves. . . . Special relativity was the result of 10 years of intellectual struggle, yet Einstein had convinced himself it was wrong within two years of publishing it. He rejected his theory, even before most physicists had come to accept it, for reasons that only he cared about. For another 10 years, as the world of physics slowly absorbed special relativity, Einstein pursued a lonely path away from it.

Why? The main reason was that he wanted to extend relativity to include all observers, whereas his special theory postulates only an equivalence among a limited class of observers—those who aren’t accelerating. A second reason was to incorporate gravity, making use of a new principle he called the equivalence principle. This postulates that observers can never distinguish the effects of gravity from those of acceleration so long as they observe phenomena only in their immediate neighborhood. By this principle [general relativity] he linked the problem of gravity with the problem of extending relativity to all observers. . . .

[I]n spite of the great triumph general relativity represented, Einstein did not linger long over it. For Einstein, quantum physics was the essential mystery, and nothing could be really fundamental that was not part of the solution to that problem. As general relativity didn’t explain quantum theory, it had to be provisional as well. It could only be a step towards Einstein’s goal, which was to find a theory of quantum phenomena that would agree with all the experiments, but satisfy his demand for clarity and completeness.

Einstein imagined for a time that such a theory could come from an extension of general relativity. Thus he entered into the final period of his scientific life, his search for a unified field theory. He sought an extension of general relativity that would incorporate electromagnetism, thereby wedding the large-scale world where gravity dominates with the small-scale world of quantum physics. . . .

[B]y the end of his life Einstein had to some extent abandoned his search for a unified field theory. He had failed to find a version of the theory that did what was most important to him, which is to explain quantum phenomena in a way that involved neither measurements nor statistics. In his last years he was moving on to something even more radical. He proposed to give up the idea that space and time are continuous. . . .

I think a sober assessment is that up until now, almost all of us who work in theoretical physics have failed to live up to Einstein’s legacy. His demand for a coherent theory of principle was uncompromising. It has not been reached—not by quantum theory, not by special or general relativity, not by anything invented since. Einstein’s moral clarity, his insistence that we should accept nothing less than a theory that gives a completely coherent account of individual phenomena, cannot be followed unless we reject almost all contemporary theoretical physics as insufficient. . . .

In my whole career as a theoretical physicist, I have known only a handful of colleagues of whom it can truly be said have followed Einstein’s path. They are driven, as Einstein was, by a moral need for clear understanding. In everything they do, these few strive continually to invent a new theory of principle that could satisfy the strictest demands of coherence and consistency without regard to fashion or the professional consequences. Most have paid for their independence in a harder career path than equally talented scientists who follow the research agendas of the big professors.

I have quoted Smolin at length because he reveals two key facets of Einstein, the scientist: a willingness to abandon a theory, and a stubbornness about challenging the conventional wisdom, even though its proponents were equally eminent scientists.

Einstein stands as a paragon among scientists: unwilling to run with the herd, unwilling to “follow any fad or popular direction,” as Smolin puts it elsewhere in the essay quoted above. Now we seem to have herds of so-called scientists who cling to certain theories because those theories are popular and dominant. They may be great scientists — or hacks — who have come to a certain worldview and are loathe to abandon it, or they may be followers of renowned scientists who lack the imagination to see alternative explanations of phenomena. Whatever the case, a “scientist” who insists on the truth of his worldview has abandoned science for something that might as well be called religion or philosophy.

In the case of global warming, we’ve seen the herd instinct at work for many years. It has become an article of faith among academic and government scientists not only that global warming is due mainly to human activity but also that it is “bad.” Dr. Roy Spencer, an atmospheric scientist, stands back from the fray in “Let’s Be Honest about the Real Consensus” (link added):

“Consensus” among scientists is not definitive, and some have even argued that in science it is meaningless or counterproductive. After all, even scientific “laws” have been disproved in the past (e.g. the Law of Parity in nuclear physics). Global warming is a process that can not be measured in controlled lab experiments, and so in many respects it can not be tested or falsified in the traditional scientific sense. Nevertheless, I’m willing to admit that in the policymakers’ realm, scientific consensus might have some limited value. But let’s be honest about what that consensus refers to: that “humans influence the climate”. Not that “global warming is a serious threat to mankind”.

Moreover, it’s certainly not clear that the scientific consensus about global warming is correct. (See, for example, this earlier post.)

Now we come to evolution. I have written elsewhere about the tendency of evolutionary biologists (and their hangers-on at places like The Panda’s Thumb) to act like priests of a secular religion. But just how firm is the ground on which their temple is built? Not all that firm, according to a recent report in ScienceDaily:

Contrary to inheritance laws the scientific world has accepted for more than 100 years, some plants revert to normal traits carried by their grandparents, bypassing genetic abnormalities carried by both parents.These mutant parent plants apparently have hidden templates containing genetic information from the preceding generation that can be transferred to their offspring, even though the traits aren’t evident in the parents, according to Purdue University researchers. This discovery flies in the face of the scientific laws of inheritance first described by Gregor Mendel in the mid-1800s and still taught in classrooms around the world today.

“This means that inheritance can happen more flexibly than we thought in the past,” said Robert Pruitt, a Purdue Department of Botany and Plant Pathology molecular geneticist. “While Mendel’s laws that we learned in high school still are fundamentally correct, they’re not absolute.

“If the inheritance mechanism we found in the research plant Arabidopsis exists in animals, too, it’s possible that it will be an avenue for gene therapy to treat or cure diseases in both plants and animals.”

The study is published in the March 24 issue of the journal Nature. . . .

Editor’s Note: The original news release can be found here.

Such findings don’t discredit evolutionary theory, but they do underscore two points:

  • Evolutionary theory is still very much in flux.
  • Prevailing scientific theories are never as secure as they seem to be — or as many of their adherents would like them to be.

Nevertheless, the scientific consensus seems to be that any scientist who even entertains intelligent design (ID) as a supplementary explanation of the development of life forms has somehow become a non-scientist. Consider the recent controversy surrounding Dr. Richard Sternberg, as described in The Washington Post of August 19:

Evolutionary biologist Richard Sternberg made a fateful decision a year ago.

As editor of the hitherto obscure Proceedings of the Biological Society of Washington, Sternberg decided to publish a paper making the case for “intelligent design,” a controversial theory that holds that the machinery of life is so complex as to require the hand — subtle or not — of an intelligent creator.

Within hours of publication, senior scientists at the Smithsonian Institution — which has helped fund and run the journal — lashed out at Sternberg as a shoddy scientist and a closet Bible thumper.

“They were saying I accepted money under the table, that I was a crypto-priest, that I was a sleeper cell operative for the creationists,” said Steinberg, 42 , who is a Smithsonian research associate. “I was basically run out of there.”

An independent agency has come to the same conclusion, accusing top scientists at the Smithsonian’s National Museum of Natural History of retaliating against Sternberg by investigating his religion and smearing him as a “creationist.”

The U.S. Office of Special Counsel, which was established to protect federal employees from reprisals, examined e-mail traffic from these scientists and noted that “retaliation came in many forms . . . misinformation was disseminated through the Smithsonian Institution and to outside sources. The allegations against you were later determined to be false.”

“The rumor mill became so infected,” James McVay, the principal legal adviser in the Office of Special Counsel, wrote to Sternberg, “that one of your colleagues had to circulate [your résumé] simply to dispel the rumor that you were not a scientist.” . . .

A small band of scientists argue for intelligent design, saying evolutionary theory’s path is littered with too many gaps and mysteries, and cannot account for the origin of life.

Most evolutionary biologists, not to mention much of the broader scientific community, dismiss intelligent design as a sophisticated version of creationism. . . .

Sternberg’s case has sent ripples far beyond the Beltway. The special counsel accused the National Center for Science Education, an Oakland, Calif.-based think tank that defends the teaching of evolution, of orchestrating attacks on Sternberg.

“The NCSE worked closely with” the Smithsonian “in outlining a strategy to have you investigated and discredited,” McVay wrote to Sternberg. . . .

Sternberg is an unlikely revolutionary. He holds two PhDs in evolutionary biology, his graduate work draws praise from his former professors, and in 2000 he gained a coveted research associate appointment at the Smithsonian Institution.

Not long after that, Smithsonian scientists asked Sternberg to become the unpaid editor of Proceedings of the Biological Society of Washington, a sleepy scientific journal affiliated with the Smithsonian. Three years later, Sternberg agreed to consider a paper by Stephen C. Meyer, a Cambridge University-educated philosopher of science who argues that evolutionary theory cannot account for the vast profusion of multicellular species and forms in what is known as the Cambrian “explosion,” which occurred about 530 million years ago.

Scientists still puzzle at this great proliferation of life. But Meyer’s paper went several long steps further, arguing that an intelligent agent — God, according to many who espouse intelligent design — was the best explanation for the rapid appearance of higher life-forms.

Sternberg harbored his own doubts about Darwinian theory. He also acknowledged that this journal had not published such papers in the past and that he wanted to stir the scientific pot.

“I am not convinced by intelligent design but they have brought a lot of difficult questions to the fore,” Sternberg said. “Science only moves forward on controversy.” . . .

When the article appeared, the reaction was near instantaneous and furious. Within days, detailed scientific critiques of Meyer’s article appeared on pro-evolution Web sites. “The origin of genetic information is thoroughly understood,” said Nick Matzke of the NCSE. “If the arguments were coherent this paper would have been revolutionary– but they were bogus.”

A senior Smithsonian scientist wrote in an e-mail: “We are evolutionary biologists and I am sorry to see us made into the laughing stock of the world, even if this kind of rubbish sells well in backwoods USA.”

An e-mail stated, falsely, that Sternberg had “training as an orthodox priest.” Another labeled him a “Young Earth Creationist,” meaning a person who believes God created the world in the past 10,000 years.

This latter accusation is a reference to Sternberg’s service on the board of the Baraminology Study Group, a “young Earth” group. Sternberg insists he does not believe in creationism. “I was rather strong in my criticism of them,” he said. “But I agreed to work as a friendly but critical outsider.” . . .

“I loathe careerism and the herd mentality,” [Sternberg says]. “I really think that objective truth can be discovered and that popular opinion and consensus thinking does more to obscure than to reveal.”

At the core of ID is the hypothesis of irreducible complexity, which is the subject of a Wikipedia article that also provides many links to various aspects of the controversy about irreducible complexity and ID. To quote from that article: “Irreducible complexity is not an argument that evolution does not occur, but rather an argument that it is incomplete.”

Is irreducible complexity an unscientific proposition (an unfalsifiable hypothesis), as many of its critics charge? And if it is a falsifiable hypothesis, where does it stand? The answers to those questions shift so rapidly that the best I can do here is quote from the Wikipedia article:

Some critics [of irreducible complexity], such as Jerry Coyne (professor of evolutionary biology at the University of Chicago) and Eugenie Scott (a physical anthropologist and executive director of the National Center for Science Education) have argued that the concept of irreducible complexity, and more generally, the theory of Intelligent Design is not falsifiable, and therefore, not scientific.

[Michael] Behe [a leading proponent of ID] argues that the theory that irreducibly complex systems could not have been evolved can be falsified by an experiment where such systems are evolved. For example, he posits taking bacteria with no flagella and imposing a selective pressure for mobility. If, after a few thousand generations, the bacteria evolved the bacterial flagellum, then Behe believes that this would refute his theory.

Other critics take a different approach, pointing to experimental evidence that they believe falsifies the argument for Intelligent Design from irreducible complexity. For example, Kenneth Miller cites the lab work of Barry Hall on E. coli, which he asserts is evidence that “Behe is wrong.”

The problem is that as every pro-ID hypothesis is falsified (assuming that it is, eventually), another pro-ID hypothesis can be produced. For, there must be a very large number of biological manifestations that have not yet been explained by documented facts. Until such documented facts are produced, a proper scientist would keep irreducible complexity on the table as a possible explanation of an unexplained manifestation. But I have noticed a tendency among die-hard evolutionists — those for whom evolution is a religion — to resort to the practice of extrapolating from documented facts to argue that evolution could explain such-and-such, if only the necessary facts weren’t inconveniently missing. In a word, they are cheaters. (For more, see this post.)

I think it really boils down to this: Anti-ID scientists cannot prove that ID is unscientific; pro-ID scientists cannot prove that ID is anything more than a convenient explantion for currently unexplained phenomena. It’s the scientific (or non-scientific) version of a Mexican standoff.

Where does that leave us? It leaves us here:

When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth. (“Sherlock Holmes” in The Adventure of the Blanched Soldier)

What are the possibilities with which we must begin? In addition to the evolution of evolutionary biology, there are these alternatives, taken from the Wikipedia article on irreducible complexity:

  • Intelligent Design, the argument that irreducible complexity occurs through the input of some “intelligent designer”. One example of an Intelligent Design theory is Creationism (although it can be argued that this begs the question, as it does not say how or what created the Creator, and, if no creator was necessary to create the Creator, why creators should be needed for all other entities).
  • Francis Crick‘s suggestion that life on Earth may have been seeded by aliens (although it can be argued that this begs the question, as it does not say how the alien life arose).

You may have noticed that the list conflates two entirely different issues. There is the question of how life arose — which, I submit, can only be a matter of faith or conjecture — and there is the question of how life has developed, regardless of how it arose — which can be a matter for scientific investigation. Therein lies the crux of the problem. It is impossible to eliminate any explanation of the origin of life or the development of life forms, as long as that explanation doesn’t conflict with facts. Similarly, it is impossible to eliminate any explanation of the origin of the universe, as long as that explanation doesn’t conflict with facts. Staunch evolutionists — those who resist Creationism, intelligent design, or any other unfalsifiable or unfalsified explanation for the origin of the universe, the origin of life, or the development of life forms — are merely invoking their preferred worldview — not facts.

The best that science can do, under any foreseeable circumstances, is to investigate how life developed from the point in the known history of the universe at which there is evidence of life. But many (perhaps most) evolutionists and their hangers-on aren’t content to pursue that scientific agenda. As Frederick Turner puts it:

In many cases it is clear that the beautiful and hard-won theory of evolution, now proved beyond reasonable doubt, is being cynically used by some — who do not much care about it as such — to support an ulterior purpose: a program of atheist indoctrination, and an assault on the moral and spiritual goals of religion. A truth used for unworthy purposes is quite as bad as a lie used for ends believed to be worthy. If religion can be undermined in the hearts and minds of the people, then the only authority left will be the state, and, not coincidentally, the state’s well-paid academic, legal, therapeutic and caring professions. If creationists cannot be trusted to give a fair hearing to evidence and logic because of their prior commitment to religious doctrine, some evolutionary partisans cannot be trusted because they would use a general social acceptance of the truth of evolution as a way to set in place a system of helpless moral license in the population and an intellectual elite to take care of them.

“Mainstream” evolutionists might be willing to consider alien origins, complexity theory, and quantum evolution, given the provenance of those theories. But those same evolutionists are unlikely to back down from their resistence to intelligent design. Why? Because ID threatens their underlying agenda, which — as Turner suggests — is the ascendancy of scientism, scientific elites, and the strident atheists who support them. Another case in point is the strong vein of resistance to the Big Bang theory, because it’s consistent with a Creation. (Sample the results of this Google search, for example.) The irony of it all is that atheism is an unscientific belief in an unfalsifiable proposition, namely, that there is no God. Moreover, if there is a God, He doesn’t need to rely on Big Bangs or other such pyrotechnics to work His will.

Am I going too far when I join Frederick Turner in his distrust of “evolutionary partisans”? I think not. Peruse The Panda’s Thumb, where, for example, one contributor posted approvingly of an article arguing that the teaching of intelligent design should be ruled unconstitutional because it is unscientific. As I wrote at the time,

[t]hink of the fine mess we’d be in if the courts were to rule against the teaching of intelligent design not because it amounts to an establishment of religion but because it’s unscientific. That would open the door to all sorts of judicial mischief. The precedent could — and would — be pulled out of context and used in limitless ways to justify government interference in matters where government has no right to interfere.

It’s bad enough that government is in the business of funding science — though I can accept such funding wheere it actually aids our defense effort. But, aside from that, government has no business deciding for the rest of us what’s scientific or unscientific. When it gets into that business, you had better be ready for a rerun of the genetic policies of the Third Reich.

Scientific elites and their hangers-on, like paternalists of all kinds, would like to tell us how to live our lives — for our own good, of course — because they think they have the answers, or can find them. (They would be benign technocrats, of course, unlike their counterparts in the old USSR.) And when they are thwarted, they get in a snit and issue manifestos.

But, as I said at the outset, science isn’t about how to live morally, it’s about how to live life more comfortably, if that is what we wish to do. To know how to live life morally we must turn to a philosophy that promotes liberty, and we must not reject the moral code of the Judeo-Christian tradition, in which one finds much support for liberty.

I’m very much for science, properly understood, which is the increase of knowledge. I’m very much against the misuse of science by scientists (and others) who invoke it to advance an extra-scientific agenda. Science, properly done, begins with doubts and ends in certainties, but those certainties extend only to the realm of observable, documented facts. Science has no claim to superiority over philosophy or religion in the extra-factual realm of morality.

I close by paraphrasing my son’s comment about my post on “Religion and Liberty“:

The basis of liberty is extra-scientific; thus the need for non-scientific moral institutions.

Further reading:
Evolution (Wikipedia article)
Intelligent Design (Wikipedia article)
Intelligent Design: A Special Report from History Magazine
The Little Engine That Could…Undo Darwinism (The American Spectator article)
Faith-Based Evolution (Tech Central Station article)
Darwin and Design: The Evolution of a Flawed Debate (Tech Central Station article)
Intelligent Decline, Revisited (Tech Central Station article)
The Real Intelligent Designers (Tech Central Station article)
Divine Evolution (Tech Central Station article)
The Case Against Intelligent Design (The New Republic article)
Discovery Institute (the leading proponents of ID)
The Talk.Origins Archive (a collection of articles and essays that explore the creationism/evolution controversy from a mainstream scientific perspective)
Show Me the Science (anti-ID by noted philosopher Daniel C. Dennett)
Intelligent Design Has No Place in the Science Curriculum (The Chronicle of Higher Education article)

Related posts:

Hemibel Thinking
(07/16/04)
Climatology (07/16/04)
Global Warming: Realities and Benefits (07/18/04)
Words of Caution for the Cautious (07/21/04)
Scientists in a Snit (08/04/04)
Another Blow to Climatology? (08/21/04)
Bad News for Politically Correct Science (10/18/04)
Another Blow to Chicken-Little Science (10/27/04)
Bad News for Enviro-Nuts (11/27/04)
Going Too Far with the First Amendment (01/01/05)
Atheism, Religion, and Science (01/03/05)
The Limits of Science (01/05/05)
Three Perspectives on Life: A Parable (01/15/05)
Beware of Irrational Atheism (01/22/05)
The Hockey Stick Is Broken (01/31/05)
The Creation Model (02/23/05)
The Thing about Science (03/24/05)
Religion and Personal Responsibility (04/08/05)
Science in Politics, Politics in Science (05/11/05)
Global Warming and Life (07/18/05)
Evolution and Religion (07/25/05)
Speaking of Religion (07/26/05)
Words of Caution for Scientific Dogmatists (08/19/05)
Religion and Liberty (08/25/05)

A Footnote . . .

. . . to the preceding post, in which I quote at length from a recent article by Charles Murray, co-author of The Bell Curve (1994). In the article, Murray reviews the evidence about race and IQ and concludes

that we know two facts beyond much doubt. First, the conventional environmental explanation of the black-white difference [in IQ] is inadequate. Poverty, bad schools, and racism, which seem such obvious culprits, do not explain it. Insofar as the environment is the cause, it is not the sort of environment we know how to change, and we have tried every practical remedy that anyone has been able to think of. Second, regardless of one’s reading of the competing arguments, we are left with an IQ difference that has, at best, narrowed by only a few points over the last century. I can find nothing in the history of this difference, or in what we have learned about its causes over the last ten years, to suggest that any faster change is in our future.

I want to emphasize this point:

Insofar as the environment is the cause, it is not the sort of environment we know how to change, and we have tried every practical remedy that anyone has been able to think of.

That’s entirely consistent with what has been said by Thomas Sowell (a noted black scholar of conservative-libertarian persuasion), both in his commentary on The Bell Curve and in his recent writings about race and culture. Here’s what Sowell said about The Bell Curve soon after its publication:

Whatever innate potential various groups may have, what they actually do will be done within some particular culture. That intractable reality cannot be circumvented by devising “culture-free” tests, for such tests would also be purpose-free in a world where there is no culture-free society.

Perhaps the strongest evidence against a genetic basis for intergroup differences in IQ is that the average level of mental test performance has changed very significantly for whole populations over time and, moreover, particular ethnic groups within the population have changed their relative positions during a period when there was very little intermarriage to change the genetic makeup of these groups.

While The Bell Curve cites the work of James R. Flynn, who found substantial increases in mental test performances from one generation to the next in a number of countries around the world, the authors seem not to acknowledge the devastating implications of that finding for the genetic theory of intergroup differences. . . .

Even before Professor Flynn’s studies, mental test results from American soldiers tested in World War II showed that their performances on these tests were higher than the performances of American soldiers in World War I by the equivalent of about 12 IQ points. Perhaps the most dramatic changes were those in the mental test performances of Jews in the United States. The results of World War I mental tests conducted among American soldiers born in Russia–the great majority of whom were Jews–showed such low scores as to cause Carl Brigham, creator of the Scholastic Aptitude Test, to declare that these results “disprove the popular belief that the Jew is highly intelligent.” Within a decade, however, Jews in the United States were scoring above the national average on mental tests, and the data in The Bell Curve indicate that they are now far above the national average in IQ.

. . . For Jews, it is clear that later tests showed radically different results–during an era when there was very little intermarriage to change the genetic makeup of American Jews.

My own research of twenty years ago showed that the IQs of both Italian-Americans and Polish-Americans also rose substantially over a period of decades. Unfortunately, there are many statistical problems with these particular data, growing out of the conditions under which they were collected. However, while my data could never be used to compare the IQs of Polish and Italian children, whose IQ scores came from different schools, nevertheless the close similarity of their general patterns of IQ scores rising over time seems indicative–especially since it follows the rising patterns found among Jews and among American soldiers in general between the two world wars, as well as rising IQ scores in other countries around the world. . . .
Herrnstein and Murray openly acknowledge such rises in IQ and christen them “the Flynn effect,” in honor of Professor Flynn who discovered it. But they seem not to see how crucially it undermines the case for a genetic explanation of interracial IQ differences. They say:

The national averages have in fact changed by amounts that are comparable to the fifteen or so IQ points separating blacks and whites in America. To put it another way, on the average, whites today differ from whites, say, two generations ago as much as whites today differ from blacks today. Given their size and speed, the shifts in time necessarily have been due more to changes in the environment than to changes in the genes.

While this open presentation of evidence against the genetic basis of interracial IQ differences is admirable, the failure to draw the logical inference seems puzzling. Blacks today are just as racially different from whites of two generations ago as they are from whites today. Yet the data suggest that the number of questions that blacks answer correctly on IQ tests today is very similar to the number answered correctly by past generations of whites. If race A differs from race B in IQ, and two generations of race A differ from each other by the same amount, where is the logic in suggesting that the IQ differences are even partly racial? . . .

. . . When any factor differs as much from Al to A2 as it does from A2 to B2, why should one conclude that this factor is due to the difference between A in general and B in general? That possibility is not precluded by the evidence, but neither does the evidence point in that direction.(2.)

In footnote 2 Sowell concedes that “rising IQs over time do not refute the belief that races differ in IQ for genetic reasons, though it ought to at least raise a question about that belief.”

But let us continue with Sowell’s main theme, which is that persistent inter-racial differences in IQ can be attributed to persistent cultural differences. Writing recently in OpinionJournal, Sowell paraphrases his essay “Black Rednecks and White Liberals,” from the eponymous book. Here’s some of what he has to say:

There have always been large disparities, even within the native black population of the U.S. Those blacks whose ancestors were “free persons of color” in 1850 have fared far better in income, occupation, and family stability than those blacks whose ancestors were freed in the next decade by Abraham Lincoln.

What is not nearly as widely known is that there were also very large disparities within the white population of the pre-Civil War South and the white population of the Northern states. Although Southern whites were only about one-third of the white population of the U.S., an absolute majority of all the illiterate whites in the country were in the South.

The North had four times as many schools as the South, attended by more than four times as many students. Children in Massachusetts spent more than twice as many years in school as children in Virginia. Such disparities obviously produce other disparities. Northern newspapers had more than four times the circulation of Southern newspapers. Only 8% of the patents issued in 1851 went to Southerners. Even though agriculture was the principal economic activity of the antebellum South at the time, the vast majority of the patents for agricultural inventions went to Northerners. Even the cotton gin was invented by a Northerner.

Disparities between Southern whites and Northern whites extended across the board from rates of violence to rates of illegitimacy. American writers from both the antebellum South and the North commented on the great differences between the white people in the two regions. So did famed French visitor Alexis de Tocqueville.

None of these disparities can be attributed to either race or racism. Many contemporary observers attributed these differences to the existence of slavery in the South, as many in later times would likewise attribute both the difference between Northern and Southern whites, and between blacks and whites nationwide, to slavery. But slavery doesn’t stand up under scrutiny of historical facts any better than race or racism as explanations of North-South differences or black-white differences. The people who settled in the South came from different regions of Britain than the people who settled in the North–and they differed as radically on the other side of the Atlantic as they did here–that is, before they had ever seen a black slave.

Slavery also cannot explain the difference between American blacks and West Indian blacks living in the United States because the ancestors of both were enslaved. When race, racism, and slavery all fail the empirical test, what is left?

Culture is left.

The culture of the people who were called “rednecks” and “crackers” before they ever got on the boats to cross the Atlantic was a culture that produced far lower levels of intellectual and economic achievement, as well as far higher levels of violence and sexual promiscuity. That culture had its own way of talking, not only in the pronunciation of particular words but also in a loud, dramatic style of oratory with vivid imagery, repetitive phrases and repetitive cadences.

Although that style originated on the other side of the Atlantic in centuries past, it became for generations the style of both religious oratory and political oratory among Southern whites and among Southern blacks–not only in the South but in the Northern ghettos in which Southern blacks settled. It was a style used by Southern white politicians in the era of Jim Crow and later by black civil rights leaders fighting Jim Crow. Martin Luther King’s famous speech at the Lincoln Memorial in 1963 was a classic example of that style.

While a third of the white population of the U.S. lived within the redneck culture, more than 90% of the black population did. Although that culture eroded away over the generations, it did so at different rates in different places and among different people. It eroded away much faster in Britain than in the U.S. and somewhat faster among Southern whites than among Southern blacks, who had fewer opportunities for education or for the rewards that came with escape from that counterproductive culture.

Nevertheless the process took a long time. As late as the First World War, white soldiers from Georgia, Arkansas, Kentucky and Mississippi scored lower on mental tests than black soldiers from Ohio, Illinois, New York and Pennsylvania. Again, neither race nor racism can explain that–and neither can slavery.

The redneck culture proved to be a major handicap for both whites and blacks who absorbed it. Today, the last remnants of that culture can still be found in the worst of the black ghettos, whether in the North or the South, for the ghettos of the North were settled by blacks from the South. The counterproductive and self-destructive culture of black rednecks in today’s ghettos is regarded by many as the only “authentic” black culture–and, for that reason, something not to be tampered with. Their talk, their attitudes, and their behavior are regarded as sacrosanct.

The people who take this view may think of themselves as friends of blacks. But they are the kinds of friends who can do more harm than enemies.

If East Asians and Azhkenazic Jews could rise to the top of the IQ charts, as they have, why can’t blacks rise too? Sowell would answer that they could rise, if only they would break the bonds of the “black redneck” culture, which hinders so many of them. The law cannot break those bonds, for, as Sowell argues, the law only reinforces those bonds by making blacks dependent on the affirmative action, welfare programs, and other “white liberal” contrivances.

If culture is the enemy of black advancement, the only way blacks can advance is to abandon the culture that many of them have transported from inner cities to suburbia, where they encounter white culture in many places, including public schools. There, the cultural divide becomes obvious in the phenomemon known as “acting white,” the subject of an article by Harvard economist Roland G. Fryer Jr. and graduate student Paul Torelli, “An Empirical Analysis of ‘Acting White’.” The Washington Post‘s Richard Morin summarizes:

As commonly understood, acting white is a pejorative term used to describe black students who engage in behaviors viewed as characteristic of whites, such as making good grades, reading books or having an interest in the fine arts.

The phenomenon is one reason some social thinkers give to help explain at least a portion of the persistent black-white achievement gap in school and in later life. Popularity-conscious young blacks, afraid of being seen as acting white, steer clear of behaviors that could pay dividends in the future, including doing well in school. . . .

No one can change such attitudes but blacks themselves.

If “black redneck” culture is the cause of the inter-racial gap in IQ, and if blacks choose to perpetuate the “black redneck” culture, then the perpetuation of the IQ gap might as well be genetic. For, it will be the result of blacks’ self-imposed servitude to the forces of ignorance.

Recommended reading: Race and Intelligence (a Wikipedia article with many links to sources and opposing views)

Related posts: Affirmative Action and Race (a collection of links)

After the Bell Curve

Charles Murray, writing in Commentary, reviews what has been learned about gender, race, and IQ since the publication of his (and the late Richard Herrnstein’s) The Bell Curve eleven years ago. Why is he writing now?

The Lawrence Summers affair last January made me rethink my silence. The president of Harvard University offered a few mild, speculative, off-the-record remarks about innate differences between men and women in their aptitude for high-level science and mathematics, and was treated by Harvard’s faculty as if he were a crank. The typical news story portrayed the idea of innate sex differences as a renegade position that reputable scholars rejected.

It was depressingly familiar. In the autumn of 1994, I had watched with dismay as The Bell Curve’s scientifically unremarkable statements about black IQ were successfully labeled as racist pseudoscience. At the opening of 2005, I watched as some scientifically unremarkable statements about male-female differences were successfully labeled as sexist pseudoscience.

His target:

[S]pecific [social] policies based on premises that conflict with scientific truths about human beings tend not to work. Often they do harm.

One such premise is that the distribution of innate abilities and propensities is the same across different groups. . . . The assumption of no innate differences among groups suffuses American social policy. That assumption is wrong.

When the outcomes that these policies are supposed to produce fail to occur, with one group falling short, the fault for the discrepancy has been assigned to society. It continues to be assumed that better programs, better regulations, or the right court decisions can make the differences go away. That assumption is also wrong.

About gender:

[F]or reasons embedded in the biochemistry and neurophysiology of being female, many women with the cognitive skills for achievement at the highest level also have something else they want to do in life: have a baby. In the arts and sciences, forty is the mean age at which peak accomplishment occurs, preceded by years of intense effort mastering the discipline in question.20 These are precisely the years during which most women must bear children if they are to bear them at all. . . .

[W]omen with careers were four-and-a-half times more likely than men to say they preferred to work fewer than 40 hours per week. The men placed greater importance on “being successful in my line of work” and “inventing or creating something that will have an impact,” while the women found greater value in “having strong friendships,” “living close to parents and relatives,” and “having a meaningful spiritual life.” As the authors concluded, “these men and women appear to have constructed satisfying and meaningful lives that took somewhat different forms.”23 The different forms, which directly influence the likelihood that men will dominate at the extreme levels of achievement, are consistent with a constellation of differences between men and women that have biological roots.

I have omitted perhaps the most obvious reason why men and women differ at the highest levels of accomplishment: men take more risks, are more competitive, and are more aggressive than women.24 The word “testosterone” may come to mind, and appropriately. Much technical literature documents the hormonal basis of personality differences that bear on sex differences in extreme and venturesome effort, and hence in extremes of accomplishment—and that bear as well on the male propensity to produce an overwhelming proportion of the world’s crime and approximately 100 percent of its wars. But this is just one more of the ways in which science is demonstrating that men and women are really and truly different, a fact so obvious that only intellectuals could ever have thought otherwise.

As for race, Murray reviews the evidence at length and concludes

that we know two facts beyond much doubt. First, the conventional environmental explanation of the black-white difference [in IQ] is inadequate. Poverty, bad schools, and racism, which seem such obvious culprits, do not explain it. Insofar as the environment is the cause, it is not the sort of environment we know how to change, and we have tried every practical remedy that anyone has been able to think of. Second, regardless of one’s reading of the competing arguments, we are left with an IQ difference that has, at best, narrowed by only a few points over the last century. I can find nothing in the history of this difference, or in what we have learned about its causes over the last ten years, to suggest that any faster change is in our future.

The implications:

Elites throughout the West are living a lie, basing the futures of their societies on the assumption that all groups of people are equal in all respects. Lie is a strong word, but justified. It is a lie because so many elite politicians who profess to believe it in public do not believe it in private. It is a lie because so many elite scholars choose to ignore what is already known and choose not to inquire into what they suspect. We enable ourselves to continue to live the lie by establishing a taboo against discussion of group differences. . . .

The taboo arises from an admirable idealism about human equality. If it did no harm, or if the harm it did were minor, there would be no need to write about it. But taboos have consequences. . . .

How much damage has the taboo done to the education of children? Christina Hoff Sommers has argued that willed blindness to the different developmental patterns of boys and girls has led many educators to see boys as aberrational and girls as the norm, with pervasive damage to the way our elementary and secondary schools are run.78 . . .

How much damage has the taboo done to our understanding of America’s social problems? The part played by sexism in creating the ratio of males to females on mathematics faculties is not the ratio we observe but what remains after adjustment for male-female differences in high-end mathematical ability. The part played by racism in creating different outcomes in black and white poverty, crime, and illegitimacy is not the raw disparity we observe but what remains after controlling for group characteristics. . . .

Even to begin listing the topics that could be enriched by an inquiry into the nature of group differences is to reveal how stifled today’s conversation is. Besides liberating that conversation, an open and undefensive discussion would puncture the irrational fear of the male-female and black-white differences I have surveyed here. We would be free to talk about other sexual and racial differences as well, many of which favor women and blacks, and none of which is large enough to frighten anyone who looks at them dispassionately. . . .

The law should not prevent individuals from doing their best. Reverse discrimination — which is the law — pushes some people toward pursuits for which they are not best suited and it pushes other people away from pursuits for which they are best suited. In sum, reverse discrimination prevents individuals from doing their best. That’s bad social policy. But we mustn’t talk about it.

Related posts:

Affirmative Action and Race (a collection of links)
I Missed This One (08/12/04)
A Century of Progress? (01/30/05)
Feminist Balderdash (02/19/05)

A Values-Free Government?

Barry Lynn, executive director of Americans United for the Separation of Church and State, is quoted in an L.A. Times article* as saying, “We are not to turn the Holy Scriptures of any group into public policy.” That says a lot about the depth of religiosity to be found in an organization like Americans United, which is to the defense of religion as the American Civil Liberties Union is to the defense of liberty.

Now, I wouldn’t expect Americans United to endorse the first four of the Ten Commandments, which are about God, or even numbers 5 (honor parents), 7 (eschew adultery), 9 (don’t lie), or 10 (don’t covet others’ possessions). But you’d think that even Americans United would be in favor of laws that forbid (if not punish) murder (number 6) and outright theft (number 8). I guess not.

Related post: Religion and Liberty (with links to many other related posts)
__________
* Reguires a free subscription. Or get an e-mail ID and password at bugmenot.com, then search for “Grooming Politicians for Christ” (the title of the article).

Fuel for Thought

The charts below come from Chart of the Day. Ignore the rather strained effort to correlate spikes in the indices with recessions; focus on the fact that things have been “worse” and the world hasn’t come to an end.

Then ask yourself if there’s any reason believe that market forces will allow the real price of a particular item to rise indefinitely. The correct answer: of course not. Substitutes will become available at attractive prices, without government subsidiization of those substitutes. And people will buy the substitutes.

Religion and Liberty

Excerpts of a long post at Liberty Corner II:

Many libertarians — especially the strident atheists among them — are quick to say that religious morality is unnecessary because morality — standards of right and wrong — can be supplied by other sources: libertarianism, for example. There’s something to that, if you can bring yourself to believe that the gospel of Adam Smith, John Stuart Mill, and Friedrich Hayek could attract a much wider audience than its present, minuscule, market share.

For libertarianism to grow and thrive, it must be planted in fertile ground. As Jennifer Roback Morse wrote in “Marriage and the Limits of Contract,”

[l]ibertarians recognize that a free market needs a culture of law-abidingness, promise-keeping, and respect for contracts. . . . A culture full of people who violate their contracts at every possible opportunity cannot be held together by legal institutions, as the experience of post-communist Russia plainly shows.

Neither the state nor the stateless Utopia of anarcho-capitalist dreams can ensure a moral society, that is, one in which there is law-abidingness, promise-keeping, and respect for contracts. Where, then, do we turn for moral education? To the public schools, whose unionized teachers preach the virtues of moral relativism, big government, income redistribution, and non-judgmentalism (lack of personal repsonsibility)? I hardly think so.

That leaves us with religion, especially religion in the Judeo-Christian tradition. . . .

The weakening of Judeo-Christianity in America is owed to enemies within (established religions trying in vain to be “relevant”) and to enemies without (Leftists and nihilistic libertarians who seek every opportunity to denigrate religion). . . .

I believe that incessant attacks on religion have helped to push people — especially young adults — away from religion, to the detriment of liberty. It’s not surprising that modern liberals tend to be anti-religious, for they disdain the tenets of personal responsibility and liberty that are contained in the last six of the Ten Commandments. It is disheartening, however, when libertarians join the anti-religious chorus. They know not what they do when they join the Left in tearing down a bulwark of civil society, without which liberty cannot prevail.

Humans need no education in aggression and meddling; those come to us naturally. But we do need to learn to take responsibility for our actions and to leave others alone — and we need to learn those things when we are young. Public schools can’t foster that learning, nor can a relative handful of libertarians. Parents can do it, if they have the right background for it; that background is to be found in the Judeo-Christian tradition. Most importantly, children can learn for themselves, if they are raised in the Judeo-Christian tradition. . . .

Rather than join the Left in attacking the Judeo-Christian tradition, libertarians ought to accommodate themselves to it and even encourage its acceptance — for liberty’s sake. There is much to gain and — given the separation of church and state, which most religionists prefer — almost nothing to lose.

CLICK HERE TO READ THE FULL POST.

Religion and Liberty

Before you read the following post, you should know that I am an agnostic, not a person of religion. I was raised in the Roman Catholic faith but abandoned that faith more than two-thirds of a lifetime ago.

Many libertarians — especially the strident atheists among them — are quick to say that religious morality is unnecessary because morality — standards of right and wrong — can be supplied by other sources: libertarianism, for example. There’s something to that, if you can bring yourself to believe that the gospel of Adam Smith, John Stuart Mill, and Friedrich Hayek could attract a much wider audience than its present, minuscule, market share.

For libertarianism to grow and thrive, it must be planted in fertile ground. As Jennifer Roback Morse wrote in “Marriage and the Limits of Contract,”

[l]ibertarians recognize that a free market needs a culture of law-abidingness, promise-keeping, and respect for contracts. . . . A culture full of people who violate their contracts at every possible opportunity cannot be held together by legal institutions, as the experience of post-communist Russia plainly shows.

Neither the state nor the stateless Utopia of anarcho-capitalist dreams can ensure a moral society, that is, one in which there is law-abidingness, promise-keeping, and respect for contracts. Where, then, do we turn for moral education? To the public schools, whose unionized teachers preach the virtues of moral relativism, big government, income redistribution, and non-judgmentalism (lack of personal repsonsibility)? I hardly think so.

That leaves us with religion, especially religion in the Judeo-Christian tradition. As the Catholic Encyclopedia puts it:

The precepts [of the last six of the Commandments] are meant to protect man in his natural rights against the injustice of his fellows.

  • His life is the object of the Fifth;
  • the honour of his body as well as the source of life, of the Sixth;
  • his lawful possessions, of the Seventh;
  • his good name, of the Eighth;
  • And in order to make him still more secure in the enjoyment of his rights, it is declared an offense against God to desire to wrong him, in his family rights by the Ninth;
  • and in his property rights by the Tenth.

I am neither a person of faith nor a natural-rights libertarian, but I would gladly live in a society in which the majority of my fellow citizens believed in and adhered to the Ten Commandments, especially the last six of them. I reject the currently fashionable notion that religion per se breeds violence. In fact, a scholarly, non-sectarian paper offers good evidence that religiosity leads to good behavior:

. . . We will define religious activities as[:] (1) Attendance to religious activities, (2) Salience or importance of God to one’s self, (3) Denomination, (4) Frequency of prayer, (5) Bible studies, and (6) Religious activities outside of church. . . .

Some of the studies reported in this speculative review used multidimensional means of measuring religiosity with consistency. Of these reports nearly all found that that there was a significant negative correlation between religiosity and delinquency. This was further substantiated by studies using longitudinal and operationally reliable definitions. Of the early reports which were either inconclusive or found no statistical correlation, not one utilized a multidimensional definition or any sort of reliability factor. We maintain that the cause of this difference in findings stemmed from methodological factors as well as different and perhaps flawed research strategies that were employed by early sociological and criminological researchers.

The studies that we reviewed were of high research caliber and showed that the inverse relationship [between religiosity and delinquincy] does in fact exist. It therefore appears that religion is both a short term and long term mitigat[o]r of delinquency.

But a society in which behavior is guided by the Ten Commandments seems to be receding into the past. Consider these statistics, from InfoPlease: Between 1990 and 2001

  • the fraction of American adults claiming to belong to a Christian religion dropped from 86.4 percent to 76.7 percent, and
  • the fraction of American adults claiming to be of the Jewish faith dropped from 1.8 percent to 1.4 percent.

What’s noteworthy about those figures is the degree of slippage in a span of 11 years. The absolute values, of course, overstate the degree of adherence to formal religion because respondents tend to say the “right” thing, which (oddly enough) continues to be a profession of religious faith. If Bill Clinton (among others) can claim to be a “religious” person, who could not?

The good news is that most of the slippage in stated attendance is among the major, old-line denominations: the post-Vatican II Roman Catholic Church and the Baptists, Methodists, Lutherans, and Presbyterians. Those denominations, or large segments of them, have slid away from the Ten Commandments in order to be more “relevant” — thus evidently becoming less “relevant.”

The bad news is that claiming adherence to a religion and receiving religious “booster shots” through regular church attendance are two entirely different things. Consider this excerpt of the cover story (“In Search of the Spiritual“) in the August 29 – September 5 issue of Newsweek:

Of 1,004 respondents to the NEWSWEEK/Beliefnet Poll, 45 percent said they attend worship services weekly, virtually identical to the figure (44 percent) in a Gallup poll cited by Time in 1966. Then as now, however, there is probably a fair amount of wishful thinking in those figures; researchers who have done actual head counts in churches think the figure is probably more like 20 percent [link added: ED]. There has been a particular falloff in attendance by African-Americans, for whom the church is no longer the only respectable avenue of social advancement, according to Darren Sherkat, a sociologist at Southern Illinois University. The fastest-growing category on surveys that ask people to give their religious affiliation, says Patricia O’Connell Killen of Pacific Lutheran University in Tacoma, Wash., is “none.” But “spirituality,” the impulse to seek communion with the Divine, is thriving. The NEWSWEEK/Beliefnet Poll found that more Americans, especially those younger than 60, described themselves as “spiritual” (79 percent) than “religious” (64 percent). Almost two thirds of Americans say they pray every day, and nearly a third meditate.

But what does “spirituality” have to do with morality? Prayer and meditation may be useful and even necessary to religion, but they do not teach morality. Substituting “spirituality” for Judeo-Christian religiosity is like watching golf matches on TV instead of playing golf; a watcher can talk a good game but cannot play the game very well, if at all.

Historian Niall Ferguson, a Briton, writes about the importance of religiosity in “A loss of faith fans the fire of fanaticism“:

I am not sure British people are necessarily afraid of religion, but they are certainly not much interested in it these days. Indeed, the decline of Christianity — not just in Britain but across Europe — stands out as one of the most remarkable phenomena of our times.

There was a time when Europe would justly refer to itself as “Christendom.” Europeans built the Continent’s loveliest edifices to accommodate their acts of worship. They quarreled bitterly over the distinction between transubstantiation and consubstantiation. As pilgrims, missionaries and conquistadors, they sailed to the four corners of the Earth, intent on converting the heathen to the true faith.

Now it is Europeans who are the heathens. . . .

The exceptionally low level of British religiosity was perhaps the most striking revelation of a recent ICM poll [link added: ED]. One in five Britons claim to “attend an organized religious service regularly,” less than half the American figure. [In light of the relationship between claimed and actual church attendance, discussed above, the actual figure for Britons is probably about 10 percent: ED.] Little more than a quarter say that they pray regularly, compared with two-thirds of Americans and 95 percent of Nigerians. And barely one in 10 Britons would be willing to die for our God or our beliefs, compared with 71 percent of Americans. . . .

Chesterton feared that if Christianity declined, “superstition” would “drown all your old rationalism and skepticism.” When educated friends tell me that they have invited a shaman to investigate their new house for bad juju, I see what Chesterton meant. Yet it is not the spread of such mumbo-jumbo that concerns me as much as the moral vacuum that de-Christianization has created. Sure, sermons are sometimes dull and congregations often sing out of tune. But, if nothing else, a weekly dose of Christian doctrine helps to provide an ethical framework for life. And it is not clear where else such a thing is available in modern Europe.

Over the last few weeks [since the terrorist attacks of 7/7: ED], Britons have heard a great deal from Tony Blair and others about the threat posed to their “way of life” by Muslim extremists such as Muktar Said Ibrahim. But how far has their own loss of religious faith turned Britain into a soft target — not so much for the superstition Chesterton feared, but for the fanaticism of others?

 

Yes, what “way of life” is being threatened — and is therefore deemed worth defending — when people do not share a strong moral bond?

That the moral bond of Judeo-Christianity also has weakened on this side of the Atlantic is evidenced by the rising tide of “foxhole rats” in our midst: post-patriotic and undoubtedly anti-religious Leftists for whom America is just an arbitrary geopolitical entity.

The weakening of Judeo-Christianity in America is owed to enemies within (established religions trying in vain to be “relevant”) and to enemies without (Leftists and nihilistic libertarians who seek every opportunity to denigrate religion). Thus the opponents of religiosity seized on the homosexual scandals in the Catholic Church not to attack homosexuality (which would go against the attackers’ party line) but to attack the Church, which teaches that acts of the kind that were committed by a relatively small number of priests are, in fact, immoral.

Then there is the relentless depiction of Catholicism as an accomplice to Hitler’s brutality, about which my son writes in his review of Rabbi David G. Dalin’s The Myth of Hitler’s Pope: How Pius XII Rescued Jews from the Nazis:

Despite the misleading nature of the controversy — one which Dalin questions from the outset — the first critics of the wartime papacy were not Jews. Among the worst attacks were those of leftist non-Jews, such as Carlo Falconi (author of The Silence of Pius XII), not to mention German liberal Rolf Hochhuth, whose 1963 play, The Deputy, set the tone for subsequent derogatory media portrayals of wartime Catholicism. By contrast, says Dalin, Pope Pius XII “was widely praised [during his lifetime] for having saved hundreds of thousands of Jewish lives during the Holocaust.” He provides an impressive list of Jews who testified on the pope’s behalf, including Albert Einstein, Golda Meir and Chaim Weizmann. Dalin believes that to “deny and delegitimize their collective memory and experience of the Holocaust,” as some have done, “is to engage in a subtle yet profound form of Holocaust denial.”

The most obvious source of the black legend about the papacy emanated from Communist Russia, a point noted by the author. There were others with an axe to grind. As revealed in a recent issue of Sandro Magister’s Chiesa, liberal French Catholic Emmanuel Mounier began implicating Pius XII in “racist” politics as early as 1939. Subsequent detractors have made the same charge, working (presumably) from the same bias.

While the immediate accusations against Pius XII lie at the heart of Dalin’s book, he takes his analysis a step further. The vilification of the pope can only be understood in terms of a political agenda — the “liberal culture war against tradition.” . . .

Rabbi Dalin sums it up best for all people of traditional moral and political beliefs when he urges us to recall the challenges that faced Pius XII in which the “fundamental threats to Jews came not from devoted Christians — they were the prime rescuers of Jewish lives in the Holocaust — but from anti-Catholic Nazis, atheistic Communists, and… Hitler’s mufti in Jerusalem.”

I believe that incessant attacks on religion have helped to push people — especially young adults — away from religion, to the detriment of liberty. It’s not surprising that modern liberals tend to be anti-religious, for they disdain the tenets of personal responsibility and liberty that are contained in the last six of the Ten Commandments. It is disheartening, however, when libertarians join the anti-religious chorus. They know not what they do when they join the Left in tearing down a bulwark of civil society, without which liberty cannot prevail.

Humans need no education in aggression and meddling; those come to us naturally. But we do need to learn to take responsibility for our actions and to leave others alone — and we need to learn those things when we are young. Public schools can’t foster that learning, nor can a relative handful of libertarians. Parents can do it, if they have the right background for it; that background is to be found in the Judeo-Christian tradition. Most importantly, children can learn for themselves, if they are raised in the Judeo-Christian tradition.

Am I being hypcritical because I am unchurched and my children were not taken to church? Perhaps, but my religious upbringing imbued in me a strong sense of morality, which I tried — successfully, I think — to convey to my children. But as time passes the moral lessons we older Americans learned through religion will attenuate unless those lessons are taught, anew, to younger generations.

Rather than join the Left in attacking the Judeo-Christian tradition, libertarians ought to accommodate themselves to it and even encourage its acceptance — for liberty’s sake. There is much to gain and — given the separation of church and state, which most religionists prefer — almost nothing to lose.

Related posts:

More Things a Libertarian Can Believe In
(07/11/04)
Libertarian Conservative or Conservative Libertarian (07/29/04)
Hobbesian Libertarianism (10/08/04)
The State of Nature (12/05/04)
Libertarianism and Conservatism (12/05/04)
Going Too Far with the First Amendment? (01/01/05)
Atheism, Religion, and Science (01/03/05)
The Limits of Science (01/05/05)
Three Perspectives on Life: A Parable (01/15/05)
Beware of Irrational Atheism (01/22/05)
Judeo-Christian Values and Liberty (02/20/05)
The Creation Model (02/23/05)
Libertarianism, Marriage, and the True Meaning of Family Values (04/06/05)
Religion and Personal Responsibility (04/08/05)
Free Will: A Proof by Example? (04/09/05)
Where Conservatism and (Sensible) Libertarianism Come Together (04/19/05)
A Renewed Respect? (04/19/05)
Conservatism, Libertarianism, and Public Morality (04/25/05)
Evolution and Religion (07/25/05)
Moral Issues (07/26/05)
Shall We All Hang Separately? (08/13/05)
Foxhole Rats (08/14/05)
Words of Caution for Scientific Dogmatists (08/19/05)
Foxhole Rats, Redux (08/22/05)

Guilty Until Proven Innocent

Excerpt of an e-mail from the law firm of McGuireWoods (“No Good Deed Goes Unpunished? Seventh Circuit Rules That No Adverse ‘Employment’ Action is Necessary to Sustain Title VII Retaliation Claims”):

Executive Secretary, Chrissy Washington worked for the Illinois Department of Revenue on a flexible schedule from 7 a.m. to 3 p.m., instead of the standard 9-5 schedule, allowing her to care for her son with Down Syndrome. When some of her duties were reassigned to others, she filed charges with state and federal agencies alleging race discrimination. Subsequently, a senior manager required that she work from 9 to 5, and when she refused, her position was abolished. Washington was assigned to another Executive Secretary post with a different supervisor and was required to apply anew for a flextime schedule, which was refused. Washington maintained that it was her prior discrimination charge that led supervisors to rescind the flextime schedule on which her son depended. . . .

. . . [The Seventh Circuit Court of Appeals] concluded (with a highly entertaining reference to the comic strip Dilbert) that where an employer retaliates for protected activity by exploiting an employee’s known vulnerability, such as Washington’s reliance on flextime to care for her disabled son, the action can be a material change sufficient to sustain a retaliation claim under Title VII [of the Civil Rights Act of 1964]. The standard for materiality, the court noted, is whether the employer’s action has the “potential” to dissuade an employee (and, by logical extension, other employees) from pursuing her rights under Title VII.

Although this opinion does not reflect a uniform view among the jurisdictions on the ultimate issue, it should serve to alert employers to some of the potential problems that can arise from the implementation of flextime schedules and other employee-friendly initiatives. The court clearly says that once these admittedly optional benefits are in place for an employee, their removal can serve as a basis for retaliation claims.

Lesson 1: A benefit, once bestowed, can become an entitlement.

Lesson 2: An employee who has filed an Equal Employment Opportunity (EEO) claim against an employer may became immune to otherwise defensible business decisions by that employer.

As my HR director used to say whenever a disgruntled employee or former employee filed an EEO claim: “We (the company) are guilty until proven innocent.” Because that’s how the EEO racket works.