Charles Murray’s Grand Plan

Charles Murray — he of The Bell Curve fame — recently unveiled his grand plan to overhaul the welfare state. His plan, which Murray outlines in his new book, In Our Hands : A Plan To Replace The Welfare State, amounts to this: Cut out the government middleman and give everyone who is older than 21 and not in jail $10,000 a year (or less, depending on income). The idea, I guess, is to accomplish three things:

  • Eliminate the “house cut,” that is, the cost of maintaining the multitude of bureaucracies, consultants, and contractors. In addition to wasting money, they often are effective self-promoters.
  • Eliminate myriad special-interest programs — each of which has a vocal constituency — because these are seldom cut or eliminated individually, in spite of an aggregate cost to which most taxpayers object. Make the welfare state an all-or-nothing proposition in which every free adult has an equal stake.
  • Let individuals decide for themselves how best to use their “gift” from other taxpayers. On balance, they will make better decisions than bureaucrats, and those decisions (e.g., more education) will yield higher incomes. Thus the cost of the program will go down in the long run, and support for its expansion will be harder to come by.

Here are excerpts of Murray’s interview with Kathryn Jean Lopez, editor of National Review Online:

Kathryn Jean Lopez: First things first. $10,000? Who’s getting and when? And can I use it on my credit-card debt?

Charles Murray: If you’ve reached your 21st birthday, are a United States citizen, are not incarcerated, and have a pulse, you get the grant, electronically deposited in monthly installments in an American bank of your choice with an ABA routing number. If you make more than $25,000, you pay part of it back in graduated amounts. At $50,000, the surtax maxes out at $5,000. I also, reluctantly but with good reason, specify that $3,000 has to be devoted to health care. Apart from that, you can use the grant for whatever you want. Enjoy. . . .

Lopez: How can even low-income folks have a “comfortable retirement” under your plan? Is that foolproof?

Murray: Someone turning 21 has about 45 years before retirement. The lowest average real return for the U.S. stock market for any 45-year period since 1801 is 4.3 percent. Round that down to 4 percent and work the magic of compound interest. Just a $2,000 contribution a year amounts to about $253,000 at retirement. A low-income couple that has followed that strategy retires with more than half a million dollars in the bank plus $20,000 continuing annual income from the grant. Sounds comfortable to me. As for “foolproof,” think of it this way: All of the government’s guarantees for Social Security depend on the U.S. economy growing at a rate that, at the very least, is associated with an historically worst average return of 4 percent in the stock market (actually, it needs a much stronger economy than that). Absent economic growth, no plan is foolproof. With economic growth, mine is. . . .

Lopez: Under your plan, the government spends more first, but saves money in the long run, right? But is there any guarantee folks in the future abide by the plan? Can’t a few pols wanting to restore an entitlement here or there ruin things?

Murray: I leave the size of the grant to the political process, but there is a built-in brake. Congress can pass hundreds of billions of dollars in favors for special groups, because no single allocation is large enough to mobilize the opposition of a powerful coalition opposing it. A change in the size of the grant directly effects everyone over the age of 21. Every time Congress talks about changing the size of the grant, it will be the biggest story in the country.

The one thing that can’t be left to the political process is the requirement that the grant replace all other transfers. That has to be a constitutional requirement, written in language that even Supreme Court justices can’t ignore. Assuming such a thing is possible.

And there’s the rub. Coalitions of special-interest groups will band together in defense of the status quo because each of them will seek to preserve “its” program. The fact that they and their constituencies are paying each other’s freight won’t matter. They’ll believe (or pretend to believe) that they’re soaking the rich and “big business,” when — in reality — they are burdening the poor by disincentivizing the inventors, innovators, and entrepreneurs who are the mainspring of economic growth. Murray’s grand plan is therefore more likely to be implemented as an add-on to the welfare state than as a substitute for it.

Nevertheless, unlike the anarcho-capitalist contingent, I won’t characterize his proposal as unlibertarian. If it were adopted as an alternative to the present system it probably would lighten the weight that government places on us. That would be great progress, but anything short of the abolition of government is unacceptable to Rothbardians, for they dwell in a wonderland of impossibility. (See this post, for example, and follow the links therein.)

One commenter — a columnist at Bloomberg.com by the name of Andrew Ferguson — has a different objection to Murray’s plan:

His larger goal is to revive those social institutions, particularly the family, the workplace and the local community, which the welfare state has weakened and supplanted and “through which people live satisfying lives.”

If you want to see the enervating effects of the all- encompassing welfare state, he says, look at Europe, where marriage and birth rates have plunged and work and religion have lost their traditional standing as sources of happiness and personal satisfaction. . . .

In Europe, he says with evident disdain, “the purpose of life is to while away the time as pleasantly as possible.”

Here the reader of “In Our Hands” may suddenly pull up short. What began as a wonkish policy tract enlarges into an exploration of how people live lives of meaning and purpose.

Who knew? It turns out that Charles Murray, the nation’s foremost libertarian philosopher, is a moralist.

In the end, though, moralizing and libertarianism make for an uncomfortable fit.

On the one hand, Murray says he wants to liberate citizens from the welfare state so they can live life however they choose. On the other hand, by liberating citizens from the welfare state, he hopes to force them back into lives of traditional bourgeois virtue.

Mr. Ferguson once wrote speeches for President George H.W. Bush. And it shows in the shallowness of his analysis. Murray is not “moralizing.” Murray is explaining that when individuals are liberated from the welfare state they are more likely to adopt — voluntarily — those mores that keep the welfare state at bay. Murray isn’t hoping to “force” people “back into lives of traditional bourgeois virtue” (the condescenscion drips from that phrase), he is saying that liberty rests on what Ferguson chooses to call “traditional bourgeois virtue.” (For an extended analysis of that proposition, read this, and especially this segment.)

More Anti-Black Bigotry from the Left

In a recent post, “A Black Bigot Speaks,” I noted the tendency of Leftists (even black ones) to denigrate black Republicans and conservatives. The Leftists say, in so many words, that

black conservatives are too “dumb” to know that conservatism is bad for them. And/or they’re just power-seeking Uncle Toms and Aunt Jemimas who suck up to powerful whites in return for access to power and the perks of high office. [The Left] is unwilling to credit . . . black conservatives with having a principled attachment to conservatism.

Now comes today’s New York Times Magazine, with a story about black Republican Michael Steele. Steele, who is Maryland’s lieutenant governer, is running for the U.S. Senate. The headline of the story:

Why Is Michael Steele a Republican Candidate?

I need say no more, except that it’s long past time for blacks to tell the Left where to put its condescenscion.

(Hat tip to Betsy’s Page.)

A Black Bigot Speaks

If anything exemplifies Leftists’ condescenscion to blacks it’s this op-ed piece* in the L.A. Times by Erin Aubry Kaplan (right). The op-ed is about former White House staffer (and black Republican) Claude Allen, who recently was charged with theft. The most telling bits:

I don’t support conservatism in its current iteration, and I support black conservatives even less . . . .

Here is a man who, like most black conservatives, has had to do an awful lot of personal and political rationalizing to pay dues . . . .

In so many words, Allen and other black conservatives are too “dumb” to know that conservatism is bad for them. And/or they’re just power-seeking Uncle Toms and Aunt Jemimas who suck up to powerful whites in return for access to power and the perks of high office. Kaplan (like her compatriots on the Left) is unwilling to credit Allen and other black conservatives with having a principled attachment to conservatism.

Kaplan’s own blackness doesn’t excuse her profound bigotry. It merely underscores her status as a “house black” at the Left-wing L.A. Times, where she spouts the party line in the hope of keeping blacks “in line” — that is, voting for Democrats in order to perpetuate the regulatory-welfare state that has done so much, for so long, to undermine black families and stifle the initiative of young blacks.
__________
* Free registration required. Try latimes@fastchevy.com as a username and password as a password.

The Cultural Divide

In Chicago, “Diversity lacking in crowds at large museums.” And blah, blah, blah. The museums — operating in their liberal-guilt mode — are shouldering the blame for low attendance by minorities:

At the Museum of Science and Industry, officials already have a name for the phenomenon — “the Glenview effect,” after the largely white suburb that represents its highest single ZIP code attendance, said Valerie Waller, the museum’s vice president of marketing.

“We see it on the floor — our audience is not as diverse as what we see in the city of Chicago or the surrounding area,” said Waller.

Speaking Wednesday at the Cultural Center, where the study was unveiled, Waller said, “The number of people not engaged in our institutions, with all the variety of programming and opportunities we have for them, is shocking.”

Waller wondered if minorities and the poor aren’t aware of the institutions or not interested. “Is price a factor? Many of our institutions were free 15 years ago,” she said. “Is it the hours we’re open? [Are people] overscheduled with soccer practices and everything else?”

The real culprit — which dare not speak its name — is the bias within minority cultures against “acting white.”

(Thanks to Tongue Tied for the pointer to the article about attendance at Chicago’s museums.)

K-K-Katrina

UPDATED TWICE

It’s the hurricane that won’t go away. Now we are being told, in so many words, that — with a hurricane bearing down on a New Orleans that was “protected” by levees that were built for failure (many years earlier) because of political graft and bureaucratic ineptness, and with feckless State and local government officials cluttering up the scene* — President Bush was either supposed to divert the hurricane (perhaps through prayer) or leap from his chair, fly to NO and put his finger in the dike, so to speak. (Is it still all right to say “dike”?)

For a sensible view of the hurricane that won’t quit, read this post by Capital Freedom. UPDATE: For more, read this post at Wizbang. SECOND UPDATE: See also the Popular Mechanics article “Now What? The Lessons of Katrina.”

Then go here:

Katrina’s Aftermath: Who’s to Blame?
(09/01/05)
“The Private Sector Isn’t Perfect” (09/02/05)
A Modest Proposal for Disaster Preparedness (09/07/05)
No Mention of Opportunity Costs (09/08/05)
Whose Incompetence Do You Trust? (09/10/05)
An Open Letter to Michael Moore (09/13/05)
Enough of Amateur Critics (09/13/05)
__________
* E.g., New Video Shows Blanco Saying Levees Safe (AP, via Yahoo! News)

Time on the Cross, Re-revisited

There has been endless debate as to whether or not the American Civil War was fought over slavery. My own view is that the Civil War was about slavery, in a roundabout way:

  • The mainly agrarian South wanted low tariffs on manufactured goods because high tariffs meant that Southerners had to pay higher prices for manufactured goods. The North wanted high tariffs to protect its new manufacturing industries.
  • Slave labor was fundamental to Southern agrarianism. Abolition was largely a Northern phenomenon.
  • Anti-Northern feelings among Southern elites had been running high for decades. With the rise of the Republican Party, Southerners faced not only the continued prospect of Northern economic dominance but also the prospect that slavery would be abolished. In sum, the election of Abraham Lincoln posed an imminent threat to the Southern “way of life.”
  • War then was inevitable, given the South’s aversion to the North’s economic and abolitionist agenda, on the one hand, and Lincoln’s determination to preserve the Union, on the other hand.

The North’s victory in the Civil War meant an end to slavery in the United States, even though ending slavery was, in Lincoln’s view, secondary to preserving the Union. According to one account of a failed peace parley in January 1865 — an account that is somewhat disingenuous about the South’s interest in preserving slavery — Lincoln

stated that it was never his intention to interfere with slavery in the states where it already existed and he would not have done so during the war, except that it became a military necessity. He had always been in favor of prohibiting the extension of slavery into the territories but never thought immediate emancipation in the states where it already existed was practical. He thought there would be “many evils attending” the immediate ending of slavery in those states.

Be that as it may, the government of the United States did take advantage of the Civil War to eradicate slavery, first partially through the Emancipation Proclamation, then fully through the Thirteenth, Fourteenth, and Fifteenth Amendments to the Constitution.

Slavery’s demise, as a byproduct of the Civil War, raises two questions:

  • Would slavery have been ended peacefully?
  • Is slavery an indelible stain on American history?

Would Slavery Have Been Ended Peacefully?

There are those who argue that if the North had fought the Civil War over slavery, it had fought an unnecessary war because economic forces would eventually have put an end to slavery. There are others who argue that slavery would not have succumbed to economic forces. Crucial to the debate between the two camps is the validity (or invalidity) of Robert Fogel and Stanley Engerman’s cliometric study, Time on the Cross: The Economics of American Negro Slavery (1974), which makes a case that slavery would not have succumbed to economic forces. Fogel and Engerman’s study, however, is fraught with errors. Thomas J. DiLorenzo explains some of those errors:

. . . Fogel and Engerman’s . . . reliance on . . . the price of slaves . . . as “evidence” that slavery could not have been ended peacefully is poor economics. . . . For one thing, the Fugitive Slave Act socialized the enforcement costs of slavery, thereby artificially inflating slave prices. Abolition of the Act, as would have been the reality had the Southern states been allowed to leave in peace would have caused slave prices to plummet and quickened the institution’s demise. That, coupled with a serious effort to do what every nation on the face of the earth did to end slavery during the nineteenth century – compensated emancipation – could have ended slavery peacefully. Great Britain did it in just six years time, and Americans could have followed their lead.. . .

[T]he high price of slaves . . . in 1860 created strong incentives for Southern farmers to find substitutes in the form of free labor and mechanized agriculture. It also increased the expected profitability of mechanized agriculture, so that the producers of that equipment were motivated to develop and market it in the South. This is what happens in any industry where there are rapidly-rising prices of factors of production of any kind. As Mark Thornton wrote in “Slavery, Profitability, and the Market Process” (Review of Austrian Economics, vol. 7, No. 2, 1994), by 1860 “slavery was fleeing from both the competition of free labor and urbanization towards the isolated virgin lands of the Southwest.” Gunderson does not cite any literature past 1974 on this point, so he is probably unaware of such facts.

[T]here is a difference between slave labor being “efficient” for the slave owner and its effect on society as a whole. Of course slavery was profitable to slave owners. This government-supported system helped them confiscate the fruits of the slaves’ labor. But since slave labor is inherently less efficient than free labor, and since so many resources had to be devoted to enforcing the system — most of which were the result of government interventions such as the Fugitive Slave Act, mandatory slave patrol laws, and laws that prohibited manumission — the system imposed huge burdens (“dead weight loss,” in the language of economics) on the rest of society. Free laborers and non-slave owners in the South (at least 80 percent of the adult population) were the primary victims of these government-imposed costs, and would have been a natural political constituency for their eventual abolition. As Hummel concluded, “In real terms, the entire southern economy, including both whites and blacks, was less prosperous” overall because of slavery.

There was net internal migration from South to North, confirming the fact that free laborers in the South were also indirectly exploited by the slave system which forced them into lower-paying jobs. . . .

DiLorenzo — an anarcho-libertarian who despises Abraham Lincoln and is rabidly pro-secession (column archive) — may strike you as a biased source, even though he seems to have facts and logic on his side, in this instance. But we need not rely on DiLorenzo. Fogel and Engerman’s thesis has been attacked, on its merits, from many quarters. Here, for example, are excerpts of a review essay by Thomas L. Haskell, “The True and Tragical History of ‘Time on the Cross’ ” (fee required), from The New York Review of Books (October 2, 1975):

The flaws of Time on the Cross are not confined to its parts but extend to its conceptual heart: the efficiency calculation. No finding raised more eyebrows than the dramatic claim that slaves, through their personal diligence and enthusiastic commitment to the work ethic, made southern agriculture 35 percent more efficient than the family farms of the North. My own nonspecialist’s doubts about this contention . . . have been amply confirmed (and superseded in expertise and weight of evidence) by the work of a half-dozen economic historians.

Fogel and Engerman should have known from the beginning that any comparison of regional efficiency in the antebellum period was fraught with breathtaking difficulties. The basis for their comparison, a rather controversial economist’s tool known as the “geometric index of total factor productivity,” gives results whose interpretation is debatable in even the most conventional applications. . . .

Since the index is based on market value it reflects not only the performance of producers (which is what we have in mind when we talk about productive efficiency) but also the behavior of consumers, whose eagerness for the product helps to determine its market value. Consumer behavior is clearly irrelevant to productive efficiency and the index is misleading to the extent that it is influenced by this factor.

In short, the index is sensitive to demand: if two producers organize their work in equally rational ways, work equally hard, and even produce equal amounts of physical output, the so-called “efficiency” index may nonetheless rank one producer more “efficient” than the other because his product is in greater demand. As David and Temin observe, this is not the accepted meaning of “efficiency.”

Given the sensitivity of the index to demand and the heavy demand for the South’s principal crop, cotton, the index by itself is utterly incapable of justifying the chief inference that Fogel and Engerman drew from it—that slaves must have been hard-working Horatio Alger types and their masters skilled scientific managers. Gavin Wright confirms that the efficiency gap has more to do with voracious consumer demand for cotton than with any Herculean feats of productivity by southern producers. . . .

The bias introduced by cotton demand is only the most obvious of the flaws in the efficiency calculation. Even apart from the inherent frailties of the index in this especially difficult application, Fogel and Engerman’s use of it rests on some extremely dubious assumptions. The choice of 1860 as a typical year for measurement has been sharply questioned. So has the authors’ proposition that an acre of northern farmland was on average 2.5 times better in quality than southern farmland. This extraordinary assumption alone is enough to guarantee a finding of southern superiority in productivity. . . .

Lance Davis of the California Institute of Technology, a prominent cliometrician, singled out the efficiency calculation as the least plausible argument of a generally unpersuasive book. He estimated that Fogel and Engerman’s chances of successfully defending the efficiency finding were about one in ten. This is a telling judgment from the man who introduced the term “New Economic History,” who once called Fogel’s railroad study a “great book,” and who even crowned Fogel himself as “the best” of the cliometricians nine years ago. The efficiency calculation has been closely scrutinized not only by Davis, Wright, Temin, and Paul David, but also by Stanley Lebergott of Wesleyan, Harold Woodman of Purdue, Jay Mandle of Temple, and Frank B. Tipton, Jr. and Clarence E. Walker, both of Wesleyan. No one has a kind word to say for it.

Haskell certainly wasn’t offering an apology for slavery or for any other form of oppression. Nor am I. Slavery was evil, but it existed. The question facing our forbears was how best to eradicate it and then improve the lot of those who had been enslaved. With the advantage of hindsight a case can be made that America’s blacks would be better off today if their ancestors had been freed and integrated into society voluntarily — through economic forces if not social ones. But that is merely hindsight. Regardless of Lincoln’s motivation for prosecuting the Civil War, that war brought an end to slavery. And that — thankfully — is that.

Moreover, Lincoln-hater DiLorenzo gives us good reason to believe that slavery would have died hard in the South. DiLorenzo wants the best of both worlds. He wants to prove that the Civil War was not fought (by the North) because of slavery, and also to prove that the Civil War was fought (by the North) unnecessarily because economic forces would have put an (eventual) end to slavery. The second proposition is inconsistent with the first. DiLorenzo’s inconsistency arises because he is a pro-secessionist who also has the good grace to oppose slavery. He must therefore resort to alternative history in order to justify his secessionist views. His alternative history (sampled above) is that economic forces would have brought an end to slavery in the South, absent the Civil War. But would they have done so? Perhaps eventually, but not for an unconscionably long time.

Economic forces arise from human nature. One facet of human nature is a “taste” that manifests itself in the oppression of “inferior” races (e.g., blacks, Jews, Tutsis, Hutus). Such a “taste” can override “rational” (i.e., wealth-maximizing) forces. The post-Civil War history of race in the South suggests very strongly that slavery would have died hard in the South. Thomas Sowell examines a slice of that history:

The death of Rosa Parks has reminded us of her place in history, as the black woman whose refusal to give up her seat on a bus to a white man, in accordance with the Jim Crow laws of Alabama, became the spark that ignited the civil rights movement of the 1950s and 1960s.

Most people do not know the rest of the story, however. Why was there racially segregated seating on public transportation in the first place? “Racism” some will say — and there was certainly plenty of racism in the South, going back for centuries. But racially segregated seating on streetcars and buses in the South did not go back for centuries.

Far from existing from time immemorial, as many have assumed, racially segregated seating in public transportation began in the South in the late 19th and early 20th centuries.

Those who see government as the solution to social problems may be surprised to learn that it was government which created this problem. Many, if not most, municipal transit systems were privately owned in the 19th century and the private owners of these systems had no incentive to segregate the races.

These owners may have been racists themselves but they were in business to make a profit — and you don’t make a profit by alienating a lot of your customers. There was not enough market demand for Jim Crow seating on municipal transit to bring it about.

It was politics that segregated the races because the incentives of the political process are different from the incentives of the economic process. Both blacks and whites spent money to ride the buses but, after the disenfranchisement of black voters in the late 19th and early 20th century, only whites counted in the political process.

It was not necessary for an overwhelming majority of the white voters to demand racial segregation. If some did and the others didn’t care, that was sufficient politically, because what blacks wanted did not count politically after they lost the vote.

The incentives of the economic system and the incentives of the political system were not only different, they clashed. . . .

The “incentives of the political system” — a “taste” for racial oppression, in other words — dominated Southern politics until the 1960s. And that was in a defeated South. The determination of Southern political leaders to defend slavery in the first place, and then to salvage the remnants of slavery through Jim Crow, is strong evidence that economic forces might not have been allowed to operate freely in the South, at least not for a long time. The evil (take note, Mr. DiLorenzo) was to be found in Southern political leaders, not in the White House.

Opponents of slavery, unarmed as they were with “sophisticated” (and flawed) cliometric techniques, saw the evil in slavery and eradicated it when they had the opportunity to do so. Uncertain gradualism in the defense of liberty is no virtue. Opportunistic abolitionism in the defense of liberty is far from a vice.

The Stain of Slavery

The fact that slavery existed in the United States for so long is taken by some — especially those of the Left, here and abroad — as evidence that white-male-capitalist-dominated-America is evil incarnate. But slavery in the United States was ended when white, male capitalists still dominated America, whereas slavery still exists in non-white areas of the world.

Strident critics of the United States nevertheless persist in saying that the existence in the United States of slavery (or any other “evil,” real or imagined) means that the U.S. was and is no better than, say, the fascistic Third Reich. (Leftists don’t like to remind us about the longer-lived and equally fascistic USSR.) Such assertions studiously ignore the fact that most Americans always have been freer than the subjects of Hitler and Stalin. The economic forces that could eventually have brought an end to slavery in the United States would not have been allowed to operate in Nazi Germany or the Soviet Union — or in Communist China, Cuba, Saddam’s Iraq, North Korea, and other dictatorial regimes of the kind that Leftists often have defended and even idealized as “progressive” and even “freedom-loving.” Nor should it go without notice that Nazi Germany and the USSR met their demise at the hands of the “militaristic” United States.

It is supremely ironic that Leftists — who like to attack the United States as “fascistic” and “militaristic” — are proponents of government interventions in private affairs that are confiscatory and stultifying in their effects on economic output. All working persons in the United States — and all who depend on them — are in thrall to the “plantation owners” who run our affairs from the Capitol in Washington, the various State capitols, and sundry municipal buildings. The Left applauds that thralldom and agitates for its intensification.

Yes, the fact that slavery existed in the United States for so long is a stain on the history of the United States, but it is not an indelible stain. To err is human, which must come as news to the Left, with its penchant for judging its enemies (mainly conservative, white, American males) by superhuman standards of conduct, while seeking to impose its utopian social and economic order through the power of the state. The Left’s cynicism stands in stark contrast to the vision of the Framers, who sought “a more perfect Union” by enabling the free exchange of ideas and goods.

Remedial Vocabulary Training

David Bernstein, writing at TCS Daily a few years ago, recounted tales from the department of politically correct speech. This one struck close to home:

One especially merit-less [hostile work environment] claim that led to a six-figure verdict involved Allen Fruge, a white Department of Energy employee based in Texas. Fruge unwittingly spawned a harassment suit when he followed up a southeast Texas training session with a bit of self-deprecating humor. He sent several of his colleagues who had attended the session with him gag certificates anointing each of them as an honorary Coon Ass — usually spelled coonass — a mildly derogatory slang term for a Cajun. The certificate stated that [y]ou are to sing, dance, and tell jokes and eat boudin, cracklins, gumbo, crawfish etouffe and just about anything else. The joke stemmed from the fact that southeast Texas, the training session location, has a large Cajun population, including Fruge himself.

An African American recipient of the certificate, Sherry Reid, chief of the Nuclear and Fossil Branch of the DOE in Washington, D.C., apparently missed the joke and complained to her supervisors that Fruge had called her a coon. Fruge sent Reid a formal (and humble) letter of apology for the inadvertent offense, and explained what Coon Ass actually meant. Reid nevertheless remained convinced that Coon Ass was a racial pejorative, and demanded that Fruge be fired. DOE supervisors declined to fire Fruge, but they did send him to diversity training. They also reminded Reid that the certificate had been meant as a joke, that Fruge had meant no offense, that Coon Ass was slang for Cajun, and that Fruge sent the certificates to people of various races and ethnicities, so he clearly was not targeting African Americans. Reid nevertheless sued the DOE, claiming that she had been subjected to a racial epithet that had created a hostile environment, a situation made worse by the DOEs failure to fire Fruge.

Reid’s case was seemingly frivolous. The linguistics expert her attorney hired was unable to present evidence that Coon Ass meant anything but Cajun, or that the phrase had racist origins, and Reid presented no evidence that Fruge had any discriminatory intent when he sent the certificate to her. Moreover, even if Coon Ass had been a racial epithet, a single instance of being given a joke certificate, even one containing a racial epithet, by a non-supervisory colleague who works 1,200 miles away does not seem to remotely satisfy the legal requirement that harassment must be severe and pervasive for it to create hostile environment liability. Nevertheless, a federal district court allowed the case to go to trial, and the jury awarded Reid $120,000, plus another $100,000 in attorneys fees. The DOE settled the case before its appeal could be heard for a sum very close to the jury award.

In a meeting with a group of employees, in which I discussed our company’s budget, I used the word “niggardly” (meaning stingy or penny-pinching). The next day a fellow VP informed me that some of the black employees of her division had been offended by my use of the word “niggardly.” My reaction was to suggest that she give her employees remedial training in English vocabulary. That should have been the verdict in the Reid case.

Workplace Whiners

I was thinking earlier today about the prevalance of whining in the workplace. Then I came across this, from Suits in the Workplace: An Employment Law Blog:

In this age of easily hurt feelings and heightened sensitivity to just about everything, it’s nice to see a common sense decision in an employment case. The Tenth Circuit just ruled – brace yourself – that a supervisor who “set goals and deadlines for an ongoing project, requested that [an employee] 1) keep track of her daily activities in fifteen-minute intervals for seven days, 2) work in her cubicle so [the supervisor] could more closely supervise her, and 3) inform [the supervisor] of dates she would be out of the office” did not constructively discharge the employee she was managing. Turnwall v. Trust Co. of America, No. 04-1303 (10th Cir. 2005). . . .

Thankfully, the Tenth Circuit held that the working conditions weren’t objectively intolerable, and that there was no outrageous conduct, so they got it right. But what does this lawsuit say about the average supervisor’s ability to manage an employee who admittedly had problems prioritizing her work? Goal setting and regular monitoring of progress are textbook management techniques, and were perfectly appropriate under these circumstances. Nevertheless, the employer here had to defend a federal lawsuit, and a subseqent appeal, at no small cost, essentially because somebody couldn’t handle criticism from a supervisor.

We are blessed with excellent Federal Judges here in the Eastern District of Virginia. . . .

Yes, you are blessed. I speak from experience. The experience of putting up with workplace whiners like Ms. Turnwall, and the experience of having been backed up by the Eastern District of Virigina whenever one of those whiners went to court.

Metaphor du Jour

Groucho Marx is supposed to have said “I wouldn’t join any club that would have me as a member” — or something in that vein. In other words, when an exclusive club loses its exclusivity, membership in the club becomes less valuable, both to the members who joined it when it was still exclusive and to prospective members (if they are as astute as Groucho Marx).

But there’s more to it than that. Suppose that the exclusive club has stringent standards of conduct, which aren’t always observed but which most of its members strive to honor. Suppose that by changing its rules of admission — by admitting Groucho Marx, for instance — the club also seems to signal that it has lowered its standards of conduct. What is likely to happen as a result? At the margin, even some of those members who had joined the club when it was exclusive will adapt to the lower standards of conduct. Moreover, many persons who would have sought membership in the club when it was exclusive will simply decline to join it, with the result that, at the margin, some of them will not rise to the standards of conduct that they would have risen had they joined the club.

Human beings respond to social norms in ways that might seem “irrational” to those who think that humans are nothing but wealth-maximizing automata. Humans are much more complex than that, however, which is why it’s important to have exclusive “clubs” with high standards of conduct. If abstractions such as “honor” and “duty” were meaningless, soldiers wouldn’t join a “club” whose unwritten rules sometimes require them to throw themselves onto hand grenades; firemen wouldn’t join a “club” whose unwritten rules sometimes require them to risk almost-certain death on the slim odds of rescuing a person from an inferno.

There is a lesson in this for who are ineligible for certain exclusive clubs. They should — in the interest of society’s well-being — form their own exclusive clubs instead of trying to force their way into those clubs that already exist. They can call their clubs whatever they wish — and they can set very high standards for membership in those clubs — but they should not devalue the clubs that already exist by trying to change the rules of admission to those clubs.

Schelling and Segregation

Tyler Cowen of Marginal Revolution, who was mentored by Thomas Schelling at Harvard, praises Schelling’s Nobel prize by noting, among other things, Schelling’s analysis of the economics of segregation:

Tom showed how communities can end up segregated even when no single individual cares to live in a segregated neighborhood. Under the right conditions, it only need be the case that the person does not want to live as a minority in the neighborhood, and will move to a neighborhood where the family can be in the majority. Try playing this game with white and black chess pieces, I bet you will get to segregation pretty quickly.

True, but trivial. For, like many game-theoretic tricks, Schelling’s segregation gambit omits much important detail.

To begin with, blacks are not culturally homogeneous. Thomas Sowell argues, rather persuasively to this native of the North, that

[t]here have always been large disparities, even within the native black population of the U.S. Those blacks whose ancestors were “free persons of color” in 1850 have fared far better in income, occupation, and family stability than those blacks whose ancestors were freed in the next decade by Abraham Lincoln. . . .

The redneck culture [prevalent in the South] proved to be a major handicap for both whites and blacks who absorbed it. Today, the last remnants of that culture can still be found in the worst of the black ghettos, whether in the North or the South, for the ghettos of the North were settled by blacks from the South. The counterproductive and self-destructive culture of black rednecks in today’s ghettos is regarded by many as the only “authentic” black culture–and, for that reason, something not to be tampered with. Their talk, their attitudes, and their behavior are regarded as sacrosanct.

The people who take this view may think of themselves as friends of blacks. But they are the kinds of friends who can do more harm than enemies.

As Sowell explains more fully in his essay “Black Rednecks and White Liberals” (from the eponymous book) Northerners were rather accepting of the blacks in their midst until the great migrations of Southern blacks to the North from the 1930s onward. Then whites began to flee the neighborhoods into which Southern blacks were moving. The “old line” blacks sought to do the same, but they had less success than whites because the “old line” blacks became identified with the uncouth intruders from the South.

It is therefore meaningless to treat segregation as a game in which all whites are willing to live with black neighbors as long as they (the whites) aren’t in the minority. Most whites (including most liberals) do not want to live anywhere near any black rednecks if they can help it. Living in relatively safe, quiet, and attractive surroundings comes far ahead of whatever value there might be in “diversity.”

“Diversity” for its own sake is nevertheless a “good thing” in the liberal lexicon. The Houston Chronicle notes Schelling’s Nobel by saying that Schelling’s work

helps explain why housing segregation continues to be a problem, even in areas where residents say they have no extreme prejudice to another group.

Segregation isn’t a “problem,” it’s the solution to a potential problem. Segregation today is mainly a social phenomenon, not a legal one. It reflects a rational aversion on the part of whites to having neighbors whose culture breeds crime and other types of undesirable behavior.

As for what people say about their racial attitudes: Believe what they do, not what they say. Most well-to-do liberals choose to segregate themselves and their children from black rednecks. That kind of voluntary segregation, aside from demonstrating liberal hypocrisy about black redneck culture, also demonstrates the rationality of choosing to live in safer and more decorous surroundings.

Nor is segregation confined to cities. It has spread to the suburbs, as well, because black redneck culture has — too commonly — followed blacks there.

Related posts: Affirmative Action and Race (a collection of links)

Thoughts That Liberals Should Be Thinking

If women are the same as men, except for certain anatomical features, there’s no reason to favor women candidates for office because they might possess more “compassion.”

If a woman’s place is outside the home, whose place is inside the home, where children need the kind of moral education best given by a parent?

Excluding the inculcation of immoral socialistic ideals, public schools cannot venture very far into moral guidance without offending someone’s “sensibilities.” Nor can public schools enforce moral guidance with a quick swat.

And what’s wrong with a quick swat as a way of imprinting a moral lesson? It’s a lot more effective than “Joshua, I’m telling you for the last (100th) time not to do that.”

If the black poor are poor because they’ve been “kept down” by discrimination, then affirmative action isn’t of much use to them, except as a way of victimizing whites. (Moral lesson: Two wrongs don’t make a right.)

If the black poor are poor in spite of generations of welfare programs aimed at them, perhaps the problem is that such programs have created a form of dependency that destroys initiative.

If, as Thomas Sowell argues, “black” (redneck) culture is largely responsible for both the perpetuation of black poverty and racial prejudice, doesn’t that make a strong case for “acting white” instead of clinging to a culture that isn’t even authentically “black”?

And where’s the “compassion” for poor, inner-city blacks when they cannot obtain a better education through school vouchers because of resistance to vouchers by their own “educators” and white “liberals” in adjacent suburbs?

If it’s good to have racial and ethnic diversity in housing and jobs, why isn’t it good to have intellectual diversity (a.k.a. free speech) on campuses?

Homosexuality has driven many Catholic priests to molest boys. The Church wants to protect boys by banning the ordination of homosexuals. Liberals must choose between their reflexive defense of homosexuals and their purported desire to protect children.

That purported desire should also cause them to rethink where they want mothers to spend their days and where they want children to go to school.

An Open Letter to Michael Moore

Hey Mikey,

I understand that you’ve written an open letter to all who voted for George W. Bush in 2004. Something about how Katrina is all Bush’s fault — from start to finish. Well, I guess you’d know about such things, if anyone does. After all, your resume is quite impressive. Among other things,

  • You’ve told the CEO of General Motors how to run his vast company, which is a tad bit more difficult than making movies.
  • You’ve revealed the widespread suppression of dissent in the country, which obviously has prevented you from making millions of dollars from your movies.
  • You’ve explained how America’s bad karma — which is so evident in the outpouring of donations and aid in the aftermath of Katrina — has driven a few dozen high-school students to kill some of their fellow students.
  • Although you haven’t explained how fundamentalist Islam’s bad karma drove 19 young men to kill 3,000 Americans on a sunny morning in September, you have found a way to put the blame on the Bush family.

So, it’s obvious that you know a lot about how the world works. In fact, you know so much that I’ve begun to wonder about your involvement in Katrina. Given your wealth, the combined wealth of your Lefty pals in Hollywood, and the immense wealth of Lefty sympathizers like George Soros, I think I know what happened.

You and your buddies didn’t cause Hurricane Katrina. I don’t think you’re up to that task, yet. But you knew it was developing and knew precisely where it was headed, long before the National Weather Service did. So, you got to Mayor Noggin and Governor Blank-o and made it worth their while to screw up the evacuation of New Orleans and surrounding areas. (Governor Barbour of Mississippi couldn’t be bought off, for obvious reasons, so you saved some bucks there.)

After the hurricane struck, and before everyone realized the full extent of the death and destruction it had caused, you got to CNN, NBC, ABC, CBS, and MSNBC and fed them the lie that Bush was responsible for the destruction of New Orleans because he piddled the money away in Iraq. (FoxNews couldn’t be bought off, either, but five out of six ain’t bad.) You also concocted the fable that poor blacks were disproportionately affected by Katrina because Bush doesn’t care about blacks. That’s all it took. Those stories had legs, man; now they’re gospel in most quarters. And your pet pollsters are having a field day spinning the results.

So, Mikey, I have to hand it to you. Your deeply felt empathy for the “common man” has served him well. I mean, what’s a few thousand deaths if that’s what it takes to help open Americans’ eyes to the evil that is Bush.

Of course, I’m sure you’ll be well served, too. I can envision the title of your next hit movie: Farhrenheit 212: Bush in Hot (Flood) Water.

Yours in paranoia forever,
LC

P.S. Are you still at the fat farm? It’s a shame you got so grossly overweight. But I know it wasn’t your fault, because you’re not one of the stupid white guys. I remember when a younger George Bush forced those Big Macs down your throat. You were hooked for life, and it’s all Bush’s fault.

P.P.S. I see that CNN has a story in which every level of government is taking heat for what happened in New Orleans. You know what that means, of course. The big government that you love so much — not the one that fights to defend your right to make a rather nice living, but the other one that thinks more money is always the answer, regardless of the question — that big government is going to get bigger.

That’s the American way, isn’t it Mikey? Put all responsibility on government, praise it when it’s in Democrat hands, blame it when it’s in Republican hands, and keep on spending, no matter how much it screws up. It sure beats giving individuals back their tax money, along with the responsibility for choosing safe places to live or protecting themselves when they decide to live in unsafe places. (Oh, I almost forgot about the poor, untaxed people who are poor mostly because they’ve never been weaned from the government tit or who can’t find jobs because taxation and regulation destroy jobs.)

Anyway, if you make people responsible for themselves they might do something stupid like getting grossly fat, as you did. But it wouldn’t be their fault, of course. So, as long as we’re going to have a federal czar for disaster-prevention-against-all-odds, instant-response-at-all-costs, and rebuilding-bigger-and-better-in-dangerous-places, we might as well have a federal czar for forcing-fat-boys-to-run-two-miles-a-day. How’s that strike you?

Something Snapped

A portion of the bio of a contributor to the Blogger News Network, in which she notes that “something snapped inside”:

My mother was a Civil Rights activist and a teacher. She passed away in 1998.My father was an Army intel op in the Second World War. He passed away in 1985. I have been writing since I was very young. I have been involved in politics, the civil rights movement, and the anti-war movement since I was a child. My mother founded the first integrated pre-school for black and white children in Roxbury, Massachusetts in 1941. That was 13 years before the official beginning of the Civil Rights movement in America. Every weekend for our coming up years, my mother brought us into Boston for rallies and teach-ins. My early life was filled with the speeches of Martin Luther King. I heard them live, and I read them over and over. His writing had a profound effect on me. Later in life I read about Mahatma Ghandi. I think he might be my vote of the greatest political and religious leader who ever lived. My mother would have told us stories about Mahatma Ghandi and the Salt Marches. When war was declared against Iraq in 2003 I had been living out of the United States for 10 years or more. I lived an idyllic life in Ireland, in a beautiful cottage, with a lovely boyfriend who was one of the greatest musicians in all of Ireland. I played fiddle badly, but I had a supremely happy life. When I heard George W. Bush’s State of the Union Address in 2003, when I heard him outline the “Axis of Evil,” and when I heard him boast that the had sanctioned the summary execution of 3,000 Afghani prisoners, something snapped inside. When war was declared on Iraq I reached a turning point. For years I had been contributing 20% of everything I earned through my painting and writing to Medicins Sans Frontieres. For years I had enjoyed a life that few people could imagine. But it ended when war was declared on Iraq. I had many Iraqi friends, and because of the art and literature and antiquities in Iraq, I just could not countenance any war of agression against that country.

Obviously something “snapped inside” her, but it had snapped long before she heard George Bush inveigh against the “axis of evil.” Listen lady, if you can’t distinguish between enemy states and their people (most of whom are not our enemies), you are too stupid to be taken seriously about anything. If you’re defending the “axis” states of North Korea, Iran, and pre-invasion Iraq, you have forfeited your right to judge anyone else’s morality. And if you simply think that war is inherently “bad” because “it just is” or because civilians sometimes get caught in the crossfire, then you dishonor your father’s memory.

With company like that (and several dozen other nutcases and “liberal” statists), it’s no wonder I recently resigned from BNN. Something snapped.

A Footnote . . .

. . . to the preceding post, in which I quote at length from a recent article by Charles Murray, co-author of The Bell Curve (1994). In the article, Murray reviews the evidence about race and IQ and concludes

that we know two facts beyond much doubt. First, the conventional environmental explanation of the black-white difference [in IQ] is inadequate. Poverty, bad schools, and racism, which seem such obvious culprits, do not explain it. Insofar as the environment is the cause, it is not the sort of environment we know how to change, and we have tried every practical remedy that anyone has been able to think of. Second, regardless of one’s reading of the competing arguments, we are left with an IQ difference that has, at best, narrowed by only a few points over the last century. I can find nothing in the history of this difference, or in what we have learned about its causes over the last ten years, to suggest that any faster change is in our future.

I want to emphasize this point:

Insofar as the environment is the cause, it is not the sort of environment we know how to change, and we have tried every practical remedy that anyone has been able to think of.

That’s entirely consistent with what has been said by Thomas Sowell (a noted black scholar of conservative-libertarian persuasion), both in his commentary on The Bell Curve and in his recent writings about race and culture. Here’s what Sowell said about The Bell Curve soon after its publication:

Whatever innate potential various groups may have, what they actually do will be done within some particular culture. That intractable reality cannot be circumvented by devising “culture-free” tests, for such tests would also be purpose-free in a world where there is no culture-free society.

Perhaps the strongest evidence against a genetic basis for intergroup differences in IQ is that the average level of mental test performance has changed very significantly for whole populations over time and, moreover, particular ethnic groups within the population have changed their relative positions during a period when there was very little intermarriage to change the genetic makeup of these groups.

While The Bell Curve cites the work of James R. Flynn, who found substantial increases in mental test performances from one generation to the next in a number of countries around the world, the authors seem not to acknowledge the devastating implications of that finding for the genetic theory of intergroup differences. . . .

Even before Professor Flynn’s studies, mental test results from American soldiers tested in World War II showed that their performances on these tests were higher than the performances of American soldiers in World War I by the equivalent of about 12 IQ points. Perhaps the most dramatic changes were those in the mental test performances of Jews in the United States. The results of World War I mental tests conducted among American soldiers born in Russia–the great majority of whom were Jews–showed such low scores as to cause Carl Brigham, creator of the Scholastic Aptitude Test, to declare that these results “disprove the popular belief that the Jew is highly intelligent.” Within a decade, however, Jews in the United States were scoring above the national average on mental tests, and the data in The Bell Curve indicate that they are now far above the national average in IQ.

. . . For Jews, it is clear that later tests showed radically different results–during an era when there was very little intermarriage to change the genetic makeup of American Jews.

My own research of twenty years ago showed that the IQs of both Italian-Americans and Polish-Americans also rose substantially over a period of decades. Unfortunately, there are many statistical problems with these particular data, growing out of the conditions under which they were collected. However, while my data could never be used to compare the IQs of Polish and Italian children, whose IQ scores came from different schools, nevertheless the close similarity of their general patterns of IQ scores rising over time seems indicative–especially since it follows the rising patterns found among Jews and among American soldiers in general between the two world wars, as well as rising IQ scores in other countries around the world. . . .
Herrnstein and Murray openly acknowledge such rises in IQ and christen them “the Flynn effect,” in honor of Professor Flynn who discovered it. But they seem not to see how crucially it undermines the case for a genetic explanation of interracial IQ differences. They say:

The national averages have in fact changed by amounts that are comparable to the fifteen or so IQ points separating blacks and whites in America. To put it another way, on the average, whites today differ from whites, say, two generations ago as much as whites today differ from blacks today. Given their size and speed, the shifts in time necessarily have been due more to changes in the environment than to changes in the genes.

While this open presentation of evidence against the genetic basis of interracial IQ differences is admirable, the failure to draw the logical inference seems puzzling. Blacks today are just as racially different from whites of two generations ago as they are from whites today. Yet the data suggest that the number of questions that blacks answer correctly on IQ tests today is very similar to the number answered correctly by past generations of whites. If race A differs from race B in IQ, and two generations of race A differ from each other by the same amount, where is the logic in suggesting that the IQ differences are even partly racial? . . .

. . . When any factor differs as much from Al to A2 as it does from A2 to B2, why should one conclude that this factor is due to the difference between A in general and B in general? That possibility is not precluded by the evidence, but neither does the evidence point in that direction.(2.)

In footnote 2 Sowell concedes that “rising IQs over time do not refute the belief that races differ in IQ for genetic reasons, though it ought to at least raise a question about that belief.”

But let us continue with Sowell’s main theme, which is that persistent inter-racial differences in IQ can be attributed to persistent cultural differences. Writing recently in OpinionJournal, Sowell paraphrases his essay “Black Rednecks and White Liberals,” from the eponymous book. Here’s some of what he has to say:

There have always been large disparities, even within the native black population of the U.S. Those blacks whose ancestors were “free persons of color” in 1850 have fared far better in income, occupation, and family stability than those blacks whose ancestors were freed in the next decade by Abraham Lincoln.

What is not nearly as widely known is that there were also very large disparities within the white population of the pre-Civil War South and the white population of the Northern states. Although Southern whites were only about one-third of the white population of the U.S., an absolute majority of all the illiterate whites in the country were in the South.

The North had four times as many schools as the South, attended by more than four times as many students. Children in Massachusetts spent more than twice as many years in school as children in Virginia. Such disparities obviously produce other disparities. Northern newspapers had more than four times the circulation of Southern newspapers. Only 8% of the patents issued in 1851 went to Southerners. Even though agriculture was the principal economic activity of the antebellum South at the time, the vast majority of the patents for agricultural inventions went to Northerners. Even the cotton gin was invented by a Northerner.

Disparities between Southern whites and Northern whites extended across the board from rates of violence to rates of illegitimacy. American writers from both the antebellum South and the North commented on the great differences between the white people in the two regions. So did famed French visitor Alexis de Tocqueville.

None of these disparities can be attributed to either race or racism. Many contemporary observers attributed these differences to the existence of slavery in the South, as many in later times would likewise attribute both the difference between Northern and Southern whites, and between blacks and whites nationwide, to slavery. But slavery doesn’t stand up under scrutiny of historical facts any better than race or racism as explanations of North-South differences or black-white differences. The people who settled in the South came from different regions of Britain than the people who settled in the North–and they differed as radically on the other side of the Atlantic as they did here–that is, before they had ever seen a black slave.

Slavery also cannot explain the difference between American blacks and West Indian blacks living in the United States because the ancestors of both were enslaved. When race, racism, and slavery all fail the empirical test, what is left?

Culture is left.

The culture of the people who were called “rednecks” and “crackers” before they ever got on the boats to cross the Atlantic was a culture that produced far lower levels of intellectual and economic achievement, as well as far higher levels of violence and sexual promiscuity. That culture had its own way of talking, not only in the pronunciation of particular words but also in a loud, dramatic style of oratory with vivid imagery, repetitive phrases and repetitive cadences.

Although that style originated on the other side of the Atlantic in centuries past, it became for generations the style of both religious oratory and political oratory among Southern whites and among Southern blacks–not only in the South but in the Northern ghettos in which Southern blacks settled. It was a style used by Southern white politicians in the era of Jim Crow and later by black civil rights leaders fighting Jim Crow. Martin Luther King’s famous speech at the Lincoln Memorial in 1963 was a classic example of that style.

While a third of the white population of the U.S. lived within the redneck culture, more than 90% of the black population did. Although that culture eroded away over the generations, it did so at different rates in different places and among different people. It eroded away much faster in Britain than in the U.S. and somewhat faster among Southern whites than among Southern blacks, who had fewer opportunities for education or for the rewards that came with escape from that counterproductive culture.

Nevertheless the process took a long time. As late as the First World War, white soldiers from Georgia, Arkansas, Kentucky and Mississippi scored lower on mental tests than black soldiers from Ohio, Illinois, New York and Pennsylvania. Again, neither race nor racism can explain that–and neither can slavery.

The redneck culture proved to be a major handicap for both whites and blacks who absorbed it. Today, the last remnants of that culture can still be found in the worst of the black ghettos, whether in the North or the South, for the ghettos of the North were settled by blacks from the South. The counterproductive and self-destructive culture of black rednecks in today’s ghettos is regarded by many as the only “authentic” black culture–and, for that reason, something not to be tampered with. Their talk, their attitudes, and their behavior are regarded as sacrosanct.

The people who take this view may think of themselves as friends of blacks. But they are the kinds of friends who can do more harm than enemies.

If East Asians and Azhkenazic Jews could rise to the top of the IQ charts, as they have, why can’t blacks rise too? Sowell would answer that they could rise, if only they would break the bonds of the “black redneck” culture, which hinders so many of them. The law cannot break those bonds, for, as Sowell argues, the law only reinforces those bonds by making blacks dependent on the affirmative action, welfare programs, and other “white liberal” contrivances.

If culture is the enemy of black advancement, the only way blacks can advance is to abandon the culture that many of them have transported from inner cities to suburbia, where they encounter white culture in many places, including public schools. There, the cultural divide becomes obvious in the phenomemon known as “acting white,” the subject of an article by Harvard economist Roland G. Fryer Jr. and graduate student Paul Torelli, “An Empirical Analysis of ‘Acting White’.” The Washington Post‘s Richard Morin summarizes:

As commonly understood, acting white is a pejorative term used to describe black students who engage in behaviors viewed as characteristic of whites, such as making good grades, reading books or having an interest in the fine arts.

The phenomenon is one reason some social thinkers give to help explain at least a portion of the persistent black-white achievement gap in school and in later life. Popularity-conscious young blacks, afraid of being seen as acting white, steer clear of behaviors that could pay dividends in the future, including doing well in school. . . .

No one can change such attitudes but blacks themselves.

If “black redneck” culture is the cause of the inter-racial gap in IQ, and if blacks choose to perpetuate the “black redneck” culture, then the perpetuation of the IQ gap might as well be genetic. For, it will be the result of blacks’ self-imposed servitude to the forces of ignorance.

Recommended reading: Race and Intelligence (a Wikipedia article with many links to sources and opposing views)

Related posts: Affirmative Action and Race (a collection of links)

After the Bell Curve

Charles Murray, writing in Commentary, reviews what has been learned about gender, race, and IQ since the publication of his (and the late Richard Herrnstein’s) The Bell Curve eleven years ago. Why is he writing now?

The Lawrence Summers affair last January made me rethink my silence. The president of Harvard University offered a few mild, speculative, off-the-record remarks about innate differences between men and women in their aptitude for high-level science and mathematics, and was treated by Harvard’s faculty as if he were a crank. The typical news story portrayed the idea of innate sex differences as a renegade position that reputable scholars rejected.

It was depressingly familiar. In the autumn of 1994, I had watched with dismay as The Bell Curve’s scientifically unremarkable statements about black IQ were successfully labeled as racist pseudoscience. At the opening of 2005, I watched as some scientifically unremarkable statements about male-female differences were successfully labeled as sexist pseudoscience.

His target:

[S]pecific [social] policies based on premises that conflict with scientific truths about human beings tend not to work. Often they do harm.

One such premise is that the distribution of innate abilities and propensities is the same across different groups. . . . The assumption of no innate differences among groups suffuses American social policy. That assumption is wrong.

When the outcomes that these policies are supposed to produce fail to occur, with one group falling short, the fault for the discrepancy has been assigned to society. It continues to be assumed that better programs, better regulations, or the right court decisions can make the differences go away. That assumption is also wrong.

About gender:

[F]or reasons embedded in the biochemistry and neurophysiology of being female, many women with the cognitive skills for achievement at the highest level also have something else they want to do in life: have a baby. In the arts and sciences, forty is the mean age at which peak accomplishment occurs, preceded by years of intense effort mastering the discipline in question.20 These are precisely the years during which most women must bear children if they are to bear them at all. . . .

[W]omen with careers were four-and-a-half times more likely than men to say they preferred to work fewer than 40 hours per week. The men placed greater importance on “being successful in my line of work” and “inventing or creating something that will have an impact,” while the women found greater value in “having strong friendships,” “living close to parents and relatives,” and “having a meaningful spiritual life.” As the authors concluded, “these men and women appear to have constructed satisfying and meaningful lives that took somewhat different forms.”23 The different forms, which directly influence the likelihood that men will dominate at the extreme levels of achievement, are consistent with a constellation of differences between men and women that have biological roots.

I have omitted perhaps the most obvious reason why men and women differ at the highest levels of accomplishment: men take more risks, are more competitive, and are more aggressive than women.24 The word “testosterone” may come to mind, and appropriately. Much technical literature documents the hormonal basis of personality differences that bear on sex differences in extreme and venturesome effort, and hence in extremes of accomplishment—and that bear as well on the male propensity to produce an overwhelming proportion of the world’s crime and approximately 100 percent of its wars. But this is just one more of the ways in which science is demonstrating that men and women are really and truly different, a fact so obvious that only intellectuals could ever have thought otherwise.

As for race, Murray reviews the evidence at length and concludes

that we know two facts beyond much doubt. First, the conventional environmental explanation of the black-white difference [in IQ] is inadequate. Poverty, bad schools, and racism, which seem such obvious culprits, do not explain it. Insofar as the environment is the cause, it is not the sort of environment we know how to change, and we have tried every practical remedy that anyone has been able to think of. Second, regardless of one’s reading of the competing arguments, we are left with an IQ difference that has, at best, narrowed by only a few points over the last century. I can find nothing in the history of this difference, or in what we have learned about its causes over the last ten years, to suggest that any faster change is in our future.

The implications:

Elites throughout the West are living a lie, basing the futures of their societies on the assumption that all groups of people are equal in all respects. Lie is a strong word, but justified. It is a lie because so many elite politicians who profess to believe it in public do not believe it in private. It is a lie because so many elite scholars choose to ignore what is already known and choose not to inquire into what they suspect. We enable ourselves to continue to live the lie by establishing a taboo against discussion of group differences. . . .

The taboo arises from an admirable idealism about human equality. If it did no harm, or if the harm it did were minor, there would be no need to write about it. But taboos have consequences. . . .

How much damage has the taboo done to the education of children? Christina Hoff Sommers has argued that willed blindness to the different developmental patterns of boys and girls has led many educators to see boys as aberrational and girls as the norm, with pervasive damage to the way our elementary and secondary schools are run.78 . . .

How much damage has the taboo done to our understanding of America’s social problems? The part played by sexism in creating the ratio of males to females on mathematics faculties is not the ratio we observe but what remains after adjustment for male-female differences in high-end mathematical ability. The part played by racism in creating different outcomes in black and white poverty, crime, and illegitimacy is not the raw disparity we observe but what remains after controlling for group characteristics. . . .

Even to begin listing the topics that could be enriched by an inquiry into the nature of group differences is to reveal how stifled today’s conversation is. Besides liberating that conversation, an open and undefensive discussion would puncture the irrational fear of the male-female and black-white differences I have surveyed here. We would be free to talk about other sexual and racial differences as well, many of which favor women and blacks, and none of which is large enough to frighten anyone who looks at them dispassionately. . . .

The law should not prevent individuals from doing their best. Reverse discrimination — which is the law — pushes some people toward pursuits for which they are not best suited and it pushes other people away from pursuits for which they are best suited. In sum, reverse discrimination prevents individuals from doing their best. That’s bad social policy. But we mustn’t talk about it.

Related posts:

Affirmative Action and Race (a collection of links)
I Missed This One (08/12/04)
A Century of Progress? (01/30/05)
Feminist Balderdash (02/19/05)

Guilty Until Proven Innocent

Excerpt of an e-mail from the law firm of McGuireWoods (“No Good Deed Goes Unpunished? Seventh Circuit Rules That No Adverse ‘Employment’ Action is Necessary to Sustain Title VII Retaliation Claims”):

Executive Secretary, Chrissy Washington worked for the Illinois Department of Revenue on a flexible schedule from 7 a.m. to 3 p.m., instead of the standard 9-5 schedule, allowing her to care for her son with Down Syndrome. When some of her duties were reassigned to others, she filed charges with state and federal agencies alleging race discrimination. Subsequently, a senior manager required that she work from 9 to 5, and when she refused, her position was abolished. Washington was assigned to another Executive Secretary post with a different supervisor and was required to apply anew for a flextime schedule, which was refused. Washington maintained that it was her prior discrimination charge that led supervisors to rescind the flextime schedule on which her son depended. . . .

. . . [The Seventh Circuit Court of Appeals] concluded (with a highly entertaining reference to the comic strip Dilbert) that where an employer retaliates for protected activity by exploiting an employee’s known vulnerability, such as Washington’s reliance on flextime to care for her disabled son, the action can be a material change sufficient to sustain a retaliation claim under Title VII [of the Civil Rights Act of 1964]. The standard for materiality, the court noted, is whether the employer’s action has the “potential” to dissuade an employee (and, by logical extension, other employees) from pursuing her rights under Title VII.

Although this opinion does not reflect a uniform view among the jurisdictions on the ultimate issue, it should serve to alert employers to some of the potential problems that can arise from the implementation of flextime schedules and other employee-friendly initiatives. The court clearly says that once these admittedly optional benefits are in place for an employee, their removal can serve as a basis for retaliation claims.

Lesson 1: A benefit, once bestowed, can become an entitlement.

Lesson 2: An employee who has filed an Equal Employment Opportunity (EEO) claim against an employer may became immune to otherwise defensible business decisions by that employer.

As my HR director used to say whenever a disgruntled employee or former employee filed an EEO claim: “We (the company) are guilty until proven innocent.” Because that’s how the EEO racket works.

Judge Roberts and Women

Oh, the hue and cry about Judge John Roberts’s writings of 20-plus years ago. In one instance,

he said that a controversial legal theory then in vogue — of directing employers to pay women the same as men for jobs of “comparable worth” — was “staggeringly pernicious” and “anti-capitalist.”

Well, he was right then, and he would be just as right today if he were to say the same thing. There is no such thing as “comparable worth,” a doctrine that would substitute someone’s subjective judgment about the “value” of work for the objective judgment of the market about the value of work.

In another instance,

Linda Chavez, then the White House’s director of public liaison . . . had proposed entering her deputy, Linda Arey, in a contest sponsored by the Clairol shampoo company to honor women who had changed their lives after age 30. Arey had been a schoolteacher who decided to change careers and went to law school.

In a July 31, 1985, memo, Roberts noted that, as an assistant dean at the University of Richmond law school before she joined the Reagan administration, Arey had “encouraged many former homemakers to enter law school and become lawyers.” Roberts said in his memo that he saw no legal objection to her taking part in the Clairol contest. Then he added a personal aside: “Some might question whether encouraging homemakers to become lawyers contributes to the common good, but I suppose that is for the judges to decide.”

That’s certainly not dogmatic opposition to the idea of married woment working outside the home, though the likes of Ted Kennedy and NOW (strange bedfellows, indeed) will portray it in that light.

Politically incorrect as it may be to say that encouraging homemakers to work outside the home may not be for the common good, there is reason to think that Roberts was right when he said as much. As I wrote here:

Because estimates of GDP don’t capture the value of child-rearing and other aspects of “household production” by stay-at-home mothers, the best way to put 1900 and 2000 on the same footing is to estimate GDP for 2000 at the labor-force participation rates of 1900. The picture then looks quite different: real GDP per capita of $4,300 in 1900, real GDP per capita of $25,300 in 2000 (a reduction of 28 percent), and an annualized growth rate of 1.8 percent, rather than 2.1 percent.

The adjusted rate of growth in GDP per capita still overstates the expansion of prosperity in the twentieth century because it includes government spending, which is demonstrably counterproductive. A further adjustment for the cost of government — which grew at an annualized rate of 7.5 [percent] during the century (excluding social transfer payments) — yields these estimates: real GDP per capita of $3,900 in 1900, real GDP per capita of $19,800 in 2000, . . . an annualized growth rate of 1.6 percent. (In Part V of “Practical Libertarianism for Americans,” I will [did] estimate how much greater growth we would have enjoyed in the absence of government intervention.)

The twentieth century was a time of great material progress. And we know that there would have been significantly greater progress had the hand of government not been laid so heavily on the economy. But what we don’t know is the immeasurable price we have paid — and will pay — for the exodus of mothers from the home. We can only name that price: greater incivility, mistrust, fear, property loss, injury, and death.

Most “liberal” programs have unintended negative consequences. The “liberal” effort to encourage mothers to work outside the home has vastly negative consequences. Unintended? Perhaps. But I doubt that many “liberals” would change their agenda, even if they were confronted with the consequences.

Should women be free to work outside the home? Absolutely. They must judge what’s best for themselves, in light of their obligations as parents — if they have such obligations.

Should government be in the business of encouraging women to work outside the home — perhaps even encouraging the breakup of families — by spending taxpayer dollars for that purpose? Absolutely not, because such encouragement is a form of paternalism that pushes people in the direction of making decisions that they wouldn’t otherwise make — in this case, decisions that undermine the kind of civil society that makes liberty possible. As Jennifer Roback Morse wrote,

[t]he libertarian approach to caring for the dependent is usually described in terse form as “let families and private charity take care of it, and get the government out of the way.” This position is sometimes ridiculed as unrealistic or attacked as harsh. But the libertarian position, once fully fleshed out, is both humane and realistic.

The libertarian preference for nongovernmental provision of care for dependents is based upon the realization that people take better care of those they know and love than of complete strangers. It is no secret that people take better care of their own stuff than of other people’s. Economists conclude that private property will produce better results than collectivization schemes. But a libertarian preference for stable married-couple families is built upon more than a simple analogy with private property. The ordinary rhythm of the family creates a cycle of dependence and independence that any sensible social order ought to harness rather than resist. . . .

But for this minimal government approach to work, there has to be a family in the first place. The family must sustain itself over the course of the life cycle of its members. If too many members spin off into complete isolation, if too many members are unwilling to cooperate with others, the family will not be able to support itself. A woman trying to raise children without their father is unlikely to contribute much to the care of her parents. In fact, unmarried parents are more likely to need help from their parents than to provide it.

In contrast to the libertarian approach, “progressives” view government provision of social services as the first resort, not the last. Describing marriage as a “privatization scheme” implies that the most desirable way to care for the dependent is for the state to provide care. An appreciation of voluntary cooperation between men and women, young and old, weak and strong, so natural to libertarians and economists, is completely absent from this statist worldview. . . .

Marriage is the socially preferred institution for sexual activity and childrearing in every known human society. The modern claim that there need not be and should not be any social or legal preference among sexual or childrearing contexts is, by definition, the abolition of marriage as an institution. This will be a disaster for the cause of limited government. Disputes that could be settled by custom will have to be settled in court. Support that could be provided by a stable family must be provided by taxpayers. Standards of good conduct that could be enforced informally must be enforced by law.

There I go again, questioning liberal (and sometimes libertarian) orthodoxy.

Related posts:

I Missed This One (08/12/04)
A Century of Progress? (01/30/05)
Feminist Balderdash (02/19/05)
Libertarianism, Marriage, and the True Meaning of Family Values (04/06/05)

PC Madness

Coyote Blog points to the NCAA’s latest venture into political correctness:

The presidents and chancellors who serve on the NCAA Executive Committee have adopted a new policy to prohibit NCAA colleges and universities from displaying hostile and abusive racial/ethnic/national origin mascots, nicknames or imagery at any of the 88 NCAA championships.

The Executive Committee, meeting Thursday in Indianapolis, also approved recommended best practices for schools who continue to use Native American mascots, nicknames and imagery in their intercollegiate athletic programs.

“Colleges and universities may adopt any mascot that they wish, as that is an institutional matter,” said Walter Harrison, chair of the Executive Committee and president at the University of Hartford. “But as a national association, we believe that mascots, nicknames or images deemed hostile or abusive in terms of race, ethnicity or national origin should not be visible at the championship events that we control.”

Obviously, no college or university in the U.S. is hostile toward Native Americans or any other group of persons that isn’t white, male, and heterosexual. So what’s the problem?

Why aren’t the Greeks and Turks upset about all those teams of Spartans and Trojans dotted around the country?

If livestock could vote, the NCAA certainly would be riding herd on Mustangs and Broncos, and all of those other rampaging animals. Though I doubt that anyone would stick up for the Mud Hens (the nickname of a minor league baseball team).

Speaking of baseball — my favorite sport — why aren’t New Englanders up in arms about the New York Yankees when most New Englanders (the original Yankees) are fans of the Boston Red Sox.

I guess it’s okay to call a team the Sox (the Red of Boston or White of Chicago) because there are few textile and hosiery manufacturers still operating in the U.S. We still have a lot of mountains, though, so I do have to wonder about the Colorado Rockies.

Why aren’t matched siblings and extra-large persons upset about the Minnesota Twins and San Francisco Giants?

Why aren’t professional groups of various sorts marching against the Houston Astros (for astronauts), Kansas City Royals, Los Angeles Angels, Los Angeles (trolley) Dodgers, Milwaukee Brewers, Oakland Athletics, Pittsburgh Pirates, San Diego Padres, Seattle Mariners, and Texas Rangers?

Birds should sue the Baltimore Orioles, St. Louis Cardinals, and Toronto Blue Jays. And there are the beasts of land and sea who must be offended by the likes of the Arizona Diamondbacks, Chicago Cubs, Detroit Tigers, Florida Marlins, and Tampa Bay Devil Rays.

Then there are those pesky Native American teams, the Atlanta Braves and Cleveland Indians. Why haven’t they wised up yet?

So we’re right back where we started. I guess what we need are more teams with innocuous names like the New York Mets, Philadelphia Phillies, and Washington Nationals.

Come to think of it, I doubt that any self-respecting PC policeperson would object to the Cincinnati Reds. I mean, aren’t socialism and communism far more enlightened systems than free-market capitalism?

Technorati tag:

A Note to Larry Summers

Larry,

It’s okay to suggest that women have different aptitudes than men, as long as those aptitudes are superior:

Chris Clarke, the America-based CEO of Boyden, a firm of headhunters, and a visiting professor at Henley Management College in England, argues that women are superior to men at multi-tasking, team-building and communicating, which have become the essential skills for running a 21st-century corporation. Maria Wisniewska, who headed a Polish bank, Bank Pekao, and is an international adviser to the Conference Board, says: “The links between the rational and emotional parts of the brain are greater in women than in men. If so, and if leadership is about making links between emotion and intelligence, then maybe women are better at it than men.”

Read the whole thing, and weep.

Regards,
LC

Technorati tag:

A Law Professor to Admire

I balance my dismissal of UT lawprof Brian Leiter by endorsing his senior colleague, Lino Graglia. In an interview with Columbia Law School Report on the occasion of the 50th anniversary of the Supreme Court’s decision in Brown v. Board of Education, Graglia had much to say, including this:

The overwhelming negative effect of Brown is that it changed the view of many people – certainly of most professors of constitutional law and, most important, of the justices themselves – as to the proper role of the Supreme Court in our system of government. If the court (as was believed) could do so great and good a thing as end racial segregation, what other great and good things could it not do? Did not Brown demonstrate that decision-making by judges on issues of basic social policy is superior to decision-making by elected representatives? The result has been a perversion of the system of government created by the Contitution, the basic principles of which are self-government through elected representatives, decentralized power (federalism), and separation of powers. At least in part as a result of Brown, we have arrived at the antithesis of this system: government by majority vote of a committee of nine unelected, life-tenured lawyers making the most basic policy decisions for the nation as a whole from Washington, D.C. The acclaim the court received as a result of Brown emboldened it to go on to such further decisions as Roe v. Wade, purporting to find that the Constitution, incredibly enough, guarantees rights of abortion, thereby converting an issue that was being peacefully settled on a state-by-state basis into an intractable national controversy.

On the race issue itself, the eventual success of Brown as a result of the 1964 Civil Rights Act emboldened the court to move from Brown’s prohibition of segregation – prohibiting the assignment of students to separate schools by race – to a vastly more ambitious and questionable requirement of integration – requiring the assignment of students to schools by race, now to increase racial mixing. Compulsory school racial integration, given residential racial concentrations, could only be attempted by ordering cross-district busing, but the justices now felt powerful enough to think that they could order even that. The result has been not to lessen but to increase school racial separation – as middle-class parents, black as well as white, fled school systems subject to busing orders – and the expenditure of billions of dollars and devastation of public school systems across the nation with no educational or other benefit….

Segregation should have been ended by Congress, as it in fact eventually was, and it is most unfortunate, as shown above, that the issue was purportedly decided instead by the court. The court’s supposed reliance on the totally discredited so-called sociological evidence in Brown illustrates only that constitutional law need have no relation to truth or reality….

It is misleading to state that “public schools in America remain segregated,” even after adding “though not officially by law,” seeming to suggest some inconsistency with Brown when in fact it is compulsory integration that violates Brown’s prohibition of all official race discrimination. For social, economic, and perhaps other reasons, areas of residential racial concentration are the norm, and neighborhood schools (which have many advantages) necessarily reflect residential patterns. This does not make the schools “segregated” any more than the neighborhoods are “segregated.” The pursuit of school “racial balance” is not only largely futile or counterproductive, but essentially pointless. The problem to be faced in regard to black education is the astounding fact that the average black 12th grader performs at about the level of the average white or Asian eighth grader in reading and math, and blacks from high income homes score below whites and Asians from low income homes. Some schools with the highest per pupil expenditures, as in Washington, D.C., and New York City, have the lowest levels of pupil performance. The pursuit of school “racial balance” – the dispersal of black students among whites and Asians (but not among Hispanics who have similar, though lesser educational difficulties) – will do nothing to change these facts. It will serve only to divert attention from possibly effective efforts and keep racial activists in business.

Graglia for Chief Justice. (If only he weren’t “too old” and “politically incorrect.”)