To Pay or Not to Pay

It’s tax time. Let’s celebrate with a bit of revisionist literary history. William Shakespeare was a tax protestor. Think about the message hidden in the titles of several of his plays:

“A Midsummer Night’s Dream” is about a man who hopes soon to repay the money he borrowed to meet his tax bill. “Winter’s Tale” follows him through months of overtime work as he struggles to save money for his old age. In “Love’s Labor’s Lost” he confronts the ugly reality that his savings will go to the IRS. “A Comedy of Errors” depicts his travails with Form 1040 and its many schedules. In “Much Ado About Nothing” he discovers, alas, that he owes the IRS even more than he had feared. Stunned by the discovery, he decides in “Twelfth Night” (April 12) not to file a tax return and tears it into tiny pieces. He reconsiders, and “The Tempest” recounts his struggle to complete a new return by April 15. “As You Like It” celebrates his triumphal march to the Post Office on April 15, armed with a return that shows him even with the IRS. “All’s Well That Ends Well” is a fantasy in which the IRS finds no fault with our hero’s return.

Then there is the real text of Hamlet’s soliloquy:

To file or not to file — that is the question;

Whether ‘tis nobler in the pocketbook to suffer

The slings and arrows of outrageous taxes,

Or to take arms against a sea of instructions

And by ignoring evade them. To file — to pay —

No more; and by not paying to say we end

The headache and the thousand-dollar debts

That Uncle Sam is heir to — ‘tis a consummation

Devoutly to be wish’d. To run — to hide —

To hide! Perchance in Bimini! Ay there’s the spot;

For in that sunny isle what dreams may come

When we have eluded the revenue agent

Must give us pause; there’s the reality

That makes mockery of such simple plans;

For who would bear the heat and hard bunks of Leavenworth;

The cell-block bully’s fist, the guard’s glare

The bagginess of prison garb, the sad children’s tears,

The righteousness of neighbors, and the spurns

That the gray-faced ex-con takes

When he himself might his quietus make

With a simple check? Who would these taxes bear,

To grunt and sweat under a glaring desk lamp,

But that the dread of something after April 15,

The uncelebrated penitentiary from whose walls

No inmate leaves, without parole,

And makes us rather bear those taxes we must

Than fly to Bimini or other exotic places?

Thus conscience does make taxpayers of us all….

Drunk Driver Appointed Traffic Court Judge

If there was any lingering doubt about the corruptness of the 9/11 Commission, Attorney General John Ashcroft dispelled it yesterday in his testimony before the Commission.

Ashcroft disclosed that Commissioner Jamie Gorelick wrote this, which Ashcroft properly described as “[t]he single greatest structural cause for September 11…the wall that segregated criminal investigators and intelligence agents.” Ashcroft continued, “Government erected this wall. Government buttressed this wall. And before September 11, government was blinded by this wall.” Specifically,

In the days before September 11, the wall…impeded the investigation into Zacarias Moussaoui, Khalid al-Midhar and Nawaf al-Hazmi. After the FBI arrested Moussaoui, agents became suspicious of his interest in commercial aircraft and sought approval for a criminal warrant to search his computer. The warrant was rejected because FBI officials feared breaching the wall.

When the CIA finally told the FBI that al-Midhar and al-Hazmi were in the country in late August, agents in New York searched for the suspects. But because of the wall, FBI Headquarters refused to allow criminal investigators who knew the most about the most recent al Qaeda attack to join the hunt for the suspected terrorists.

At that time, a frustrated FBI investigator wrote Headquarters, quote, “Whatever has happened to this — someday someone will die — and wall or not — the public will not understand why we were not more effective and throwing every resource we had at certain ‘problems’. Let’s hope the National Security Law Unit will stand behind their decision then, especially since the biggest threat to us, UBL, is getting the most protection.”

Of course Gorelick didn’t foresee the particular, horrific terrorist acts we call 9/11, just as a drunk driver doesn’t foresee the particular, horrific accident caused by his drunkenness.

If Gorelick’s policy hadn’t become known immediately after 9/11 — and hadn’t been rectified already — Ashcroft’s testimony would have contained the only true “bombshell” to emerge thus far from the 9/11 hearings.

Economic Illiteracy Prevails

Poll: Balanced Budget Beats Tax Cuts

By WILL LESTER, Associated Press Writer

WASHINGTON – By almost a 2-1 margin, Americans prefer balancing the nation’s budget to cutting taxes, according to an Associated Press poll, even though many believe their overall tax burden has risen despite tax cuts over the past three years.

About six in 10, 61 percent, chose balancing the budget while 36 percent chose tax cuts when they were asked which was more important, according to a poll conducted for the AP by Ipsos Public Affairs.

If you want a balanced budget, hold spending steady or cut it gradually. If you want a recession and slow economic growth, raise tax rates.

Today’s Notable Birthday

Sir Robert Watson-Watt (1892-1973) was born on this date.

Watson-Watt invented radar in 1935, specifically for the purpose of detecting aircraft. His “pioneering work…resulted in the design and installation of a chain of radar stations along the East and South coast of England in time for the outbreak of war in 1939. This system…provided the vital advance information that helped the Royal Air Force to win the Battle of Britain.” (From Radar Personalities: Sir Robert Watson-Watt.)

Had the Battle of Britain gone the other way, Hitler probably would have invaded England. A successful invasion would have sundered the U.S.-British alliance and ensured Hitler’s victory in Europe.

Vietnam and Iraq as Metaphors

Vietnam: a costly, unpopular, “unwinnable” war that foments unrest at home and anti-Americanism abroad.

We “lost” Vietnam, not because we couldn’t win it but because we weren’t willing to bear the cost of winning it. The “loss” of Vietnam posed no obvious threat to America’s vital interests. In that respect, the critics of the Vietnam War were right — and I was one of those critics.

Some proponents of the Vietnam War predicted that our withdrawal from Vietnam would, in the long run, threaten America’s vital interests by showing our potential enemies that we could be made to back down by the sight (or prospect) of body bags. As we saw in Lebanon, the Gulf War, Somalia, and Clinton’s tepid response to terrorist acts, those proponents of the Vietnam War were right.

We are now engaged in a war against terror that we invited, in part, by our actions in Vietnam, Lebanon, and the rest. We are now engaged in a war to stabilize the Middle East, where America does, indeed, have vital interests.

Iraq: a winnable war in which America shows its willingness to protect its vital interests, despite much anti-Americanism at home and abroad.

Fear of the Free Market — Part II

In Part I of this series (second post under April 8, 2004), I pointed out that

[i]t is easier to list those markets in which the government doesn’t intervene (namely, “black markets”) than it is to list those markets in which the government does intervene. There simply isn’t a lawful business activity that isn’t affected by government regulation….[G]overnment intervention in the market for any product or service tends to reduce the supply of that product or service.

Health care, being something almost everyone needs (like electricity and phone service), has been regulated to the point of being nationalized (see Part I). Yet it is unclear that the regulation of health care does anything but restrict our access to doctors and drugs. Licensing exams have no meaningful effect on our ability to choose competent doctors (see Part I).

What about FDA approval of drugs? The FDA doesn’t test drugs, it prescribes testing procedures for drugs. The responsibility for testing falls to the maker of the drug. According to a statistics published on the FDA web site, The FDA ultimately approves about 20% of applications for new drugs. The three phases of the FDA’s prescribed testing process last at least one year and sometimes six years and longer. What does the FDA hope to accomplish through its approval process? Here’s some of what the FDA’s Ken Flieger has to say:

Most of us understand that drugs intended to treat people have to be tested in people. These tests, called clinical trials, determine if a drug is safe and effective, at what doses it works best, and what side effects it causes–information that guides health professionals and, for nonprescription drugs, consumers in the proper use of medicines.

Clinical testing isn’t the only way to discover what effects drugs have on people. Unplanned but alert observation and careful scrutiny of experience can often suggest drug effects and lead to more formal study. But such observations are usually not reliable enough to serve as the basis for important, scientifically valid conclusions. Controlled clinical trials, in which results observed in patients getting the drug are compared to the results in similar patients receiving a different treatment, are the best way science has come up with to determine what a new drug really does. That’s why controlled clinical trials are the only legal basis for FDA to conclude that a new drug has shown “substantial evidence of effectiveness.”

It boils down to safety and effectiveness. But safety and effectiveness are also your doctor’s concern. Do you suppose that your doctor would prescribe a drug that its manufacturer hadn’t thoroughly tested for safety and effectiveness? Of course, your doctor might well flub his diagnosis (something that happens a lot, despite the medical licensing exam) and prescribe the wrong medication. Or your doctor might diagnose you correctly but prescribe a medication that produces an unpleasant side effect. In summary, the safety and effectiveness of the drugs your doctor prescribes depends mainly on your doctor’s competence.

Misadventure is more likely with non-prescription (over-the-counter) drugs. As the FDA acknowledges, “Most OTC drug products have been marketed for many years, prior to the laws that require proof of safety and effectiveness before marketing.” Very interesting. As with prescription drugs, OTC drugs, used to be available without the FDA’s imprimatur. That is, individuals used to be trusted to buy and use OTC drugs wisely, but then the FDA got into the act. Why? According to the FDA:

Languishing in Congress for five years, the bill that would replace the 1906 [Food and Drugs Act] was ultimately enhanced and passed in the wake of a therapeutic disaster in 1937. A Tennessee drug company marketed a form of the new sulfa wonder drug that would appeal to pediatric patients, Elixir Sulfanilamide. However, the solvent in this untested product was a highly toxic chemical analogue of antifreeze; over 100 people died, many of whom were children. The public outcry not only reshaped the drug provisions of the new law to prevent such an event from happening again, it propelled the bill itself through Congress. This was neither the first nor the last time Congress presented a public health bill to a president only after a therapeutic disaster. FDR (pictured at left) signed the Food, Drug, and Cosmetic Act on 25 June 1938.

The new law brought cosmetics and medical devices under control, and it required that drugs be labeled with adequate directions for safe use. Moreover, it mandated pre-market approval of all new drugs, such that a manufacturer would have to prove to FDA that a drug were safe before it could be sold. It irrefutably prohibited false therapeutic claims for drugs, although a separate law granted the Federal Trade Commission jurisdiction over drug advertising. The act also corrected abuses in food packaging and quality, and it mandated legally enforceable food standards. Tolerances for certain poisonous substances were addressed. The law formally authorized factory inspections, and it added injunctions to the enforcement tools at the agency’s disposal.

And on it went:

Enforcement of the new law came swiftly. Within two months of the passage of the act, the FDA began to identify drugs such as the sulfas that simply could not be labeled for safe use directly by the patient–they would require a prescription from a physician. The ensuing debate by the FDA, industry, and health practitioners over what constituted a prescription and an over-the-counter drug was resolved in the Durham-Humphrey Amendment of 1951. From the 1940s to the 1960s, the abuse of amphetamines and barbiturates required more regulatory effort by FDA than all other drug problems combined.

Notice that the focus is always on abuses and never on successes. Here’s what The Cato Institute’s Handbook for Congress has to say about the FDA::

As an agency, the FDA has a strong incentive to delay allowing products to reach the market. After all, if a product that helps millions of individuals causes adverse reactions or even death for a few, the FDA will be subject to adverse publicity with critics asking why more tests were not conducted. Certainly, it is desirable to make all pharmaceutical products as safe as possible. But every day that the FDA delays approving a product for market, many patients who might be helped suffer or die needlessly.

For example, Dr. Louis Lasagna, director of Tufts University’s Center for the Study of Drug Development, estimates that the seven-year delay in the approval of beta-blockers as heart medication cost the lives of as many as 119,000 Americans. During the three and half years it took the FDA to approve the drug Interleukin-2, 25,000 Americans died of kidney cancer even though the drug had already been approved for use in nine other countries. Eugene Schoenfeld, a cancer survivor and president of the National Kidney Cancer Association, maintains that ‘‘IL-2 is one of the worst examples of FDA regulation known to man.’’

In the past two decades patients’ groups have become more vocal in demanding timely access to new medication. AIDS sufferers led the way. After all, if an individual is expected to live for only two more years, three more years spent testing the efficacy of a prospective treatment does that person no good. The advent of the Internet has allowed individuals suffering from specific ailments and patient groups to use websites and chat rooms to exchange information and to give them an opportunity to take more control of their own treatment. They now can track the progress of possible treatments as they are tested for safety and efficacy and are quite conscious of how FDA-imposed delays can stand in the way of their good health and even their lives….

[I]n a free society individuals should be free to take care of their physical well-being as they see fit. The advent of the Internet gives individuals even more access to information about medical products and treatments. Individuals should be allowed to choose the treatments they think best. Such liberty does not open the door for fraud or abuse any more than does a free market in other products. In fact, informed consent by patients probably will become more sophisticated as the market for information about medical treatments becomes more free and open.

Government regulation of health-care products and services makes them harder to get and more expensive than the products and services that would be delivered in the absence of regulation. Would quality suffer in a free-market health-care system? It might in some cases, but competition among producers and providers would lead to an overall increase in quality, in response to consumers’ demands for competent medical practitioners and effective drugs.

If it’s unnecessary to regulate health care, can we take the next step and de-nationalize it? What about other industries and types of economic activity? Stay tuned for Part III of this series.

Re-fighting Old Wars

Ted Kennedy thinks Iraq is Bush’s Vietnam. Why can’t it be Bush’s Spanish-American War, World War I, World War II, Korean War, invasion of Grenada, Gulf War, or even the whatchamacallit in the Balkans? None of those wars became a quagmire? Not true.

We were stuck with Cuba and the Philippines for decades after winning the Spanish-American War. We still have troops in Germany almost 60 years after winning World War II, and troops in Korea 50 years after the quagmire — oops — stalemate there. I think we still have a military presence in the Balkans, even after having brought Milosovich to the uncertain tribunal of international justice.

Ted, you should come up with a term more imaginative than “quagmire.” How about “Chappaquiddick”? “Iraq is Bush’s Chappaquiddick” would have the ring of moral authority, wouldn’t it?

The Iraqi Insurgency

The insurgents and al Qaeda are in cahoots, probably with the backing of Syria, Iran, and others. They’re trying to do what bin Laden tried and failed to do with 9/11, namely, demoralize the U.S. and force our withdrawal from the Middle East, to open the way for the ascendancy of Muslim fundamentalism. They won’t succeed as long as Bush is president, but they’re hoping, of course, that U.S. forces will fail to overcome the insurgency (or at least fail to do so quickly or decisively). That would help to ensure the election of Kerry, whom they view as being more likely to cut and run — a view that Kerry’s guru, Ted “Quagmire” Kennedy, has lent considerable credence.

We must, therefore, put down the insurgency and put it down quickly. I think we can and will as long as the worry-warts in Washington don’t put too many constraints on the Marines, which seems unlikely. According to a Marine who’s in Iraq, the president “has given us the green light to do whatever we needed to do to win this thing so we have that going for us.” That’s a quotation from an interesting and balanced e-mail posted by Andrew Sullivan.

So, I think the enemy has, once again, underestimated our strength and resolve. The “second Iraq war” — as some are calling the insurgency — may in the end prove to be the decisive war. We can win it. I expect that we will win it.

Fear of the Free Market — Part I

In So When Are We Going to Get That Free-Market Health Care Everyone’s Complaining About?, Trent McBride guesstimates that with the addition of the prescription drug benefit to Medicare “our health care system will be paid for by explicit or implicit public funds at a rate of 65-70%.” By “explicit or implicit public funds” he means direct payments (e.g., Medicare, Medicaid, and the VA) plus the sundry regulatory activities (e.g., FDA approval of new drugs) that are funded by taxes. McBride therefore characterizes the health-care system as “marginally nationalized.” He asks, “if we have a nationalized health-care system now, and that system is [considered] broken, is more nationalization the way to go?”

Sasha Volokh objects to McBride’s characterization of the health-care system as “nationalized” because what matters is not only “who pays but also…who controls.” Apparently, in Sasha Volokh’s view, Medicare doesn’t count as a form of nationalization because beneficiaries get to choose their doctors. In this regard, it’s important to recall the old variation on the Golden Rule: “Them what has the gold makes the rules.” I might get to choose my doctor from a government-approved list, but all good doctors won’t be on that list, nor will all the treatments I might like to have. It would cost me more to go to doctors who aren’t on the list and to receive non-approved treatments, but I may not be able to afford either because my wealth has been depleted by many years of paying into Medicare. Bottom line: Medicare is most certainly a form of nationalization.

Government’s effective control of the health-care system is only a notorious example of government’s distortion of free-market mechanisms. It is easier to list those markets in which the government doesn’t intervene (namely, “black markets”) than it is to list those markets in which the government does intervene. There simply isn’t a lawful business activity that isn’t affected by government regulation.

If, for example, I wished to turn this blog into a business by selling advertising space on it, I would (or should) get a business license from the city, pay property tax on my computer (as a piece of business equipment), keep a set of business books for tax purposes, file a special income tax return (Schedule C, at a minimum), and pay additional Social Security taxes at the rate for self-employed persons. If business thrived and I hired someone to help me produce the blog (or handle the paperwork), that would compound my compliance problem and the cost of dealing with it.

Alternatively, I could ignore the law and run the risk of being caught and fined or even imprisoned. That’s a risk that I might take for the sake of a low-profile blog. It’s not a risk that I would take for the sake of making big bucks as an untrained, unlicensed M.D., though it is a risk that others (sometimes trained but unlicensed doctors) have been willing to take.

In summary, government intervention in the market for any product or service tends to reduce the supply of that product or service.

But, but, but…the proponents of regulation say…if government didn’t require doctors to pass licensing exams people wouldn’t know if they were being served by “good” or “bad” docs (not to mention lawyers, electricians, plumbers, and beauticians). Similarly, if the FDA didn’t approve drugs, people wouldn’t know if they were buying efficacious drugs or snake oil. And so on and so forth.

Are all medical school graduates equally competent? Are all medical school graduates who pass licensing exams equally competent? Is the doctor who barely passes the exam significantly better than the doctor who barely flunks it? The correct answer in every instance is “no.”

Do medical licensing exams weed out a large percentage of incompetent doctors? It’s not obvious that they do. Statistics for takers of the <a href="

http:// http://www.usmle.org/news/2002perf.htm”>U.S. Medical Licensing Examination in 2002 indicate that about 85% of first-time takers of the exam from allopathic (conventional) medical schools in the U.S. and Canada successfully complete all three steps of the exam. With re-takes, the percentage successfully completing all three steps is expected to be 97%. Osteopaths have a lower success rate — 60% for first-takers — but they represent only 2% of the first-takers from U.S. and Canadian medical schools.

The only real weeding-out takes place among graduates of medical schools outside the U.S. and Canada. First-takers from those medical schools have only a 34% success rate. This weeding-out may reflect incompetence in English — even though applicants had to pass an English-language proficiency exam — as much as it does incompetence in medicine. These results suggest a simple strategy of avoiding doctors who weren’t trained in the U.S. or Canada — a strategy that many Americans follow instinctively.

As for graduates of medical schools in the U.S. and Canada, you’re on your own. When you go to a licensed doctor for the first time you will probably have no clue about that doctor’s competence. You can avoid the relatively few doctors who have been disciplined because most States now make such information available online. You can get recommendations from family, friends, and acquaintances, but those recommendations may tell you more about a doctor’s “bedside manner” than about his or her competence. And in some large cities you can find lists in local magazines for the “best” doctors, by specialty, though you will have no idea of the criteria underlying such lists. In the end, you’ll simply hope that your doctor is competent, if not warm and fuzzy.

You’ll learn from experience whether your doctor seems competent, just as you’ll learn from experience whether your auto mechanic is competent (and honest) or merely a smiling face. So much for licensing as a boon to consumer choice.

Strategic Vision

Washington’s strategic vision was to break free of British rule. He persevered and the newborn United States survived to childhood.

Lincoln’s strategic vision was to preserve the Union and the ideals of the Declaration of Independence. He persevered and the Union passed tumultuously from adolescence to vigorous adulthood.

Roosevelt’s strategic vision was to cure the adult nation of its Depression. His apparent success — which was owed in fact to a horrific war — sapped the nation’s vigor by leading it into long-term dependence on government.

Reagan’s strategic vision was to cure the nation of its dependence on government, to restore it vigorous adulthood. He failed because the nation’s addiction was too strong to be broken by a mere president, unaided by Congress.

Clinton’s strategic vision — pursued from his adolescence — was to become president. He persevered and the nation sank deeper into senile dependence on government.

Pet Peeves

• Smug cities (e.g., Austin, New York, and San Francisco — “We’re so cool.”)

• SUVs (“We’re more important than you; get out of our way.”)

• Tailgaters (usually SUVs)

• Tailgaters with their brights on

• Canned music

• Contemporary jazz — canned or live — in a restaurant

• The demise of dress codes in most expensive restaurants

• Squealing babies and noisy children in any restaurant that doesn’t have drive-through service

• Baggy clothing, skimpy clothing, and piercings

• Conversational filler: like, you know

• Driving while talking on a cell phone

• Talking on a cell phone in the presence of a captive audience (e.g., in a waiting room, airport lounge, or airplane)

• Playing a car stereo so loudly that it can be heard in the next car, if not a block away

• Failing to say “thank you” when someone holds a door open for you

• TVs that are always on

• Most of what’s on TV

• Most of what’s on radio

• Most of what’s in movies

• Most of what’s called music

• Most of the 20th century and all of the 21st century thus far

The Good Old Days

I remember the good old days.

The United States had just won a popular war when I entered Kindergarten. The war was concluded when a Democrat president decided to use weapons of mass destruction that killed about 200,000 enemy civilians. (Historical revisionists take note: The alternative was an invasion that would have cost at least as many American lives and resulted in many more civilian casualties.)

My father bought a 1938 Ford V8 in 1940. He kept it until 1951. He didn’t buy his first new car until 1956. He and my mother never owned two cars.

My father sometimes brought home live chickens, which he dispatched at the chopping block. I was allowed to watch this spectacle because it was a part of daily life called “putting food on the table.” I wasn’t scarred for life by the experience.

Nor were my values twisted by daily exposure to sex and violence on TV. I listened to Jack Benny, the Great Gildersleeve, Our Miss Brooks, the Lone Ranger, and Superman on the radio.

I went to three different red-brick schoolhouses as I progressed from Kindergarten through the fifth grade. Each schoolhouse was by then at least 60 years old. I was nevertheless well educated in the three Rs because my teachers didn’t have to put up with rude, unruly, and inattentive students.

Every schoolroom had framed pictures of George Washington and Abraham Lincoln high on the wall. (The ceilings were high in those red-brick schoolhouses so that their triple-hung windows could be opened in hot weather. The only air conditioned buildings in our small city were the movie theaters.)

Washington’s Birthday was a legal holiday. So was Lincoln’s.

There was one black student in my school when I was in the fourth grade. He was my best friend. It was no big deal.

When I was six or seven years old I traveled by bus to the village where my grandmother lived, a trip of 90 miles. I traveled alone. My mother put me on the bus and my grandmother met me at the other end.

My grandmother raised ten children without the benefit of welfare, social workers, au pairs, nannies, and cleaning services. She didn’t have indoor plumbing or a telephone until she was 70. That was when she also got an electric stove to replace her wood stove. Her children built the bathroom and installed the stove for her.

My grandmother lived to the age of 96. In her later years I persuaded her to give me the photographs of her and my grandfather that had hung high on her living room wall for so many years. That was all she could afford to give me. It was more than enough. It was priceless.

So were the good old days.

Polls, Party Preferences, and Polarization

The Pew Research Center’s web site includes a page entitled The 2004 Political Landscape: Evenly Divided and Increasingly Polarized. The first graph on that page shows party identification in the U.S. between 1937 and 2003. The graph also (unintentionally) shows why polls are so unreliable:

1. The incumbent president’s popularity strongly affects what people tell pollsters about party affiliation. There has been a consistent swing toward the opposite party as the popularity of incumbent presidents has waned or plummeted. This phenomenon can be seen toward the end of every presidency from Truman’s through Clinton’s, and most notably toward the end of Nixon’s disgraced presidency.

2. The core of each party’s constituency has changed drastically during the past seven decades. Remember when New England was reliably Republican and the “Solid South” was a bastion of the Democrat Party? Remember when there was more than a handful of liberal Republicans and conservative Democrats in Congress? The realignment of party affiliations wasn’t sudden. It began in 1948, when many Southerners found it possible not to vote for a Democrat. It continued in 1952, when popular Ike ran as a Republican. It accelerated in 1960, when the Democrats nominated Catholic JFK, much to the consternation of many Southerners. It got another boost in 1968, when Democrats got on the wrong side of the culture war. And it continued well into the 1980s, thanks largely to Carter’s ineptness and the left’s continuing dominance within the Democrat Party. Polling results about party preferences were largely meaningless during the 40 years from 1948 to 1988 because personal as well as regional party alignments were in almost constant flux during that period.

Pollsters — and pundits — are nevertheless fond of drawing sweeping inferences from flawed statistics. An inference that has played prominently since the close presidential election of 2000 is that the nation has become “polarized.” That is, many States have become reliably “Red” (Republican) and “Blue” (Democrat), instead of vacillating from one election to the next. In this case, the pollsters and pundits are right, but they would have been just as right in the 1940s and 1950s, when Republicans reliably held New England and Democrats solidly held the South. So why is “polarization” now such a big issue?

It’s a big issue because the Democrat Party no longer enjoys the large (but illusory) plurality that it enjoyed from New Deal days until the 1980s. “Polarization” is bad only if it means that your favorite party is no longer the dominant party.

The underlying fear, of course, is that today’s “polarization” may become tomorrow’s Republican dominance. As another graph on the Pew page indicates, Democrats tend to be older than Republicans. That is, Democrats are dying at a faster rate than Republicans.

Justice Is Dead, Even in Texas

From FoxNews.com:

Texas woman who claimed God ordered her to bash in heads of her three children — two of whom died — acquitted of all charges.

Rating Books, Movies, and Presidents

I have found that I rate books, movies, and music as follows:


• I have (or would gladly) read, see, or hear it more than once. (***)

• Once was enough, but I enjoyed it most of the time. (**)

• I made it to the end. (*)

• I tried but gave up on it. (0)

One person’s *** book or movie won’t be another person’s *** book or movie. By the same token, I’ve given up on many a book and movie that critics and friends have raved about. Among my *** books are Edith Wharton’s Ethan Frome, John Fowles’s The Magus, and Stephen King’s The Stand. Some of my *** movies are “The Philadelphia Story,” “Gunga Din,” and “My Man Godfrey.” Books and movies that I’ve given the goose egg include James Joyce’s Ulysses and Finnegan’s Wake, anything I’ve tried by Martha Grimes and Elizabeth George, and such film “classics” as “Z” and “Last Year at Marienbad.”

Although I’ve read a lot of books and seen a lot of movies that rate ** and *, my preferences in music tend to be binary. Almost anything written between 1700 and 1900 gets *** (the tedious compositions of Wagner, Mahler, and Bruckner being the most notable exceptions). I give a big fat 0 to almost anything written after 1900 by a so-called serious composer: the likes of Berg, Stravinsky, Shostakovich, Poulenc, Britten, Hovannes, Glass, and their more recent offshoots. For music written after 1900, I turn to Gershwin, Lehar, Friml, Kern, bluegrass, jazz (written before 1940), and rock of the 1960s to early 1980s.

Now that I’ve lived through, and remember, 11 complete presidencies — from Truman’s through Clinton’s — here’s how I’d rate them on my book/movie/music scale:


Truman **

Eisenhower ***

Kennedy *

Johnson 0

Nixon 0

Ford *

Carter 0

Reagan ***

Bush I *

Clinton 0

You can try this at home.

Never Relent: A Tale of Libertarian Dissent

I’m a heretic from libertarian orthodoxy on two major issues: immigration (which I’d tighten considerably) and pre-emptive war (which I favor). I’m also willing to give law-enforcement agencies the benefit of the doubt when it comes to snooping in search of terrorist conspiracies.

I’m still a staunch libertarian on most other issues, but when it comes to terrorists, I say keep them out (or as many as we can), kill as many as possible before they get here, and if they get here, catch them before they kill us. I don’t want my murder to be avenged by justice or retribution, I want to fully enjoy my golden years in the sunshine. I want the same for my wife, my children, my grandchildren, and all my progeny.

When my wife and I turned on our TV set that morning of 9/11/01, the first plane had just struck the World Trade Center. A few minutes later we saw the second plane strike. In that instant a horrible accident became an obvious act of terror. Then, in the awful silence that had fallen over Arlington, Virginia, we could hear the “whump” as the third plane hit the Pentagon.

Our thoughts for the next several hours were with our daughter, whom we knew was at work in the adjacent World Financial Center when the planes struck. Was her office struck by debris? Did she flee her building only to be struck by or trapped in debris? Was she smothered in the huge cloud of dust? Because telephone communications were badly disrupted, we didn’t learn for several hours that she had made it home safely.

Thousands of grandparents, parents, husbands, wives, children, grandchildren, lovers, and good friends — the survivors of the 3,000 who died that day in Manhattan, the Pentagon, and western Pennsylvania — did not share our good fortune. Never forgive, never forget, never relent.

A Colloquy on War and Terrorism

Able. Is it right to go to war against a country that has not attacked us?

Baker. No.

Able. What about Nazi Germany?

Baker. Well, Nazi Germany was in league with Imperial Japan, which had attacked us.

Able. So it’s all right to go to war with our enemy’s friend?

Baker. Well, only if the enemy has already attacked us.

Able. Hadn’t we already been attacked by al Qaeda, not once but several times, before we went to war against Saddam Hussein?

Baker. But Saddam wasn’t a friend of al Qaeda.

Able. You don’t believe that Saddam condoned the giving of support to al Qaeda by members of his regime, even if he wasn’t directly involved?

Baker. Well suppose Saddam’s regime had nothing to do with al Qaeda, after all there are many who question the Saddam-al Qaeda link. That leaves Saddam as a potential enemy, but he didn’t pose an imminent threat to us.

Able. Did Hitler pose an imminent threat to us in December 1941?

Baker. No, but Saddam was no Hitler, that is, he lacked the wherewithal to attack us any time soon, if ever.

Able. It doesn’t matter to you that he was an oppressive dictator and a known enemy of the U.S., and that — at a minimum — his presence emboldened other regimes in the Middle East to support terrorism?

Baker. We shouldn’t have invaded Iraq until it became clear that Saddam posed a direct and imminent threat to the U.S.

Able. In other words, we shouldn’t spray a nest of hornets if only one of them has stung us? We should wait until more have stung us?

Baker. But our pre-emptive war caused much innocent blood to be shed.

Able. How much more innocent blood will be shed if we don’t go after terrorism at its roots?’

Baker. But what if our pre-emptive strategy inflames hatred of the U.S. and creates even more terrorists?

Able. What if our pre-emptive strategy also deters would-be terrorists by creating fear of, if not respect for, the U.S.? (Look at what’s happened in Libya, for instance.) What if our pre-emptive strategy makes it harder for would-be terrorists to act on their hatred? There is — and was — already an ample supply of America-haters in the Middle East (and elsewhere). Nothing we do, or don’t do, is likely to reduce their numbers significantly. They hate America not out of poverty or ignorance (though many of them are poor and ignorant), but because most humans have a need to hate something. The U.S., with all its power and wealth, is an easy target for hatred. Does hatred justify terror?

Baker. Of course not, but surely there must be a better way than pre-emptive war.

Able. Shall we all join hands at the United Nations and denounce terrorism? Well, that’s already been tried, and a lot of good it’s done. Tell me what you would do. Go on, tell me, I’m waiting…

Baker. We need to detect and prevent actual terrorist operations through improved intelligence.

Able. I agree. But I don’t see that as an alternative to pre-emptive action overseas. We need both better intelligence and pre-emptive action, especially because there are many things intelligence cannot do. It cannot keep out terrorists who are already in the country. It cannot keep out terrorists who can easily cross our mostly open borders with Canada and Mexico. It cannot keep out terrorists who come into the country on seemingly legitimate business and then vanish from sight. It cannot prevent any of these terrorists from making weapons of terror from materials that can be bought or stolen. We can reduce such risks by making it easier for law-enforcement agencies to detect terrorist plans and conspiracies, as we have through the Patriot Act.

Baker. I’m glad you mentioned the Patriot Act…

Able. Me, too. You’re aghast at some of the leeway it gives law-enforcement agencies, though we always run the risk that they will abuse their already considerable power. But you’re also aghast at the doctrine of pre-emption. I guess that your anti-terror strategy is to hunt down terrorists after they have struck.

Baker. That’s not fair.

Able. It’s a logical consequence of your position. You either fight terror or you let it happen to you.

9/11 and Pearl Harbor

It has been about two and a half years since September 11, 2001. Two and a half years after the Japanese attacked Pearl Harbor — that is, by mid-1944 — the U.S. and its allies had rallied decisively against the Axis: Allied forces had successfully landed in Normandy and U.S forces in the Pacific were leap-frogging toward the Japanese homeland.

But World War II was far from over in two and a half years. It took another year of bloody fighting in Europe — and more than a year in the Pacific — to defeat the Axis. Yet the Germans and Japanese waged conventional war: Their units were identifiable. They could be found, attacked, and destroyed, without ambiguity.

Why, then, would anyone expect that we should be near victory over al Qaeda and its allies after a mere two and a half years? The enemy is within our borders, and within the borders of other Western nations. The enemy is hard to identify and, therefore, hard to attack and destroy. Unlike World War II and previous wars, we cannot measure the march toward victory by the rate of advance toward an enemy’s capital.

We have done much to disrupt the enemy’s plans, communications, and financing through our successes in Afghanistan and Iraq — and through other successes that cannot be publicized without telling the enemy what we know and how we know it. Despite all the press about bloody acts of “resistance” in Iraq and bloody acts of terror elsewhere, we are winning.

Victory in the war on terror will not come in another year or two, but it will surely come if we persist — and only if we persist. Our persistence will be tested by more bloody acts, inside and outside our borders. Those acts will test our resolve to “provide for the common defence.”

Will we fight the enemy or try to appease him? I am not confident of the answer. The United States of 2004 lacks the moral fiber of the United States of 1941.

I’ll Never Understand the Insanity Defense

Headline at FoxNews.com:


Mom Describes Stoning Sons on Tape

Psychiatrist says woman delusional when she killed sons with rocks

It’s impossible to know a person’s “state of mind” at the time he or she committed a crime. It follows that “innocent by reason of insanity” is — pun intended — an insane verdict.

And so what if a person was “insane” at the time he or she committed a crime? A crime was committed and, therefore, someone must be “guilty” of it. If not the “insane” person, then who, Harvey the Rabbit?

Why Outsourcing Is Good: A Simple Lesson for Liberal Yuppies

You work in Manhattan, at the headquarters of a company whose product is sold throughout the U.S. and overseas. You live in Connecticut and commute to Manhattan by train. You drive to and from the train station in an SUV that was assembled in Tennessee.

Shazam! Outsourcing is outlawed. You can’t buy a new SUV unless it’s assembled in Connecticut and all its parts are made in Connecticut of raw materials that are native to Connecticut.

Wait, it gets worse. You can’t work for a Manhattan-based firm if you live in Connecticut. Only Manhattanites need apply. The good news is that you won’t need an SUV if you live in Manhattan. The bad news is that you can’t afford to live in Manhattan. The good news is that you wouldn’t want to live there anyway, because the only raw materials native to Manhattan are smog and smut.