Modernism in the Arts and Politics

David Friedman has a theory about the “modern” movement:

Suppose you are the first city planner in the history of the world. If you are very clever you come up with Cartesian coordinates, making it easy to find any address without a map, let alone a GPS—useful since neither GPS devices nor maps have been invented yet.

Suppose you are the second city planner. Cartesian coordinates have already been done, so you can’t make your reputation by doing them again. With luck, you come up with some alternative, perhaps polar coordinates, that works almost as well.

Suppose you are the two hundred and ninetieth city planner in the history of the world. All the good ideas have been used, all the so-so ideas have been used, and you need something new to make your reputation. You design Canberra. That done, you design the Combs building at ANU, the most ingeniously misdesigned building in my personal experience, where after walking around for a few minutes you not only don’t know where you are, you don’t even know what floor you are on.

I call it the theory of the rising marginal cost of originality—formed long ago when I spent a summer visiting at ANU.

It explains why, to a first approximation, modern art isn’t worth looking at, modern music isn’t worth listening to, and modern literature and verse not worth reading. Writing a novel like one of Jane Austen’s, or a poem like one by Donne or Kipling, only better, is hard. Easier to deliberately adopt a form that nobody else has used, and so guarantee that nobody else has done it better.

In other words, if you can’t readily do better than your predecessors, you take the easy way out by doing something different — ugly as it may be. And you call it “progress.” As I wrote here:

In the early decades of the twentieth century, the visual, auditory, and verbal arts became an “inside game.” Painters, sculptors, composers (of “serious” music), choreographers, and writers of fiction began to create works not for the enjoyment of audiences but for the sake of exploring “new” forms. Given that the various arts had been perfected by the early 1900s, the only way to explore “new” forms was to regress toward primitive ones — toward a lack of structure…. Aside from its baneful influence on many true artists, the regression toward the primitive has enabled persons of inferior talent (and none) to call themselves “artists.” Thus modernism is banal when it is not ugly.

Painters, sculptors, etc., have been encouraged in their efforts to explore “new” forms by critics, by advocates of change and rebellion for its own sake (e.g., “liberals” and “bohemians”), and by undiscriminating patrons, anxious to be au courant. Critics have a special stake in modernism because they are needed to “explain” its incomprehensibility and ugliness to the unwashed.

The unwashed have nevertheless rebelled against modernism, and so its practitioners and defenders have responded with condescension, one form of which is the challenge to be “open minded” (i.e., to tolerate the second-rate and nonsensical). A good example of condescension is heard on Composers Datebook, a syndicated feature that runs on some NPR stations. Every Composers Datebook program closes by “reminding you that all music was once new.” As if to lump Arnold Schoenberg and John Cage with Johann Sebastian Bach and Ludwig van Beethoven.

All music, painting, sculpture, dance, and literature was once new, but not all of it is good. Much (most?) of what has been produced since 1900 is inferior, self-indulgent crap.

As it was in the arts, so it was in politics. Yes, there was sleaze before 1900, and plenty of it. But presidents, members of Congress, and justices of the Supreme Court generally remained faithful to the Constitution, especially its restraints on the power of the federal government. Then along came populism and “progressivisism” — the twin pillars of political modernism in the United States — and down went liberty and prosperity.

By Their Musical Preferences Ye Shall Know Them

Marginal Revolution has become an increasingly “marginal” blog because its dominant contributor, Tyler Cowen, has become increasingly incoherent. It turns out that Cowen is a fan of Elliott Carter, who writes incoherent “music,” of which many samples can be heard here.

Neither sound economics nor good music is consistent with incoherence. Therefore, I have scratched Marginal Revolution from my reading list, just as years ago I scratched my copy of a chamber-music LP to eradicate an unlistenable piece by Elliott Carter.

Musical Memories

The six songs I remember from an early age:

I’ve Got Spurs That Jingle Jangle Jingle” (probably sung by Gene Autry)

You Are My Sunshine” (probably sung by Jimmie Davis, who wrote it)

Cool Water” (sung by the Sons of the Pioneers)

Always” (sung by Dinah Shore)

Mairzy Doats” (probably sung by the Andrews Sisters)

Let It Snow

Random Thoughts

Why is “gunite” pronounced gun-ite, whereas “granite” is pronounced gran-it?

If, in 1950, Harry Truman had said “four score and seven years ago,” he would have been referring to 1863, the year in which Abraham Lincoln uttered that famous phrase.

In the computer industry, “email” is preferred to “e-mail.” But it seems to me that “e-mail” better represents the phrase “electronic mail.” The meaning of “e-mail” is immediately obvious to me; “email,” at first glance, looks like a typo.

If the dismal northern weather of early April and late October — which delayed the start of the 2008 baseball season in some cities and then disrupted the World Series — doesn’t convince Major League Baseball to lop two weeks from each end of the regular season, nothing will.

One of the funniest movies I’ve seen is Harold Lloyd’s Dr. Jack (1922). It starts slowly, but builds to a hilariously frantic finish. Lloyd’s Safety Last! is better known — and deservedly considered a comedy classic — but it isn’t half as funny as Dr. Jack.

Between novels, I have been slogging my way through Thomas K. McCraw’s Prophet of Innovation: Joseph Schumpeter and Creative Destruction. There’s too much armchair psychology in it, but it whets my appetite for Schumpeter’s classic Capitalism, Socialism, and Democracy, which (I hate to admit) I haven’t read. Schumpter’s famous term for capitalism, “creative destruction,” often is applied with an emphasis on “destruction”; the emphasis should be on “creative.”

I must observe, relatedly, that my grandmother’s lifetime (1880-1977) spanned the invention and adoption of far more new technology than is likely to emerge in my lifetime, even if I live as long as my grandmother did.

Forgotten Stars

Richard Barthelmess

Warner Baxter

Bebe Daniels

Richard Dix

Kay Francis

Gail Patrick

One-Line Movie Reviews

Movies I have seen this year:

Once – Buskers’ holiday.

The Savages, Married Life – Good actors wasting their time and mine.

No Country for Old Men, There Will Be Blood, Before the Devil Knows You’re Dead, The Assassination of Jesse James by the Coward Robert Ford, Straight Time, 3:10 to Yuma, Gone Baby Gone, American Gangster – Good actors wasting their time and mine with gratuitous violence.

Interview, Sunshine, The Nines, Unconscious, Death at a Funeral, I’m Not There, Cassandra’s Dream, The American Friend – Weird and mysterious doings, sometimes funny, mostly just weird and mysterious.

The Bourne Ultimatum – Action for action’s sake.

The Man on the Flying Trapeze – W.C. Fields wings it.

Cave of the Yellow Dog, Into the Wild, The Tunnel, The Kite Runner, The Counterfeiters – Gripping reality.

A Little Princess, The Jane Austen Book Club – Enjoyable froth.

Michael Clayton – New Deal propaganda in the 21st century.

Becoming Jane, La Vie en Rose, The Whole Wide World, My Boy Jack – Well done biopics and period pieces.

Lust, Caution – Spicy Chinese fare.

Shadow of a Doubt – Overrated Hitchcock.

Safety Last, Girl Shy – Hilarious silent stuff.

The Heart Is a Lonely Hunter, Tomorrow – Southern soul.

Atonement, This Is England, The Search for John Gissing, Son of Rambow – Excellent Britflicks.

Resurrecting the Champ, The Bucket List, The Great Debaters – Feel-good American films — barely bearable.

Ballet Shoes, Before the Rains, Miss Pettigrew Lives for a Day — Better-than-bearable Britflicks (two feel-gooders, one soaper with scenery); those accents do make a difference.

Lars and the Real Girl, Charlie Bartlett – Middle-age/teen-age angst.

The Bank Job – The best caper movie since Snatch; Topkapi in London, and more realistic.

Charlie Wilson’s War – Mr. Smith goes to Kabul, with laughs.

Sidekicks, with a Twist

A sidekick, according to Wikipedia,

is a stock character, a close companion who assists a partner in a superior position. Sancho Panza in Don Quixote, Doctor Watson in Sherlock Holmes and Batman‘s companion Robin are some well-known sidekicks….

Sidekicks not only provide comic relief but can occasionally be brave or resourceful at times and rescue the hero from some dire fate: such as … Festus Haggen of Gunsmoke‘s Matt Dillon….

Sidekicks also frequently serve as an emotional connection, especially when the hero is depicted as detached and distant, traits which would normally generate difficulty in making the hero likable. The sidekick is often the confidant who knows the main character better than anyone else and gives a convincing reason to like the hero. Although Sherlock Holmes was admittedly a difficult man to know, the friendship of Dr. Watson convinces the reader that Holmes is a good person….

While it is usually the reverse, it is not unheard of for a sidekick to be physically more conventionally attractive, charismatic, or physically capable than the character who is intended to be the hero. This is most typically encountered when the hero’s appeal is supposed to be intellect instead of sex appeal or physical prowess. Such characters are often middle aged or older and tend towards eccentricity; fictional sleuths and scientists for example. Such sidekicks are rarely encountered in fiction because the hero runs the risk of being upstaged by them. However, examples of successful such pairings include Inspector Morse and his sidekick DS Robbie Lewis, Nero Wolfe and his sidekick Archie Goodwin….

Other famous sidekicks — whose roles vis-a-vis their partners range from comic foil to friendly nemesis to voice of reason to stalwart ally — include (in no particular order):

I’m sure I’ve omitted other notable pairings. I’ll add them as they come to mind.

Two Tenors

Compare the legendary John McCormack (1884-1945), an Irish tenor whose career spanned five decades, and Brooklyn-born Franklyn Baur (1904-1950), whose career lasted less than ten years.

Both singers recorded many popular songs of the 1920s (McCormack samples here and here; Baur samples here). McCormack’s influence on Baur (among others) is unmistakable, most notably in Irving Berlin’s “You Forgot to Remember.” Baur masked his native accent more successfully than did McCormack. But that is no criticism of McCormack, whose distinctive, lilting voice was supported by exemplary vocalism.

Baur, the original first tenor of The Revelers, was the engine of that group’s originality and success. (Aural evidence of Baur’s influence can be heard on Breezin’ Along with The Revelers, where the group’s innovative, jazzy sound turns more traditional — even “barbershoppy” — following Baur’s departure.) Had it not been for the influence of The Revelers, as they were in Baur’s time, the Comedian Harmonists — an even better ensemble — might not have been formed. (If you’ve never heard of the Comedian Harmonists, you must see Comedian Harmonists, a 1997 dramatization of the group’s history that is both toe-tapping and touching.) And without McCormack, the world might not have come to embrace Irish tenors.

We are fortunate that so many examples of McCormack’s and Baur’s art survive them.

A Rumination on Red

I like red as an accent color. I especially like red’s brighter and deeper hues (for example, carmine, cerise, cherry, cinnabar, crimson, fuschia, magenta, maroon, ruby, sanguine, scarlet, and vermillion). Therefore:

Classics on Film: Last of the Mohicans

Guest commentary by Postmodern Conservative.

When watching film versions of great books I am reminded of the old “classic comics”—those illustrated presentations of famous literature that were put out in the ’50s and ’60s. I managed to get hold of one or two ragged copies of them as a kid in the ’70s. Of course, there is always the danger that popular presentations of classic stories, abridged in print or film, can result in the dumbing-down of great literature. The Veggie Tales series, for example, goes too far in that direction, turning stories of the Bible and famous novels into silly preschool caricatures. It reveals the tendency of adults to underestimate children. But if done right, movies can give young people a taste for good books, and they can be enjoyable in their own right.

This past week my kids and I watched the 1971 BBC miniseries of James Fenimore Cooper’s The Last of the Mohicans. Like a lot of British productions from the period it is low budget and you can see them recycling some of the same actors as both British soldiers and Indians. Yet they really did the most with what they had. There are memorable characters and good dialogue. Cooper has Indians declaiming like Shakespearean actors. But that is no more anachronistic than having ancient Romans talking like Elizabethan Englishmen. What matters is the story. That is probably why my kids also liked the 1953 version of Julius Caesar. And there’s plenty of well choreographed action in Last of the Mohicans—realistic but not too violent for younger viewers. No doubt because it was a British non-Hollywood production it was true to the original story, more accurate (and I would say probably more enjoyable) than the 1992 version.

Boswell’s Book

Guest commentary by Postmodern Conservative

On a more uplifting note, here is a piece about James Boswell’s Life of Johnson, a 1,200+ page book I’ve read through twice, by Henrik Bering in Policy Review:

Among the great encounters of literature, none ranks higher than the one that took place between James Boswell and Samuel Johnson in Tom Davis’s bookstore in Russell Street, Covent Garden on Monday, May 16, 1763.

Of particular interest are [Johnson’s] reading habits. Dropping by for a visit, Boswell found Johnson dusting his books, with a “cloud of dust flying around him,” “wearing a pair of large gloves such as hedgers use,” and living up to Boswell’s uncle’s characterization of him as “a Herculean genius, born to grapple with whole libraries.” (When visiting others, Johnson would make a beeline for their bookshelves and lose himself completely, “almost brushing the books with his eyelashes,” as the novelist Fanny Burney has noted.) One of the Life’s nicest images shows us Johnson outside “swinging upon the low gate” of the Thrale residence without his hat, totally absorbed in his book.

Johnson was a host of contradictions: by turns kind and brutal, stern and forgiving, a subtle intellect which could be incredibly rigid, an intellectual bruiser and a kind and humane man, and for Boswell it was imperative to get the emphasis right (“The
Ultimate Literary Portrait
“).

Policy Review always has good political and social analysis. This is the first time I’ve seen a literary essay. It was enjoyable.

Richard Scarry Gets Scary?

Guest commentary by Postmodern Conservative.

Actually, Richard Scarry books have be dumbed-down for years, but I only noticed it recently because of my young children. I regret now that some of their older Scarry books have bit the dust from over-use. As it turns out, they were irreplaceable.

What about new editions and reprints? Don’t count on it. For example, I own a copy of Richard Scarry’s Best Storybook Ever, an original from the 1960s. It’s in rather poor shape so I was thrilled to see that it has been re-issued. It’s the most visually appealing of all his books with some great stories. But it turns out that the story of the Quebec bruin, “Pierre Bear,” is gone. I imagine it’s because he is shown hunting seals and turning their pelts into fur coats.

For a sad comparison of Scarry’s popular Best Word Book Ever between 1963 and 1991 editions, see this. Not only is the artwork altered in the name of political correctness, in many cases it just plain remedial compared to Scarry’s originals. Another point made by critics is that the language has been made stupider compared to what kids a generation or two ago were reading. Unlike Scarry, these publishers don’t know how to write for children, only overindulged leftist adults.

Unsplit Infinitives

Eugene Volokh, a known grammatical relativist, scoffs at “to increase dramatically,” as if “to dramatically increase” would be better. But better in what way: clearer or less stuffy? The meaning of “to increase dramatically” is clear. The only reason to write “to dramatically increase” would be to avoid the appearance of stuffiness.

Seeming unstuffy (i.e., without standards) is neither a necessary nor sufficient reason to split an infinitive. The rule about unsplit infinitives, like most other grammatical rules, serves the valid and useful purpose of preventing English from sliding yet further down the slippery slope of incomprehensibility than it has slid already. If an unsplit infinitive makes a clause or sentence seem awkward, the clause or sentence should be rewritten to avoid the awkwardness. Better that than make an exception that leads to further exceptions — and thence to babel.

Related posts:
Missing the Point” (28 Mar 2008)
More Grammatical Anarchy” (31 Mar 2008)

Root Causes

No matter how much money is spent to combat poverty, crime, sloth, slovenliness, rudeness, obesity, poor grades, and all other “social ills,” those “ills” cannot and will not be cured unless three things change:

1. The state quits discouraging innovation, entrepreneurship, and capital investment through taxation and regulation.

2. The state quits subsidizing the less capable at the expense of the more capable, which subsidization (a) helps to ensure the reproduction of the less capable at a faster rate that of the more capable, and (b) traps the less capable in a cycle of dependency which prevents them from taking ownership of their lives and striving to escape the ills of poverty, crime, sloth, etc.

3. The state quits discouraging heterosexual marriage, family formation, and the inculcation of traditional values. The state discourages those things through its bureaucracies, public schools, and public universities, which (altogether) celebrate “diversity” and “alternative lifestyles,” belittle religion, denigrate traditional marriage, foster teen sex through contraception, elevate abortion to a secular sacrament, underwrite single motherhood, encourage mothers to work outside the household, and enable couples to divorce at the drop of a hat rather than work out their differences.

I have written so many related posts that I cannot begin to list all of them. I refer you to “The Best of Liberty Corner.”

"John Adams"

Regarding the HBO mini-series, John Adams, I have two comments:

1. I never got used to the idea of Paul Giamatti as John Adams. Giamatti simply doesn’t look the part, and he never seemed to be comfortable with the hybrid English-Yankee accent chosen for his character.

2. The mini-series was geared to viewers who are largely ignorant of American history. Why else would the script writers have kept injecting trite dialog to “establish” well-known facts? The final episode, for example, included cumbersome reminders that John Quincy Adams (John’s eldest son) also served as president (1825-9), and that both John Adams and Thomas Jefferson died on the Fourth of July 1826, the fiftieth anniversary of the Declaration of Independence.

More Grammatical Anarchy

I noted in the previous post that Mark Liberman of Language Log is a grammatical anarchist. Perhaps grammatical anarchism is a condition of blogging at Language Log. Arnold Zwicky of that blog corrects a writer who refers to the subjunctive mood as the “subjective tense.” So far, so good. But Zwicky then goes on to excuse those who insist on using

the ordinary past rather than a special counterfactual form (often called “the subjunctive” or “the past subjunctive”) for expressing conditions contrary to fact….

…There’s absolutely nothing wrong with using the special counterfactual form — I do so myself — but there’s also nothing wrong with using the ordinary past to express counterfactuality. It’s a matter of style and personal choice, and no matter which form you use, people will understand what you are trying to say.

But somehow preserving the last vestige of a special counterfactual form has become a crusade for some people. There are surely better causes.

There may be “better causes,” but Zwicky’s ceding of grammatical ground to “personal choice” leads me to doubt that he will fight for those causes.

Missing the Point

Mark Liberman of Language Log has devoted at least three posts to James J. Kilpatrick’s supposed linguistic socialism. Kilpatrick stands accused (gasp!) of trying to propound rules of English grammar. Given that Kilpatrick can’t enforce such rules, except in the case of his own writing, it seems to me that Liberman is overreacting to Kilpatrick’s dicta.

I am not surprised by Liberman’s reaction to Kilpatrick, given that Liberman seems to be a defender of grammatical anarchy. Liberman tries to justify his anarchistic approach to grammar by quoting from Friedrich Hayek’s Law, Legislation and Liberty, Volume 1: Rules and Order; for example:

Man … is successful not because he knows why he ought to observe the rules which he does observe, or is even capable of stating all these rules in words, but because his thinking and acting are governed by rules which have by a process of selection been evolved in the society in which he lives, and which are thus the product of the experience of generations.

All of which is true, but misinterpreted by Liberman.

First, given that Kilpatrick cannot dictate the rules of grammar, he is a mere participant in the “process of selection” which shapes those rules. In a world that valued effective communication, Kilpatrick’s views would be given more weight than those of, say, a twenty-something who injects “like, you know,” into every sentence. But whether or not Kilpatrick’s views are given more weight isn’t up to Kilpatrick. However much Kilpatrick might like to be a linguistic authoritarian, he is not one.

Second, Hayek’s observation has nothing to do with anarchy, although Liberman wants to read into the passage an endorsement of anarchy. Hayek’s real point is that rules which survive, or survive with incremental modifications, do so because they are more efficient (i.e., more effective, given a resource constraint) than rules that fall by the wayside.

Kilpatrick, and other “strict constructionists” like him, can’t dictate the course of the English language, but they can strive to make it more efficient. Certainly the thought that they give to making English a more efficient language (or forestalling its devolution toward utter inefficiency) should be praised, not scorned.

Language games can be fun, but language is much more than a game, contra Liberman’s approach to it. Language is for communicating ideas — the more efficiently, the better. But, in the three posts linked here, Liberman (strangely) has nothing to say about the efficiency of language. He seems more concerned about James J. Kilpatrick’s “linguistic socialism” than about the ability of writers and speakers to deploy a version of English that communicates ideas clearly.

Well, at least Liberman recognizes socialism as a form of authoritarianism.

Singing It, Proudly

This should bring a lump to your throat and a tear to your eye.

(Thanks to Mark Perry for the pointer.)

A Fighting Frenchman

Yesterday I watched La Môme (a.k.a. La Vie en Rose), a sad film about the sad life of Édith Piaf. Watching the film, I was reminded that Piaf had an affair with boxer Marcel Cerdan, a French pied-noir.

Cerdan won several boxing titles as a welterweight and middleweight, including the world middleweight championship. Cerdan’s career came to a tragic end in October 1949, when he was killed in a plane crash while enroute to meet Piaf in New York, where she was then performing.

Having been reminded of Cerdan, it occurred to me that he may have been the last Frenchman who (a) fought and (b) won.

On Prejudice

I have just finished reading Theodore Dalrymple’s In Praise of Prejudice: The Necessity of Preconceived Ideas. Dalrymple’s thesis is simple but profound: We cannot (and do not) operate in this world without the benefit of preconceived ideas about how the world works. If we tried to do so, we would be as helpless as babes in the wood.

To state Dalrymple’s thesis so baldly is to do a grave injustice to the lucidity, incisiveness, elegance, and ruthless logic of his short book. At the outset, Dalrymple makes it clear that he holds no brief for racial and ethnic prejudice. As he points out: “No prejudice, no genocide.” But he adds that

If the existence of a widespread prejudice is necessary for the commission of genocide, it is certainly not a sufficient one. Nor does it follow from the fact that all who commit genocide are prejudiced that all who are prejudiced commit genocide.

Dalrymple spends many pages (fruitfully) eviscerating John Stuart Mill’s simplistic liberalism, which holds that that one may do as one pleases as long as (in one’s own opinion) one does no harm to others. This belief (itself a prejudice) has led to what Dalrymple calls “radical individualism” — and it is just that, despite the efforts of libertarian apologists to demonstrate otherwise. Dalrymple offers a spot-on diagnosis of the wages of radical individualism:

What starts out as a search for increased if not total individualism ends up by increasing the power of government over individuals. It does not do so by the totalitarian method of rendering compulsory all that is not forbidden … but by destroying all moral authority that intervenes between individual human will and governmental power…. “There is no law against it” becomes an unanswerable justification for conduct that is selfish and egotistical.

This, of course, makes the law, and therefore those who make the law, the moral arbiters of society. It is they who, by definition, decide what is permissible and what is not….

Given the nature of human nature, it hardly needs pointing out that those who are delegated the job of moral arbiter for the whole of society enjoy their power and come to thing that they deserve it, and that they have been chosen for their special insight into the way life should be lived. It is not legislators who succumb to this temptation but judges also….

Dalrymple, an admitted non-believer, also slices through the pretensions of Peter Singer and Richard Dawkins, strident atheists both. He exposes their prejudices, which they try to conceal with the language of science and bombastic certitude.

There is much more in this delightful book. I offer a final sample:

In order to prove to ourselves that we are not prejudiced, but have thought out everything for ourselves, as fully autonomous (if not responsible) human beings should, we have to reject the common maxims of life that in many, though not in all, cases, preserve civilized relations. Enlightenment, or rather, what is so much more important for many people, a reputation for enlightenment, consists in behaving in a way contrary to those maxims. And once a common maxim of life is overthrown in this fashion, it is replaced by another — often, though of course not always, a worse one.

Social norms that have passed the test of time are more likely than not to be beneficial. And, so, we owe them the benefit of the doubt, instead of discarding them for the sake of change, that is, for the sake of new prejudices.

I urge you to buy In Praise of Prejudice, to read it, and to re-read it (as I will do).

Related:
The Meaning of Liberty” (25 Mar 2006)
Atheism, Religion, and Science Redux” (01 Jul 2007)