Back to the Drawing Board: Reflections on Architecture

Guest post:

A recent exhibit at the Library of Virginia, Never Built Virginia (January 11 – May 21, 2008), documents architectural designs that never made it off the drawing board. Ranging in designs from prosaic 19th century churches to ugly modern high rises, the exhibit forms an interesting cultural and aesthetic chronicle. There are a few items which stand out, like the magnificent Greco-Roman concept for the Library of Virginia, proposed in the 1930s. Unfortunately it was shelved in favor of a drab art deco structure (not the best specimen of that style) when the library was rebuilt in 1940. The state library has since been relocated to a retro-modern, and not totally ungraceful, building just down the street.

Not all modernism is bad, but a little bit goes a long way. And when we are told that “Virginia’s deep-rooted traditionalism doomed many [architectural] schemes” we can be glad. After looking at plans from a few decades ago for the James River area—consisting of angular, massive poured concrete structures—it is fortunate that development was postponed until the recent neo-classical revival, when most of the buildings being put up exhibit tasteful Georgian lines to match the historic downtown.

One of the architects highlighted in Never Built Virginia is Haigh Jamgochian, a 1960s disciple of hyper-modernism. That he is a misanthropic recluse who has a made a career (like so many modern “creative” people) by not actually doing anything, seems appropriate. Admittedly his drawings and models are curious to look at, like the whimsical futurist predictions of old science fiction movies. Jamgochian cites the original Star Trek show as an early influence. But the minute you actually throw up such edifices on real streets, amidst venerable brick, stone and stucco structures, the effect is monstrous. Jamgochian was not very successful in selling his designs, but there are still plenty of disasters scattered about Richmond from the ’60s and ’70s to damage the landscape. Fortunately, as an established east coast city, enough of the traditional buildings have survived to maintain its character.

Perhaps the most that can be said for classic modernism is its symmetry. Of course symmetry is not enough to make a good building. But it’s impossible to imagine good design without it. In that respect postmodernism, with its chaotic fragmentation, is only a further step in the direction of artistic decay in which even traditional elements are haphazardly plundered in the way that barbarians of the Dark Ages appropriated bits and pieces from handsome temples and palaces to construct their poorly made hovels. The effect is to evoke not so much admiration as pity.

The Arts: Where Regress is "Progress"

Bookworm (of Bookworm Room) shares my disdain of modern art forms, some of which I express and explain here:

Speaking of Modern Art” (24 Jul 2004)
Making Sense about Classical Music” (23 Aug 2004)
An Addendum about Classical Music” (24 Aug 2004)
My Views on ‘Classical’ Music, Vindicated” (02 Feb 2005)
A Quick Note about Music” (29 Jun 2005)
All That Jazz” (03 Nov 2006)

In the early decades of the twentieth century, the various arts became an “inside game.” Painters, sculptors, composers (of “serious” music), and choreographers began to create works not for the enjoyment of audiences but for the sake of exploring “new” forms. Given that the various arts had been perfected by the early 1900s (at the outside), the only way to explore “new” forms was to regress toward primitive ones — toward a lack of structure, as Bookworm calls it. Aside from its baneful influence on many true artists, the regression toward the primitive has enabled persons of inferior talent (and none) to call themselves “artists”. Thus modernism is banal when it is not ugly.

Painters, sculptors, etc., have been encouraged in their efforts to explore “new” forms by critics, by advocates of change and rebellion for its own sake (e.g., “liberals” and “bohemians”), and by undiscriminating patrons, anxious to be au courant. Critics have a special stake in modernism because they are needed to “explain” its incomprehensibility and ugliness to the unwashed.

The unwashed have nevertheless rebelled against modernism, and so its practitioners and defenders have responded with condescension, one form of which is the challenge to be “open minded” (i.e., to tolerate the second-rate and nonsensical). A good example of condescension is heard on Composers Datebook, a syndicated feature that runs on some NPR stations. Every Composers Datebook program closes by “reminding you that all music was once new.” As if to lump Arnold Schoenberg and John Cage with Johann Sebastian Bach and Ludwig van Beethoven.

All music, painting, sculpture, and dance were once new, but new doesn’t necessarily mean good. Much (most?) of what has been produced since 1900 (if not before) is inferior, self-indulgent crap.

My Year at the Movies (2007)

Actually, 2007 was (as usual) a year of sitting in the comfort of my home watching DVDs. I find movie theaters to be messy places filled with inconsiderate, blabbering idiots.

I watched (or started) 80 theatrical films in 2007. (I also saw 15 made-for-TV movies and mini-series.) I list below the 80 theatrical films, in the order in which I watched them. I indicate for each film the year of its release (according to IMDb) and my rating (on a 10-point scale). Most of my ratings are relatively high (6 points and up), which indicates selectivity in choosing films.

Not that I am always able to avoid stinkers, as you will see. Five rentals stand out as especially egregious choices: The Da Vinci Code, Black Snake Moan, Dreamgirls, Live Free or Die Hard, and Superbad. Five other almost-as-bad choices were Casino Royale, The Draughtsman’s Contract, Seraphim Falls, Spider-Man 3, and Ratatouille.

Happily, I saw 26 films to which I have given a rating of 8 or higher: Roberta, The Others, Dear Frankie, The Browning Version (1951), The Navigator, Stranger Than Fiction, Empire of the Sun, Volver, El Aura, The History Boys, Notes on a Scandal, The Queen, Children of Men, The Painted Veil, Venus, Breach, Sweet Land, Flags of Our Fathers, The Accidental Tourist, The Chorus, The Lives of Others, Away from Her, Snow Cake, Vitus, The Cameraman, and The Ladykillers (1955).

Here is a guide to my ratings:
1 – unwatchable
2 – watched all the way through, to my regret
3, 4, 5 – varying degrees of entertainment value, but altogether a waste of time
6 – generally engaging, but noticeably flawed in some way (e.g., a weak performance in a major role, trite story, a contrived ending, insufficient resolution of plot or sub-plot)
7 – well done in all respects, with only a few weak spots; enjoyable but not scintillating
8 – a thoroughly engaging movie; its weak spots (e.g., a corny plot), if any, are overwhelmed by scintillating performances (e.g., the spectacular dancing of Astaire and Rogers), sustained hilarity, a compelling plot, a witty script, etc.
9 – an “8” that is so good it bears re-watching (a rating I have given to only 61 of the more than 2,000 theatrical films I’ve seen)
10 – a movie that I didn’t want to end; a masterpiece of film-making (a rating I have given to only 5 of the theatrical films I’ve seen)

Here’s my movie list for 2007:

Walk the Line (2005) 7
Little Manhattan (2005) 7
Roberta (1935) 8
The Others (2001) 8
The Illusionist (2006) 7
Foreign Correspondent (1940) 6
Happy Accidents (2000) 7
The Captain’s Paradise (1953) 6
Conversations with Other Women (2005) 7
Counterfeit Traitor (1962) 7
Kinky Boots (2005) 7
Dear Frankie (2004) 8
Hollywoodland (2006) 7
The Heiress (1949) 7
The Browning Version (1951) 9
The Departed (2006) 7
The Navigator (1924) 8
Flushed Away (2006) 7
Keeping Mum (2005) 7
The Prestige (2006) 7
Stranger Than Fiction (2006) 8
The Da Vinci Code (2006) 2
Empire of the Sun (1987) 8
Casino Royale (2006) 4
Immortal Beloved (1994) 7
The Draughtsman’s Contract (1982) 4
The Silent Partner (1978) 7
Volver (2006) 8
El Aura (2005) 8
Three Kings (1999) 6
The Good Shepherd (2006) 7
The History Boys (2006) 8
Notes on a Scandal (2006) 8
Modern Times (1936) 7
Plein Soleil (Purple Noon) (1960) 7
The Queen (2006) 8
Children of Men (2006) 8
Dreamgirls (2006) 2
The Painted Veil (2006) 8
Deja Vu (2006) 7
Venus (2006) 8
Seraphim Falls (2006) 4
Letters from Iwo Jima (2006) 7
El Laberinto del Fauno (Pan’s Labyrinth) (2006) 7
Breach (2007) 8
Miss Potter (2006) 7
Becket (1964) 7
The Freshman (1925) 6
The Big Clock (1948) 7
Sweet Land (2005) 8
Black Snake Moan (2006) 1
La Tourneuse des Pages (The Page Turner) (2006) 7
Why Worry? (1923) 7
Hot Fuzz (2007) 6
The Woman in the Window (1944) 7
Cashback (2006) 7
Flags of Our Fathers (2006) 8
The Accidental Tourist (1988) 8
Fracture (2007) 7
Les Choristes (The Chorus) (2004) 8
The Lookout (2007) 7
Das Leben der Anderen (The Lives of Others) (2006) 8
The Ultimate Gift (2006) 7
Away from Her (2006) 8
The Wind That Shakes the Barley (2006) 7
Snow Cake (2006) 8
Zwartboek (Black Book) (2006) 7
Spider-Man 3 (2007) 5
Ten Canoes (2006) 7
The Tracker (2002) 6
Live Free or Die Hard (2007) 2
Waitress (2007) 7
Vitus (2006) 8
Superbad (2007) 1
Ratatouille (2007) 5
The Cameraman (1928) 8
Harry Potter and the Order of the Phoenix (2007) 6
Stardust (2007) 6
Amazing Grace (2006) 7
The Ladykillers (1955) 8

At the Movies: The Best and Worst Years

Further thoughts on the decline of the movie industry. (Earlier thoughts, replete with details, here.) Base on my ratings of films released since 1930, these are the best vintages:

1933 (56 percent rated 8 or higher on 10-point scale)
1934 (63%)
1936 (55%)
1938 (75%)
1939 (59%)
1941 (65%)
1954 (57%)*
1974 (60%)**

And these are the worst (also see footnote ***):

1963 (16%)
1969 (15%)
1976 (12%)
1978 (6%)
1985 (15%)
1996 (16%)
2007 (8%)

Excellent films (rating of 8 or higher) as a percentage of films seen, by decade of release:

1930s – 52%
1940s – 36%
1950s – 32%
1960s – 31%
1970s – 28%
1980s – 27%
1990s – 22%
2000s – 22%

Some things have improved markedly over the years (e.g., the quality of automobiles and personal computers). Some things have not: government and entertainment, especially.

Movies are no longer as compelling and entertaining as they used to be. Why? For me, it’s film-makers’ growing reliance on profanity, obscenity, violence, unrealistic graphics, and “social realism” (i.e., depressing situations, anti-capitalist propaganda). To rent a recently released movie (even one that has garnered good reviews) is to play “Hollywood roulette.”
__________
* An aberration in what I call the “Abysmal Years”: 1943-1965.
** An aberration in what I call the “Vile Years”: 1966-present.
*** Tied at 17% are 1943, 1944, 1975, 1991, 1998, and 2005 — all among the Abysmal and Vile Years.

"The War": Final Grade

See this, this, and this for my reactions to the first six episodes of Ken Burns’s The War.

REVISED, 11/17/07

Having now seen the seventh and final episode of The War, I give the series a grade of “D”; it escapes an “F” only for its willingness to say, hesitantly, that

  • World War II was, for the United States, a necessary war because of the nature of the enemy. It was, therefore, worth its cost in lives, limbs, and money.
  • It was, in the end, necessary to drop A-bombs on Japan in order to bring the war to avert an invasion of Japan — an invasion that would have cost the lives of millions of Americans and Japanese.

But we already knew those things, didn’t we?

Like episodes two through six, episode seven suffers from viewpoint confusion. The War makes the points I list above, then — time after time — retracts or undermines them. In episode seven, for example, we hear again from the egregious Paul Fussell (see this), who clearly implies that the war wasn’t worth fighting until the Holocaust came to light, late in the war.

And there is the insistence on presenting “balanced” reactions to the dropping of A-bombs on Japan. One of the “witnesses” who appears throughout the series staunchly defends the act. Another notes its strategic wisdom but still wishes it hadn’t been necessary. But it was necessary — and, really, an act of mercy toward the Japanese as well as to America’s fighting men. Why pander to the nay-sayers, who will go to their graves condemning the act, in spite of its moral necessity?

Burns and company, I fear, simply wanted to make a “blockbuster.” To that end, they chose World War II and the “greatest generation” — subjects guaranteed to elicit sympathy and lull the viewer into agreement with the film’s subtext, which has two main elements.

One element is voiced at the very end of the final episode, in the dedication. It is to those who served in World War II, “that necessary war” (emphasis added), not “a necessary war,” as the first episode has it. The implication is that no later war was or is necessary — certainly not the present one.

The second element of the subtext reinforces the first one, and it is less subtle. That second element is The War‘s insistence on playing up America’s moral failings (as discussed above and in my second and third reactions to The War). The intended message is that because of our moral failings, and because war is hell, World War II was barely worth fighting, although it seemed necessary at the time (even to the Left). Therefore, given the murkiness of our present cause — as proclaimed loudly by the Leftists who have come to dominate the media and academe — the war in Iraq (and perhaps the war on terror) is unjustified because America remains morally imperfect and war remains hellish. The Left proclaims an act of war against anyone but Hitler (not a Hitler, the Hitler) to be an act of hypocrisy and brutality by a morally imperfect nation.

That is Metaethical Moral Relativism (MMR), about which I have written:

It treats different groups as if they had different moral imperatives. By and large, they do not; most groups (or, more exactly, most of their members) have the same moral imperative: The Golden Rule.

There are, of course, groups that seldom if ever observe The Golden Rule. Such groups are ruled by force and fear, and they deny voice and exit to their members. The rulers of such groups are illegitimate because they systematically try to suppress observance of The Golden Rule, which is deep-seated in human nature. Other groups may therefore justly seek to oust and punish those despotic rulers.

I go on to point out that MMR, these days, seems to take this form:

The United States is imperfect. It is, therefore, no better than its enemies.

Such is the relativism we see in those who excuse despotic, murderous regimes and movements because “we asked for it” or “we are no better than they are” or “war is never the answer” or “one man’s terrorist is another man’s freedom fighter” or “terrorists deserve the protections of the Geneva Convention.” That kind of relativism empowers the very despots and terrorists whose existence is an affront to The Golden Rule.

The War is a barely redeemed exercise in Metaethical Moral Relativism. I say that only because its subtext may escape many viewers who are not of the Left. As for the Left, it had embraced MMR long before The War appeared on PBS; The War merely affirms

the American Left’s long-standing allegiance to anti-defense, anti-war dogmas, under which lies the post-patriotic attitude that America is nothing special, just another place to live.

Related posts:
Shall We All Hang Separately?
Foxhole Rats
Foxhole Rats, Redux
The Faces of Appeasement
We Have Met the Enemy . . .
Whose Liberties Are We Fighting For?
Words for the Unwise
More Foxhole Rats
Post-Americans and Their Progeny
Anti-Bush or Pro-Treason?
Com-Patriotism and Anti-Patriotic Acts
Depressing but True
Katie Couric: Post-American

PC Madness

“Dumbleore was gay,” says J.K Rowling.

Pinocchio was Chucky‘s father,” say I.

Both statements bear the same relation to reality, which is none. Mine, at least, isn’t tritely au courant.

UPDATE: For a hilarious parody of Rowling’s PC “revelation,” see this post at The Needle.

Punctuation

David Bernstein of The Volokh Conspiracy writes:

I frequently have disputes with law reviewer editors over the use of dashes. Unlike co-conspirator Eugene, I’m not a grammatical expert, or even someone who has much of an interest in the subject.

But I do feel strongly that I shouldn’t use a dash between words that constitute a phrase, as in “hired gun problem”, “forensic science system”, or “toxic tort litigation.” Law review editors seem to want to generally want to change these to “hired-gun problem”, “forensic-science system”, and “toxic-tort litigation.” My view is that “hired” doesn’t modify “gun”; rather “hired gun” is a self-contained phrase. The same with “forensic science” and “toxic tort.”

Most of the commenters (thus far) are right in advising Bernstein that the “dashes” — he means hyphens — are necessary. Why? To avoid confusion as to what is modifying the noun “problem.”

In “hired gun,” for example, “hired” (adjective) modifies “gun” (noun, meaning “gunslinger” or the like). But in “hired-gun problem,” “hired-gun” is a compound adjective which requires both of its parts to modify “problem.” It is not a “hired problem” or a “gun problem,” it is a “hired-gun problem.” The function of the hyphen is to indicate that “hired” and “gun,” taken separately, are meaningless as modifiers of “problem,” that is, to ensure that the meaning of the adjective-noun phrase is not misread.

A hyphen isn’t always necessary in such instances. But the consistent use of the hyphen in such instances avoids confusion and the possibility of misinterpretation.

The consistent use of the hyphen to form a compound adjective has a counterpart in the consistent use of the serial comma, which is the comma that precedes the last item in a list of three or more items (e.g., the red, white, and blue). Newspapers (among other sinners) eschew the serial comma for reasons too arcane to pursue here. Thoughtful counselors advise its use. Why? Because the serial comma, like the hyphen in a compound adjective, averts ambiguity. It isn’t always necessary, but if it is used consistently, ambiguity can be avoided. (Here’s a great example, from the Wikipedia article linked in the first sentence of this paragraph: “To my parents, Ayn Rand and God.” The writer means, of course, “To my parents, Ayn Rand, and God.”)

This all reminds me of the unfortunate demise of the comma in adjectival phrases. If “hired” and “gun” were meant to modify “problem” separately, the expression would (should) be written “hired, gun problem.” Not that “hired, gun problem” means anything, but if it did, the proper use of a comma between “hired” and “gun” would ensure against misreading the phrase as “hired gun problem” (unpunctuated, as Bernstein prefers) as “hire-gun problem.”

A little punctuation goes a long way.

Austin: Not the Live Music Captial of the World

The City of Austin likes to claim that Austin is “The Live Music Capital of the World.” I suppose that’s a more palatable claim than the more apt description, “The People’s Republic of Austin.” That moniker is owed to the predilections of Austin’s all-Democrat city council, which dispenses taxpayers’ money like manna from city hall, itself a monstrous monument to Austin’s commissariat:


In any event, Austin is not the live music capital of the world. It isn’t even the live music capital of the U.S., according to a new study of the music industry in the nation’s 50 most populous metropolitan areas:

  • Figure 4 of that study indicates that Austin ranks fourth in the number of musicians employed in the music industry per 1,000 residents. The top three: Nashville (first by a wide margin), New York, Los Angeles.
  • Figure 6 indicates that Austin ranks fourth in the total number of persons directly employed in the music industry per 1,000 residents. The top three: Nashville (first by a wide margin), Los Angeles, New York.
  • Austin is among the laggards in absolute numbers of musicians, other industry employees, number of establishments, payrolls, and revenues (figures 2, 3, 7, 9, 10, 11).

It’s evident that Austin is not the capital of anything, when it comes to music. It may be the capital of smugness, though San Francisco, Manhattan, and a few other places are probably in Austin’s class (and I don’t mean “classy”).

Austin is the capital of Texas. Well, more precisely, the Capitol of Texas is in Austin. But that’s a historical accident. Austin is to the rest of Texas (Houston excepted) as George W. Bush is to MoveOn.org.

September Songs

Anticipating September, I searched today for audio clips of my two favorite September songs: “September Song” and “September in the Rain.” I was looking especially for clips featuring Walter Huston (“September Song”) and James Melton (“September in the Rain”) because their renditions are the definitive originals.

Huston introduced “September Song” in the 1938-39 Broadway musical, Knickerbocker Holiday. That Huston was an actor and not a trained (or very skilled) singer makes his delivery all the more sincere and poignant. It is especially poignant for me because I remember first hearing the song (and Huston singing it) on an autumn day almost fifty years ago. Here’s the clip.

Melton, on the other hand, was a trained singer. The affected style of some of his recordings (especially the early ones) can be off-putting, but it’s a style that was common in those days (the 1920s and 1930s) and it doesn’t (for me) detract from the beauty of Melton’s voice. Now an obscure figure, Melton had a long career on the band platform, on radio, in movies, in opera, and on TV. (I remember watching his short-lived 1951 TV series.) Melton possessed a bright, ringing tenor voice with a melancholic edge: a perfect voice for “September in the Rain,” which Melton made famous in the 1937 film, Melody for Two. Here’s the clip.

Fore — uh — word

UPDATED AT 5:58 P.M.

The estimable Arnold Kling blogs at EconLog (with Bryan Caplan), which is a must-read for me. Today, Arnold stumbles a bit on spelling. He writes “foreward” where he means “foreword“: “a short introductory essay preceding the text of a book.”

“Foreward” might seem to be a conflation of “foreword” and “forward” (a relative direction). But, no, “foreward” is a word, if an archaic one, which is only tangentially related to “foreword” and “forward.” It means “the van; the front,” as in a battle. (It’s a new one on me.)

Given the context, however, the word Arnold wants is “foreword.”

UPDATE: Arnold has corrected his spelling error. I thank him for the acknowledgment, and for the opportunity to learn a new word (“foreward”).

The Movies: (Not) Better Than Ever

UPDATED BELOW, 06/23/07

According to the lists of movies that I keep at the Internet Movie Database (IMDb), I have thus far seen 2,034 theatrically released feature films in my lifetime. That number does not include such forgettable fare as the grade-B westerns, war movies, and Bowery Boys comedies that I saw on Saturdays, at two-for-a-nickel, during my pre-teen years.

I have given 570 (28 percent) of those 2,034 films a rating of 8, 9, or 10 (out of 10). The proportion of high ratings does not indicate low standards on my part; rather, it indicates the care with which I have (usually) selected films for viewing.

I call the 570 highly rated films my favorites. I won’t list all them here, but I will mention some of them — and their stars — as I analyze the up-and-down history of film-making.

I must, at the outset, admit two biases that have shaped my selection of favorite movies. First, because I’m a more or less typical American movie-goer (or movie-viewer, since the advent of cable, VCR, and DVD), my list of favorites is dominated by American films starring American actors.

A second bias is my general aversion to silent features and early talkies. Most of the directors and actors of the silent era relied on “stagy” acting to compensate for the lack of sound — a style that persisted into the early 1930s. There are exceptions, of course. Consider Charlie Chaplin, whose genius as a director and comic actor made a virtue of silence; my list of favorites includes two of Chaplin’s silent features (The Gold Rush, 1925) and (City Lights, 1931). Perhaps a greater comic actor (and certainly a more physical one) than Chaplin was Buster Keaton, whose Our Hospitality (1923), The Navigator (1924), Sherlock Jr. (1924), The General (1927), and Steamboat Bill Jr. (1928) outnumber Chaplin’s contributions to my favorites. My list of favorites includes only ten other films from the years before 1933, among them F.W. Murnau’s Nosferatu the Vampire (1922) and Fritz Lang’s Metropolis (1927) — the themes of which (supernatural and futuristic, respectively) enabled them to transcend the limitations of silence — and such early talkies as Whoopee! (1930), Dracula (1931), and Grand Hotel (1932).

On the whole, I can recall having seen only 42 feature films that were released before 1933, 17 of which (40 percent) rank among my favorites. (I plan, however, to increase that number as I sample other highly rated silent films, including several of Harold Lloyd’s.) So, I will say no more here about films released before 1933. I will focus, instead, on movies released from 1933 to the present — which I consider the “modern” era of film-making.

My inventory of modern films comprises 1,992 titles, of which I have rated 553 at 8, 9, or 10 on the IMDb scale. But the overall proportion of favorites (28 percent) masks vast differences in the quality of modern films, which were produced in three markedly different eras:

  • the Golden Age (1933-1942) — 179 films seen, 96 favorites (54 percent)
  • the Abysmal Years (1943-1965) — 317 films seen, 98 favorites (31 percent)
  • the Vile Years (1966-present) — 1,496 films seen, 359 favorites (24 percent)

What made the Golden Age golden, and why did films go from golden to abysmal to vile? Read on.

To understand what made the Golden Age golden, let’s consider what makes a great movie: a novel or engaging plot, dialogue that is fresh if not witty, and strong performances (acting, singing, and/or dancing). (A great animated feature may be somewhat weaker on plot and dialogue if the animations and sound track are first rate.) The Golden Age was golden largely because the advent of sound fostered creativity — plots could be advanced through dialogue, actors could deliver real dialogue, and singers and dancers could sing and dance with abandon. It took a few years to fully realize the potential of sound, but movies hit their stride just as the country was seeking respite from worldly cares: first, a lingering and deepening Depression, then the growing certainty of war.

Studios vied with each other to entice movie-goers with new plots (or plots that seemed new when embellished with sound), fresh and often wickedly witty dialogue, and — perhaps most important of all — captivating performers. The generation of super-stars that came of age in the 1930s consisted mainly of handsome men and beautiful women, blessed with distinctive personalities, and equipped by their experience on the stage to deliver their lines vibrantly and with impeccable diction.

What were the great movies of the Golden Age, and who starred in them? Here’s a sample of the titles: 1933 — Dinner at Eight, Flying Down to Rio, Morning Glory; 1934 — It Happened One Night, The Thin Man, Twentieth Century; 1935 — Mutiny on the Bounty, A Night at the Opera, David Copperfield; 1936 — Libeled Lady, Mr. Deeds Goes to Town, Show Boat; 1937 — The Awful Truth, Captains Courageous, Lost Horizon; 1938 — The Adventures of Robin Hood, Bringing up Baby, Pygmalion; 1939 — Destry Rides Again, Gunga Din, The Hunchback of Notre Dame, The Wizard of Oz, The Women; 1940 — The Grapes of Wrath, His Girl Friday, The Philadelphia Story; 1941 — Ball of Fire, The Maltese Falcon, Suspicion; 1942 — Casablanca, The Man Who Came to Dinner, Woman of the Year.

And who starred in the greatest movies of the Golden Age? Here’s a goodly sample of the era’s superstars, a few of whom came on the scene toward the end: Jean Arthur, Fred Astaire, John Barrymore, Lionel Barrymore, Ingrid Bergman, Humphrey Bogart, James Cagney, Claudette Colbert, Ronald Colman, Gary Cooper, Joan Crawford, Bette Davis, Irene Dunne, Nelson Eddy, Errol Flynn, Joan Fontaine, Henry Fonda, Clark Gable, Cary Grant, Jean Harlow, Olivia de Havilland, Katharine Hepburn, William Holden, Leslie Howard, Allan Jones, Charles Laughton, Carole Lombard, Myrna Loy, Jeanette MacDonald, Joel McCrea, Merle Oberon, Laurence Olivier, William Powell, Ginger Rogers, Rosalind Russell, Norma Shearer, Barbara Stanwyck, James Stewart, and Spencer Tracy. There were other major stars, and many popular supporting players, but it seems that a rather small constellation of superstars commanded most of the leading roles in the best movies of the Golden Age — most of the great movies and many others of merit.

Why did movies go into decline after 1942’s releases? World War II certainly provided an impetus for the end of the Golden Age. The war diverted resources from the production of major theatrical films; grade-A features gave way to low-budget fare. And some of the superstars of the Golden Age went off to war. (Two who remained civilians — Leslie Howard and Carole Lombard — were killed during the war.) With the resumption of full production in 1946, the surviving superstars who hadn’t retired were fading fast, though their presence still propelled many a movie. In fact, superstars of the Golden Age starred in 44 of my 98 favorites from the Abysmal Years but only two of my 359 favorites from the Vile Years.

Stars come and go, however, as they have done since Shakespeare’s day. The Abysmal and Vile Years have deeper causes than the dimming of old stars:

  • The Golden Age had deployed all of the themes that could be used without explicit sex, graphic violence, and crude profanity — none of which become an option for American movie-makers until the mid-1960s.
  • Prejudice got significantly more play after World War II, but it’s a theme that can’t be used very often without boring audiences.
  • Other attempts at realism (including film noir) resulted mainly in a lot of turgid trash laden with unrealistic dialogue and shrill emoting — keynotes of the Abysmal Years.
  • Hollywood productions sank to the level of TV, apparently in a misguided effort to compete with that medium. The garish technicolor productions of the 1950s often highlighted the unnatural neatness and cleanliness of settings that should have been rustic if not squalid.
  • The transition from abysmal to vile coincided with the cultural “liberation” of the mid-1960s, which saw the advent of the “f” word in mainstream films. Yes, the Vile Years have brought us more more realistic plots and better acting (thanks mainly to the Brits). But none of that compensates for the anti-social rot that set in around 1966: drug-taking, drinking and smoking are glamorous; profanity proliferates to the point of annoyance; sex is all about lust and little about love; violence is gratuitous and beyond the point of nausea; corporations and white, male Americans with money are evil; the U.S. government (when Republican-controlled) is in thrall to that evil; etc., etc. etc.

There have been, of course, outbreaks of greatness since the Golden Age. During the Abysmal Years, for example, aging superstars appeared in such greats as Life With Father (Dunne and Powell, 1947), Key Largo (Bogart and Lionel Barrymore, 1948), Edward, My Son (Tracy, 1949), The African Queen (Bogart and Hepburn, 1951), High Noon (Cooper, 1952), Mr. Roberts (Cagney, Fonda, Powell, 1955), The Old Man and the Sea (Tracy, 1958), Anatomy of a Murder (Stewart, 1959), North by Northwest (Grant, 1959), Inherit the Wind (Tracy, 1960), Long Day’s Journey into Night (Hepburn, 1962), Advise and Consent (Fonda and Laughton, 1962), The Best Man (Fonda, 1964), and Othello (Olivier, 1965). A new generation of stars appeared in such greats as The Lavender Hill Mob (Alec Guinness, 1951), Singin’ in the Rain (Gene Kelly, 1952), The Bridge on the River Kwai (Guiness, 1957), The Hustler (Paul Newman, 1961), Lawrence of Arabia (Peter O’Toole, 1962), and Dr. Zhivago (Julie Christie, 1965). Even Lancaster (Elmer Gantry, 1960) , Kerr (The Innocents, 1962), and Peck (To Kill a Mockingbird, 1962) had their moments. Nevertheless, selecting a movie at random from the output of the Abysmal Years — in the hope of finding something great or even worth watching — is like playing Russian Roulette with a loaded revolver.

The same can be said for the Vile Years, which in spite of their seaminess have yielded many excellent films and new stars. Some of the best films (and their stars) are A Man for All Seasons (Paul Scofield, 1966), Midnight Cowboy (Dustin Hoffman, 1969), MASH (Alan Alda, 1970), The Godfather (Robert DeNiro, 1972), Papillon (Hoffman, Steve McQueen, 1973), One Flew over the Cuckoo’s Nest (Jack Nicholson, 1975), Star Wars and its sequels (Harrison Ford, 1977, 1980, 1983), The Great Santini (Robert Duvall, 1979), The Postman Always Rings Twice (Nicholson, Jessica Lange, 1981), The Year of Living Dangerously (Sigourney Weaver, Mel Gibson, 1982), Tender Mercies (Duvall, 1983), A Room with a View (Helena Bonham Carter, Daniel Day Lewis 1985), Mona Lisa (Bob Hoskins, 1986), Fatal Attraction (Glenn Close, 1987), 84 Charing Cross Road (Anne Bancroft, Anthony Hopkins, Judi Dench, 1987), Dangerous Liaisons (John Malkovich, Michelle Pfeiffer, 1988), Henry V (Kenneth Branagh, 1989), Reversal of Fortune (Close and Jeremy Irons, 1990), Dead Again (Branagh, Emma Thompson, 1991), The Crying Game (1992), Much Ado about Nothing (Branagh, Thompson, Keanu Reeves, Denzel Washington, 1993), Trois Couleurs: Bleu (Juliette Binoche, 1993), Richard III (Ian McKellen, Annette Bening, 1995), Beautiful Girls (Natalie Portman, 1996), Comedian Harmonists (1997), Tango (1998), Girl Interrupted (Winona Ryder, 1999), Iris (Dench, 2000), High Fidelity (John Cusack, 2000), Chicago (Renee Zellweger, Catherine Zeta-Jones, Richard Gere, 2002), Master and Commander: The Far Side of the World (Russell Crowe, 2003), Finding Neverland (Johnny Depp, Kate Winslet, 2004), Capote (Philip Seymour Hoffman, 2005), The Chronicles of Narnia: The Lion, the Witch, and the Wardrobe (2005), The Painted Veil (Edward Norton, Naomi Watts, 2006), and Breach (Chris Cooper, 2007).

But every excellent film produced during the Abysmal and Vile Years has been surrounded by outpourings of dreck, schlock, and bile. The generally tepid effusions of the Abysmal Years were succeeded by the excesses of the Vile Years: films that feature noise, violence, sex, and drugs for the sake of noise, violence, sex, and drugs; movies whose only “virtue” is their appeal to such undiscerning groups as teeny-boppers, wannabe hoodlums, resentful minorities, and reflexive leftists; movies filled with “bathroom” and other varieties of “humor” so low as to make the Keystone Cops seem paragons of sophisticated wit.

In sum, movies have become progressively worse than ever since the end of the Golden Age — and I have the numbers to prove it. The numbers are based on my IMDb ratings, and my conclusion about the low estate of film-making flows from those ratings. That is to say, I came to the conclusion that the quality of films has been in decline since 1942 only after having rated some 2,000 films. Before I looked at the numbers I believed that there had been a renaissance in film-making, inasmuch as the number of highly rated films (favorites), has been rising since the latter part of the Abysmal Years:


But the rising number of favorites is due to the rising number of films (mainly recent releases) that I have seen since the advent of VHS and DVD (and especially since my retirement about 10 years ago). In the chart below, all of the points to the right of 30 on the x-axis represent films released in 1982 or later; all of the points to the right of 60 represent films released in 1994 or later. (I have omitted the releases of 2007 from this analysis because I have seen only one of them: Breach.)


Those observations led to me to run a regression films released from 1933 through 2006. The result:

Number of favorite films (for a given year of release) = 147.94 – (o.075 x year) + (0.24 x number of films seen)

Regression statistics: adjusted r-square — 0.66; standard error of estimate — 2.61; F — 72.52; t values of intercept and independent variables — 3.46, 3.40, 10.06

By applying the regression equation to the number of films seen in each year I could compare the actual and predicted number of favorites as a percentage of films seen:


The downward trend is unmistakable, both in the data and in the predictions:

  • Actual percentages for seven of the 10 years of the Golden Age exceed predictions for those years.
  • Actual percentages fall short of predicted percentages in 18 of the 23 Abysmal Years — evidence of the general dreariness of the films of that era.
  • The Vile Years have had their high points and low points — both mainly in the 1960s and ’70s — but, nevertheless, the downward trend since the Golden Age continues unabated.

Imagine how much steeper the downward trend would be if my observations were to include absolute trash of the sort that dominates the trailers which one encounters on TV and DVDs. My selectivity in movie-watching has led me to overstate the quality of recent and current movie offerings.

Movies are worse than ever, but there are gems yet to be found among the dross.

UPDATE: The lowest IMDb rating for a movie is a “1” — a rating that I have given to 37 films. Those 37 are the movies that I found too moronic or vile to watch to the end. The following table lists the films, shows my ratings, and shows the average ratings assigned by users of IMDb. It is telling that (with three exceptions) the average ratings range from 6.6 to 8.2 — relatively high scores in the world of IMDb.

Bad Santa (2003) 1 7.2
Better Off Dead… (1985) 1 7.1
Big Night (1996) 1 7.1
Bottle Rocket (1996) 1 7.2
The Butterfly Effect (2004) 1 7.7
Diva (1981) 1 7.1
Exotica (1994) 1 7.0
Garden State (2004) 1 8.0
The General’s Daughter (1999) 1 6.0
Hannah and Her Sisters (1986) 1 7.8
Happiness (1998) 1 7.6
Harold & Kumar Go to White Castle (2004) 1 7.1
The Holiday (2006) 1 6.9
I’m the One That I Want (2000) 1 7.3
The Joy Luck Club (1993) 1 7.2
King of the Corner (2004) 1 6.0
Lord of War (2005) 1 7.7
The Lost City (2005) 1 6.9
Lucía y el sexo (2001) 1 7.5
Lucky Number Slevin (2006) 1 7.8
Metropolitan (1990) 1 7.2
My Dinner with Andre (1981) 1 7.4
One Last Thing… (2005) 1 7.1
Quills (2000) 1 7.3
Reine Margot, La (1994) 1 7.5
Roger Dodger (2002) 1 7.2
Sideways (2004) 1 7.8
Sleepy Hollow (1999) 1 7.4
The Spook Who Sat by the Door (1973) 1 6.9
Tape (2001) 1 7.3
The Thing About My Folks (2005) 1 6.6
Under the Volcano (1984) 1 6.6
The Upside of Anger (2005) 1 7.1
Waking Life (2001) 1 7.5
Z (1969) 1 8.2
Zelig (1983) 1 7.5
Zoolander (2001) 1 6.2

The Censored Wisdom of T.S. Eliot

Eliot, unfortunately for us, censored himself on at least one occasion. Virginia Quarterly Review explains:

In May 1933, T. S. Eliot delivered three lectures at the University of Virginia, as part of the Page-Barbour Series. By Eliot’s own description, these lectures were intended as “further development of the problem which the author first discussed in his essay, ‘Tradition and the Individual Talent.’”…

[T]he lectures, gathered in Spring 1934 as the slim volume After Strange Gods, have gained most of their notorious reputation, because they contain some of the strongest evidence of Eliot’s intolerance for non-Christian religions and his blatant anti-Semitism. At one point, he declared that, “The population should be homogeneous; where two or more cultures exist in the same place they are likely either to be fiercely self-conscious or both to become adulterate. What is still more important is unity of religious background; and reasons of race and religion combine to make any large number of free-thinking Jews undesirable.”…

Barely a decade later….Eliot had grown leery of having his remarks published in post-Nazi Europe. Eliot withdrew After Strange Gods from publication, and it has remained unavailable ever since.

[O]ne of the lectures, “Personality and Demonic Possession,” appeared in VQR in January 1934…. The following essay is decidedly the least incendiary of the three Eliot delivered at Virginia; however, even here it is clear the degree to which his dogmatic artistic beliefs have blurred into social intolerance.

VQR, being a publication with academic pretensions, evidently takes the position that to it amounts to “social intolerance” when someone has coherent literary and social standards — as opposed to the morally relativistic stance that all ideas and cultures are created equal. What VQR calls “social intolerance,” is really a defense of the kind of civility and civilization that enables VQR and its ilk to survive, which it would not have done in the USSR or could not do in the Caliphate.

Recent events in Europe — and long-term trends in the United States — attest to the wisdom of Eliot’s statement that “where two or more cultures exist in the same place they are likely either to be fiercely self-conscious or both to become adulterate.” That really is an understatement, given the barely contained (and sometimes uncontained) state of tension (and sometimes violence) that exists where whites, blacks, and tans rub together in the U.S., and where Muslims and non-Muslims rub together in Europe.

Eliot does go too far in his emphasis on religious homogeneity. Jews certainly can be and have been staunch defenders of Western civilization — by which I mean a republican government of limited powers; respect for the rule of law; and, underpinning those things, rationality as opposed to emotionalism in political and civil discourse. But it must be said that many Jews (along with many more non-Jews) have been prominent among those who advance and fund ideas that are inimical to Western civilization. But the failings of those particular Jews cannot be laid to Judaism, else the failings of their non-Jewish brethren could be laid to Christianity.

In any event, here is what Eliot has to say about emotionalism, on page four of “Personality and Demonic Possession”:

[E]xtreme emotionalism seems to me a symptom of decadence; it is a cardinal point of faith in a romantic age to believe that there is something admirable for its own sake in violent emotion, whatever the emotion or whatever its object. But it is by no means self-evident that human beings are most real when most violently excited; violent passions do not in themselves differentiate men from each other, but rather tend to reduce them to the same state…. Furthermore, strong passion is only interesting or significant in strong men; those who abandon themselves without resistance to excitements which tend to deprive them of reason become merely instruments of feeling and lose their humanity; and unless there is moral resistance and conflict there is no meaning. But as the majority is capable neither of strong emotion nor of strong resistance, it always inclines, unless instructed to the contrary, to admire passion for its own sake; and if somewhat deficient in vitality, people imagine passion to be the surest evidence of vitality.

Thus do demagogues dupe the masses.

Pride and Prejudice on Film

I have now seen four film versions of Jane Austen’s Pride and Prejudice. Last night I had the extreme pleasure of viewing for the first time the earliest and best of the four: the 117-minute, 1940 release starring Greer Garson as Elizabeth Bennet and Laurence Olivier as Mr. Darcy. The 1940 version shows Hollywood at its finest. Great actors delivering great lines with panache and wit in a lavish, tightly orchestrated, and fast-paced production that demands — and deserves — your full attention.

Garson and Olivier, in particular (but not exclusively), outshine their counterparts in the other productions that I have seen. Garson may have been “too old” (36 at the time the film was released) but who cares? She is now my image of Elizabeth Bennet: witty, cunning, cutting, forthright — and beautiful as well. Olivier (33 at the time of release) simply exudes Darcy: stubborn, prideful, haughty — and yet vulnerable and kind behind the facade.

The other three versions that I have seen all are commendable for various reasons. They are:

1995 (300-minute mini-series), starring Jennifer Ehle and Colin Firth — excellent performances delivered at a more thoughtful pace than that afforded by a feature film, and in realistic settings (as opposed to the gaudy faux-rusticism of the 1940 version)

1980 (265-minute mini-series), starring Elizabeth Garvie and David Rintoul — somewhat stiff performances in a production clearly (and successfully) aimed at recreating the time and place of which Austen wrote

2005 (127-minute feature film), starring Keira Knightley and Matthew Macfadyen — a mixed bag of performances (e.g., Knightley is good, if too juvenile; Macfadyen is a nothing) in a feature film that achieves more “realism” than the 1940 version.

A Small Circle of Stars

Evelyn Waugh was born in 1903; Katharine Houghton Hepburn, four years later. Of Waugh’s novels that were adapted to film, Hepburn appeared in but one: Love Among the Ruins.

Hepburn’s co-star in Love Among the Ruins, Laurence Olivier, starred also in a mini-series based on Waugh’s Brideshead Revisited

…which co-starred, among others, Jeremy Irons of The Merchant of Venice (2004).

Merchant
featured Allan Corduner, a.k.a. Sir Arthur Sullivan of Topsy-Turvy, the co-star of which (Jim Broadbent as W.S. Gilbert) was in Widow’s Peak with Natasha Richardson…

…whose mother (and co-star in The White Countess), Vanessa Redgrave, appeared in the play A Madhouse in Goa with Rupert Graves.

And Graves starred in the film adaptation of Waugh’s A Handful of Dust.

All That Jazz

An otherwise sensible blogger (whom I’ll not name) adores Miles Davis. He (the blogger) says, “If you listen to nothing else by Miles Davis, buy and listen to Relaxin’. I absolutely guarantee you will not hate it, and you are very likely to love it.”

Well, I refreshed my memory of the Davis oeuvre by listening to a few cuts from Relaxin’ via Amazon.com. I absolutely hate it; it’s pablum for the ears. It reminds me of the background music for Peanuts films. Maybe it is the background music for Peanuts films.

Wherever jazz went after the late 1930s, it wasn’t a good place. Davis’s stuff is better than the dithering, discordant offerings of other post-war jazz “artists” whose names will not (dis)grace this blog. But that’s like saying a bowlful of sugar is better for you than a bowlful of arsenic. It is, but why eat either when the jazz pantry is stocked with the nutritious, flavorful pre-war offerings of Louis Armstrong, Sidney Bechet, Bix Beiderbecke, Benny Goodman, Fletcher Henderson, Jimmie Lunceford, Jelly Roll Morton, King Oliver, Kid Ory (heard here with his post-war group but in pre-war form), the Quintette of the Hot Club of France (featuring Django Reinhardt and Stéphane Grappelli), Fats Waller, and the “smooth” but always listenable Paul Whiteman. They are among the many greats to be found at The Red Hot Jazz Archive. Go there. It’s a toe-tapping, foot-stomping treat.

In Praise of Solitude

A remark by my son caused me to revisit Anthony Storr’s Solitude: A Return to the Self. Storr, in the book’s final paragraphs, summarizes his themes and conclusions:

This book began with the observation that many highly creative people were predominantly solitary, but that it was nonsense to suppose that, because of this, they were necessarily unhapppy or neurotic. Although man is a social being, who certainly needs interaction with others, there is considerable variation in the depth of the relationships which individuals form with each other. All human beings need interests as well as relationships; all are geared toward the impersonal as well as toward the personal….

The capacity to be alone was adumbrated as a valuable resource, which facilitated learning, thinking, innovation, coming to terms with change, and the maintenance of contact with the inner world of the imagination. We saw that, even in those whose capacity for making intimate relationships has been damaged, the development of creative imagination could exercise a healing function…. Man’s adaptation to the world is largely governed by the development of the imagination and hence of an inner world of the psyche which is necessarily at variance with the external world…. Throughout the book, it was noted that some of the most profound and healing psychological experiences which individuals encounter take place internally, and are only distantly related, if at all, to interaction with other human beings….

The epigraph of this chapter is taken from The Prelude. It is fitting that Wordsworth should also provide its end.

When from our better selves we have too long
Been parted by the hurrying world, and droop,
Sick of its business, of its pleasures tired,
How gracious, how benign, is Solitude.

Related post: IQ and Personality

Pornography: A Definition and an Example

The proprietor of Imlac’s Journal observes that

the candid news photograph of a person grieving over a tragedy is as pornographic as a blue movie. It is because the individual has become another object of lurid interest to the voyeur, stripped naked physically or emotionally.

Also pornographic, in my view, is the non-sexual movie that appeals to the “lurid interest” of the rabid partisan. A good example of such a movie is what James Pinkerton calls “that new Bush snuff movie,” Death of a President. Pinkerton continues:

Some might say that “snuff movie” is too strong a term — but how else to describe a movie that clearly revels in the prospect of George W. Bush’s being assassinated?

How else, indeed, except to say that it is pornographic?

A Haunting Lyric

I think I first heard A.A. MIlne‘s “Disobedience” as a rope-skipping chant. It’s a hanting lyric, the first three lines of which you may never be able to banish from your mind. Here is the first stanza:

James James
Morrison Morrison
Weatherby George Dupree
Took great
Care of his Mother,
Though he was only three.
James James Said to his Mother,
“Mother,” he said, said he;
“You must never go down
to the end of the town,
if you don’t go down with me.”

Victim Disarmament

Tom W. Bell of Agorophilia proposes “victim disarmament” as an alternative to “gun control,” and he asks his readers to help spread the use of the alternative. I hereby vow to do my part.

The Purpose-Driven Life

The Purpose-Driven Life: What on Earth Am I Here For?, by Rick Warren, has been on The New York Times‘s list of best-sellers (in the Hardcover Advice category) for 184 weeks. I hadn’t heard of the book until today, when I happened to channel-surf by an interview with the author. The title of his book flashed on the screen and piqued my curiosity. I didn’t linger to watch the interview, but instead turned to the web for enlightenment. Here is Amazon.com‘s review:

The spiritual premise in The Purpose-Driven Life is that there are no accidents—God planned everything and everyone. Therefore, every human has a divine purpose, according to God’s master plan. Like a twist on John F. Kennedy’s famous inaugural address, this book could be summed up like this: “So my fellow Christians, ask not what God can do for your life plan, ask what your life can do for God’s plan.” Those who are looking for advice on finding one’s calling through career choice, creative expression, or any form of self-discovery should go elsewhere. This is not about self-exploration; it is about purposeful devotion to a Christian God. The book is set up to be a 40-day immersion plan, recognizing that the Bible favors the number 40 as a “spiritually significant time,” according to author Rick Warren, the founding pastor of Saddleback Church in Lake Forest, California, touted as one of the nation largest congregations. Warren’s hope is that readers will “interact” with the 40 chapters, reading them one day at a time, with extensive underlining and writing in the margins. As an inspirational manifesto for creating a more worshipful, church-driven life, this book delivers. Every page is laden with references to scripture or dogma. But it does not do much to address the challenges of modern Christian living, with its competing material, professional, and financial distractions. Nonetheless, this is probably an excellent resource for devout Christians who crave a jumpstart back to worshipfulness

That’s all well and good if you like your self-help with a heavy dose of Warren’s brand of religiosity. For those of you who are not inclined in that direction, I recommend Victor E. Frankl’s Man’s Search for Meaning: An Introduction to Logotherapy, which I read (and re-read) some 20 years ago. Frankl survived a Nazi concentration camp, and he uses his experiences there to introduce what he calls “logotherapy,” or “meaning-therapy.” As Frankl puts it, logotherapy

focuses on the meaning of human existence as well as on man’s search for such a meaning. According to logotherapy, the struggle to find a meaning in one’s life is the primary motivational force in man.

I will not try to summarize Frankl’s psychotherapeutic approach, which he outlines in the second half of the book, except to say that he addresses such topics as the meaning of life, the meaning of existence, the meaning of love, and the meaning of suffering.

Even if you’re not interested in logotherapy, the first half of this inexpensive book ($6.99 in paperback at Amazon.com) — which recounts Frankl’s experiences in the concentration camp — is well worth the price. The story is candid without resorting to graphic sensationalism, and it sets the stage for Frankl’s explanation of logotherapy in the second half.