Here's Yglesias, responding to the complaints from conservatives (and some Democrats) that the stimulus bill is being larded up with spending on possibly-worthy but non-stimulative programs:
... a lot of this stuff whether or not it really "belongs in the stimulus" seems irrelevant to me. If you have a program that actually is
worthy, then funding it will make the country better, whether or not it
truly "belongs" in the stimulus. If you have a program that's worthy,
and that doesn't really belong in the stimulus, and you have a Republican who doesn't think the program is worthy, and he'd be willing to vote for the stimulus if you stripped that program from the bill, then it seems to me that you have a decent case for dropping a worthy
program. But if you're Ben Nelson and you think the program is worthy,
then why not just support the worthy program? It's true that doing so
doesn't fit a perfectly pristine notion of how the legislative process
should work, but anytime the process is working in favor of worthy
programs rather than crappy ones, that's a lot better than the normal
functioning of the legislative process.
Well, sure. This is the basic liberal calculus at the moment: The stimulus bill is thick with non-stimulative spending increases because it's a chance to, well, pass spending increases that Democrats think are worthy. Which is fair enough; they did, after all, thump the GOP two election cycles in a row. But surely even the most deficit-happy liberal ought to worry a little about how all of this is going to be paid for - and by extension, whether a spending binge on existing programs today will make it harder to pass, say, an expensive overhaul of the health care system tomorrow. At some point, barring an economic miracle, the GOP will be able to get at least some traction by playing Ross Perot and arguing against out-of-control spending. Maybe the whole liberal wish list will be passed into law before that happens: As Yglesias says in a subsequent post, it's possible that at a time like this there's no "fixed sum of political capital" for liberals to spend down, and so the thing to do is go for broke, quite literally, instead of trying to prioritize health care reform over Pell Grants, or climate change legislation over Head Start. But there's also a chance that the Democrats will look back on the stimulus bill as an instance where they gained ground in the short run, but at the expense of their longer-term ambitions.
Well, sort of. As you might expect, I agree with a lot of Ambinder's caustic remarks about the Minority Leader's recent "whither Republicanism" speech. But McConnell, like all GOP leaders, is in an awfully difficult spot at the moment: He's heading up a party that desperately needs a new direction, but whose most loyal and vocal members want nothing to do with anything that smacks of compromise or centrism. In those circumstances, the thing for Republicans in Washington to do is to talk an awful lot about how conservative principles don't need to change (and they don't, broadly speaking), while eagerly embracing new policy options whenever possible. And here McConnell deserves at least a modicum of credit for coming out in favor of the best of the alternative stimulus plans floating around on the right-of-center - namely, some sort ofpayroll tax cut, which is precisely the sort of small-government populist, Sam's Club-meets-Cato idea that the GOP ought to be embracing, instead of resisting.
The key for Republicans, as Yuval notes today, is to offer not only opposition to Obamanomics but alternatives as well - but those alternatives needs to sound like something other than the Bush agenda redux, or else there's no point in offering them. And on that front, McConnell's doing a better job that some of his colleagues.
The obvious good news is that the movie franchise will continue post-Prince Caspian, with Fox stepping in after Disney backed out. The not-so-obvious good news is this:
While it looks like both the film's principle cast and director will be
clearing some time on their calendars this summer to shoot the picture,
some sacrifices had to be made on the budget front to make the project
viable. According to the Los Angeles Times, Disney spent some $215 million producing Prince Caspian,
and another $175 million on marketing it (the film ended up grossing
roughly $419 million worldwide). So, in order to lessen the risk on Dawn Treader, Walden Media and Fox have decided to go halfsies on the third film's slated budget of $140 million.
That sounds like bad news at first. But artistically speaking, at least, a smaller budget may be exactly what the Narnia movies need. I liked Caspian, in certain respects, but it felt like it was made more in self-conscious imitation of Peter Jackson's appropriately-humongous Lord of the Rings films than in the more intimate spirit of C.S. Lewis's novels. Or as I put it in my NR review:
The movie plays up ... every tension it finds in Lewis's novel, and invents several more,
creating rivalries (between Peter and Caspian), generating romances
(between Susan and Caspian), adding battles (particularly a long set
piece in the movie's middle, in which the Old Narnians launch a raid on
Miraz's castle), and doubling down on the political intrigue in the
Telmarine court. For the most part, the additions serve their purpose,
transmuting a somewhat slight children's adventure into a gripping
medieval war picture: Braveheart with more magic, or Tolkien with
talking squirrels.
But this achievement comes with a
price-namely, the evisceration of Lewis's major theme. If The Lion, the
Witch and the Wardrobe is a story about rebirth and renewal-Aslan
resurrected, and spring cracking the ice of an enchanted winter-then
Prince Caspian is fundamentally a story about re-enchantment, and the
glorious return of the supernatural forces that the Telmarines have
repressed. Little of this survives in Adamson's adaptation; it's been
pruned away to make room for battles and arguments and longing glances
and one-liners. The book's climax, in which the trees and rivers come
to life and a wild pagan rout overruns the sterile secularism of
Telmarine society, is reduced to a brief battlefield intervention that
rips off not one but two scenes in Lord of the Rings. Aslan, too, is
reduced to a walk-on role, sweeping in once the body count has climbed
and the CGI budget been exhausted to roar a halt to the proceedings. He
murmurs about faith, in the voice of Liam Neeson, but he feels less a
Christ figure than a strikingly flimsy plot device: Leo ex machina.
The
bad news for Narniaphiles is that this may be the only way that C. S.
Lewis can plausibly be adapted, given the economics (and biases) of
contemporary Hollywood-with the metaphysics downplayed and the Generic
Epic elements accentuated, the better to justify the price tag that
comes attached to any fantasy film ... But
judging from Caspian's middling box-office showing to date, it might be
worth considering something different for Voyage of the Dawn Treader
and (one hopes) its sequels: half the budget, perhaps, and a little
more fidelity to the elements of theme and plot that make Narnia
something more than an entertaining but two-dimensional imitation of
Tolkien's Middle Earth.
Spending $140 million instead of $215 million isn't quite halving the budget, but it's pretty close. With luck, the result will be richer storytelling, instead of just lousier special effects.
The current recession may turn into a small
depression, and may push global living standards down by five percent
for one or two or (we hope not) five years, but that does not erase the
gulf between those of us in the globe's middle and upper classes and
all human existence prior to the Industrial Revolution. We have reached
the frontier of mass material comfort--where we have enough food that we
are not painfully hungry, enough clothing that we are not shiveringly
cold, enough shelter that we are not distressingly wet, even enough
entertainment that we are not bored. We--at least those lucky enough to
be in the global middle and upper classes who still cluster around the
North Atlantic--have lots and lots of stuff. Our machines and factories
have given us the power to get more and more stuff by getting more and
more stuff--a self-perpetuating cycle of consumption.
Our goods
are not only plentiful but cheap. I am a book addict. Yet even I am
fighting hard to spend as great a share of my income on books as Adam
Smith did in his day. Back on March 9, 1776 Adam Smith's Inquiry into
the Nature and Causes of the Wealth of Nations went on sale for the
price of 1.8 pounds sterling at a time when the median family made
perhaps 30 pounds a year. That one book (admittedly a big book and an
expensive one) cost six percent of the median family's annual income.
In the United States today, median family income is $50,000 a year and
Smith's Wealth of Nations costs $7.95 at Amazon (in the Bantam Classics
edition). The 18th Century British family could buy 17 copies of the
Wealth of Nations out of its annual income. The American family in 2009
can buy 6,000 copies: a multiplication factor of 350.
Keynes thought that by today we would have
reached a realm of plenty where "We shall once more value ends above
means and prefer the good to the useful. We shall honour those who can
teach us how to pluck the hour and the day virtuously and well, the
delightful people who are capable of taking direct enjoyment in things,
the lilies of the field who toil not, neither do they spin."
But
no dice. I look around, and all I can say is: not yet, not for a long
time to come, and perhaps never ... There is a point
at which we say "enough!" to more oat porridge. But all evidence
suggests Keynes was wrong: We are simply not built to ever say
"enough!" to stuff in general.
That we'll never be satisfied with what we have probably goes without saying. But the most pressing issue, it seems to me, is whether we've reached - or will reach - a point at which all our abundance cushions us against the political consequences of suddenly-diminished expectations. In 1932 or so, the West's porridge-eating past wasn't nearly as far in the rearview mirror as it is today, but a Brad DeLong of the Great Depression could still have marshaled all sorts of statistics to prove that even amid economic crisis, your average Westerner was in vastly better shape than his pre-industrial forefathers. Yet that underlying reality didn't save Europe from a decade in which democratic capitalism was thought to be discredited, and the whole edifice of modern civilization was very nearly torn apart.
Hopefully the world - not only DeLong's North Atlantic cluster, but the developing powers as well - has grown rich enough and stable enough that something like that simply couldn't happen again, no matter how hard the fall and how deep the depression. Hopefully.
There's a lot to agree with in Peter Beinart's piece
about Obama's quest to "end" the culture wars - particularly his point
that as far as style and symbolism goes, a black liberal may be
better-positioned than a white liberal to build the kind of bridges between the
secular left and the religious middle that an enduring Democratic
majority requires. (In a somewhat similar vein, I suspect the GOP's
quest to build a bridge between the religious right and the religious
middle would have been better served had George W. Bush been a Catholic
rather than an Evangelical - though that's an argument for another day.)
But Beinart's argument is shot through with the characteristic liberal
conceit that the culture wars are a one-sided affair, in which
right-wing culture warriors start fights and peace-loving liberals try to avoid them. In reality, what makes Obama promising to liberals isn't his potential to "end" culture-war battles - it's his potential ability to win them, by dressing up the policies that Planned Parenthood or the Human Rights Campaign or the ACLU or whomever would like to see in the kind of religiose language and fuzzy talk about consensus that swing voters like to hear. So waiting a day to reverse the ban on overseas funding for groups that provide abortions, for instance, isn't a compromise in the culture wars, or an act of moderation - it's a way of making a victory for the left seem like an act of moderation to people who aren't that invested in the issue. And the same will doubtless hold true when the stem-cell debate comes around, or the next Supreme Court vacancy, or any flashpoint you can think of: Liberals will praise Obama for taking steps to defuse the culture war, but what they'll mean is that he's taking steps to win it.
"Enhanced interrogation" yielded crucial intelligence that saved lives, says former Bush speechwriter Marc Thiessen. No, says the Post's Dan Froomkin, it didn't. Yes, says Thiessen, it did.
Obviously, this debate will never be completely resolved. But neither will it disappear: If it does go away temporarily, you can bet that it will come roaring back eventually, in this administration or in one to come. And I, for one, wouldn't mind getting a lot more information out on the table now - for the next round of debate, if not for this one.
If you're looking for a more nuanced and detailed take on the Vatican's decision to lift the excommunication of four Society of Saint Pius X bishops than, say, the New York Timesprovides, I recommend Amy Welborn's roundup and analysis. This bit, especially, distills what I'm assuming is the essence of the reasoning behind Benedict's decision:
The Pope is not stupid. He knows the ins and outs of the SSPX better
than any of us and is deeply familiar with the various currents of
belief, practice and attitude that run through it. There are virulent
anti-Semites in the SSPX. There are near-sedevacantists. There are
many who believe that the Second Vatican Council was an illegitimate,
invalid council. There are those who believe that the Mass that most of
reading this blog go to every Sunday, if not every day, is invalid and
that the elements are not consecrated.
But he also knows, particularly in Europe, there are many SSPX
adherents who do not share these views and are simply seeking to
practice a richer Catholic faith than is available to them in their
local regular parish. I think to really understand the whole picture on
this, you have to understand the European situation, which in many ways
is quite different than it is here.
I think what the Pope knows is that there is going to be a huge
degree of self-selection going on over the next few years, as well as
some inevitably self-destructive behavior. In short, those who truly
want to be union with Rome will do so, and the holdouts will hold out
until some fantasy moment occurs in which the Novus Ordo Mass and the
Second Vatican Council is repudiated.
The problem, of course, is that by create this opening for those SSPX-ers who should be in full communion with the Catholic Church, the Vatican is temporarily empowering Bishop Richard Williamson, Holocaust denier and all-around charmer, who gives every evidence that he shouldn't be - and probably doesn't want to be - back in the fold, but who's instantly become the poster boy for the Pope's decision, and for the Traditionalist community more generally. This is a price worth paying, hopefully, for the sake of closing unnecessary divisions, but the price wouldn't be nearly so steep if the Vatican had a better sense of how to do public relations in a controversial case like this. The average reporter or commentator isn't going to understand the nuances of canon law, the history and background of the SSPX, the context of the excommunications, the status of these bishops post-excommunication, and so forth. What the average journalist does understand, though, is how to write this headline: "Pope Rehabilitates Holocaust-Denying Bishop." And while the potential for bad publicity shouldn't prevent the Vatican from showing mercy to excommunicants when appropriate, it should incentivize wrapping any such mercy in a forceful, detailed, "Catholicism and canon law for dummies" explanation of what such an action doesn't mean: In this case, an endorsement of poisonous anti-semitism and conspiracy theorizing.
And this is exactly what hasn't been forthcoming. Oh, the Papal spokesman said that Williamson's Holocaust-denying remarks were "completely indefensible," and L'Osservatore Romanohad an editorial (not yet translated into English, of course) stating that the decision "should not be sullied with unacceptable revisionist opinions and attitudes with regard to the Jews." But in the contemporary media environment, that's not good enough. If the Pope de-excommunicates a Holocaust denier, the Vatican press office should be working around the clock, with press releases flying, to provide context and do damage control. What's more, if the Pope de-excommunicates a Holocaust denier, the Pope himself needs to say something about it, and not just obliquely nod to the decision in his latest homily. Yes, the Church's primary business is saving souls, not public relations - but in this day and age, public relations is part of the business of saving souls. And nobody in Rome, from Benedict on down, seems to have figured that out.
In response to this post, and the suggestion that even hardened atheists should occasionally feel faint tremors of "maybe God does exist" doubt, several scoffing readers have directed me to Bertrand Russell's famous teapot analogy, which supposedly settles once and for all the question of whether nonbelievers should give any credence to the possibility that God exists:
If I were to suggest that between the Earth and Mars there is a china
teapot revolving about the sun in an elliptical orbit, nobody would be
able to disprove my assertion provided I were careful to add that the
teapot is too small to be revealed even by our most powerful
telescopes. But if I were to go on to say that, since my assertion
cannot be disproved, it is intolerable presumption on the part of human
reason to doubt it, I should rightly be thought to be talking nonsense.
If, however, the existence of such a teapot were affirmed in ancient
books, taught as the sacred truth every Sunday, and instilled into the
minds of children at school, hesitation to believe in its existence
would become a mark of eccentricity and entitle the doubter to the
attentions of the psychiatrist in an enlightened age or of the
Inquisitor in an earlier time.
This analogy - like its modern descendant, the Flying Spaghetti Monster - makes a great deal of sense if you believe that the idea of God is an absurdity dreamed up by crafty clerics in darkest antiquity and subsequently imposed on the human mind by force and fear, and that it only survives for want of brave souls willing to note how inherently absurd the whole thing is. As you might expect, I see the genesis of religion rather differently: An intuitive belief in some sort of presiding Agent seems to be an extremely common, albeit hardly universal, feature of human nature; this intuition has intersected, historically, with an enormous amount of subjective religious experience; and this intersection (along with, yes, the force of custom and tradition) has produced and sustained the religious traditions that seem to Richard Dawkins and company like so much teapot-worship. The story of our civilization, in particular, is a story in which an extremely large circle of non-insane human beings have perceived themselves to be experiencing an interaction with a being who seems recognizable as the Judeo-Christian God (here I do feel comfortable using the term), rather than merely being taught about Him in Sunday School. I am unaware of anything similar holding true for orbiting pots or flying noodle beasts. And without the persistence of this perceived interaction (and beneath
it, the intuitive belief in some kind of God), it's difficult to imagine religious belief playing anything like the role it does in human affairs, no matter how many ancient scriptures there were propping the whole thing up.
This is not to say that humanity's religious experiences and intuitions are anything like a dispositive argument for the existence of God. Certainly, there are all sorts of interestingefforts to explain them without recourse to the hypothesis that they correspond to anything real, and all kinds of reasons to choose atheism over faith. But it is one thing to disbelieve in God; it is quite another to never feel a twinge of doubt about one's own disbelief. And just as the Christian who has never entertained doubts about his faith probably hasn't thought hard enough about the matter, the atheist who perceives the Christian God and the flying spaghetti monster as equally ridiculous hypotheses really needs to get out more often.
Lincoln, Wilson and FDR-each of them was responsible for far more
deaths and far more destruction than Che Guevara or any of a number of
Arab nationalist figures ever was, but two important things separate
them in the eyes of the general public: they did not personally kill
anyone, and the causes for which their armies killed and destroyed are
widely considered to be the just and right ones. That is to say, the
exact same moralizing, or rather anti-moralizing, that the ends justify
the means that Che used in rationalizing revolutionary violence is
employed to praise and sanctify approved figures who authorized much
larger slaughters for the "right reasons." [emphasis mine - RD] Not only have sympathetic,
shoulder-shrugging, anti-moralizing stories been told about these men,
but we have built large physical monuments to them (or at least to two
of the three mentioned above), which is rather more troubling in its
way than silly people who wear T-shirts or directors who minimize the
moral failings of their main characters.
But of course in just-war theory, the ends often do legitimize the means, in some sense at least. Not all means, of course: Some forms of violence are intrinsically immoral, whatever the ends in question. But to employ criteria like "proportionality" and "right intention" in judging a war's justness is to recognize that the morality of a given military campaign depends (among other things) on the objectives it seeks to accomplish, and the context in which it takes place. The consensus surrounding the moral legitimacy of Lincoln and FDR's warmaking flows, in part at least, from precisely this issue of intentions. So does most contemporary criticism of Che Guevera and the Cuban Revolution, which tends to focus on the tyranny that Che and Castro ended up establishing in the revolution's wake, not the moral legitimacy of the revolt itself. And so, for that matter, does the debate about the
Israeli-Palestinian conflict, in which each side is judged, not unreasonably, on their ultimate intentions. Do
the Palestinians want sovereignty and self-determination, or do they
want to see Israel destroyed? Do the Israelis seek security and a recognition of their nation's right to exist as a Jewish state, or are they still invested in the dream of a Greater Israel? These are not the only questions to keep in mind when assessing the justice of each side's military operations, but they are real and important questions nonetheless.
Of course there's a slippery slope involved whenever you judge means in light of ends, and it's certainly the case that Americans, like most peoples, are too quick to absolve our leaders for wars entered unwisely and prosecuted immorally so long as they seem to work out "in the long run." But the American memory isn't just shaped by a mix of jingoism and consequentialism: The Lincoln-FDR consensus may be mistaken (as Larison obviously believes it to be), but the fact remains that it's driven, at least in part, by a real attempt to make moral distinctions about the conflicts that we've fought, rather than just a rank chauvinism in which our wars are always justified and other people's wars aren't. There's a reason that Lincoln has an enormous memorial and, say, James K. Polk does not; there's a reason that the Washington Mall has a Museum of the American Indian rather than a monument to Philip Sheridan's Plains campaigns; there's a reason that the Spanish-American War and the First World War don't enjoy the kind of "good war" reputations that accrue to the Civil War and World War II; there's a reason that the Korean War is remembered as a more heroic affair than Vietnam, and that our Filipino counterinsurgency isn't remembered at all. The American reckoning with the moral questions that surround our wars is incomplete at best, but that doesn't mean it doesn't exist - or that the attempt to distinguish good wars from bad ones on the basis of the ends that we sought isn't a legitimate way to go about making moral judgments.
From Todd VanDerWerff's meditations on the season premiere:
I suspect when all is said and done that the history of Lost will cleave it pretty neatly into two different shows.
... The great divide falls between the first half of the show's third
season and the last half of that season (which roughly matches up with
when executive producers Damon Lindelof and Carlton Cuse convinced ABC
to let them set a hard end date for the series). Before season three's
13th episode, "The Man from Tallahassee," the series was
much more meandering and much more prone to fits of stupidity. But it
was also a show with more time--time for things like visual poetry or
narrative tangents that occasionally seemed like dead ends (fans hated
season three's "Tricia Tanaka Is Dead," but it was really a fine little
piece of television--it just didn't advance the master narrative in any
way) ...
But
after the network set a firm end date for the show, it became something
ever-so-slightly different. Gone were the long, meander-y episodes
where we found out why Kate liked horses (and/or killed her dad) for
the most part (there was one where we found out why Desmond says
"brother" to everyone, but that was the last of an old era). The show
became something much more purposeful, taking great strides forward in
its narrative and starting to tie seemingly disconnected elements into
a larger framework. In addition, the characters started behaving more
like real people, no longer forced to do things they wouldn't do in
real life in a similar situation by the constraints of a plot that said
they couldn't because the show might run 10 seasons, and what would you
do then? Most of the series' fans are deeply agnostic that Cuse and
Lindelof really had a plan for how the series would run, but the
episodes since that back half of season three seem to speak well for
the two at least having SOME idea of how this was all going to play
out. Plus, while there have been a few clunkers since the back half of
season three (most notably season four's "Something Nice Back Home"),
the series by and large has reinvigorated itself as one of the best
hours of action-packed TV out there, flitting easily between genres,
depending on who's got the episode focus that week.
If you like the show, read the whole thing. The division VanDerWerff outlines is real, I think, and the decision to set an end-date has played a big role - as predicted here - in saving the show from the wheel-spinning stagnation that defined most of its third season. For me, though, the real Lost divide will always be between the first two seasons and everything else, rather than between the pre- and post-deadline versions of the show. I'm part of the minority that actually liked the second season, hatch and all, and what I liked about it was the air of dread that still clung to the Island and everything about it - to the Smoke Monster and the Others, the cryptic numbers and the strange visions, the kidnapped children and the Dark Territory, the quarantine signs and the orientation films and all the rest of it. These things are still part of the show, in one sense or another (though many of them are part of plot strands that have been dropped, at least temporarily), but the dread started to leak away with the season-three revelation that the Others were just another bunch of squabbling, pretty-ordinary people with their own set of problems ... and now two seasons later it's all but gone. The show still hasn't explained "why," in Peter Suderman's memorable formulation, but in the course of explaining "what" and "how" it's lost the aura of barely-suppressed terror that clung to the Island in the first two seasons. Its mysteries are still real, but they've been domesticated: For all
the apocalyptic overtones, I feel like the show partakes more of Michael
Crichton, at this point, than Stephen King.
I like Crichton, of course, and I still like Lost enormously: Thanks to the late-in-Season-3 righting of the ship, it's still one of the best shows on TV, and hopefully will remain one to the end. But now it's a good action-packed sci-fi show, without the element of fear and trembling that kept me riveted through the first forty episodes or so. VanDerWerff misses the first two seasons' "simple moments of visual beauty" and "plot digressions that don't have to
be tied into the master plot," and sometimes I do too. But more importantly I miss the dread.
... what's particularly clear this season is that the Academy will reward
excellence, no matter if it comes from a big studio or a small
independent.
... This year's Top 5 were studio and indie, big and little, broad and very
specific. The string that pulls them together is not where the films
came from in terms of backing, but where they come from artistically. Each of the films selected for a best-picture nomination ... represents the auteur ideal, in which a director is bankrolled and
left pretty much alone. It is no coincidence that these five films were
created by directors who also received best-director nominations.
Never before, I'm pretty sure, has the phrase "auteur ideal" been used in conjunction with the work of Ron Howard, so points to Carr for crossing that particular bridge. His broader argument - that the extent to which a given film partakes of "the independent aesthetic" is more important to its Oscar chances these days than whether it meshes with "the tastes of the mass audience" - is pretty obviously true. But what's missing from his analysis is a recognition that rewarding an art-house aesthetic isn't the same thing as rewarding excellence: Mass-market movies can be good movies, and movies made with a narrow, highbrow audience in mind can be mediocre-to-bad. The fact that films like, say, Terms of Endearment or E.T. were studio tentpoles that played to huge audiences didn't mean that they didn't deserve their Oscars; and the fact that Stephen Daldry didn't have much studio interference while making The Reader doesn't make him anything more than a high-toned hack who's good at playing by the current Oscar rules. The Academy should reward excellence wherever it comes from, absolutely. But this year - again, a bad year for movies overall - it rewarded too many of the wrong auteurs.
We disagree about the merits of Revolutionary Road, but for a similarly damning (and more comprehensive) take on this year's nominees, I recommend Chris Orr's burst of spleen.
Allow me to quote myself, from the latest issue of National Review:
... the [Christmas] rush is worse for critics than for viewers, since at least half the
movies "released" in November and December won't trickle out to
non-Manhattan multiplexes until January. (Clint Eastwood's Gran Torino,
which national publications had to review around its official December
12 release date, probably reached a theater near you some thirty-odd
days later.) But I suspect that even filmgoers in Peoria partake of the
overwhelm-ment that settles over cinephiles sometime around Christmas
-- a time when critics who've devoted dozens of column inches to The
Mummy: Tomb of the Dragon Emperor during the movie industry's fallow
months find themselves tackling what are supposed to be the year's best
films at capsule length, and when serious moviegoers wander cineplexes
in a daze, rambling about whether Mickey Rourke should win Best Actor
for The Curious Reader of Revolutionary Doubt.
It's
bad for the moviegoers, and it's bad for the movies. Studio executives
are a risk-averse lot in the best of times and, faced with the cruel
Darwinism of the holiday season, they seem to have decided that the
best way to hedge their bets is to green-light films within an ever
narrower range. How else to explain this house-of-mirrors movie season:
two Clint Eastwood movies released within 40 days of each other; a pair
of Oscar-caliber Kate Winslet performances playing against each other
in the local art house; and not one or two, but five films about the
Holocaust and Nazis playing between mid-October and the New Year.
What does all this conformity and caution get you? It gets you Revolutionary Road.
No film in this holiday season checks quite so many Oscar-season boxes:
There are A-list stars (Winslet and Leonardo DiCaprio, together again a
decade after they clutched at each other in Titanic), an Academy
Award-winning director (Winslet's husband, Sam Mendes), a sterling
supporting cast, a handsome mid-century aesthetic, and a semi-famous
literary novel as the source material. And no holiday-season film
better illustrates the way that such box-checking curdles art.
As it turns out, the Academy nominated neither Eastwood movie and just one of the Nazi films, and ignored Revolutionary Road entirely. And yet the final Best Picture list - save for Slumdog Millionaire, which slipped into the dark-horse slot previously occupied by Juno and Little Miss Sunshine - still looks like a roster of box-checking exercises, and what A.O. Scott memorably termed "hermetically sealed melodrama[s] of received thinking." There were that many of them!
This was, admittedly, a bad year for movies overall, which makes a disappointing Best Picture slate par for the course. I'm not enough of a Dark Knight partisan to get outraged at its exclusion, and while I wish The Wrestler and Rachel Getting Married were occupying the slots filled by The Reader and Frost/Nixon, neither of the former are anywhere near as good as No Country For Old Men - to pick my favorite recent winner - and neither of the latter are anywhere near as bad as, say, Crash. But it's still an uninspiring group of nominees - which is a good reason to pull for Slumdog come Oscar night, even if you think it's overpraised and overrated. I mean, which would you rather see rewarded - Stephen Daldry or Ron Howard being pretentious and high-minded, or Danny Boyle being (as usual) quirky and adventurous? I think the question answers itself.
Speaking of week-old blog posts, here's a provocative argument from Edward Glaeser - one that foreshadows, I suspect, some interesting intra-conservative debates to come.
Compromise, rather than absolutism, has been the watchword of anti-abortion efforts for some time now. But the pro-life movement can't give up on overturning Roe without giving up on its very reason for being.
Forget the predatory lenders, Wall Street sharks and their government enablers: The current economic crisis, and the housing bubble that produced it, all started with George Bailey.