Before Midnight and the Perils of Sequels

I suspect that a lot of people of my generation have become somewhat inured to the consistent fallibility of sequels. Our experience over the past few decades, whether you’re talking about long-delayed sophomore albums, movie trilogy prequels or Tiny Toons-style presidencies has shown us that franchise extensions lead to almost certain disappointment.

That’s why Richard Linklater, Julie Delpy and Ethan Hawke’s follow-ups to 1995’s “Before Sunrise,” are so remarkable. Each installment has enriched its predecessors, rather than diminishing them. What began almost twenty years ago as an unexpectedly charming flight of fancy between two lovestruck twenty-somethings has blossomed into the quietest, loveliest kind of epic trilogy ever imagined. It’s a series of acutely human, almost mundane conversations played across decades that somehow manage to brilliantly illuminate the arc of adulthood at the end of the 20th and beginning of the 21st Centuries.

That there is a third installment at all — in theaters now — is brazen; 2004’s “Before Sunset” was a nearly flawless bit of storytelling that ended with a thrilling, satisfyingly ambiguous fade-to-black that seemed like an impossible act to follow.

But this year’s “Before Midnight” is richer still, if less perfect. It turns a corner in the story of these two characters whose meet-cute lasted an unnaturally long ten years; their preoccupations are no longer the transcendent ideals of young love, but the quotidian hurdles of being older, raising kids, getting through life. The exchanges between Hawke and Delpy’s characters are still riveting, but also more explosive and more forlorn now. An undercurrent of bitterness runs through it all, borne from the burdens of accommodating loved ones and their ineluctable foibles for years. The first two movies were about ideas of adulthood, and how difficult it can be to aspire to them; this one is about being adults, and how paltry the upsides can be. If that sounds grim, rest assured: it’s an unremittingly talky movie, but the dialogue is still repartee, still frequently hilarious.

The best part of “Before Midnight,” though, is that it affords us the opportunity to visit again with these two amazingly imperfect characters given life by Ethan Hawke and Julie Delpy, to enjoy their company for another hundred minutes or so, and to discover new things that deepen our love for them. This movie gives us exactly what most sequels never do: the chance to burnish our original affections not just through repetition of the familiar, but through challenging our ideas of them, and of ourselves.

At this point, it would be hard to argue that this series is anything less than a cultural landmark. Rarely have we seen fictional constructs grow and evolve over so tremendous an arc of real time, and with so much verisimilitude. These movies have unexpectedly become important works of art. But they’ve also become incredibly intimate for those of us who have followed along, who have grown up alongside them. I have to admit there’s a swelling in my chest every time I see these characters on the screen. They’re like good friends who visit only every once in a while. I can’t wait for the next time.

Continue Reading

+

Goodbye to Two Greats

The world has been mourning Roger Ebert, who passed away last week, and I join them. I learned a lot about watching movies from the man, but as I observed from afar as he struggled valiantly with disease I learned a lot more about what it means to fully become a person. His film criticism was always commendable, but the way he used it to undergird a life of great curiosity and thoughtfulness was remarkable. He’ll be greatly missed.

I won’t try to write any more than this about Ebert, since so much has already been written about him just in the past two days. Not as much will probably be written about the passing of longtime comics great Carmine Infantino, though, but that doesn’t take anything away from his own remarkable life.

Continue Reading

+

The People vs. James Bond

Last weekend I went to see “Skyfall,” the twenty-third entry in the now fifty year old James Bond franchise.

As an action film, it’s more than adequate, thanks largely to its overqualified crew: it was directed by Oscar winner Sam Mendes, whose name few people expected to see attached to popcorn franchises like this, given his past highbrow features like “American Beauty” and “Revolution Road.” I’m not a big fan of those movies, but they’re easily better entertainments than the majority of what has been issued under the 007 moniker through the decades.

Just as meaningfully, “Skyfall” was shot by one of today’s most accomplished cinematographers, Roger Deakins. The first half of the film features a fight sequence in a Shanghai skyscraper that, thanks to Deakins’ almost audacious stylization, surely qualifies as the most visually stunning Bond scene since Honey Ryder emerged from the sea in “Dr. No.” On its own, it’s almost worth the price of admission.

Continue Reading

+

Wes Anderson’s Kingdom

On the whole, I’ve enjoyed most of director Wes Anderson’s oeuvre, and I count myself a fan. Enough so that I’m even partial to his oft-maligned Jacques Cousteau riff “The Life Aquatic with Steve Zissou,” one of his least-liked films. It’s far from perfect, I admit, but there’s enough of a through-line to it from “Rushmore,” his 1998 breakthrough, that I find it worthwhile. “Rushmore,” in case there’s any doubt, is a film that I found to be thoroughly wonderful and full of singular promise. It balanced a wholly novel worldview with indelible characters. There’s been very little like it from other directors since.

Over the weekend I went to see Anderson’s newest movie “Moonrise Kingdom,” which like his past works is another Joseph Cornell-like cinematic diorama, full of diminutive but delightful details and vaguely familiar but endearingly idiosyncratic characters. It tells the tale of two pre-teens who fall in love and plot to steal away to a remote part of a fictional New England island, and the comical search parties that pursue them.

Part of the wonder of a Wes Anderson film, for me, is getting to see the kind of film a designer would make given a budget, a crew and a sampling of today’s most notable celebrities. Anderson populates his movies with big name actors eager to burnish their indie cred, and he surrounds them with the accoutrements of his obsessions: obsolete technology, dubious uniforms, imaginary cartographies, naive architecture, and more. Every single piece counts, and is placed exquisitely in relation to every other. Most filmmakers compose their frames, but it might be more accurate to say that Anderson lays his out, much the way print designers once pasted up pages in lavishly illustrated encyclopedia volumes. It’s not film direction, it’s art direction.

In this, Anderson remains at the height of his powers. “Moonrise Kingdom” looks great. The eye can’t help but pore over each frame, visually twiddling with the seemingly endless details festooned fussily on every object. No one can art direct quite like Wes Anderson, and together with his regular cinematographer Robert D. Yeoman, no one can produce films quite this visually rich. The story is set in 1965 and is rendered with an appropriately halcyon color palette that’s a wonder to behold; it evokes an intoxicating, imaginary past with the verve of an Instagram photo adapted for the screen by a true auteur.

Moonrise Kingdom

Nevertheless, I found myself intermittently irritated by it. To watch “Moonrise Kingdom” is to be enthralled by the totality of Anderson’s vision, and even to be warmed by the obvious fondness that he has for his characters. But the movie is also ninety-four minutes of starvation if you’re hungry for any kind of substantial character development. The protagonists (and by the end, nearly everyone is a protagonist, undermining any real dramatic tension the plot had going for it) are little more than inventories of their scripted eccentricities. The director offers scant few arguments for why any of the characters do any of the things they do; they’re all just dress-up dolls at the beck and call of Anderson’s charmingly pre-adolescent fetishes.

Poor character development can seem like a petty complaint when Anderson also provides the visual riches that he does. His technical proficiency is clearly higher than ever, and if you can set aside the centrality of character development, you’d have no trouble arguing that “Moonrise Kingdom” is a remarkable jewel of a movie. (In fairness, the characters are not as horrifically ill-conceived as they were in Anderson’s 2007 travelogue “The Darjeeling Limited.”) This is perhaps how we should think of Anderson’s films from here on out: technical marvels engineered to show off endless quirk. That’s a legitimate credential; it’s just not the one I would have hoped for right after I saw “Rushmore.”

Continue Reading

+

Aaron Sorkin’s Steve Jobs Screenplay Will Be Highly Inaccurate and That’s Okay

Aaron Sorkin’s script for “The Social Network” won him an Oscar, but it drew the ire of at least a few tech pundits who felt that it took too many liberties for dramatic effect. Now Sorkin is writing a screenplay about Steve Jobs. In an interview with The New York Times last week, here’s what he had to say about his thinking on the project.

“At the moment I’m at roughly the same place I was when I decided to write “The Social Network” — which is to say I don’t know what the movie’s about yet. I know it won’t be a biography as it’s very hard to shake the cradle-to-grave structure of a biopic. I know that Jobs was a very complicated and dynamic genius who fought a number of dramatic battles. I know that like Edison, Marconi (and Philo Farnsworth), he invented something we love. I think that has a lot to do with our love affair with him. We’re told every day that America’s future is basically in service but our history is in building things — railroads and cars and cities — but Steve Jobs, in building something that’s taking us to our future, has also taken us to one of the best parts of our past. Now all I have to do is turn that into three acts with an intention, obstacle, exposition, inciting action, reversal, climax and denouement and make it funny and emotional and I’ll be in business.”

What’s interesting to me about these early thoughts is that they make no mention of historical accuracy. Instead, they’re focused on teasing out the dramatic core of Jobs’ story. Sorkin is looking to understand the idea of Steve Jobs, rather than the person himself.

Continue Reading

+

Salvaging a Blu-Ray on My Mac

One of my daughter’s favorite movies is “The Sound of Music.” We bought it for her as a Blu-Ray disc, but it stopped working in our player recently, owing I think to one of the periodic firmware updates that the manufacturer sends down the pike to us. It used to work wonderfully, but now gets caught on a loading screen and goes into an unending loop. Another strike against the addled monstrosity that is the Blu-Ray format. (I wrote about my major Blu-Ray complaints last year, so I won’t repeat them here.)

A software glitch is little consolation to a toddler who has her heart set on singing along with Julie Andrews though, so I resolved to somehow get a digital copy of the movie off the disc and free ourselves of the trappings of the Blu-Ray version we owned. Apple of course has decided to stay as far away from Blu-Ray as possible, so this took some work.

Continue Reading

+

Integrity and “The Artist”

Shockingly, last night at the Oscars, Hollywood decided to award the Best Picture prize to a film that celebrates Hollywood. Michel Hazanavicius’s “The Artist,” a heartfelt ode to the silent film-era, is an undeniably charming picture even if it seems unable to resist nudging the audience to constantly wink along with its own cleverness. However, I can’t help but point out that as little more than a casual fan of silent movies, “The Artist” still seems like a pale imitation of the original. It is a tribute to silents in the same way that, say, “Happy Days” was a tribute to the 1950s.

Over at The New Yorker, film critic David Denby makes a great argument as to why the year’s “Best Picture” misses the mark for what it honors. Denby’s principal complaint is that the acting in “The Artist” captures very little of the quality of acting that the original silent movie stars employed to make those films come alive in the absence of sound. He writes, “Silent film is another country. They speak another language there — a language of gestures, stares, flapping mouths, halting or skittering walks, and sometimes movements and expressions of infinite intricacy and beauty.”

Denby believes these characteristics escape the two leads of Hazanavicius’ film: “both characters, and both actors, move in a straight line in each scene; they stay within a single mood. The great silent actors did so much more.” He elaborates: “In the silents, you have to do something; you can’t just be. Silent-film acting drew on the heroic and melodramatic traditions of nineteenth- and early-twentieth-century theatre… it drew as well on mime, magic shows, and vaudeville… Subtlety was not a high priority in those arts.” It was frequently not great acting, but it was always expressive.

I agree with this assessment, though in making his case Denby underrates my biggest complaint about the film: it just didn’t look like a true silent movie. Hazanavicius’ camera is surprisingly fluid in “The Artist.” It jumps back and forth, climbs high and dips low, draws in for surprisingly detailed closeups and pulls out with great agility for wide shots. To me, silents generally felt flatter, and not in a bad way. They made the most of the inflexibility of early camera equipment, using shots that seem static relative to today’s unimpeded camera technology, but they were very effectively contrasted with their stars’ outsized facial gestures, propeller-like limbs and ability to cut dynamic swaths across the screen. The camera could not be expressive, so the actors were. “The Artist” feels a lot more like a movie that might have come a few decades later, when camera equipment got lighter and more nimble; a movie from the 1940s or 1950s perhaps, except with the sound removed. This, for me, was its worst mistake: in a movie about movies, it could not convince me to suspend my disbelief.

Continue Reading

+

Drivers and Thieves

Many of the movies I fell for as a kid drew a healthy portion of their magic from freely picking over the bones of the cinema that came decades before them. Most of what George Lucas and Steven Spielberg released in the 80s, for example, reveled in an unabashed nostalgia for the past. Many older filmgoers at the time held this approach to filmmaking in disdain, but for me and most everyone my age, it was a legitimate strategy for imagining what movies could be about. “Star Wars” and “Raiders of the Lost Ark” were more than just rehashes of old movie serials; they were more sophisticated than their progenitors, more complete in their visions, more contemporary and alive to the audiences of that particular period than the source material could ever have been.

I still feel this way, that revisiting the past — even borrowing heavily from it — is a legitimate and even necessary part of the dialog that film conducts with itself and its audience. (For that matter, it’s an essential dialog for all art forms.) Still, it’s one thing to justify this technique when yours is the generation doing the borrowing; it’s a different experience when yours is the generation being borrowed from.

This was my experience watching Nicolas Winding Refn’s “Drive,” a remarkable movie that is irresistible in its craftsmanship but mildly suspect in its originality. It stars Ryan Gosling as an archetype of cool, a Steve McQueen like mystery man of very few words, absurdly lengthy pauses and super-human fighting and driving skills, whose zen-like mastery of his world goes awry when he begins to entangle himself with other humans.

Continue Reading

+

Super-Heroes Are Faking It

Every super-hero movie requires a significant suspension of disbelief, but in 1978 when director Richard Donner brought “Superman” to the silver screen he infused the movie with considerable believability by imagining the Man of Steel’s Metropolis as a thinly-veiled version of late-twentieth century New York City. When the character defied gravity and soared over his adopted city, what laid below him was that uniquely beautiful, earthbound constellation of lights that is the Manhattan skyline — even including, during one sequence, the Statue of Liberty. In his secret identity of Clark Kent he clumsily made his way through the unmistakable congestion of midtown Manhattan to report to work at the real-life headquarters of The Daily News, which stood in for the fictional Daily Planet. Arch-nemesis Lex Luthor’s underground lair was an abandoned wing of the iconic Grand Central Terminal. Superman apprehended a burglar scaling the famous Solow Building at 9 West 57th Street. And so on.

Of course it’s not necessary to film absurdist fantasies — and super-hero movies are nothing if not that — in real locations, but imparting some sense of reality in these films can add so much, as they did for Donner. It’s fine to watch a super-human character negotiating an unreal world, but it’s more thrilling, more engaging, more entertaining to watch a super-human character negotiating a world that looks something like the world we know — the real world.

Continue Reading

+

The Silver Computer Screen

If you can’t tell already I’m a fan of the movies, pretty much all kinds of movies. From art house fare to popcorn flicks, I’m pretty confident I can find something interesting in just about every film I watch, and so I try to watch as much as I can of as many different genres as I can. This requires a well-practiced suspension of disbelief, of course, which is not hard to muster if you are passionate about films in general.

But one thing that almost always breaks me out of any movie’s spell is the on-screen appearance of any kind of computing technology — specifically the appearance of interfaces for computing technology. The reason is obvious: they’re almost always completely phony, designed not so much to reflect what the movie’s characters are supposed to be doing with a computer as to reflect what the movie’s producers want us to understand about what the character is doing with a computer.

Continue Reading

+