Roger Ebert’s Take on Video Games

Quick site note: This is the first review of either many or zero more that will use “tips.” When hovering over some links, text will pop-up near your cursor. We’re not yet sure whether it’s annoying or if it enhances the writing. I especially find myself drowning in a sea of parenthesis, and these “tips” solve that problem in a way that writing on paper never could. Feedback please.

Lately the non-review sections Roger Ebert’s website have been filled with discussion on the merits of video games versus movies, and “the internet” has been abuzz with him being an out-of-touch old man. His weekly answer-man column has addressed the issue multiple times, namely his lack of interest in video games, in general. I can’t find the absolute starting point for the whole debate, but I think it has to do with a reader objecting to Ebert’s awarding of one star to the movie adaptation of DOOM (he uses a four star system for those of you wondering how to reconcile his reviews with ours.). The reader basically took offense at his generous one star review because one section of the other-wise unremarkable adaptation paid super-close homage to the game. Ebert sufficiently served the reader by explaining that video game websites review movies on their own terms, and he will continue to review movies on his. What started the “controversy” was his final comment in his reponse:

“As long as there is a great movie unseen or a great book unread, I will continue to be unable to find the time to play video games.”

This lead to (what I can only assume to be) countless angry letters of video game fans defending their XBoxen and poorly translated, endlessly sequeled, Japanese-sourced games (i.e. the Metal Gear and “Final” Fantasy series, etc.). True, that’s my bias showing through, but in the response to the letter that Ebert decided to run, he explained:

“I am prepared to believe that video games can be elegant, subtle, sophisticated, challenging and visually wonderful. But I believe the nature of the medium prevents it from moving beyond craftsmanship to the stature of art.”

That’s the one that really got the internet in a tizzy.

ebert
You’d think that after writing all of this, I’d be able to think of a funny caption. Well, that’s not the case.

The problem with video game fans (in general) is that they are relentlessly but selectively enthusiastic about their “art” of choice. There’s no point for me to write an e-mail to the movie “Answer Man” being that it will be lost in the mountains of “You’ve never played Halo, Resident Evil, Final Fantasies 1 through 12, etc. so you suck” type letters (not that I’d assume it’d automatically be printed of course. I’m sure that every e-mail is read, but I think Jim Emerson (The site’s editor/blogger) probably handles most of the filtering). So, being that I have my own internet soapbox that ends in .com, here I go.

Without backing any of it up with fact or definitive history, I can guarantee that every medium of art has had to deal with detractors. Movies weren’t widely accepted as having any worthwhile value at their inception, especially considering that mankind had gotten used to the previous status quo from the past two-thousand+ years of seeing live actors performing on stage. On top of that, even movements within each art form have faced critics (again, with the lack of facts or evidence). People still argue about the merits of Jackson Pollack imagine hundreds of years ago, when Baroque music was developing and becoming (again) the status quo, and *gasp* didn’t base all of its harmony on the 4th. Sure, the now “normal” root-3rd-5th harmony sounds right, but back then a lot of people didn’t like it one bit due to the “profane” nature of the major-3rd harmony (in terms backing that up, I’ll hold a music professor I had responsible for defending that bit of trivia). In philosophical terms, the video game “medium” is about 25 years old and in only its second major movement. Consider the first to be the 2D era, started with the first Atari system and ended with Super Nintendo and Genesis. The second is the 3D era, started with the Sony Playstation, Nintendo 64, and Sega Saturn. The third (sub)wave of the 3D era began in late November of 2005 when Microsoft began shipping the XBox 360. As mentioned earlier, gamers are notoriously selective in their passions, and some choose to be passionate about the hardware aspect of video gaming, so in response to those people:

  1. I know I’m glossing over lots of other systems.
  2. I know that Genesis came out before Super Nintendo.
  3. Atari probably wasn’t the first system, but realistically it started the whole console “thing.”
  4. I know that 2D games have come out for the “3D” systems — especially Sega Dreamcast, but 2D vs. 3D is too significant of a divide to ignore.

Structured music went through quite a bit of development before anything gained an historical foothold; specifically, J.S. Bach and Handel are still being widely performed today while almost the whole of still-existing Renaissance and Medieval music is relegated to prominence in academic environments only. Sure, movies “came of age” much more quickly than music (or even painting), with films from about thirty years after the proliferation of the medium still widely considered classics. Interestingly enough, film also experienced several technical and artistic waves. (The “maturation of computer-generated effects” probably being the academic-sounding, retrospective categorization of today’s “wave.”)

Video games have not yet experienced a true second artistic wave (the 2D/3D divide is of a technical nature). The gameplay advances of Grand Theft Auto 3 (namely, go wherever you want, do whatever you want, follow a story or play randomly) have inspired countless similar titles the same way that DOOM began a wave of 1st person shooters in 1993. They offered different experiences, but neither was the quantum leap experienced by movie-goers attending the first “talkies” in the 1920’s. Newer hardware generations have enabled new features (namely graphical, some incredible advances in AI) in first-person shooters (and eventually bigger, prettier worlds in GTA-style games).

Anyone who says that once Roger Ebert would play Halo or any other mass-market game, Ebert would develop a huge appreciation and change his mind is simply wrong. Halo’s story serves merely to give the player a reason to shoot things. Similarly, Grand Theft Autos’ stories (any game in the series, even way back into its 2D, overhead days) simply provide a reason to take part in the shenanigans for which the games have become (in)famous. Limiting the lens to newer games, even the story of Metroid Prime is just a tool the developers used to make the shooting more compelling, not the other way around. It’s not that there aren’t story-driven titles among newer games, it’s just that in popular games it’s supplemental. I know there are ambitiously enthusiastic fans of the story in the Halo games, but ask yourself if you’d play the game any differently if there were no story, just the mission goals list, then shooting things until the next numbered list appears, rinse, repeat. So, no, I’m not claiming that “new” (more accurately, “popular”) games are bad, just that they serve as poor evidence in one’s claim that video games are narratively engaging.

So, as a bit of a disclaimer, I’d consider my interest in current video games to be passive. I’m interested enough to read game reviews or watch someone play for about 10 minutes or so, but I don’t participate. I own no consoles and the video card in my computer is from 2001. I’ve watched people play FarCry, Half-Life 2, DOOM III, Grand Theft Auto: San Andreas, both Halos and on and on. (Those games are almost all shooters, but a complete list would be excessive, and these are some of the most popular and loudly defended of the last couple of years). I hold no grudge towards new games, but my personal “golden age” of video gaming passed sometime in the late 90’s. Most recently, the games I’ve spent any considerable amount of time playing have been the Genesis version of John Madden NFL ’98 and the arcade version of Super Puzzle Fighter 2 Turbo, both running on a friend’s modded Xbox. As the understatement of the year, neither of these games is exactly what we’d consider story-heavy, but in terms of bringing things full circle, provide a very social experience with a group of people, exactly what is marketed as the number one feature of Microsoft’s Xbox 360, not its currently man-beast-esque hardware capabilities.

Directly addressing Roger Ebert, we’ll now present the definitive example of the “video game as art” discussion:

“I did indeed consider videogames inherently inferior to film and literature. There is a structural reason for that: Videogames by their nature require player choices, which is the opposite of the strategy of serious film and literature, which requires authorial control.”

At risk of this becoming a “my favorite game is more obscure than your favorite game” reverse sales-measuring contest, let’s first throw out every RPG. Ebert’s comment about “authorial control” initially sounds too heavy-handed to be anything other than hyperbole, but it should be painfully accurate to RPG players. A decently modeled RPG lets the player assume the “role” of a character (or group of characters); the user can choose for his game play experience to be as dull as desired. His or her experience will be different than another player’s. Sure, that sounds ideal, even enough to potentially consider that to be the ideal example of the one thing that would skew video games toward “art” status. But think of Choose Your Own Adventure books; they offer a choice in the reader making his or her own story. The first reaction to that is, “But they’re kids books, they’re not supposed to be good.” It goes without saying that there are plenty of widely appealing kids’ books, and if there were adult-oriented Choose Your Own Adventure books, would anyone read them? Would they be considered “literature?” Nope; and for good reason. It’s just a gimmick.

There are two story-heavy genres in video gaming. Role-Playing Games and Adventure games. (This is where I’m looking to avoid the obscurity-related reverse penis size contest.) RPG’s having already been justifiably thrown out, that leaves Adventure games. Most anyone with a passing interest in video games has played an adventure game, but their popular peak was both dramatically short and intensely focused on one title (which really wasn’t that great of a game, all things retrospectively considered ). MYST (aside from being considered the “killer app” for PC CD-ROM drives) was hugely successful, and was undeniably an adventure game. There was a set story, and very little room for non-linearity; provided you could figure out the “oh yeah, I guess that makes sense” puzzles, you were undoubtedly under the control of some “authorial” figure as you played. Though the graphics contributed to the overall mood, it was really the story and “art direction” that truly established the player’s sense of loneliness on the island throughout its history. The story created the puzzles (the single element of “gameplay”), not the other way around. Though this isn’t a review of MYST, it needs to be noted that it actually offered a rather passive gameplay experience; the puzzles were simplistic, the story, dull, but the mind-bendingly amazing (for 1995) graphics sold everyone. Unfortunately, it became the benchmark for the Adventure game genre, causing most everyone to think them dull and pretty-yet-vapid after most people were left thinking “Gee, I don’t get it” after either finishing the game (or more likely) giving up after getting one’s fill of pretty pre-rendered pictures.

With the genre’s prime and popular example painting such an ugly picture for average users as time went on, the “mass market” PC gamers moved back towards more interactive games (such as Quake, more-or-less the beginning of the PC’s true 3D boom). During this time, adventure games were still being made, and George Lucas of all people was responsible for some of the best. Okay, George Lucas’ company actually made them, but trivia’s trivia. Sam & Max Hit the Road, Day of the Tentacle, Indiana Jones and the Fate of Atlantis, and Full Throttle were some of the best received adventure games of their time and are widely considered the classics of the era. Just like every every other genre eventually shifted to 3D, adventure games followed suit.

LucasArts’ first of two 3D adventure game offered one of the most interactively cinematical experiences in games, ever, no matter the genre. Limiting the focus of the “video games as art” discussion to whether or not video games present the “authorial control” apparently required by Grim Fandango offered the “authorial control” of a movie, while engrossing the player in ways that movies and books simply can’t. Loosely inspired by some sort of Central American mythology of multiple underworlds, and souls wanting to end up in the final, 9th underworld, you play as Manny Calavera, a sort of travel agent in limbo between life and death. Manny “sells” travel packages for different routes to the underworld; the better the prospective travellers lived their lives, the quicker their trip to the 9th underworld. The “cleanest” souls get to ride on the “Number 9” train which speeds them right to heaven,while those that face the travel agent (Manny) with more regrets are stuck with the less desirable methods, notably the long, dark walk through the underworld. All that is the setting, the actual story involves a conspiracy that Manny begins to uncover as he realizes that he keeps getting the “lesser” souls, and due to his unmentioned sins of his human life, he’ll be stuck in limbo forever. Along the way, he meets a special lady, gathers a sidekick, meets a mortician performing an autopsy (one of the harmlessly creepy characters in any movie, game, book, etc.) and ends up having to shoot someone to save his life. (In case you’re wondering, to kill a dead person, you apparently shoot them with a bullet that sprouts flowers, similarly to the earth “taking over” a body buried in the ground.) Without spending forever talking about this game, I’ll simply say it’s a more cinematic experience than many movies: the story is the game, the voice actors are top quality, the art direction (which somehow combines Latin American influences with Art-Deco) compares with any Hollywood production, and it offers an ending more emotional than most movies.

Which brings up the final thought: How many people have played Grim Fandango? How many have even heard of Grim Fandango? Not many. In fact, it’s usually considered the ultimate symbol of the adventure genre’s waning popularity. It came out in 1998, one year after Sierra had abandoned the King’s Quest franchise. Critical reviews were immensely favorable, but sales were not. Escape from Monkey Island, which was ultimately the final LucasArts adventure entered and exited with a whimper as sequels to two of their most popular adventure franchises were cancelled for ‘current marketplace realities’ and ‘creative issues,’ reverse respectively. In any medium there’s a distinct divide between the commercial/popular and the artistic. There is sometimes cross-over between the two, and it seems that most fans of the “artistic” baselessly resent fans of the popular merely because it is the “popular.” Music has thrived with that divide, and the “indie” boom of the mid-90’s brought that awareness to the world of movies. Even today, Rolling Stone and Spin’s editors campaign for the “latest, greatest, obscurest” new music while movie critics practically bet their credibility in defense of those same three superlative adjectives on some not yet known about “indie movie.” Thankfully, we don’t live in France where critics have been known to defend bad movies just to prove a point. Importantly, there is no true “indie” vs. otherwise divide in video gaming. There are no critics willing to champion some unheard of game for the sake of getting more people to experience it. People’s expectations for video games are drastically different than for other media, and even with the internet, there is no true “indie” movement that produces and distributes unheard of games the way that the major movie studios have arms dedicated to picking up obscure movies. There’s simply no My Big Fat Greek Wedding in the world of video games. The infrastructure isn’t set up to “get the word” out, and I’ll go out on a limb and say that in a society greatly affected by advertising and shiny things, video gamers are especially vulnerable to this advertising.

Looks like we covered a lot more than just “Roger Ebert and Video Games” in this one, so here we go, emptybookshelf’s first three-headed review! Let’s hear it for innovation.

**½

Roger Ebert’s take on Video Games receives two-and-half stars due to the fact that as well as he defends himself, he can’t help but come off as just another old person afraid of what the kids are up to. “Oh my God! How could a bunch of moving pictures ever be better than having the actual, live actors in front of the audience?! That’ll never work!!” Unfortunately for the video gaming industry, he has a decidedly correct take on the “games as art” issue. Judging just the popular games, he’s hit the nail on the head; they are diversions where interactivity is thought to remove the need for story. Aside from the fact that he’s said he has not played games, if he were to ever pick up a controller/mouse/keyboard/bongo, he wouldn’t be playing anything remotely cinematical. At the risk of going on yet another tangent, just the fact that you can use bongos to play a video game says something about them compared to movies.

*****

Though I claimed this wasn’t a review of Grim Fandango, I can’t help but consider this an ideal time to “star” it. It receives five stars for being the most engrossing of all adventure games, and dare I say, any game. That isn’t to say that it’s the best game ever, just the most cinematical, and in a non-girly way, potentially the most beautiful.

*

Internet Fanboys receive one star due to the fact that their existence and pedanticism make it so a review of such a contentious topic needs to go on so many sidetracks. There’s something to be said for being enthusiastic about something, but there’s also something to be said for having some perspective. Not-so-oddly enough, Roger Ebert himself, probably one of the wittiest people on the planet, summed up the whole “fanboy” thing quite well in his review of Hackers:

You should never send an expert to a movie about his specialty. Boxers hate boxing movies. Space buffs said ‘Apollo 13’ showed the wrong side of the moon. The British believe Mel Gibson’s scholarship was faulty in ‘Braveheart’ merely because some of the key characters hadn’t been born at the time of the story. ‘Hackers’ is, I have no doubt, deeply dubious in the computer science department. While it is no doubt true that in real life no hacker could do what the characters in this movie do, it is no doubt equally true that what hackers can do would not make a very entertaining movie.”

Nate’s Review of Good Night, and Good Luck.

Recently, this Site’s integrity has been challenged. A member of our Junior Staff, though well-intentioned, has violated one of the precepts of reviewing. This review reviews that review, explains its shortcomings, then concludes with an establishment of goals for both The Site and its Junior Staff.

Nate takes aim at the press but hits George Clooney instead.
Nate takes aim at the press but hits George Clooney instead.

Having also seen Good Night and Good Luck., I’m more than adequately qualified to weigh in on the movie’s merits (or lack thereof). But why a review of Nate’s review instead of the movie itself? Nate made the oh-so-common mistake of confusing a movie’s hype with the actual movie itself (this confusion can be found in any reviewable product, not just movies.) It’s not George Clooney’s fault that critics think his movie’s all that and a bag of chips. Nate didn’t separate the hype from the product, and because of that, he gave the movie an unfair review, which casts this Site in an equally unfair light.

What I assume to be Nate’s gripes about the movie, what I called its “superficiality” during our initial discussion of it (before the publishing of Nate’s review), should not be gripes. They should be supporting details, leading to an informed opinion, and therefore, an informed review. Was George Clooney doing something evil when he chose to let the historical actions speak for themselves? Is it wrong to assume that history can and will repeat itself? Even if George Clooney were to consider his movie a parable (I do not believe that it is or is meant to be a parable, just a vaguely cautionary tale.), he’s not the first. If we were to consider this movie to be the thread connecting McCarthyism to the “war on terror,” we must remember that this same thread extends also to the Salem Witch Trials in Arthur Miller’s “The Crucible.” Though “The Crucible” explicitly called back/forward to the HUAC proceedings, it remained a rather superficial examination of a community of fear. Was Arthur Miller only giving Two-and-a-Half stars worth of effort in his famous play, simply because he had the (gasp!) audacity to think to himself, “Gosh, this has happened before, and it’s practically happening again.” Again, though I don’t consider “Good Night, and Good Luck” allegorical, I will say that any “depth” comes solely from the (re)viewers’ minds. If George Clooney were to say, “Gee, I hope that people vote democrat after seeing my movie!” go ahead and spend the effort bashing him (and his movie) because as an allegory, political tool, etc. it fails. It fails miserably.

Because the anti-political crowd (think of “The Daily Show” — soon to be mega-reviewed on this very site) is so large, vocal, and lacking perspective, they’re unaware of the fact that because being against politicians (or claiming that a movie is politically preachy) is just as much a political opinion as hating Hillary Clinton is a political opinion. If they’re looking for “Good Night, and Good Luck” to be a political tool, it will be. It’s been said that human minds better create horror than human eyes. Given the freedom to imagine their personal nightmare as opposed to a finite, real horror, they imagine the worst. George Clooney gives the audience that opportunity: look in the box, and what you see is only what you want to see.

No, it’s not a perfect movie. It does lack depth, it does simply re-create existing history. The actors aren’t so much “acting” as “impersonating.” but despite all of this, it remains intriguing. Metaphorically, I knew McCarthy’s ship would sink, but that doesn’t mean I don’t want to watch him scramble for a lifeboat. It is not one of the “best films of the year;” it’s not particularly “important,” no one “needs to see this movie.” But even though other critics are on record saying these things about the movie, George Clooney is not. The movie speaks for itself. It doesn’t say much of value, but certainly more than two-and-a-half stars worth. Nate’s expectations of the critics were not met, not Nate’s expectations of the movie. This is not the fault of the movie or George Clooney. Once the unwarranted, incorrect hyperbole of the critics is cast off, what’s left? A particularly solid, entertaining movie, nothing else. This is not a review of the movie, but a review of Nate’s review. The absolute star ranking of the movie is not important, as its now widely understood that it’s better than the two-and-a-half bitter stars that Nate threw at it. Nate’s review was well-written, had a particularly funny caption for its photo, maintained coherency despite its length, so I will be more fair in my review of his work than his review of Clooney’s.

*½

Due to Nate’s nature as the Site’s Junior-Reviewer -at-Large, we can’t expect perfect, objective reviews. He’s only human. We all are. Should I hold The Site to a higher standard of quality, demanding insight and unbiased objectivity in reviews written by all contributors? Naturally I should (and so should all of the Junior Staff), but until that point arrives, we will use each review as an example in time, a time-capsule of sorts, of each writer’s strengths and weaknesses so that the readership-at-large sees our Junior Reviewers accomplish all of their opinionary goals. What is insight without perspective? What is opinion without foresight? What are sweeping generalizations in the absence of nuance? What is getting on one’s soapbox without a safety net of objectivity? These are the questions for which I know the answers and for which The Site’s readership demand answers. We read on as our Junior staff grabs the first handle in the philosophical jungle-gym that begins the pursuit of their own personal answers to these inquiries. Between the lines of each review we gain a clearer understanding of their answers. Between the lines of each review we see them learning to better tell others what to think. I have utmost confidence in The Site’s Junior Staff’s ability to not only learn from their mistakes, but to rise above them, and truly establish themselves, and therefore, this Site, as a premiere opinion-making entity in the world.

Nate, we’re all rooting for you.

Good Night and Good Luck


“Good Luck” reading all of this review

When Jim Carrey got all dolled up, put on the funny accent, and opened his eyes real wide to play Andy Kaufman way back when, in “Man on the Moon”, everyone was calling it a “revelation”. Nobody believed that somebody could totally embody a role/person as Carrey did. The movie was receiving critical praise from everybody as the rebirth of the “Biopic”. Imagine my suprise when all I saw was a movie that consisted of reenactments of Kaufman’s most famous stunts, with a loose story in between, mostly to bridge the gap from one “happening” to the other. Of course, the inevitable drama in this story eventually came in the prescence of Kaufman’s battle with cancer. Nevertheless, the movie didn’t succeed for me as a whole, because its primary goal wasn’t to give us insight as to what made this person one of the most “enigmatic” performers of his era; it was to remind us of all the cool stuff he did.
That’s exactly how I feel about the new George Clooney movie, “Goodnight and Goodluck”, the story of Edward R. Murrow’s famous on-air battle with Senator Joseph McCarthy. While the movie serves as a timely story about asking tough questions in the face of government/peer pressure to relax and talk about something else, for fear of being labeled unAmerican, it fails to show us any sort of internal confict, any humanizing element of Murrow or his producer Fred Friendly (if that doesn’t sound like a made-up name, I don’t know what does), played by director/co-writer, “Mr.” Clooney, or any emotion at all. Maybe that was a specific choice made by Clooney to amplify Murrow’s stoic and stonefaced nature… to tell the facts like they were and let them speak for themselves, just as Murrow did with McCarthy. This, however, is supposed to be a movie showing us the “epic” battle, shrunk down into an hour and a half. How does the movie accomplish this? Well, considering that nearly all of the confrontation took place on the show, the natural way to show it would be by reenacting it. There’s that word again. I would guess to say that two thirds to three fourths of the screentime is devoted to recreating speeches, television segments, or showing actual file footage of the McCarthy hearings and the Senator’s on-air response to Murrow.
There is very little to the movie other than this. In fact, the bulk of the story outside of these reenactments deals with a husband and wife pair (Robert Downey Jr., and Patricia Clarkson) who work for the show but are keeping their marriage a secret for fear of being let go by the company. The only thing that I could tell that they were there for, as they really didn’t interact with the main two characters at all, was to offer, in a scene in bed, a question as to whether they were doing the right thing in regards to the pieces about McCarthy. I suppose it could be argued that they served to parallel the struggle of in-the-closet Communists, ready to be oppressed at their discovery, but in reality, that’s a stretch. The rest of the staff consists of faceless yesmen who don’t have any objections to doing these pieces, or at least are cowardly enough to have a little fear about their jobs being in jeopardy.
The only other example of conflict/human emotion involved a “troubled” newsanchor (whose “troubled” nature is shown in about 3 scenes total, and again, is only peripherally involved with the story), played by Ray Wise, whose biggest role prior to this was the “troubled” Leland Palmer on “Twin Peaks”… and let’s not forget someone named Randolph Pratt in “The Garbage Picking, Field Goal Kicking Philadelphia Phenomenon” with Tony Danza. He ends up committing suicide because of a single bad newspaper review from a right-wing Hearst newspaper.
I suppose i’m forgetting one other story. The main of these three stories is Murrow’s “Battle” with the network. I tend to forget it as a conflict, because near as I could tell, most of the time it consists of the station chief agreeing with Murrow, and doing all he can to help, even though the sponsors are pulling out. So again, there’s not a ton of conflict there, until the end, when the head honcho expands his show to a full hour but moves it to sunday afternoons, a thing which seems to me to be a fair compromise, but seemed like a defeat and the death of TV to Murrow and Friendly.
So if the story is basically entirely a reenactment of the TV tapes, then why (according to Rotten Tomatoes) did it receive 110 positive reviews and only four negative reviews nationwide?
The acting is brilliant for the most part. By “the most part”, I mean that one person truly dominates the movie and that the rest of the cast are completely serviceable in their minor background roles. David Strathairn, the blind character “Whistler”, from the totally underrated masterpiece “Sneakers”, gives “the performance of a lifetime” as Murrow. Just as Jim Carrey, and Jamie Foxx in “Ray”, and supposedly Joaquin Phoenix in “Walk the Line”, Strathairn completely is enveloped by the character, getting the presence and the speech patterns down to a science.
Clooney’s direction keeps the pace brisk, but the tone somber. His innovative (some may call it “gimmicky”) use of staging and camerawork in his directorial debut, “Confessions of a Dangerous Mind”, kept it from being just another story of one of Los Angeles’ minor celebrities who supposedly became a hitman. In “Goodnight and Good Luck”, he creates a smoky, jazzy mood of complete solemnity by removing all the color, shooting in black and white. A jazz singer serves as a segue between the scenes, while cigarette smoke fills nearly every scene. By using the black and white “gimmick”, the visual drama has to come from the sharp contrast between light and shadow, as well as the different focal lengths of the lenses used to distort what the eye would see. An example of this is the way that Strathairn is shot when on air, with the camera uncomfortably close, dark shadows looming from under his eyebrows, and the focus on his face, but out of focus on his ears. In addition, Clooney creates tension not from the action, but on what the reaction to is going to be. Will it be positive? Only history will tell. Oh, right. Well in any case, we’re there wondering, just like in Titanic.
The last thing that makes this movie work is the fact that the source material is interesting to begin with. If we were treated to a story of “September 11th” with an actor playing Peter Jennings broadcasting for 24 hours straight, it would be gripping. It would probably be more interesting to see a movie about McCarthy, but that’s not the point that Clooney wanted to make, which brings me to my final thought.
I saw Clooney on Oprah today, and he claimed that he wasn’t trying to preach anything with this movie, except maybe journalistic responsibility. I don’t pretend to be blinded by his ruse. This movie is as much a parable about our state of affairs today as “The Crucible” was to the actual McCarthy era. There are blatant lines of dialogue referring to holding people without evidence, trying them without letting them see said evidence, and labeling people “communists” (as much a jingoistic phrase as “terrorist” is today) and traitors. I have no problem with his artistic expression, and I commend him for not discussing his politics on television, but don’t lie about it. Like any great work, it’s open to lots of different interpretations and relevancies, and can incite intelligent dialogue. I would rather have him say, just as Murrow said approximately, “I have presented the facts, and the rest is upto you to decide”. The work will stand on its own however, and we’ll see where it ends up come awards season.

**½

I’m a harsh grader, especially when everyone else loves a movie. “Goodnight and Good Luck”, receives 2.5 stars for being nothing more than a well directed and shot, well acted movie that contained nothing more than reenactments of famous television moments and long stretches of file footage. While it presents interesting ideas, it doesn’t do much to discuss them, and there’s suprisingly little humanity presented for a risk that most people would at least have second thoughts about taking, especially the faceless staff members.