Quick site note: This is the first review of either many or zero more that will use “tips.” When hovering over some links, text will pop-up near your cursor. We’re not yet sure whether it’s annoying or if it enhances the writing. I especially find myself drowning in a sea of parenthesis, and these “tips” solve that problem in a way that writing on paper never could. Feedback please.
Lately the non-review sections Roger Ebert’s website have been filled with discussion on the merits of video games versus movies, and “the internet” has been abuzz with him being an out-of-touch old man. His weekly answer-man column has addressed the issue multiple times, namely his lack of interest in video games, in general. I can’t find the absolute starting point for the whole debate, but I think it has to do with a reader objecting to Ebert’s awarding of one star to the movie adaptation of DOOM (he uses a four star system for those of you wondering how to reconcile his reviews with ours.). The reader basically took offense at his generous one star review because one section of the other-wise unremarkable adaptation paid super-close homage to the game. Ebert sufficiently served the reader by explaining that video game websites review movies on their own terms, and he will continue to review movies on his. What started the “controversy” was his final comment in his reponse:
“As long as there is a great movie unseen or a great book unread, I will continue to be unable to find the time to play video games.”
This lead to (what I can only assume to be) countless angry letters of video game fans defending their XBoxen and poorly translated, endlessly sequeled, Japanese-sourced games (i.e. the Metal Gear and “Final” Fantasy series, etc.). True, that’s my bias showing through, but in the response to the letter that Ebert decided to run, he explained:
“I am prepared to believe that video games can be elegant, subtle, sophisticated, challenging and visually wonderful. But I believe the nature of the medium prevents it from moving beyond craftsmanship to the stature of art.”
That’s the one that really got the internet in a tizzy.
The problem with video game fans (in general) is that they are relentlessly but selectively enthusiastic about their “art” of choice. There’s no point for me to write an e-mail to the movie “Answer Man” being that it will be lost in the mountains of “You’ve never played Halo, Resident Evil, Final Fantasies 1 through 12, etc. so you suck” type letters (not that I’d assume it’d automatically be printed of course. I’m sure that every e-mail is read, but I think Jim Emerson (The site’s editor/blogger) probably handles most of the filtering). So, being that I have my own internet soapbox that ends in .com, here I go.
Without backing any of it up with fact or definitive history, I can guarantee that every medium of art has had to deal with detractors. Movies weren’t widely accepted as having any worthwhile value at their inception, especially considering that mankind had gotten used to the previous status quo from the past two-thousand+ years of seeing live actors performing on stage. On top of that, even movements within each art form have faced critics (again, with the lack of facts or evidence). People still argue about the merits of Jackson Pollack imagine hundreds of years ago, when Baroque music was developing and becoming (again) the status quo, and *gasp* didn’t base all of its harmony on the 4th. Sure, the now “normal” root-3rd-5th harmony sounds right, but back then a lot of people didn’t like it one bit due to the “profane” nature of the major-3rd harmony (in terms backing that up, I’ll hold a music professor I had responsible for defending that bit of trivia). In philosophical terms, the video game “medium” is about 25 years old and in only its second major movement. Consider the first to be the 2D era, started with the first Atari system and ended with Super Nintendo and Genesis. The second is the 3D era, started with the Sony Playstation, Nintendo 64, and Sega Saturn. The third (sub)wave of the 3D era began in late November of 2005 when Microsoft began shipping the XBox 360. As mentioned earlier, gamers are notoriously selective in their passions, and some choose to be passionate about the hardware aspect of video gaming, so in response to those people:
- I know I’m glossing over lots of other systems.
- I know that Genesis came out before Super Nintendo.
- Atari probably wasn’t the first system, but realistically it started the whole console “thing.”
- I know that 2D games have come out for the “3D” systems — especially Sega Dreamcast, but 2D vs. 3D is too significant of a divide to ignore.
Structured music went through quite a bit of development before anything gained an historical foothold; specifically, J.S. Bach and Handel are still being widely performed today while almost the whole of still-existing Renaissance and Medieval music is relegated to prominence in academic environments only. Sure, movies “came of age” much more quickly than music (or even painting), with films from about thirty years after the proliferation of the medium still widely considered classics. Interestingly enough, film also experienced several technical and artistic waves. (The “maturation of computer-generated effects” probably being the academic-sounding, retrospective categorization of today’s “wave.”)
Video games have not yet experienced a true second artistic wave (the 2D/3D divide is of a technical nature). The gameplay advances of Grand Theft Auto 3 (namely, go wherever you want, do whatever you want, follow a story or play randomly) have inspired countless similar titles the same way that DOOM began a wave of 1st person shooters in 1993. They offered different experiences, but neither was the quantum leap experienced by movie-goers attending the first “talkies” in the 1920’s. Newer hardware generations have enabled new features (namely graphical, some incredible advances in AI) in first-person shooters (and eventually bigger, prettier worlds in GTA-style games).
Anyone who says that once Roger Ebert would play Halo or any other mass-market game, Ebert would develop a huge appreciation and change his mind is simply wrong. Halo’s story serves merely to give the player a reason to shoot things. Similarly, Grand Theft Autos’ stories (any game in the series, even way back into its 2D, overhead days) simply provide a reason to take part in the shenanigans for which the games have become (in)famous. Limiting the lens to newer games, even the story of Metroid Prime is just a tool the developers used to make the shooting more compelling, not the other way around. It’s not that there aren’t story-driven titles among newer games, it’s just that in popular games it’s supplemental. I know there are ambitiously enthusiastic fans of the story in the Halo games, but ask yourself if you’d play the game any differently if there were no story, just the mission goals list, then shooting things until the next numbered list appears, rinse, repeat. So, no, I’m not claiming that “new” (more accurately, “popular”) games are bad, just that they serve as poor evidence in one’s claim that video games are narratively engaging.
So, as a bit of a disclaimer, I’d consider my interest in current video games to be passive. I’m interested enough to read game reviews or watch someone play for about 10 minutes or so, but I don’t participate. I own no consoles and the video card in my computer is from 2001. I’ve watched people play FarCry, Half-Life 2, DOOM III, Grand Theft Auto: San Andreas, both Halos and on and on. (Those games are almost all shooters, but a complete list would be excessive, and these are some of the most popular and loudly defended of the last couple of years). I hold no grudge towards new games, but my personal “golden age” of video gaming passed sometime in the late 90’s. Most recently, the games I’ve spent any considerable amount of time playing have been the Genesis version of John Madden NFL ’98 and the arcade version of Super Puzzle Fighter 2 Turbo, both running on a friend’s modded Xbox. As the understatement of the year, neither of these games is exactly what we’d consider story-heavy, but in terms of bringing things full circle, provide a very social experience with a group of people, exactly what is marketed as the number one feature of Microsoft’s Xbox 360, not its currently man-beast-esque hardware capabilities.
Directly addressing Roger Ebert, we’ll now present the definitive example of the “video game as art” discussion:
“I did indeed consider videogames inherently inferior to film and literature. There is a structural reason for that: Videogames by their nature require player choices, which is the opposite of the strategy of serious film and literature, which requires authorial control.”
At risk of this becoming a “my favorite game is more obscure than your favorite game” reverse sales-measuring contest, let’s first throw out every RPG. Ebert’s comment about “authorial control” initially sounds too heavy-handed to be anything other than hyperbole, but it should be painfully accurate to RPG players. A decently modeled RPG lets the player assume the “role” of a character (or group of characters); the user can choose for his game play experience to be as dull as desired. His or her experience will be different than another player’s. Sure, that sounds ideal, even enough to potentially consider that to be the ideal example of the one thing that would skew video games toward “art” status. But think of Choose Your Own Adventure books; they offer a choice in the reader making his or her own story. The first reaction to that is, “But they’re kids books, they’re not supposed to be good.” It goes without saying that there are plenty of widely appealing kids’ books, and if there were adult-oriented Choose Your Own Adventure books, would anyone read them? Would they be considered “literature?” Nope; and for good reason. It’s just a gimmick.
There are two story-heavy genres in video gaming. Role-Playing Games and Adventure games. (This is where I’m looking to avoid the obscurity-related reverse penis size contest.) RPG’s having already been justifiably thrown out, that leaves Adventure games. Most anyone with a passing interest in video games has played an adventure game, but their popular peak was both dramatically short and intensely focused on one title (which really wasn’t that great of a game, all things retrospectively considered ). MYST (aside from being considered the “killer app” for PC CD-ROM drives) was hugely successful, and was undeniably an adventure game. There was a set story, and very little room for non-linearity; provided you could figure out the “oh yeah, I guess that makes sense” puzzles, you were undoubtedly under the control of some “authorial” figure as you played. Though the graphics contributed to the overall mood, it was really the story and “art direction” that truly established the player’s sense of loneliness on the island throughout its history. The story created the puzzles (the single element of “gameplay”), not the other way around. Though this isn’t a review of MYST, it needs to be noted that it actually offered a rather passive gameplay experience; the puzzles were simplistic, the story, dull, but the mind-bendingly amazing (for 1995) graphics sold everyone. Unfortunately, it became the benchmark for the Adventure game genre, causing most everyone to think them dull and pretty-yet-vapid after most people were left thinking “Gee, I don’t get it” after either finishing the game (or more likely) giving up after getting one’s fill of pretty pre-rendered pictures.
With the genre’s prime and popular example painting such an ugly picture for average users as time went on, the “mass market” PC gamers moved back towards more interactive games (such as Quake, more-or-less the beginning of the PC’s true 3D boom). During this time, adventure games were still being made, and George Lucas of all people was responsible for some of the best. Okay, George Lucas’ company actually made them, but trivia’s trivia. Sam & Max Hit the Road, Day of the Tentacle, Indiana Jones and the Fate of Atlantis, and Full Throttle were some of the best received adventure games of their time and are widely considered the classics of the era. Just like every every other genre eventually shifted to 3D, adventure games followed suit.
LucasArts’ first of two 3D adventure game offered one of the most interactively cinematical experiences in games, ever, no matter the genre. Limiting the focus of the “video games as art” discussion to whether or not video games present the “authorial control” apparently required by Grim Fandango offered the “authorial control” of a movie, while engrossing the player in ways that movies and books simply can’t. Loosely inspired by some sort of Central American mythology of multiple underworlds, and souls wanting to end up in the final, 9th underworld, you play as Manny Calavera, a sort of travel agent in limbo between life and death. Manny “sells” travel packages for different routes to the underworld; the better the prospective travellers lived their lives, the quicker their trip to the 9th underworld. The “cleanest” souls get to ride on the “Number 9” train which speeds them right to heaven,while those that face the travel agent (Manny) with more regrets are stuck with the less desirable methods, notably the long, dark walk through the underworld. All that is the setting, the actual story involves a conspiracy that Manny begins to uncover as he realizes that he keeps getting the “lesser” souls, and due to his unmentioned sins of his human life, he’ll be stuck in limbo forever. Along the way, he meets a special lady, gathers a sidekick, meets a mortician performing an autopsy (one of the harmlessly creepy characters in any movie, game, book, etc.) and ends up having to shoot someone to save his life. (In case you’re wondering, to kill a dead person, you apparently shoot them with a bullet that sprouts flowers, similarly to the earth “taking over” a body buried in the ground.) Without spending forever talking about this game, I’ll simply say it’s a more cinematic experience than many movies: the story is the game, the voice actors are top quality, the art direction (which somehow combines Latin American influences with Art-Deco) compares with any Hollywood production, and it offers an ending more emotional than most movies.
Which brings up the final thought: How many people have played Grim Fandango? How many have even heard of Grim Fandango? Not many. In fact, it’s usually considered the ultimate symbol of the adventure genre’s waning popularity. It came out in 1998, one year after Sierra had abandoned the King’s Quest franchise. Critical reviews were immensely favorable, but sales were not. Escape from Monkey Island, which was ultimately the final LucasArts adventure entered and exited with a whimper as sequels to two of their most popular adventure franchises were cancelled for ‘current marketplace realities’ and ‘creative issues,’ reverse respectively. In any medium there’s a distinct divide between the commercial/popular and the artistic. There is sometimes cross-over between the two, and it seems that most fans of the “artistic” baselessly resent fans of the popular merely because it is the “popular.” Music has thrived with that divide, and the “indie” boom of the mid-90’s brought that awareness to the world of movies. Even today, Rolling Stone and Spin’s editors campaign for the “latest, greatest, obscurest” new music while movie critics practically bet their credibility in defense of those same three superlative adjectives on some not yet known about “indie movie.” Thankfully, we don’t live in France where critics have been known to defend bad movies just to prove a point. Importantly, there is no true “indie” vs. otherwise divide in video gaming. There are no critics willing to champion some unheard of game for the sake of getting more people to experience it. People’s expectations for video games are drastically different than for other media, and even with the internet, there is no true “indie” movement that produces and distributes unheard of games the way that the major movie studios have arms dedicated to picking up obscure movies. There’s simply no My Big Fat Greek Wedding in the world of video games. The infrastructure isn’t set up to “get the word” out, and I’ll go out on a limb and say that in a society greatly affected by advertising and shiny things, video gamers are especially vulnerable to this advertising.
Looks like we covered a lot more than just “Roger Ebert and Video Games” in this one, so here we go, emptybookshelf’s first three-headed review! Let’s hear it for innovation.
Roger Ebert’s take on Video Games receives two-and-half stars due to the fact that as well as he defends himself, he can’t help but come off as just another old person afraid of what the kids are up to. “Oh my God! How could a bunch of moving pictures ever be better than having the actual, live actors in front of the audience?! That’ll never work!!” Unfortunately for the video gaming industry, he has a decidedly correct take on the “games as art” issue. Judging just the popular games, he’s hit the nail on the head; they are diversions where interactivity is thought to remove the need for story. Aside from the fact that he’s said he has not played games, if he were to ever pick up a controller/mouse/keyboard/bongo, he wouldn’t be playing anything remotely cinematical. At the risk of going on yet another tangent, just the fact that you can use bongos to play a video game says something about them compared to movies.
Though I claimed this wasn’t a review of Grim Fandango, I can’t help but consider this an ideal time to “star” it. It receives five stars for being the most engrossing of all adventure games, and dare I say, any game. That isn’t to say that it’s the best game ever, just the most cinematical, and in a non-girly way, potentially the most beautiful.
Internet Fanboys receive one star due to the fact that their existence and pedanticism make it so a review of such a contentious topic needs to go on so many sidetracks. There’s something to be said for being enthusiastic about something, but there’s also something to be said for having some perspective. Not-so-oddly enough, Roger Ebert himself, probably one of the wittiest people on the planet, summed up the whole “fanboy” thing quite well in his review of Hackers:
You should never send an expert to a movie about his specialty. Boxers hate boxing movies. Space buffs said ‘Apollo 13’ showed the wrong side of the moon. The British believe Mel Gibson’s scholarship was faulty in ‘Braveheart’ merely because some of the key characters hadn’t been born at the time of the story. ‘Hackers’ is, I have no doubt, deeply dubious in the computer science department. While it is no doubt true that in real life no hacker could do what the characters in this movie do, it is no doubt equally true that what hackers can do would not make a very entertaining movie.”