Football (in general) pt.2

In response to my own review about football…

As quickly as games get more interesting, they can just as easily slow baaaaaacckkk doooowwn. Hopefully the end will pick up a little bit.

**

There are 52 seconds left, and it just seems like the Seahawks aren’t going to get it done. When the end seems so obvious, it gets hard to watch.

Thinking You’re Doing Something Original

Site note: to potentially get some commenting going on, you can now leave a comment without needing to fill-in an e-mail address. What’s that sound? Why, it’s the sound of accountability going out the window!

Hmmph. I never thought we’d be “pioneers” in this little internet endeavor, but I did think that, at the minimum, we’d bring something new to the table in some capacity. Well, now that we’re about two-and-a-half months into its existence, I’ve found that, no, we’re not even bringing anything new to the table.

Tis true
It’s true.

The story goes something like this:
Nate’s friend Pete submits this very website into a “community-driven links database” called digg.com. The way it works is anyone can submit a link and its description, then the “community” rates and sort of reviews it. No, digg.com isn’t the source of the frustration; it’s not a “reviews” site by any stretch of the imagination. It provides a framework for categorizing and ranking links that usually have to do with technology, computers, science, etc.; that’s about it. Looking at what Nate’s friend submitted to digg, we see that four people “digg-ed” it and two people thought they had something worthwhile to add to the discussion. Mr. “schwit,” playing the typical “internet”-role, informed everyone (in the form of a question, of course) that this link (our site) has nothing to do with technology, while a second commenter, “JohnH,” trotted out the straw that finally broke the camel’s back, explaining that although “[we’re] ok, [I] thought Lore did it better though: http://bookofratings.com.”

Please everyone, click that link, and just like me, die a little inside.

I had mentioned the existence of the “Lore”-person’s site to archives, and saw a very disturbing link to “buy the print version.” This led to the following e-mail to Nate and Adam on 1/1/06 (please note, I’m hilarious all the time, not just when writing reviews):

(Adam, this is a follow-up to a conversation about the below topic that I had with Nate” I’m sure you can follow without needing to have the conversation explained to you.)

Here’s the digg.com link.

This is the website: http://www.bookofratings.com/ It looks like he might’ve stopped updating in 2003.

It looks like this Lore person (he’s from San Francisco apparently) managed to actually publish a book of his reviews.

I’d wager that the “Editorial Review” was written by the author, but I won’t hold that against him. What I will hold against him is that fact that he’s practically completely beaten us to the punch and even (potentially/probably) made some money off of it. He even reviewed the seven deadly sins one by one (you can see it in the “look inside this book” on the amazon site).

I’m okay with the concept of someone else doing a “wacky, random, etc'” reviews website, but looking through the Amazon reader reviews I see: “Now a lot of you “simple minded” folk out there might not be interested due to Lore’s advanced and half made up vocabulary.” Now that’s just plain old reverse gimmick infringement. It doesn’t look like he reviews abstract concepts (“The Hype Surrounding This Week’s Trading Spouses,” Verbally Harassing Horses,” etc.”) but that’s probably just because I haven’t looked closely enough through his archives. The “Old Trading Cards I Bought at a Shop in San Francisco [Parts 1-3]” really seals our fates as imitators. Looking at the left of his reviews page, he has a list of other sites/projects. I’m afraid to click on them as I’m sure that one of them retells the story of his production of an action movie about Ben Franklin in 1999.

Now more than ever, we suck.

At the risk of simply repeating the rather straightforward e-mail…that’s right, reverse gimmick infringement. That way, we can blame him for copying us before we even did it. It doesn’t make much sense but it helps me sleep at night.

His reviews are all much shorter than ours, and it seems he likes reviewing things in list form (such as those baseball cards or “Aspects of Santa part 2”), but he always brings the funny. The reviews aren’t the most insightful, but that’s not his goal. For example, when reviewing “Stuff in the Airline Catalog,” one of the many items evaluated is an Authentic Pachinko Machine about which he says, “I’m just glad it’s authentic, because once I ordered a pachinko game and I forgot to check the “authentic” box and they sent me one of those little Cracker Jack toys where you have the get the little bee-bees on the puppy’s eyes or something and it lacked that authentic pachinko experience that I was hoping for.” To get an impression on the general length of his reviews, that’s the whole thing for the “pachinko machine,” but it was one of the six items in the “airline catalog” review. Disturbingly, it sounds just like something Nate or I would say, except this guy said it sometime before 2003, a good 3 years ago.

Nate mentioned that many of the reviews are focused on “internet-popular things,” and we usually avoid that stuff, but to be fair for each “Dungeons and Dragons“-related review, he has one like “Types of Band-Aids.” Because some none of you are looking to mesh his reviews with ours, he uses a traditional letter grading, making his “A+” equivalent to our….oh, nevermind.

½

Thinking You’re Doing Something Original receives half of one star due to the fact that not only is the internet unfathomably huge, it’s been huge for quite some time, and is getting, uh, huger, and that combination means that the likelihood of anyone doing something original dwindles each day. I mean, that’s fine, it’s progress and all, and don’t worry, we’re not going to be like Natalie Portman’s character in Garden State and being wacky/weird/random for the sake of originality, but I can’t help but think I won’t be at least a bit self-conscious about making sure I don’t review things that are already reviewed by our more trailblazing precursors. I’m sure that most people (meaning: our readership) probably don’t consider this to be too big a deal, but the amount of time it takes to write a what-we-hope-to-be-good review, much less maintain the website, it’s frustrating to see it sort of in someone else’s “been there, done that” category. Yeah, yeah, we know that we choose to spend the time writing, maintaining, etc., and we know that we’re only “busy” for the amount of time that we choose to spend, but still, it’s the principle of it. We get the half-star because our reviews aren’t one-trick ponies and we do evaluate serious things every now and then, something Mr. Lore seems to be too good for.

Pitchfork Media’s Top Album and Top Single of 2005

Obviously, at year’s end the opinion-based media puts out best of/worst of collections to (again, obviously) serve as “year-in-review” without going month-by-month, the same way that respected publications don’t review albums track-by-track, instead covering the highlights of the “grand scheme” of the album, then investigating the highlights of particular tracks. Pitchfork Media, which is one of the leading internet-based music sites is known for their devotion to “indie” music, though they do review non-indie music, specifically higher profile hip-hop/rap albums. To get a feel for their editorial slant, the best comparison is that the way that Spin seems “indier” than Rolling Stone, Pitchfork Media is just as “indier” to Spin. In fact, Pitch Fork Media sometimes makes even Spin seem like Tiger Beat (to keep the magazine theme going). To get an impression of Pitchfork’s expecations, Weezer’s most recent album, was given a 0.4 out of 10. Weezer never really spent much time as an “indie” band, as “The Sweater Song” and “Say it Ain’t So” were rather successful singles from their debut album, but their other albums have been reviewed in less than a favorable light, and needless to say 0.4 out of 10 is the type of review you’d give to a band that has carnal knowledge of one’s mother, especially considering how much other music came out in 2005 that they didn’t even bother to review. For those of you concerned about bias showing up in my review, don’t fret; I didn’t like the most recent Weezer album either, but it wasn’t 0.4 out of 10 bad, more like 5.5 or 6. Keeping in mind that if I were to give them a 0.4, I can’t imagine to what degree my personal reviewing index would be messed up if I had to somehow figure out what I’d give a Scott Stapp CD. With such strong opinions, they’ve developed some “haters.” Tuning Fork is a blog which (supposedly) reviews Pitch Fork’s reviews, and Sub Pop made a parody site entitled Popdork News. Enough about Pitch Fork Media. They gathered lists of the top 50 albums and singles of the year, and I’m going to review their top picks.

I\'m sorry
A likely pitchforkmedia.com reader.

First and foremost, this review will be of both the top picks together. That way, it’s a bit more than just reviewing the top album and single, it also leaves room for nitpicking of the pitchfork folks, though honestly, I’ll focus on the music, as the musically pretentious are usually critic-proof, if not critics themselves.

As the top album selected, Sufjan Stevens’ Illinois is quite the challenging pick, if only because The White Stripes released an album in 2005 and music critics love The White Stripes. In full disclosure, I had previously heard two songs from from the record on the radio, but I didn’t actually realize they were from this record, so I was pleasantly surprised that once I tracked down the album, I could finally place the songs that had wondered about months prior. Well, the album itself is quite a collection. Mr. Stevens apparently has designs to make an album about each of the 50 states, but he’s in his 20’s and only has Michigan and Illinois to show for it in three years. That doesn’t change the quality of the music, but unless he ramps up his output, he’s going to be one of the busiest 140 year-olds I’ll have heard of.

The album which, as the title suggests, involves the state of Abraham Lincoln and covers all sorts of Illinois-centered topics, including such random topics as Casimir Pulaski Day. I’m not reading between the lines: track 10 is titled “Casimir Pulaski Day.” In terms of whether or not the album meets the stringent requirements of Pitchfork Media? Let’s investigate.

  • singer-songwriter vibe? — check
  • bizarre instrumentation? —check
  • simultaneously straight-forward and “deep” lyrics? — check
  • a bit off-putting on the first listen-thru? — check

Moving away from those categories, in all seriousness, it’s a particularly solid album. Most every song is memorable, and none sound alike yet aren’t out of place. The highlights, Track 1: “Concerning the UFO Sighting Near Highland Illinois,” Track 3: “Come on! Feel the Illinoise!,” and Track 9: “Chicago.” Whether or not it’s fair to be called “The Best Album of 2005” is not something I’m prepared to answer. Music critics live ina weird world where they seemingly are only interested in current music. Reviews aren’t released for albums that came out months ago and were missed; that’d show that the critics weren’t on top of things, and they lose “indie rock cred.” Needless to say, they listen to a lot of music in a calendar year, so being that I probably haven’t heard much of it, I’ll reserve judgement. Being that I’m not a music critic and have the luxury of seemingly have “older” music be my “new” music, I’ll say that “Give Up” by The Postal Service was probably my favorite album new to me in in 2005, though it came out in 2003. All of that in mind, I’d consider “Illinois” to potentially be a fair pick for 2005 once I have enough hindsight working for me.

Pitchfork’s top single was a bit less agreeable to me. They picked “Hope There’s Someone” by Antony and the Johnsons. Nothing like (what I assume to be) a big sweaty white guy plaintively singing love songs like a big sweaty black woman. I’ll mention that his schtick is of the variety that “those that get it love it, and those that don’t get it never will.” Oh, I get it, and it stinks. People that can’t stand sardines aren’t “not getting it;” they’re just more sensitive to crap. To be fair, I’m talking only of the single, “Hope There’s Someone,” I’m not sure of the whole of the album, but Pitchfork didn’t pick it as its #1. The song itself…well, I’ll let the opening lyrics speak for its “message”…

Hope there’s someone
Who’ll take care of me
When I die, will I go

Hope there’s someone
Who’ll set my heart free
Nice to hold when I’m tired

Now, let it be said that there’s nothing wrong with those words, in fact, if I didn’t have to keep up my tough-guy persona, I’d even call them “nice.” But, are they, as a Pitchfork writer called them part of a “quiet, unself-conscious elegy for that long-lost bohemia, which was eventually decimated by AIDS, drugs, gentrification, and, perhaps, its own success” No thank you.

Musically, you’d like listening to Antony‘s singing if you’re intrigued by someone singing just like Aaron Neville, but in the Alto range instead of the Soprano. Strike One. You say you also liked to listen to that guy in highschool who’d sit and the piano with the knowledge of three chords and that banging the keys makes everything more dramatic. Strike Two. Being that this is Canadian Baseball, the song’s out.

***

Pitchfork Media’s Top Album and Top Single of 2005 receives three stars due to their insistence on keeping up their “indie rock cred” no matter what the cost. Of course, their top 10 singles were all over the place in terms of the whole indie rock thing (though “Since You Been Gone” does deserve a place on the list), but there’s nothing wrong with expanding horizons, at least temporarily. The top album was a great choice, but their top single fits the stereotypes just too well.

Block I of CUCVM, the Human Ribcage (1984 model), and Misc.

Note: I originally rated my final subject as a 1.5 star performance due to what I felt was excessive plagiarism. However, before I could publish this, our dictator chose to steal my triple-review format (though he admits this freely). Therefore, I awarded myself another half-star. Damn, Commies.

Okay, I’ll be honest…I’ve been slacking it on this website. Many (meaning the eight dedicated readers we have) may remember when Nate even semi-demanded me to start posting. Maybe it was because I was a little too proud (probably not), but mostly it was due to time constraints and the simple fact that I have very little occurring in my life that I consider worthy of rating or that I could make it remotely entertaining. Now that I’ve graced the pages of this website, does this mean that I’ve overcome these obstacles? Absolutely not. Unfortunately for all of us, this probably won’t stop any of our readers from continuing. So, now that I’ve managed to take advantage of what little downtime is so rarely presented to me during my endless pursuit of a respectable career, what have I chosen to review? I knew my topic had to be gripping, with bouts of violence, wit, romance, all while being constantly intelligent and entertaining. Of course, I threw all of these notions out the window when Dan decided to yet again pansy-fy me publicly (again, having readers in the single digits makes it no more public than usual, but it’s the principle that the public could eventually read it). So gather ’round as Adam recounts his version of the story.

So we were playing football the day after Thanksgiving. There were about ten people there, which is a pleasantly complete number, as any larger might encourage Dan to set up real offensive and defensive lines and eventually result in a lot of cussing. All of the Goletz siblings were there: Greg(g), Dave, and Tim. The significance being that they all share the same genetic foundations and therefore it can be inferred that Dave and Tim possess judgement skills about on par with Gregg. Dan duffed a pass (not unusual in the cold) as I was crossing over, but not looking in the general direction of the play. Following my QB’s directive I picked up what I thought to be a fumble and was quickly pushed down by my defender, one Josh “Barney” Clark (notice that had it been a fumble, I would have been ruled down by contact at that point). Landing flush on my right side, I took a moment to make sure the ball was secured, only to see Tim Gloetz run up, and jump into the air before landing on my left arm and pushing it into my chest.

To say the least, the pain was exquisite, but seemed well-focused in the upper left of my ribs, much more so than just having the wind knocked out of me. The downside to having completed the anatomy intense portion of a sorta-medical program and having not yet started the physiology intense portion is that when an injury occurs, you automatically think of everything that may have gone wrong, but have no idea how to definitively diagnose it. Big words like atelectasis, hemomediastinum, and pneumothorax began running through my head, though I couldn’t quite remember what any of them meant. Anyway, I shook off the pain after a few plays of minimal movement and went on to have Barney fall on my chest as well as Gregg, during a sports-blooper reel-worthy post-interception clobbering.


Ouch…my pride (though it looks more delicious as it is tenderized)

Skip ahead to the following Tuesday. By this time, all of my bruises had healed, yet the chest pain persisted. Worse yet, it had seemed to intensify beyond my perception of a deep muscle bruise and prevented me from accomplishing much in the department of physical activity.

All of this lead to the following conversation between a physician and me:

Dr (compressing 3rd rib): Does this hurt?
Me: Yes.
Dr (4th rib): Now?
Me: Yes.
Dr (5th rib, as I watch my chest push inward): Now?
Me: Christ! Don’t do that again!

So…yeah. I broke my left fifth rib about three centimeters from the sternum; my first professionaly-confirmed broken bone. It hurts like Hell and never seems to get better given that both treatments for it only seem to worsen the condition. Local heat, meant to increase blood flow to the area and thus facilitate healing, also increases pressure on the chest. Icing the area, meant to bring down the swelling, does so but results in the bone becoming more mobile, irritating the area and swelling yet again.

***½

Block I (the Animal Body) of Cornell University’s College of Veterinary Medicine, receives 3.5 stars for it’s ability to cram 1.5 years of anatomy into a 2.5 month curriculum and still manages to assist students in retaining the information. Unfortunately, the program loses points due to its questionable layout in teaching students locations and relationships of organs and body systems, but waiting to explain functions/dysfunctions of said systems until well into the students second year. This inevitably leads to worry over the numerous conditions that may occur without having any information or knowledge to confirm or disprove those worries.

***

The 1984 model of the human ribcage receives 3 stars, based purely on its stellar reputation for protecting vital organs, yet its apparent (and hopefully rare) failure to hold its integrity after only 21 years of use. 3 stars may seem like a generous score for a product that, in all honesty, failed to meet my expectations. It’s also a scary thought to consider that the first component of the structure to fail was located in an area that, had the break been more serious, could have compromised my trachea, esophagus, heart, and lungs. However, it gains a few points given that its only failure to this date was at the hands of the Goletz brothers…who, as history has proven, serve only to destroy all that is good and bring misery to the world.

**

Adam’s first review receives 2 stars on account of its complete lack of focus and inability to capture the true, judgemental spirit of this website by serving to tell a story more so than constructively reviewing a topic. I mean (come on!), what’s with all the cheap shots at the Goletzs-es? Throw in the blatant plagiarism of Dan’s earlier post in an attempt to mock him, and you’ve got a pretty piss-poor review. It’s only saving grace is that it was a decent inaugural effort and displayed a touch of originality in its (once) unprecedented uber-triple-header format. Let’s hear it for obnoxiousness.

Roger Ebert’s Take on Video Games

Quick site note: This is the first review of either many or zero more that will use “tips.” When hovering over some links, text will pop-up near your cursor. We’re not yet sure whether it’s annoying or if it enhances the writing. I especially find myself drowning in a sea of parenthesis, and these “tips” solve that problem in a way that writing on paper never could. Feedback please.

Lately the non-review sections Roger Ebert’s website have been filled with discussion on the merits of video games versus movies, and “the internet” has been abuzz with him being an out-of-touch old man. His weekly answer-man column has addressed the issue multiple times, namely his lack of interest in video games, in general. I can’t find the absolute starting point for the whole debate, but I think it has to do with a reader objecting to Ebert’s awarding of one star to the movie adaptation of DOOM (he uses a four star system for those of you wondering how to reconcile his reviews with ours.). The reader basically took offense at his generous one star review because one section of the other-wise unremarkable adaptation paid super-close homage to the game. Ebert sufficiently served the reader by explaining that video game websites review movies on their own terms, and he will continue to review movies on his. What started the “controversy” was his final comment in his reponse:

“As long as there is a great movie unseen or a great book unread, I will continue to be unable to find the time to play video games.”

This lead to (what I can only assume to be) countless angry letters of video game fans defending their XBoxen and poorly translated, endlessly sequeled, Japanese-sourced games (i.e. the Metal Gear and “Final” Fantasy series, etc.). True, that’s my bias showing through, but in the response to the letter that Ebert decided to run, he explained:

“I am prepared to believe that video games can be elegant, subtle, sophisticated, challenging and visually wonderful. But I believe the nature of the medium prevents it from moving beyond craftsmanship to the stature of art.”

That’s the one that really got the internet in a tizzy.

ebert
You’d think that after writing all of this, I’d be able to think of a funny caption. Well, that’s not the case.

The problem with video game fans (in general) is that they are relentlessly but selectively enthusiastic about their “art” of choice. There’s no point for me to write an e-mail to the movie “Answer Man” being that it will be lost in the mountains of “You’ve never played Halo, Resident Evil, Final Fantasies 1 through 12, etc. so you suck” type letters (not that I’d assume it’d automatically be printed of course. I’m sure that every e-mail is read, but I think Jim Emerson (The site’s editor/blogger) probably handles most of the filtering). So, being that I have my own internet soapbox that ends in .com, here I go.

Without backing any of it up with fact or definitive history, I can guarantee that every medium of art has had to deal with detractors. Movies weren’t widely accepted as having any worthwhile value at their inception, especially considering that mankind had gotten used to the previous status quo from the past two-thousand+ years of seeing live actors performing on stage. On top of that, even movements within each art form have faced critics (again, with the lack of facts or evidence). People still argue about the merits of Jackson Pollack imagine hundreds of years ago, when Baroque music was developing and becoming (again) the status quo, and *gasp* didn’t base all of its harmony on the 4th. Sure, the now “normal” root-3rd-5th harmony sounds right, but back then a lot of people didn’t like it one bit due to the “profane” nature of the major-3rd harmony (in terms backing that up, I’ll hold a music professor I had responsible for defending that bit of trivia). In philosophical terms, the video game “medium” is about 25 years old and in only its second major movement. Consider the first to be the 2D era, started with the first Atari system and ended with Super Nintendo and Genesis. The second is the 3D era, started with the Sony Playstation, Nintendo 64, and Sega Saturn. The third (sub)wave of the 3D era began in late November of 2005 when Microsoft began shipping the XBox 360. As mentioned earlier, gamers are notoriously selective in their passions, and some choose to be passionate about the hardware aspect of video gaming, so in response to those people:

  1. I know I’m glossing over lots of other systems.
  2. I know that Genesis came out before Super Nintendo.
  3. Atari probably wasn’t the first system, but realistically it started the whole console “thing.”
  4. I know that 2D games have come out for the “3D” systems — especially Sega Dreamcast, but 2D vs. 3D is too significant of a divide to ignore.

Structured music went through quite a bit of development before anything gained an historical foothold; specifically, J.S. Bach and Handel are still being widely performed today while almost the whole of still-existing Renaissance and Medieval music is relegated to prominence in academic environments only. Sure, movies “came of age” much more quickly than music (or even painting), with films from about thirty years after the proliferation of the medium still widely considered classics. Interestingly enough, film also experienced several technical and artistic waves. (The “maturation of computer-generated effects” probably being the academic-sounding, retrospective categorization of today’s “wave.”)

Video games have not yet experienced a true second artistic wave (the 2D/3D divide is of a technical nature). The gameplay advances of Grand Theft Auto 3 (namely, go wherever you want, do whatever you want, follow a story or play randomly) have inspired countless similar titles the same way that DOOM began a wave of 1st person shooters in 1993. They offered different experiences, but neither was the quantum leap experienced by movie-goers attending the first “talkies” in the 1920’s. Newer hardware generations have enabled new features (namely graphical, some incredible advances in AI) in first-person shooters (and eventually bigger, prettier worlds in GTA-style games).

Anyone who says that once Roger Ebert would play Halo or any other mass-market game, Ebert would develop a huge appreciation and change his mind is simply wrong. Halo’s story serves merely to give the player a reason to shoot things. Similarly, Grand Theft Autos’ stories (any game in the series, even way back into its 2D, overhead days) simply provide a reason to take part in the shenanigans for which the games have become (in)famous. Limiting the lens to newer games, even the story of Metroid Prime is just a tool the developers used to make the shooting more compelling, not the other way around. It’s not that there aren’t story-driven titles among newer games, it’s just that in popular games it’s supplemental. I know there are ambitiously enthusiastic fans of the story in the Halo games, but ask yourself if you’d play the game any differently if there were no story, just the mission goals list, then shooting things until the next numbered list appears, rinse, repeat. So, no, I’m not claiming that “new” (more accurately, “popular”) games are bad, just that they serve as poor evidence in one’s claim that video games are narratively engaging.

So, as a bit of a disclaimer, I’d consider my interest in current video games to be passive. I’m interested enough to read game reviews or watch someone play for about 10 minutes or so, but I don’t participate. I own no consoles and the video card in my computer is from 2001. I’ve watched people play FarCry, Half-Life 2, DOOM III, Grand Theft Auto: San Andreas, both Halos and on and on. (Those games are almost all shooters, but a complete list would be excessive, and these are some of the most popular and loudly defended of the last couple of years). I hold no grudge towards new games, but my personal “golden age” of video gaming passed sometime in the late 90’s. Most recently, the games I’ve spent any considerable amount of time playing have been the Genesis version of John Madden NFL ’98 and the arcade version of Super Puzzle Fighter 2 Turbo, both running on a friend’s modded Xbox. As the understatement of the year, neither of these games is exactly what we’d consider story-heavy, but in terms of bringing things full circle, provide a very social experience with a group of people, exactly what is marketed as the number one feature of Microsoft’s Xbox 360, not its currently man-beast-esque hardware capabilities.

Directly addressing Roger Ebert, we’ll now present the definitive example of the “video game as art” discussion:

“I did indeed consider videogames inherently inferior to film and literature. There is a structural reason for that: Videogames by their nature require player choices, which is the opposite of the strategy of serious film and literature, which requires authorial control.”

At risk of this becoming a “my favorite game is more obscure than your favorite game” reverse sales-measuring contest, let’s first throw out every RPG. Ebert’s comment about “authorial control” initially sounds too heavy-handed to be anything other than hyperbole, but it should be painfully accurate to RPG players. A decently modeled RPG lets the player assume the “role” of a character (or group of characters); the user can choose for his game play experience to be as dull as desired. His or her experience will be different than another player’s. Sure, that sounds ideal, even enough to potentially consider that to be the ideal example of the one thing that would skew video games toward “art” status. But think of Choose Your Own Adventure books; they offer a choice in the reader making his or her own story. The first reaction to that is, “But they’re kids books, they’re not supposed to be good.” It goes without saying that there are plenty of widely appealing kids’ books, and if there were adult-oriented Choose Your Own Adventure books, would anyone read them? Would they be considered “literature?” Nope; and for good reason. It’s just a gimmick.

There are two story-heavy genres in video gaming. Role-Playing Games and Adventure games. (This is where I’m looking to avoid the obscurity-related reverse penis size contest.) RPG’s having already been justifiably thrown out, that leaves Adventure games. Most anyone with a passing interest in video games has played an adventure game, but their popular peak was both dramatically short and intensely focused on one title (which really wasn’t that great of a game, all things retrospectively considered ). MYST (aside from being considered the “killer app” for PC CD-ROM drives) was hugely successful, and was undeniably an adventure game. There was a set story, and very little room for non-linearity; provided you could figure out the “oh yeah, I guess that makes sense” puzzles, you were undoubtedly under the control of some “authorial” figure as you played. Though the graphics contributed to the overall mood, it was really the story and “art direction” that truly established the player’s sense of loneliness on the island throughout its history. The story created the puzzles (the single element of “gameplay”), not the other way around. Though this isn’t a review of MYST, it needs to be noted that it actually offered a rather passive gameplay experience; the puzzles were simplistic, the story, dull, but the mind-bendingly amazing (for 1995) graphics sold everyone. Unfortunately, it became the benchmark for the Adventure game genre, causing most everyone to think them dull and pretty-yet-vapid after most people were left thinking “Gee, I don’t get it” after either finishing the game (or more likely) giving up after getting one’s fill of pretty pre-rendered pictures.

With the genre’s prime and popular example painting such an ugly picture for average users as time went on, the “mass market” PC gamers moved back towards more interactive games (such as Quake, more-or-less the beginning of the PC’s true 3D boom). During this time, adventure games were still being made, and George Lucas of all people was responsible for some of the best. Okay, George Lucas’ company actually made them, but trivia’s trivia. Sam & Max Hit the Road, Day of the Tentacle, Indiana Jones and the Fate of Atlantis, and Full Throttle were some of the best received adventure games of their time and are widely considered the classics of the era. Just like every every other genre eventually shifted to 3D, adventure games followed suit.

LucasArts’ first of two 3D adventure game offered one of the most interactively cinematical experiences in games, ever, no matter the genre. Limiting the focus of the “video games as art” discussion to whether or not video games present the “authorial control” apparently required by Grim Fandango offered the “authorial control” of a movie, while engrossing the player in ways that movies and books simply can’t. Loosely inspired by some sort of Central American mythology of multiple underworlds, and souls wanting to end up in the final, 9th underworld, you play as Manny Calavera, a sort of travel agent in limbo between life and death. Manny “sells” travel packages for different routes to the underworld; the better the prospective travellers lived their lives, the quicker their trip to the 9th underworld. The “cleanest” souls get to ride on the “Number 9” train which speeds them right to heaven,while those that face the travel agent (Manny) with more regrets are stuck with the less desirable methods, notably the long, dark walk through the underworld. All that is the setting, the actual story involves a conspiracy that Manny begins to uncover as he realizes that he keeps getting the “lesser” souls, and due to his unmentioned sins of his human life, he’ll be stuck in limbo forever. Along the way, he meets a special lady, gathers a sidekick, meets a mortician performing an autopsy (one of the harmlessly creepy characters in any movie, game, book, etc.) and ends up having to shoot someone to save his life. (In case you’re wondering, to kill a dead person, you apparently shoot them with a bullet that sprouts flowers, similarly to the earth “taking over” a body buried in the ground.) Without spending forever talking about this game, I’ll simply say it’s a more cinematic experience than many movies: the story is the game, the voice actors are top quality, the art direction (which somehow combines Latin American influences with Art-Deco) compares with any Hollywood production, and it offers an ending more emotional than most movies.

Which brings up the final thought: How many people have played Grim Fandango? How many have even heard of Grim Fandango? Not many. In fact, it’s usually considered the ultimate symbol of the adventure genre’s waning popularity. It came out in 1998, one year after Sierra had abandoned the King’s Quest franchise. Critical reviews were immensely favorable, but sales were not. Escape from Monkey Island, which was ultimately the final LucasArts adventure entered and exited with a whimper as sequels to two of their most popular adventure franchises were cancelled for ‘current marketplace realities’ and ‘creative issues,’ reverse respectively. In any medium there’s a distinct divide between the commercial/popular and the artistic. There is sometimes cross-over between the two, and it seems that most fans of the “artistic” baselessly resent fans of the popular merely because it is the “popular.” Music has thrived with that divide, and the “indie” boom of the mid-90’s brought that awareness to the world of movies. Even today, Rolling Stone and Spin’s editors campaign for the “latest, greatest, obscurest” new music while movie critics practically bet their credibility in defense of those same three superlative adjectives on some not yet known about “indie movie.” Thankfully, we don’t live in France where critics have been known to defend bad movies just to prove a point. Importantly, there is no true “indie” vs. otherwise divide in video gaming. There are no critics willing to champion some unheard of game for the sake of getting more people to experience it. People’s expectations for video games are drastically different than for other media, and even with the internet, there is no true “indie” movement that produces and distributes unheard of games the way that the major movie studios have arms dedicated to picking up obscure movies. There’s simply no My Big Fat Greek Wedding in the world of video games. The infrastructure isn’t set up to “get the word” out, and I’ll go out on a limb and say that in a society greatly affected by advertising and shiny things, video gamers are especially vulnerable to this advertising.

Looks like we covered a lot more than just “Roger Ebert and Video Games” in this one, so here we go, emptybookshelf’s first three-headed review! Let’s hear it for innovation.

**½

Roger Ebert’s take on Video Games receives two-and-half stars due to the fact that as well as he defends himself, he can’t help but come off as just another old person afraid of what the kids are up to. “Oh my God! How could a bunch of moving pictures ever be better than having the actual, live actors in front of the audience?! That’ll never work!!” Unfortunately for the video gaming industry, he has a decidedly correct take on the “games as art” issue. Judging just the popular games, he’s hit the nail on the head; they are diversions where interactivity is thought to remove the need for story. Aside from the fact that he’s said he has not played games, if he were to ever pick up a controller/mouse/keyboard/bongo, he wouldn’t be playing anything remotely cinematical. At the risk of going on yet another tangent, just the fact that you can use bongos to play a video game says something about them compared to movies.

*****

Though I claimed this wasn’t a review of Grim Fandango, I can’t help but consider this an ideal time to “star” it. It receives five stars for being the most engrossing of all adventure games, and dare I say, any game. That isn’t to say that it’s the best game ever, just the most cinematical, and in a non-girly way, potentially the most beautiful.

*

Internet Fanboys receive one star due to the fact that their existence and pedanticism make it so a review of such a contentious topic needs to go on so many sidetracks. There’s something to be said for being enthusiastic about something, but there’s also something to be said for having some perspective. Not-so-oddly enough, Roger Ebert himself, probably one of the wittiest people on the planet, summed up the whole “fanboy” thing quite well in his review of Hackers:

You should never send an expert to a movie about his specialty. Boxers hate boxing movies. Space buffs said ‘Apollo 13’ showed the wrong side of the moon. The British believe Mel Gibson’s scholarship was faulty in ‘Braveheart’ merely because some of the key characters hadn’t been born at the time of the story. ‘Hackers’ is, I have no doubt, deeply dubious in the computer science department. While it is no doubt true that in real life no hacker could do what the characters in this movie do, it is no doubt equally true that what hackers can do would not make a very entertaining movie.”

Dan’s review of The Myth of Christmas Starting Earlier Every Year


Dan prefers to think the Nessie does exist, because there’s no proof that it doesn’t.

Here we go again. It seems as though, once again, my opinion is wrong and has been invalidated by our site’s speech-impairing oppressor, the same man who makes up words like “opinionary” for use in his reviews. The opinion in question is my agreeance with the masses that the Christmas season is starting a bit earlier than normal this year. I have presented four facts proving that the department stores, media outlets, and product manufacturers have started promoting Christmas-themed items well before Thanksgiving. I provided dates for numerous events that occurred this year, not some vague concept of a time long ago, yet his rambling review is supposed to have more credibility than mine, just because it came more recently? I don’t see how this can fly. Sure, my facts may be wrong, and if presented with proper evedence that shows Santa coming to the mall before November 19th in any past years, or The Grinch airing before November 13th in the past, well then I am all about offering a retraction statement. Unfortunately for my detractors, I have very high doubts about said evidence’s existence. The reality is that Walmart has gone on record stating that their campaign, which started on November 1st this year was the earliest it’s ever been. Toys R US sent their first catalogue out the day after Halloween. Looking at the internet, it seems that either most of the evidence seems to agree with me, or it’s just more popular to agree with my point of view, as I’ve found numerous articles from places like the Chicago Tribune, one of Upstate New York’s top news outlets, and Dan’s favorite, USA Today. Of course, there are stores who are still sticking to the more traditional Thanksgiving-time start to the season, but if just two of those stores would start earlier, I would still be justified in saying that some stores are pushing Christmas merchandise earlier.

I suppose I’m getting away from Dan’s review, so let’s look at it, paragraph by paragraph. First of all, the picture caption. It’s said that I hate Christmas. While I actually laughed at the caption, it’s simply not true. In fact, Christmas is probably my favorite holiday, because there’s actually something to do, unlike the boring Thanksgiving, the all-too-saccharine Easter, and the incredibly depressing Valentine’s Day. Not only that, but nothing in my review states that I have any dislike for the holiday.

Next, he states that I have offered no valid negative effects of Christmas coming earlier each year. If I would’ve offered the negatives, I’m sure I would’ve been chastised for taking up valuable space with cliched arguments that one can find anywhere else on the worldwide web. If my implications in the review weren’t enough, I’ll put them explicitly. The continued expansion of the Christmas season has led to a decline in the amount of celebration of the Thanksgiving holiday, and potentially soon-to-be the Halloween holiday. In addition, the purveyors of said trends run the risk of creating a dissatisfaction with the holiday spirit, weeks before the holiday actually arrives, making it all the less enjoyable for us, the consumer, and the people who have to deal with Christmas songs 24/7.

Next, it is stated that I offered no comparison to years before, which is completely inaccurate. I offer that Santa used to come on Thanksgiving, the entire reason the Macy’s Parade exists in the first place. I also offer that in my childhood, I don’t remember Christmas progamming starting until at least after Thanksgiving, as I used to consider the showing of Rudolph and Frosty to be quite early. I then go on to say that it is completely inappropriate for candy to be Christmas-themed before Halloween, mostly because I’m not used to it being sold that early.

After this, he misreads my attempt at satire (in this specific case, exaggerating the start of the Christmas merchandising season to begin in July) as completely serious. In reality, I was searching for a picture of Santa on the beach, but this was the best picture I could find. I in no way actually believe that the Christmas season would ever start before Halloween (there’s too much merchandising to be made in the Halloween holiday that Thanksgiving doesn’t offer, as well as running the risk of completely alienating their consumers), let alone July.

I suppose that by using this thought process, Dan is literally suggesting that I transform myself into some sort of sheep and time travel back to twenty years ago to see that Christmas music was playing on the radio on November 1st (which is not an exaggeration), see the err of my ways, and come begging on my knees for forgiveness for being “wrong”. I don’t pretend that I’m not agreeing with all the other half-wits who haven’t thought this through, but the last time I disagreed with all the sheep who were following each other in agreeance, I was ripped apart anyway.

I’m not averse to Christmas being a season. In fact, it is a season, and always has been in the Church calendar. But that season starts four weeks before Christmas. Even this year, with Christmas falling on a Sunday, and Advent actually starting five weeks before Christmas, the season doesn’t start until November 27th, again, after Thanksgiving. My point was that this is the first year that I’ve seen significant proof that the people who have been harping on this point for years might be right. My disclaimer at the end effictively showed that in order to see if this is true, we would have to wait until a few years from now. Because I did not have the forethought to write down specific dates of things in the past, does that mean that my opinion should be considered wrong and invalidated? I don’t think so.

**

Dan’s Review of The Myth of Christmas Coming Earlier Every Year gets two stars, mostly because he presented little evidence to prove his case, instead relying on meandering, obscure ideas about the grass being greener on the other side and the probability that old people are wrong simply because they complain a lot and don’t always remember things. I’m not saying that he is wrong, per se, just that it appears as though my evidence greatly outweighs his, thereby lending more credence to my opinion. In addition, for a review that was specifically not supposed to be a review about my review, he spent more time discussing the merits of my ideas, instead of presenting his own case. I may be lashed for speaking out against the upper management, but perhaps this serves to be the last of the unwarranted reviews of other people’s reviews, namely those presented by the Junior Staff.

The Myth of the Christmas Season Coming Earlier Every Year

The Junior Staff has done it again. Instead of reviewing his review per se, I’ll simply re-assess the topic through the lens of having read his review. The issue with his review is simply that it’s plain-old wrong and short-sighted.

Nate hates Christmas.
Nate hates Christmas.

I know that Nate is older than I am (by a whole two months) and that the onset of his old age is even less graceful than mine. Does this mean that he’s moved into the territory of old-cooted-ness? Apparently yes. He offers no truly negative issues relating to the ballooning of the “Christmas Season,” except that it might begin to eventually float into his late-September birthday. In fact, that very day is already marked by a number of historical events and feasts for a a variety of martyrs . Of course, Nate’s birthday isn’t included in these lists, but I’d wager that the populace at large would be more upset that the Christmas Season is encroaching on the anniversary of the Battle of the Sexes tennis match than Nate’s birthday. That out of the way, onto the more general aspects of “the myth.”

Yes, the whole “Christmas-thing” starts early every year. But earlier each year? I doubt it. The Junior Staff offers no comparison to either his youth, his parents’ youth, his grandparents’ youth, or even the creepy old guy’s down the street. In fact, he even says that he has “no historical evidence to back it up.” Now, I’m sure that “way back when,” it was different; the times when people walked to school uphill both ways and Christmas shopping, planning, etc. all began at 10am sharp on December 21st. Those were the “good old days,” and that’s the way they likes it (that’s not a typo). It would seem that the Junior Staff subconsciously remembers those times even though he was born during the Reagan administration. At least ten years ago (probably 15), I remember being at what was then the new BJ’s Wholesale Club on Airport Road. It was mid-September, and guess what, there was a section of the store selling Christmas junk (literally…like those robotic Santas that probably start hundreds of fires each year). Maybe the season starting earlier each year is more widespread than in the past, but it’s not like we see Christmas specials in July and August (Christmas in July sales aren’t Christmas sales, thank you very much). If retailers started pushing Christmas in the summer, it probably wouldn’t get very far, as even though there are people who get their Christmas shopping done extremely early in the year, increasing the amount of Christmas advertising early in the year won’t convert the sane people who take care of it nearer to the actual date.

If Nate wants to complain that it comes early each year, that’s one thing (though it would be a rather trite review, which is probably why he instead reviewed the concept of it coming earlier each year), but giving credence to the myth is just bad news. People like to complain and people like to think it was better in the past. It’s like the story of the sheep who wanted to graze in the neighbor’s grass because it looked better. They went over to the neighbor’s and started to graze, only to then wish they were back on the original side. Well, this whole Christmas Season nonsense is like those sheep, except instead of wanting to graze in the neighbor’s field, they want to use a time machine to graze 20 years ago, when they “remember” that the grass was better. Of course, the grass wasn’t any better and most of them don’t even remember it, and a fair number weren’t even born yet.

Please don’t be one of those time travelling sheep.

*

The Myth of the Christmas Season Starting Earlier Each Year receives one star due to the fact that while not completely a fabrication of the sentimental, it is a greatly exaggerated event. Sure, way back when (maybe the time of Constantine?) Christmas was a day, not a season, but that distinction changed almost equally long ago. In the mean time, the season has grown, but it’s safe to say that it hasn’t been during my (or any of my contemporaries’) lives.

Nate’s Review of Good Night, and Good Luck.

Recently, this Site’s integrity has been challenged. A member of our Junior Staff, though well-intentioned, has violated one of the precepts of reviewing. This review reviews that review, explains its shortcomings, then concludes with an establishment of goals for both The Site and its Junior Staff.

Nate takes aim at the press but hits George Clooney instead.
Nate takes aim at the press but hits George Clooney instead.

Having also seen Good Night and Good Luck., I’m more than adequately qualified to weigh in on the movie’s merits (or lack thereof). But why a review of Nate’s review instead of the movie itself? Nate made the oh-so-common mistake of confusing a movie’s hype with the actual movie itself (this confusion can be found in any reviewable product, not just movies.) It’s not George Clooney’s fault that critics think his movie’s all that and a bag of chips. Nate didn’t separate the hype from the product, and because of that, he gave the movie an unfair review, which casts this Site in an equally unfair light.

What I assume to be Nate’s gripes about the movie, what I called its “superficiality” during our initial discussion of it (before the publishing of Nate’s review), should not be gripes. They should be supporting details, leading to an informed opinion, and therefore, an informed review. Was George Clooney doing something evil when he chose to let the historical actions speak for themselves? Is it wrong to assume that history can and will repeat itself? Even if George Clooney were to consider his movie a parable (I do not believe that it is or is meant to be a parable, just a vaguely cautionary tale.), he’s not the first. If we were to consider this movie to be the thread connecting McCarthyism to the “war on terror,” we must remember that this same thread extends also to the Salem Witch Trials in Arthur Miller’s “The Crucible.” Though “The Crucible” explicitly called back/forward to the HUAC proceedings, it remained a rather superficial examination of a community of fear. Was Arthur Miller only giving Two-and-a-Half stars worth of effort in his famous play, simply because he had the (gasp!) audacity to think to himself, “Gosh, this has happened before, and it’s practically happening again.” Again, though I don’t consider “Good Night, and Good Luck” allegorical, I will say that any “depth” comes solely from the (re)viewers’ minds. If George Clooney were to say, “Gee, I hope that people vote democrat after seeing my movie!” go ahead and spend the effort bashing him (and his movie) because as an allegory, political tool, etc. it fails. It fails miserably.

Because the anti-political crowd (think of “The Daily Show” — soon to be mega-reviewed on this very site) is so large, vocal, and lacking perspective, they’re unaware of the fact that because being against politicians (or claiming that a movie is politically preachy) is just as much a political opinion as hating Hillary Clinton is a political opinion. If they’re looking for “Good Night, and Good Luck” to be a political tool, it will be. It’s been said that human minds better create horror than human eyes. Given the freedom to imagine their personal nightmare as opposed to a finite, real horror, they imagine the worst. George Clooney gives the audience that opportunity: look in the box, and what you see is only what you want to see.

No, it’s not a perfect movie. It does lack depth, it does simply re-create existing history. The actors aren’t so much “acting” as “impersonating.” but despite all of this, it remains intriguing. Metaphorically, I knew McCarthy’s ship would sink, but that doesn’t mean I don’t want to watch him scramble for a lifeboat. It is not one of the “best films of the year;” it’s not particularly “important,” no one “needs to see this movie.” But even though other critics are on record saying these things about the movie, George Clooney is not. The movie speaks for itself. It doesn’t say much of value, but certainly more than two-and-a-half stars worth. Nate’s expectations of the critics were not met, not Nate’s expectations of the movie. This is not the fault of the movie or George Clooney. Once the unwarranted, incorrect hyperbole of the critics is cast off, what’s left? A particularly solid, entertaining movie, nothing else. This is not a review of the movie, but a review of Nate’s review. The absolute star ranking of the movie is not important, as its now widely understood that it’s better than the two-and-a-half bitter stars that Nate threw at it. Nate’s review was well-written, had a particularly funny caption for its photo, maintained coherency despite its length, so I will be more fair in my review of his work than his review of Clooney’s.

*½

Due to Nate’s nature as the Site’s Junior-Reviewer -at-Large, we can’t expect perfect, objective reviews. He’s only human. We all are. Should I hold The Site to a higher standard of quality, demanding insight and unbiased objectivity in reviews written by all contributors? Naturally I should (and so should all of the Junior Staff), but until that point arrives, we will use each review as an example in time, a time-capsule of sorts, of each writer’s strengths and weaknesses so that the readership-at-large sees our Junior Reviewers accomplish all of their opinionary goals. What is insight without perspective? What is opinion without foresight? What are sweeping generalizations in the absence of nuance? What is getting on one’s soapbox without a safety net of objectivity? These are the questions for which I know the answers and for which The Site’s readership demand answers. We read on as our Junior staff grabs the first handle in the philosophical jungle-gym that begins the pursuit of their own personal answers to these inquiries. Between the lines of each review we gain a clearer understanding of their answers. Between the lines of each review we see them learning to better tell others what to think. I have utmost confidence in The Site’s Junior Staff’s ability to not only learn from their mistakes, but to rise above them, and truly establish themselves, and therefore, this Site, as a premiere opinion-making entity in the world.

Nate, we’re all rooting for you.