The discussion from my posts continues over at GameSetWatch, and the latest comment there helped me clarify something that I think is going unsaid in this whole Roger Ebert/games as art/auteurs/narrative discussion.
Arts criticism is based on the assumption that there are objective standards by which we can judge a creative work. Each medium has its own, though often overlapping, criteria; the Atlantic has a short recurring feature called “Close Read: What makes good writing good” that tries to illustrate this.
Of course, no good arts critic simply goes down a checklist of brush work; cinematography; language use; graphics; etc. A typical college paper music review will go through a CD track by track, just describing the songs; too many movie reviews still consist of a pithy opening paragraph followed by plot summary; I link to GameSpot reviews all the time, but they structure each review identically and follow the checklist model.
Good critics synthesize these objective measures with a work’s context and history, the critic’s interpretations, and whatever other subjective measures the critic comes up with. A movie can be objectively bad but still great, and a good critic will explain why.
It’s also important to note that there’s nothing wrong with objectively “bad” art, and that a “bad” TV show like Full House can still affect you and be meaningful. To give another example, J.K. Rowling is a dreadful writer, but the Harry Potter books are still exciting, engaging and entertaining (the problem is when people say Rowling is better than she is and simultaneously attack other popular entertainment, but that’s another story).
If you accept these assumptions, then it’s pretty clear as Ebert says (admittedly with little knowledge of actual games) that the vast majority of video games thus far are inferior to the dominant forms of narrative art according to the accepted measures of assessing that art.
Few video game scripts have approached the level of a decent novel in terms of language use, character depth, or structural experimentation. Few video game performances have approached the level of a decently acted movie or the better-than-a-live-performance voice-acting of the Simpsons crew. Few games have approached the level of nuance and purpose in a decent movie’s (or again, The Simpsons’) direction, camera shots and composition.
Now, Ebert attributes this vast gulf between video games and other narrative art to the lack of authorial control and the consequent increase in player control in a video game. I don’t think that’s an insurmountable factor but more a reflection of the state of games so far and a lack of gamemakers who are creative geniuses in areas other than programming and game design. Many people, however, have taken issue with Ebert’s reasoning for different reasons: that the increase in interactivity and player choices is worth the decrease in “quality” typical of one-way narrative art, or that player-created narratives are just as good as any others.
As far as the first point goes, that just agrees with Ebert but says he misses the point because video games are fun. In that case, why argue with him in the first place? The two aren’t mutually exclusive arguments.
Anyway, the brief point I wanted to make, in my own geeky way, is about roleplaying. Sure, no-one is going to pretend that a D&D adventure is the work of an auteur, but adventures pretty obviously follow a story – they have to, else where is the interest? The dungeon master comes up with a plot, and the player characters follow it through their choices. They hold an incredible power to mess up your plot and ideas to no end of frustration. Because the players can decide what they will do and when, does this mean there is no narrative? Not at all. It just changes the nature of the narrative from something pre-planned and concrete to a more fluid creation, where the DM has to react and guide. Narrative changes from a task of showing a story to showing your players how to hear the story, almost. It becomes a more interactive experience because you have to shape your players to think in the way the narrative demands, but you have to do it in a very subtle and invisible way so they still feel like they are making choices.
This is an important point (I started getting at this by bringing up Settlers of Catan and other board games yesterday), because D&D and other role-playing board games have been around for decades, and they are accepted and appreciated for what they are. These games — and less open-ended ones like Warhammer that replace explicit narrative creation with figures and models that players assemble and paint, and therefore form attachments to and weave into their army’s narrative — do show the fun and importance of player-created stories, of interactivity, of social gaming. But these are far different measures than what Ebert is talking about. Now we’re talking about assessing video games in terms of just that: games.
The D&D example is instructive because in terms of player-creation and control, it’s even more advanced than a sandbox game like SimCity. “The dungeon master comes up with a plot, and the player characters follow it through their choices.” Not until the massive multiplayer games did video games approach this level of player creation and control. But few people argue that D&D is as good as Lord of the Rings.
If people viewed video games as “just games,” that’s one thing — and not any kind of insult. The Arthur M. Sackler Gallery at the Smithsonian, where my dad works, recently had an exhibition called Asian Games: The Art of Contest, which explored “the role of games as social and cultural activities in the diverse societies of pre-modern Asia.” Games are important, worthy of study, and worthy of playing.
But there is this impulse to talk about — and make — video games in the context of other forms of popular entertainment and art (which effectively means narrative art, since most popular art is of the narrative kind). Once you do that, you can’t defend games as art simply by talking about what makes them great as games. That might help lead you to an understanding of why they also work as art, but it’s not enough in itself.
If we’re going to discuss video games in the context of other art, there will be an inferiority until the story, scripts, acting, composition, etc. greatly improve. If we’re going to discuss them as games, the defensiveness needs to go and we need to accept that we’re talking about games.
I think a major factor in this defensiveness is that while board games have flourished in Europe over the past decade or so — check out the many intelligent, creative, varied, challenging offerings on Funagain Games that originated in Europe — in the U.S. board games still are seen as simplistic childhood fare like Sorry and Life; mind-numbing, geeky war games like Risk and Axis and Allies; or mind-numbing, geeky fantasy games like D&D. If games in general had more respect, people would be more comfortable talking about video games as games.
There’s also a third option: Accept that video games can do just about anything and that we’ve barely scratched the surface of their possibilities. Some games will try to be like movies and books, and they should be judged accordingly. Some games will try to compete with board games and puzzles, and they should be judged accordingly. Some games won’t be like with anything we’re familiar with, and they should be judged accordingly.
But let’s not make games into something they’re not. So what if there hasn’t been a video game Citizen Kane? Super Mario Bros. 3 is still awesome, Final Fantasy VII is still awesome, and Guitar Hero is the coolest thing I’ve ever touched. Not even Roger Ebert can argue with that.
— January 12, 2006