Boy, did I pick the wrong day to go to Sea World. I get back and not only do I have very kind shout-outs from Chris Kohler at Wired, Kotaku, and GameSetWatch, but every gaming site seems to have gotten back from its New Year’s break and has a zillion posts worth linking to or writing about. Plus CES starts Thursday, and some companies have already given their big presentations. Ahhh! But hey, I got to pet dolphins and see a beluga whale. It’s a good tradeoff.
Before I get to everything else, I want to respond to Kotaku’s post on my post about the problems with and potential limitations of next-gen graphics. Jumping off from columns in Slate and Wired, I talked about how the graphics in early Xbox 360 games are approaching the point where they’re so good that we start comparing them to real life. (Some recent PS2, Xbox, and GameCube games look amazing, but even God of War isn’t trying to look real.) But like most CGI in movies, they’re far from capturing reality. In linking to my post, Kotaku says: “We’ve all seen Gears of War footage by now, those guys don’t look plastic-y and their giant suits definitely have some weight to them. Remember, the original Xbox didn’t really hit its stride visually until the last 50 percent of its lifecycle – we need to give the developers a little time to craft and create.”
First, yes we’ve seen the Gears of War footage and the PS3 Killzone footage. Let’s wait until the games actually come out to see how good they look. Second, I don’t say in my post that game companies should always settle for Super Nintendo or PlayStation graphics. I wrote that “for all the capabilities of the next-gen systems, game developers might be facing a future of diminishing returns when it comes to graphics.” A big reason costs and prices are going up in the next generation is the expectation of unheard-of graphics. That there is a limit to how good even Xbox 360 or PlayStation 3 graphics can be suggests that this upward spiral of costs/prices isn’t always worth it.
I’m also not saying that CGI in movies stinks. I’m saying it doesn’t look real. The Pixar movies are full of wonder; Sky Captain and the World of Tomorrow is an Alex Ross comic come to life; Sin City, empty and sadistic as it is, looks like no other film. But Jar Jar Binks looks like what he is: a superimposed computer creation. I know I’m in the minority on this, but Gollum isn’t that much better. The more realistic CGI characters get, when they’re inserted into otherwise real scenes they still ultimately look fake.
The problem for video games is that this grasping at reality costs a lot of money. As Edward Jay Epstein writes in Slate about too much CGI hurting movies, “Even with advances in computing power, CGI remains incredibly expensive. In Terminator 3: The Rise of the Machines, for example, the budget for computer work, including visual effects, creature effects, and special effects, was $28.2 million.” If movie makers spend $30 million on graphics and can’t come up with anything as good as the original Millennium Falcon, what chance do video games have? Sure, Gears of War might turn out to have better physics modeling than any other game. But games featuring real people — like, say, the EA Sports lineup, which by propping up (along with Will Wright) the biggest game company in the world is also key to the industry’s health — are years away from competing with what we see on TV.
Again, the solution isn’t for every game developer to return to 16-bit, 2-D gaming. But the reality of graphics’ unreality should give developers pause. When appropriate, when your graphics engine is good enough and lets your game do things that haven’t been done before, go for it. For most game companies, though, a glance at the glut of monotonous, sort-of-real Hollywood CGI should give ample reason to trust realism less and try something different.
— January 4, 2006