High mehfinition.
Evidence suggests an unresolved divide in the game industry between people who believe HD improves the gaming experience and people who think it's just useless flash and dazzle. We are now at the stage where someone's suggested XBox 360 games be given different scores depending on the TV and resolution used during the review process.Is this an issue the game industry has discussed within itself, and decided upon? I suspect not. On the one hand, Microsoft has stipulated that 360 games must support HD resolution:
In an unprecedented move, all Xbox 360 games have been given the following edict: Support widescreen formatting and 720 progressive scan. These aren't suggestions, they're requirements, and the benefit to HDTV owners is significant.
But on the other hand, game developers haven't expressed a whole lot of excitement about this requirement. In fact, the current golden boy of the industry seems pretty decisively against it:
"The graphics on any game fall away within the first 10 or 15 minutes, and you're pretty much left with the interactive experience. Are you mentally engaged in what you're doing? (...) I'm only interested if I can achieve-and if the team I'm working with can achieve-an emotional impact or a story impact via what makes a game a game: interactivity."
Whether or not we choose to acknowledge it, this is a highly devisive issue in gaming circles and it isn't going away. Games are becoming accepted into popular culture, and the industry is beginning to recognize its maturation. The question is, will the unique quality of the medium -- interactivity -- be cited as the key reason for the curious to care about games? Or will the developers allow console manufacturers to dismiss their gameplay experiences as raw materials only properly enjoyed via high-def displays and 5.1 surround? The gamers who've been around awhile seem uniformly unimpressed by the old better-graphics-is-better-fun saw, but it's pretty well-proven in the marketplace, and graphical excellence continues to be a key talking point for magazine previews.
If interactivity is going to win out over soft-edged self-shadowing at 1900x1200, there must be clear examples of games that overcame their unremarkable looks to be successful both in the marketplace and with best-of-2005 articles. Fortunately, there's a real forehead-slapper of an example: the most popular game of the last few years was GTA, and it was not exactly a graphical powerhouse.
Perhaps there's life in this debate yet.
7 Comments:
How about a compromise and we call it "useful flash and dazzle?" Despite how you read Dave Jaffe's comments, I don't think he's suggesting graphics don't matter. I doubt he'd have produced one of the best-looking PS2 games ever if that were the case. His point is well taken, obviously, but would God of War have been nearly as "interactive" with 8-bit graphics? I think it's disingenuous to think that something like the Hydra battle could have done nearly as well in a previous generation. On the other hand, the biggest criticism that can be made of God of War is how many parts of it are not advanced beyond previous games, and here of course I'm talking about pushing around crates and dodging balls of fire and all that. It all seemed so neat because it looked so much better.
The counterexamples abound, sure. You mention GTA. How about another one? The Sims is the best selling video game of all time, and it's definitely not for the graphics. The Xbox 360 has been out for about four months and until Ghost Recon came out the game people were talking about the most was Geometry Wars. But here's the problem: the Sims is the best selling game because millions and millions of people who otherwise do not play games all bought one copy. You can't really bank on your game making that breakthrough if you're a developer. Instead you can count on the relatively small number of people who buy as many games as they can get their hands on. Those are the people who are playing in HD right now, and who do count graphics as an integral part of the overall experience. You'll never find someone who thinks good graphics take the place of good gameplay -- that's a straw man if I've ever heard one -- but try playing a good 360 game in HD and tell me the experience isn't improved. It just is.
There is a very sensible rationale to Microsoft's position, as well. Television technology is moving inexorably toward high definition. Within a decade -- maybe less -- no one will have a standard definition television as their main set. Not taking advantage of that now would be like ignoring the emerging DVD market in 1998. (Actually, it would be like releasing a cartridge-based system in 1996 -- not that I can think of any company that would do that!)
I feel like I could say more, but I've said little enough already.
Of course what I meant to say was "maek poast."
I was pretty sure that's what you meant.
Internet
I agree with most of that. I completely agree that God of War's hydra battle was significantly enhanced by its graphical and aural presentation, and I must admit HDTV is here to stay. What I disagree with is that the effort put into the audiovisual presentation is worth the cost, and that there aren't better things to spend the effort and money on.
First of all, I feel like an emphasis on presentation values can only hurt the industry in the long run. Game development budgets have exploded, and for one reason: content. Higher resolutions force more and more art to be created to fill the screen, and it all must look better and better. Procedural art generation is a popular argument, sure, but I don't see it displacing human labor unless the games themselves become utterly sterile. The artist's touch will never go away entirely, and that means content budgets will continue to increase as long as graphics are emphasized.
This keeps the industry digging itself deeper and deeper, because each new game needs to look better than the last, which necessitates a bigger budget, which necessitates huge sales, which tends to stifle innovation. I know that you're calling this a straw man argument, but I really am serious that people tend to prefer better graphics to better gameplay, assuming they can only have one or the other. My position is that, increasingly, they can only have one or the other. Games with both (God of War) sell very well indeed, but that's a rare case. The game industry creates at least as many remakes as Hollywood does, and it does this because people respond more to better presentation values than they do better gameplay. Multiplatform titles are great anecdotal evidence of this: one set of content is amortized across three SKUs, and a generic control scheme is usually grafted onto each controller. These multiplatform titles end up being a great example of what happens when you stretch a fixed budget over both graphics and gameplay: a mediocre game results, on all platforms. It seems like a case of one or the other.
I would argue that The Sims is actually not a good example of gameplay over graphics, because it's an edge case. It's a breakaway, runaway success in a genre of its own, and it's crossed over most of the traditional gaming demographic lines. I suggested GTA because it dominates a well-established genre with a number of competitors, despite not being the best at any one thing. In GTA's case, it's successful because they spent the money on gameplay, and accepted highly mediocre graphics for their multiplatform title. Had they developed their own graphics engine instead of using RenderWare, I suggest the game would not have turned out to be as compelling an experience as it did.
I'm not saying that 360 games aren't better in HD than SD, or that HD isn't a valuable addition to gaming. I'm saying that HD is an unfortunate thing to emphasize, because it's like Microsoft locked 80% of the development budget up from the get-go. If the 360 didn't support HD, sure, art budgets would be smaller, but I don't think it would be proportionally less, because pressure to outperform graphically eats up more peoples' time than just the artists. The time spent on realistic water algorithms could've been spent elsewhere, and maybe the game design doc never would've had to deal with the side effects of view-limiting SuperFog3d(tm) in the first place.
And finally, on a different note: Requiring HD and SD scores for 360 games is a terrible, terrible idea, because then developers have to spend even more effort tweaking their presentation so that both resolutions look good. Even more time and money down the drain.
Admittedly, I'm ignorant of the development process. I guess I assumed good graphics come from having the right people moreso than the time or money you spend on a graphics engine. But it makes sense that overemphasizing in one area can lead to deficiencies in others.
Even so, great games tend to do everything well. I think your point about the pitfalls of developing games across platforms is an excellent one, and something I haven't really thought about. My favorite games in the past year or so have been Resident Evil 4, Shadow of the Colossus, Guitar Hero, and Nintendogs. All of them were developed for a single platform.
I also agree, but didn't mention before, that assigning two different scores is a bad idea. It probably warrants mention whether you were playing in HD or not. But I don't know. Do PC reviewers generally provide their system specs?
Hm, good point. Generally PC game reviews include both system specs and short notes regarding how well the game plays on older systems, but performance on legacy systems tends not to affect final scores.
That seems like a good compromise, actually: mention how well the game plays in SD, but target the review to people playing "as intended." Actually, that also drags out the notion of "HD cheaters" from awhile back, where there were fears that HDTVs and surround sound would imbalance the playing field online in ways console gamers hadn't had to worry about before.
Post a Comment
<< Home