Un-Enlightened

By Shamus Posted Friday Feb 22, 2008

Filed under: Game Design 61 comments

Here is a YouTube video for Enlighten, an SDK that game developers can use to add indirect shadowing and radiosity to a scene.

Radiosity is where it takes into account the fact that light bounces off of things. Your closet isn’t pitch black just because no lights are on in there. Light is bouncing off the walls in the room outside, and being reflected into the closet so you can still see the boxes of Christmas decorations and your dusty golf clubs. Also, radiosity lighting takes into account the color of the walls. If you shine a white light on a red wall, stuff nearby is going to look a little pink. The light coming off the wall is going to be red, and it’s going to illuminate stuff in red.

I guess I should get this out of the way: Yes, this is very impressive. It’s wonderful what technology can do. Just ten years ago effects like this were so insanely expensive in terms of processing cycles that you usually didn’t even want to use it for pre-rendered stuff. Now we can do it in realtime. The effects demonstrated are an amazing accomplishment, a blend of artistry and mathematical prowess.

But so what? Aside from saying, “gosh, look at this cool rendering stunt they can do”, why should I care? Is it nice? I guess so. Even under the idealized demo conditions the effect is so subtle I probably wouldn’t notice it if I wasn’t looking for it. Is it worth running out and buying a new graphics card? Nah. Is it worth the increased expense of development, because of the cost of the SDK and the work required from artists in order to take advantage of it? Of course not.

Yes, I’m on another one of my luddite rants. Note that I don’t have anything against pretty graphics, but I do have a problem with graphics at the expense of the game itself. As the race for better graphics goes on, developers are finding themselves making games that cost twice as much to develop, run on half as many computers, and look 8% better. We passed the point of diminishing returns with this stuff years ago, but PC developers, publishers, and reviewers can’t seem to stop the mad pursuit of the Shiny New Pixel. When a reviewer feels the need to ding STALKER because it only looks as awesome as last year’s games, the review system has graduated from mild eccentricity to full-on bat-shite crazy.

Longtime PC developer Cliff Bleszinski says that “PC Games are in disarray”. Epic Games, who began life on the PC platform, now see their PC efforts as “secondary”, and therefore (I assume) see themselves as primarily console developers. I doubt the thought has entered Bleszinski’s considerably intelligent head that maybe some of the blame for this goes at the feet of places like Epic. I had many, many gripes with the Unreal Tournament 2k series and with Unreal 2. I had long lists of things that put me off of both games. Nowhere on any of those lists was, “the graphics should be better.” Put the pixel shaders down and go make me a game I want to play, man.

I guess at some point enough PC developers will go under or get bought up and converted into console game developers. Once the herd is sufficiently thinned, the remaining ones might act on survival instinct and start looking for ways to stay in business, graphics be damned. There are more PCs in the world than all three of the major consoles combined. The PC has a few technological and interface advantages that can still set it apart. I’m convinced that the PC market doesn’t need to “die”, and it doesn’t need to be in “disarray”. It just needs developers that learn, or remember, how to make fun games.

 


From The Archives:
 

61 thoughts on “Un-Enlightened

  1. Deoxy says:

    Preach it, brother!

    Of course, you’re pretty much preaching to the choir, here, but at least the choir is enjoying it. I see no signs of change in the industry – “Darn the torpedoes consumers! Full speed ahead.”

  2. Eltanin says:

    Can I get an Amen?

    Oh, damn, Deoxy beat me to that analogy. But seriously, I’m on my feet dancing, feeling the righteous indignation flow through my veins. As part of the choir, maybe we should stop just singing and move to action. How about visiting game developer sights and linking back to this blog? Spread the Word brothas and sistahs! Or something.

  3. Deoxy says:

    “Amen!”

    snicker.

  4. Darin says:

    YES! This is GREAT!!!

    For games? Eh, not so much.

    For producing home CG movies? Oh, you bet. Imagine not having to worry about lighting when making your own home-brew CG movie and instead letting it be run real-time instead. Virtual actors on a virtual stage, now with real time virtual lighting.

    One step closer to Hollywood’s demise! Muhahahaha!

    Okay, I step off the soap box.

  5. John Lopez says:

    While I agree with you that the effect is subtle, I suspect that there are two ways to look at it:

    The first is the way that you have, by seeing the costs for minimal gain. And frankly, in the short term it is exactly how it should be viewed.

    But… the tool chains will start to include support for this “out of the box” in the near future. Video cards that handle it will be as common enough that they will be “built in video” on a later generation of machines. At that point the effect will be available for virtually free, both on the developer and the consumer side.

    We don’t get there without the short run “stupid expensive for minimal gain” phase. The good news is that I was able to avoid bump/mip/xyzzy mapping until the cards that supported it were dirt cheap, but now I get those features for virtually no cost, all subsidized on the backs of the early adopters and Moore’s Law.

    On the production side of things, I have to downsample textures far less for 3D work that I used to, which means that the production process is actually faster because I can waste cycles, memory and triangles in ways that would have had me hunting down every stray node in the past.

    Not that it has completely nullified cost increases, but I for one love overpowered graphics cards because it means that I can do fewer optimizations and rely on the card to drag a few “errors” along for the ride without noticeable performance loss.

    I have absolutely no intention to use or care about this advance *now*, but I’m glad the crazy people are out there who do so I will benefit from it 5-10 years down the road.

  6. Jaguar says:

    Wow! Radiosity in real time? That’s awesome! As a graphics programing enthusiast, that blows my mind.

    Now that I got that out of my system…

    I could not agree with you more, Shamus. As much as I love computer graphics, I’d rather have a cheaper, faster, good game that runs on my computer (3 or 4 years old now).

    I’d say until computer technology advances to the point that a run-of-the-mill computer can handle radiosity in real time, radiosity belongs in movies, not games.

    Still, I suppose that it’s the individuals and companies that push the envelope that push the technology forward in the first place.

  7. Deoxy says:

    Darin makes a good intro into a point I had rolling around in my head but couldn’t seem to word right…

    Basically, all this advancement of graphics is a good thing… it’s just unfortunate that we are sacrificing the computer gaming industry to get it.

    Computer gaming companies and magazines that blow themselves out pushing for every 0.1% more graphics are indeed succeeding in making better graphics capabilities, and this new and/or improved technology can and will be useful for many things. If you don’t care about games, well, this is an unqualified good thing.

    If you care about games, well, it’s more of a “silver lining” sort of thing… :-(

    Edit: and while I was typing, John Lopez goes and submits a comment that makes my own point MUCH better than I did. Go John! heh.

  8. wererogue says:

    The one counterpoint I’d put to your argument is that one of the biggest challenges facing designers and artists that I see in our company is *not* being able to do things that they’d expect. It’s ridiculous to have to build a level, put a light in the ceiling, and then another one in the cupboard because the ceiling light doesn’t light it up properly. Tech like this (and like Havok, shaders, Euphoria, DMM etc. etc.) make it possible to begin to build an engine in which you put together a world and some scripts, and you have a game that looks how you expect it to. That’s pretty exciting. Designers are far happier to fill in the weight of a box for Havok than to have to set its world gravity or some other ridiculous pseudo-physics property.

  9. Rask says:

    What struck me was the beginning of the demo, which showed a typical FPS hallway under both forms of lighting. The effect is subtle, but I think it would draw me into the game a bit more, as it’s a little more realistic.

    (Your RSS teaser that brought me here had me thinking, “oh, great, someone has rediscovered ambient light values,” but now I see what the technology is about, and I think it’s a good thing.)

  10. Snook says:

    I’ve been seeing many, many articles about the “demise” of pc gaming. Guess how many list “hardware too expensive” as one of their reasons? None. All list a lack on innovation or declining sales or somesuch.

    With a computer that’s 5 years old, I can’t really play any of the new games at all. It bothers me, but I can’t afford an upgrade. However, some games (indie, mostly) I can run just fine thanks to not relying on graphics, and they have great gameplay to boot! Coincidence? I think not. For example, see Mount & Blade. They know many of their gamers don’t have powerhouse PC’s and they also take hints from their players (the game is essentially in active beta) to improve the gameplay where their customers want it improved, and to remove or add things as customers demand, in each new version.

    I want to see more games like Mount & Blade, not necessarily the same type but with the same form of development. Less focus on “oooh shiny” and more focus on “is this fun?”

  11. krellen says:

    I remember when I stopped trusting X-Play. It was when they gave a game (I can’t recall exactly which) a 4 out of 5 because the graphics weren’t the best. The graphics were the only criticism they had of the whole game, and we’re not talking sprite-graphics here. More like Half-Life instead of Half-Life 2.

    Stuff like that is just inexcusable. Yes, good graphics are nifty, but they should be gravy, not the main course.

    (I suppose, of course, that they are, in fact, gravy. The problem is really all the dry games that are produced that require gravy to be palpable.)

  12. Yehoshua says:

    This is a good thing – it’s one more step to take us out of the Uncanny Valley.

  13. Phlux says:

    Wererogue makes a fair point. If this technology succeeds in DECREASING complexity of level design by reducing the number of lightsources and whatnot, then it’s probably a good thing. Anything that succeeds in lowering the cost of development is good for all of us.

  14. straechav says:

    Yep, I agree with those who give counterpoints to your argument, Shamus. You have a point there, but it’s not fully thought out. Physics engine did, and especially this Radiosity lighting probably will, simplify and make level design so much more easier (not to mention the extra puzzles physics gave to games). I can completely sympathize with level designers who have to place dozen static lights everywhere to create anything like the effect of natural lighting that could be created with one radiosity light.

    Once this is universally supported, I expect to see some pretty damn awesome lighting and the “standard” lighting used now will probably take much less time to place & design.

  15. Rend says:

    I agree with John Lopez for the most part on this one. New technology always comes out ridiculously expensive, and then gets much cheaper as time goes on.

    I think the problem we really have here is that video game companies for PC game insist on _using_ the newest most ridiculously expensive stuff, instead of just a few who want to really hit the edge, while most stay in a comfortable “I can buy your games because my rig can be assembled without zero gravity” zone.

  16. Vegedus says:

    But graphics *is* what the consumers want. I obviously don’t have any raw stats here, but there is a majority that mostly base their purchases on graphics.

    Look at the console war. There are 3 common arguments for the Wii sucking: It doesn’t have many good games/any mature games, the Wiimote is a stupid gimmick, and it’s games doesn’t look good. There are a large number who would buy a PS3 because it’s shinier!

    I browse around EBgames and overhear a conversation between some kid and his dad. “I think I would rather have a PS3 than an xbox360. It’s graphics are soooo pretty.”

  17. krellen says:

    Graphics are what console gamers want. Let the consoles have the “pretty” market; PCs are far more marketable with the low-graphics fun games anyway. PopCap has proven this already.

    (Not to say all games should be “casual” games, just that PCs are far easier to sell for if bleeding-edge graphics isn’t your only selling point.)

  18. Locri says:

    I think what’s far more interesting then that is the new work coming out from real time raytracing that has all these amazing things built right in because, well, that’s the nature of raytracing. Computers are getting advanced enough that it would be reasonable to have a realtime raytracing engine (although it would likely start out looking worse then a traditional engine). Also, there is a researcher in Wisconsin that demonstrated that raytracing can take better advantage of multiple CPU cores vs traditional rendering.

    That’s the future… no special engines to buy, just add the lights and tell the game what textures objects are and the raytracing handles the rest.

  19. Deoxy says:

    There are 3 common arguments for the Wii sucking: It doesn't have many good games/any mature games, the Wiimote is a stupid gimmick, and it's games doesn't look good.

    Hm, and which console is kicking the other 2 in the soft parts on a daily basis? Oh, yeah… the WII.

    Graphics are what SOME gamers want; unfortunately for the rest of us, graphics are the easiest to measure, which makes them the easiest to develop, as you can easily measure your progress and easily measure the responses of a certain segment of your customers. It’s also easiest to review, as it’s less subjective than “game play”, etc. AND it was the area that made the most progress the most easily for quite some time, which gave graphics improvement a feeling of “most bang for the buck” (because for a time, it was).

    But clearly, graphics aren’t remotely the only thing, nor even the biggest or most thing, or the Wii wouldn’t be clobbering the XBox360, much less the PS3.

  20. Cadamar says:

    Real-time Radiosity!!! The math geek in me just fainted…

    I think that there are enough similarities between the movie industry and the game industry in order to draw analogies, so…
    Consider that every time a new tech comes out for film making there is a corresponding slew of crappy movies that exist simply to highlight the tech. Eventually people calm down about it and the good film-makers starting thinking about how to incorporate it subtly into their film in order to enhance the story. Same with games. Sometime this takes a very long time. 3D movies have been around since the 50’s but it still isn’t used well. It’s very rare for early adopters to figure out how to utilize it and still tell a good story. For every Tron you also get a dozen Lawn Mower men…
    Give it time. Every new technical achievment is another tool added to the story tellers toolbox.

    Besides, take the statement “X? So what?” and replace “X” with the following:
    Real-time Radiosity
    Real-time shadows
    Specular Mapping
    Texture Mapping
    Polygonal Graphics
    32-bit color
    16-bit color
    Sound
    4-bit color
    And then where are we? Not much of a story to tell with Pong…

  21. Davesnot says:

    I don’t have a console.. why?? well.. I have plenty of games to play.. on machines that run them well.. and now I even have extra machines to link up some of those old games.. NASCAR Racing 3 with the kids.. great! Army Men.. Mech Commander.. all were great games.. still are..

    I never threw out my old Avalon Hill games.. or even Monopoly.. or Stratego.. Did I buy a new version of Stratego because moulding of plastic got better?? No..

    My point is.. the game is the game.. I have plenty of things to do with $60, $40 or even $20 .. I don’t need the new story-line.. I have NWN and NWN2 for new stories..

    I guess I should mention that I’ve also been know to “play” golf on a driving range.. taking a score card with a map with me.. then just pretend how the shots go..

    Sure.. I miss out on some eye candy.. but I also miss out on the frustration.. joy of playing is joy of playing..

    Having to get new hardware to feel joy in gaming is like.. well buying something sold by spammers to increase the girth of one’s Johnson… I’m comfortable having fun .. I don’t need an industry to make me insecure about my fun.. I know it’s fun.. period.

    That said… I’ll glady buy these fancy games in a few years.. when I’ve bought a “new” computer that is still “old” on the tech curve.. and ya know what.. I’ll like the old games then too.. I live close enough to Hollywood to see through the hype.. fun is fun.. missing mortgage payments is not.. I’ll gladly game within my limits.

  22. Elethiomel says:

    I’d like to add my voice to Phlux and Wererogue. Radiosity will make level design cheaper once it’s become “standard”, not more expensive. Static-lighting a level to make it look “natural” is a major chore along the lines of putting together the prefab parts to make the level geometry itself (assuming you use prefabs like Oblivion or Morrowind).

  23. Randolph says:

    I think a large portion of the problem is that graphical realism and game design have not been implemented in a really complementary fashion. That is, the designers haven’t really sat down to think about how improvements in the rendition of an environment can be utilized specifically for purposes of gameplay. Consider games such as the Splinter Cell series in which, actually, the lighting is a significant portion of the game. The human eye frequently, unconsciously, interprets the manner in which light reflects off of objects to understand the contents of an environment. There’s some instances in that demo in which, for instance, the presence of a bright red affected the tint of the cast shadow via reflected light. This sort of thing happens all the time in real life, when an object entering a space changes the character of the light in that space, and we often use this information to assess our surrounds. And its not as if game design can’t make use of this information.

    I think the frustration stated in this post is being misplaced, and in fact, the problem is already being addressed by the industry itself as it moves to correct said problem. Consider these two articles: the first which outlines the problem, and the second which goes to show that some of our deepest fears are directly being confronted by the industry (of course, the second article may need to be taken in with a grain of salt). Its not really that game design is being sacrificed for the sake of graphics, but that the industry is still immature when it comes to dealing with the kinds of economic and financial demands of such large scale operations that the gaming industry has become, and is slowly beginning to come to terms with what it takes to maintain sales while not stifling creativity.

    Obviously this is just my opinion, but I believe what we’re experiencing right now is really just growing pains, and all the doom and gloom about how PC gaming is dead is not really taking into consideration the larger picture. The industry is as large as it is today because of the influx of new gamers that have been created by the console market; its really not that people are giving up PC gaming in favor of consoles.

    And while I completely sympathize with the frustration of having to constantly upgrade hardware to be able to play the latest games, when has this *not* been the case for PC gaming?

  24. Randolph says:

    I think a large portion of the problem is that graphical realism and game design have not been implemented in a really complementary fashion. That is, the designers haven’t really sat down to think about how improvements in the rendition of an environment can be utilized specifically for purposes of gameplay. Consider games such as the Splinter Cell series in which, actually, the lighting is a significant portion of the game. The human eye frequently, unconsciously, interprets the manner in which light reflects off of objects to understand the contents of an environment. There are some instances in that demo in which, for instance, the presence of a bright red affected the tint of the cast shadow via reflected light. This sort of thing happens all the time in real life, when an object entering a space changes the character of the light in that space, and we often use this information to assess our surrounds. And its not as if game design can’t make use of this information.

    I think the frustration stated in this post is being misplaced, and in fact, the problem is already being addressed by the industry itself as it moves to correct said problem. Consider these two articles: the first which outlines the problem, and the second which goes to show that some of our deepest fears are directly being confronted by the industry (of course, the second article may need to be taken in with a grain of salt). Its not really that game design is being sacrificed for the sake of graphics, but that the industry is still immature when it comes to dealing with the kinds of economic and financial demands of such large scale operations that the gaming industry has become, and is slowly beginning to come to terms with what it takes to maintain sales while not stifling creativity.

    Obviously this is just my opinion, but I believe what we’re experiencing right now is really just growing pains, and all the doom and gloom about how PC gaming is dead is not really taking into consideration the larger picture. The industry is as large as it is today because of the influx of new gamers that have been created by the console market; its really not that people are giving up PC gaming in favor of consoles.

    And while I completely sympathize with the frustration of having to constantly upgrade hardware to be able to play the latest games, when has this *not* been the case for PC gaming?

    Also, perhaps the size of the console industry with respect to the PC game industry, and their growth over the last decade or so, should be considered. Has the PC market shrunk? Or is it just that the console market is growing faster than the PC market? And hasn’t it always been the case that PC games have been considered somewhat esoteric and difficult to break into compared to console games? I don’t really want to label console gaming as “casual” in comparison to PC gaming, but I believe that PC gaming has a reputation for being obtusely complex and not particularly friendly to new comers, particularly in comparison to console games which don’t require set up or fiddling with options to get started, and only require that the player learn which buttons do what.

    Moreover, the lack of universality in hardware has been one of the reasons why developing for PCs is more difficult than developing for consoles. A developer knows they will get the same result every time on an XBox, but this is not always the case with PCs since hardware will change from box to box. So instead of the claim that PC gaming is in disarray, I would say that its more accurate to claim that its easier and more lucrative to develop for consoles *in comparison to* PCs (hence, developers are going for console gaming).

    So again, I believe the problem is that PC game publishers are still struggling with finding a means of competing (for talent, resources, and sales) against the more accessible and predictably reliable (in terms of stability, piracy, etc.) nature of console gaming instead of understanding that perhaps these are two very separate markets which should not be competing against each other, and it hasn’t been as apparent so far to large production companies, whose chairs don’t necessarily have the background in gaming that would be desirable, that the solution is simply solid, creative game design (obvious to us, isn’t it?).

    Long winded statement made short: the bad times will pass.

  25. It’s worth pointing out that one of the highest rated and selling games on the month, Sins of a Solar Empire was originally designed and conceived as able to run on 4 year old PCs and scale graphically, neatly, up to modern high-end graphics-prettified 8800+ monster-boxen. And it does a fine, fine job of it.

  26. Daosus says:

    Real time radiosity is very spiffy. And it is true that in a few years it’ll be very cheap to do. That’s not what the problem is. The problem is that by then, there will be a new technology that uses up a huge chunk of development time. This article isn’t an example of a tech that didn’t need to be researched. It’s an example of how the game industry’s focus is on advanced technology instead of making good games.

    Economists talk about marginal costs and benefits. This basically means that for any action, there’s a certain additional cost and additional benefit above and beyond what you’re paying now. If your marginal cost is more than your marginal benefit, you shouldn’t be doing that action. For this radiosity example, it will take many man hours of effort, maybe 10% of total development time, to build this into a game using this new engine. It will take even longer if you have to start from scratch. But the benefit is a tiny improvement in graphics.

    The point isn’t that this isn’t an awesome improvement in real time rendering. The point is that the time spent developing this could have been spent developing a better game.

    As an aside, I think consoles are a great place to build games that push the envelope on graphics because you have a few years to learn to utilize that hardware. Doing so on a computer leads to customers being unable to play your game unless they upgrade every six months.

    The computer has many advantages over the consoles. Things like a good, well developed interface. Keyboards. Relatively easy web connectivity. But, right now, computer games are not really using any of those. Trying to compete against consoles on their own home turf leads to losing.

  27. Shamus says:

    And while I completely sympathize with the frustration of having to constantly upgrade hardware to be able to play the latest games, when has this *not* been the case for PC gaming?

    For me? For most of the 90’s and a little bit into the new century. I made sure my computer never got older than four years, and I was able to run just about anything off the shelf.

    I still do that, but now I also have to drop a hundred dollars a year (either $400 every 4 years or $100 once a year, take your pick) to keep the graphics hardware up, and I STILL see a lot of titles I can’t run. Of those I can, a lot of them will run poorly. (See my posts on Oblivion. That was not a game where you wanted to be near the low end of the system specs.) And there are fewer games in the stores.

    Alexander made an excellent point above: SoaSE Runs on “low-end” hardware, and the thing is making a killing. The point I’m making is that if developers would just back away from the bleeding edge they could enjoy higher sales, and PC gamers could enjoy more games.

    The consoles will keep pushing the envelope for us. It would be much better for PC developers to ride their coattails instead of killing themselves trying to keep up.

  28. Mordaedil says:

    You know Shamus, I do agree with you here.

    I’m at the point where I can barely tell what looks better among new games. They all have that fancy HDR and normal mapping, so I won’t notice the stuff apparently everyone else sees, and I’m frickin’ educated IN 3D modeling!

    It will be impressive to see this in use in 6 years. When the load on our computers will make this insignificant to performance. I am all for quality, but I think games are trying to stay ahead with technology too hard.

    I mean, I want progress and all that, but it’s not really necessary to use every little thing to just make your game look a little better at the expense of alienating 30% of the market.

    Slow down and let the scientist develop 4D already. *Then* make games that support 3D.

  29. Randolph says:

    “For me? For most of the 90's and a little bit into the new century. I made sure my computer never got older than four years, and I was able to run just about anything off the shelf.”

    Hmm… perhaps I don’t recall things that way because my hardware was always lagging (we weren’t exactly financially secure enough to spend money on “frivolities” like computers… I guess I owned maybe 2 computers throughout the stretch of the 90’s), so you’re probably right on that point. And actually, I pretty much said the same thing as what you state in your reply. Its just that I think developers have already realized this (within the last year or so) and are either going console altogether or re-thinking what they need to do.

  30. krellen says:

    Daosus: The computer has many advantages over the consoles. Things like a good, well developed interface. Keyboards. Relatively easy web connectivity. But, right now, computer games are not really using any of those. Trying to compete against consoles on their own home turf leads to losing.

    This is all demonstrated by (and I shudder to say this) the immense success of MMO gaming on PCs. MMOs utilise the advantages of the PC – the more robust interface that is a mouse and keyboard, the ubiquitous nature of internet connectivity – and make a killing for it. These games simply wouldn’t work on consoles, and they clearly show that there are, in fact, things the PC is better for.

  31. Lanthanide says:

    I didn’t bother reading the other comments, but I just wanted to add this.

    This is *exactly* what was missing from Doom 3. They went on and on about their cool lighting system, but when you were playing the game, it always seemed a bit odd. Especially because most surfaces were supposed to be metal, and therefore realistically should be reflecting quite a bit of light, but they never did. This was parituclarly jarring with your flashlight (a moving light-source) as you should have been able to see the halo of light reflected around you on the walls, but you never did.

  32. Blake says:

    Besides, take the statement “X? So what?”Âť and replace “X”Âť with the following:
    Real-time Radiosity
    Real-time shadows
    Specular Mapping
    Texture Mapping
    Polygonal Graphics
    32-bit color
    16-bit color
    Sound
    4-bit color
    And then where are we? Not much of a story to tell with Pong…

    Yeah, it’s not like there was any story to Zork, and it was using 1-bit non-graphics. :)

  33. Bryan says:

    It seems to me that you have a general misunderstanding of what is meant by the “demise of PC gaming” which, in all honesty, you seem to be a proponent of, Shamus. The general meaning to that is PC gaming dying against consoles in the big realms of things like FPSes. In fact, in that very article, Cliffy B claims “what's driving the PC right now is “ËśSims'-type games and “ËśWoW”Ëś and a lot of stuff that's in a web-based interface. You just click on it and play it. That's the direction PC is evolving into” – graphics are in no way what is destroying the PC gaming industry, expensive graphics cards are in no way a new thing. The old PC games are simply better served by consoles now, I don’t know that I like that, but it’s a reasonable development direction,and PC games are heading these other directions.

  34. Phlux says:

    Again, Deoxy makes another good point about the relative ease of measuring graphics vs gameplay.

    I think it goes a little deeper than that, though. Reviews often play a role in the actual compensation of developers. If your game gets X average score you get Y bonus dollars.

    Sales are king, but publishers are obsessed with review scores.

    So if good graphics = good reviews and good reviews = good money, then it’s no wonder why developers focus on them so much.

    It’s obviously not like this everywhere, but it’s not a model without precedent either.

  35. Nick Pitino says:

    I can see exactly where you’re coming from here Shamus. There’s at least eight games I can think of right off the top of my head that I would like to play if it wasn’t for the fact they would make my built-in-2001 computer CATCH ON FIRE.

    I was extremely disappointed to say the least when I downloaded Half-Life 2: Episode 2 just to be told by Steam (AFTER, not BEFORE letting it spend eight hours downloading, you’d think it would tell you ahead of time…) that I couldn’t even come close to running it due to my CPU not have the ‘SSE Instruction Set’ or some such.

    While I AM saving up for a new computer, it still frustrates me immensely to go and pay $20 for a game and have it tell me ‘No, your computer is sucky, go play Zork or something newbie, you have lost your membership card to the PC gaming world.’

    …And knowing my luck something will die in said new computer in like a year because of tin-whiskers growing off of the damn ROHS solder paste joints they put in everything these days. I can’t blame the game developers for that though, so I guess I’ll blame society like I do for everything else.

    :(

  36. Alan De Smet says:

    I obviously don't have any raw stats here, but there is a majority that mostly base their purchases on graphics.

    Look at the console war. There are 3 common arguments for the Wii sucking: … it's games doesn't look good. There are a large number who would buy a PS3 because it's shinier!

    Which is why, of course, The Wii is outselling the PS3 2 to 1, and outselling the XBox 360 by a more modest amount.

    Sure, there are lots of people who like more realistic graphics. Amoung hard core gamers (which includes myself), the majority may. But Nintendo is handily proving that you don’t need to pander to the hard core gamers.

  37. chuko says:

    It seems to me this is exactly the way to do difficult-but-to-the-player-subtle improvements in games – that is, provided by a third party developer who takes on the work that wouldn’t be worth it to a regular gaming house.

  38. Daemian_Lucifer says:

    Look at the bright side:Soon they will reach the point where games look like real life,and every improvement from there will only be in making it easier to process and develop.

  39. Rebecca says:

    I’m just excited that graphics have gotten to the point where the great technological advances are really subtle things, polishing really. It means we’ve come really far!

  40. Scott says:

    Daemian_Lucifer: Ahh, but we will never reach the point where video games look like real life.

    I remember playing Half-Life and marveling at how much like real life it looked! By 2018, we will be looking back at games like Crysis saying, “I used to think those graphics looked good!”

    Oh, and about the radiocity thing?
    It’s not so much about easing the burden on lighting for level design. It’s entirely possible to set up VERY realistic looking lighting in a static space. It’s when you put that light on a moving object where it gets tough.

    I took a tour through a cave once. There weren’t many people with us so it was pretty cool. The tour guide was the only one with a flashlight. The way the cave formations cast shadows was breathtaking and somewhat terrifying and the fantastic colors reflecting off of the rocks were a sight I won’t forget.
    If I were to walk through that same cave with the flashlight from Half-Life 2, I wouldn’t even stop to take a breath. It would be just another cave.

    Having dynamic radiocity would significantly improve the mood of many games allowing characters to explore areas that are not lit by any static lighting! Instead of a small circle of light cast by a flashlight, you can see the whole room simply from reflected light which becomes pitch black as soon as it is turned off.

    Maybe I’m just crazy, but this is exciting to me!

  41. Ian says:

    I agree with Scott. Advancements like this can make games a lot more immersive and truly improve the atmosphere (and this is coming from someone who still plays and loves ZZT and Kroz, so I’m definitely not a complete graphics whore).

    Gears of War is one extremely fun game I can think of off the top of my head where the effects of a modern graphics processor can actually make an excellent game even better. Just the overall gritty look and the use of post-processing effects greatly add to the overall atmosphere, and the splattering of blood you see when you use the chainsaw on an opponent is just jaw-dropping.

    Of course, relying on graphics to boost a mediocre game isn’t a good idea, either, nor is alienating people with systems older than a year.

    It’s probably going to be like pixel shaders and just about every other advancement — either rarely used, or optional, at the beginning and winding up being required after about five years.

  42. ngthagg says:

    I’m not sure the Wii vs. PS3 or 360 is a really good argument. If you are saying that people want gameplay over gimmicks, the Wii may not be the best system to demonstrate that.

    I’m with Shamus on this whole issue. These are great graphics, and I’m glad that there are companies that are working to bring them to the mass market, but I really wish the people making the software would focus more on content and less on gloss. Or maybe that the people buying software would focus more on content and less on gloss.

  43. Shalkis says:

    Er.. Quake 2, released in 1997 had radiosity. Doom 3, released in 2004, had dynamic lighting. So in that sense, the video clip had nothing new. However, to address the game engine vs. gameplay debate.. there are instances where game engine improvements have resulted in more creative gameplay. For example, Quake 3 had the portal-that-you-can-see-through element, but it took Portal to turn it into a crucial game mechanic. As for dynamic lighting.. that too can be used as a gameplay mechanic. If you don’t have dynamic lighting, you can’t have monsters that dislike bright lights and flee from those, or NPCs that act realistically when someone shoots lamps..

    Think of game engines vs gameplay like physics vs engineering. Discoveries in the former allow you to make cool stuff that you couldn’t do before with the latter.

  44. Bizarre says:

    I don’t think there’s anything necessarily *wrong* with wanting a focus on graphics for a game, if that’s what you’re setting out to make.

    Crysis was built to be pretty. That’s pretty apparent. It’s a thin story over solid but standard gameplay, but it’s pretty. Very, very pretty.

    The problem comes, I think, when people start to think that great graphics make anything but a pretty game.

  45. Ian says:

    Shalkis: There is a key difference between Quake 2’s radiosity implementation and this one. All of Quake 2’s lighting, barring the simple dynamic lights that are cast from weapons fire, is calculated during the map building process. Enlighten’s radiosity, on the other hand, is dynamically generated.

    While the concept of using radiosity in games is far from a new one, using radiosity for dynamic light is.

  46. Mephane says:

    I am still awaiting the day when we can have real time raytracing and then get rid of all that video card uber crap, but instead a certain percentage of our so-many CPU cores just do the raytracing. THAT is the way to go into the future. All new fancy features for video cards are nothing less than little steps trying to overcome the totally limited rendering models used there. *sigh*

  47. guy says:

    I have a game i had to buy a geforce 7300 to play, and if i were to put it next to a 3 year old game, i would have trouble seeing the diffrence. but then, i can’t tell the diffrence between Blu-Ray and DVD when they’re in split screen and an object is split by the dividing line, so don’t pay too much attention.

  48. Deoxy says:

    “I'm not sure the Wii vs. PS3 or 360 is a really good argument. If you are saying that people want gameplay over gimmicks, the Wii may not be the best system to demonstrate that.”

    Um, what else would that be demonstrating? Seriously, the “old” system with “bad” (relatively speaking) is kicking the other two systems’ rear-ends all around the block.

    What else could that mean?

  49. Deoxy, I think what he was suggesting is that the Wii, rather than emphasizing gameplay over gimmicks, simply emphasizes a different gimmick. So it’s a case of gimmicky controllers over gimmicky graphics. The Wii arguably is succeeding because the Wii controllers are a better, fresher gimmick than incremental improvements in an already fairly mature feature (graphics).

  50. Burning says:

    Given that the Wii was brought up as evidence that not all gamers insist on uber leet graphics, I don’t know that the question of whether the Wii remote is a gimmick or not is particularly relevant to the discussion up to this point. I don’t know enough to address the question of whether the graphics monsters are losing sales to the Wii or if the Wii is just pulling in new people. The fact that remains that there are a lot of people who seem to be willing to settle for “good enough” in their graphics. I think it’s just possible that even if they’re all new people to gaming, a lot of more hardcore style games could be sold to these people…if they didn’t have to break the bank upgrading their PCs every six months.

  51. Here’s the thing: I think you’ve fetishized a legitimate complaint to an extreme where it’s no longer valid.

    I think it’s absolutely true that the PC gaming market has fetishized the cutting edge to such a degree that mainstream customers are being pushed out (or have already been pushed out). I think it’s absolutely true that this is madness, considering that one of the number one advantages of PC gaming is the ubiquity of the platform.

    But, OTOH, I think concluding that, therefore, all progress should come to a stop doesn’t make any sense to me.

    I think that if a game is released today then it should be playable (and not just technically “playable”, but enjoyable) on technology at least 3-5 years old. But 5 years from now, there’s no reason for the PC gaming market can’t be aiming for the technology that’s cutting edge today.

    I also think that it’s in the publisher’s best interest to make sure that game can take advantage of the cutting edge technology today.

    Why? Because PC gaming also had an advantage over the consoles in that it CAN be on the cutting edge at a pace that the consoles can’t compete with. The PC gaming market shouldn’t be abandoning that advantage any more than it should be abandoning the advantage of its ubiquity.

    To sum up: Swinging from one extreme to the other doesn’t solve anything.

    Or to put it another way: When someone says that they wish modern games would run on their 7-year-old machines (like Nick Pitino did), I’m just left scratching my head. Let’s put that in context for the old-school grognards here: If you bought a game for your 386 in 1992, you really have been complaining that it wouldn’t run on your 1985 IBM-XT because those new-fangled, technology-obsessed developers insisted on using VGA graphics and you were still stuck with your CGA monitor?

  52. Shamus says:

    Justin: Like I said, I don’t have anything against nice graphics. I’m certainly not advocating a freeze on graphics. My main point is that most developers haven’t even mastered the nuances of existing technologies, so the last thing they need is yet another graphics trick to be implemented half-assed. The other problem is that right now we’re past the point of diminishing returns with this stuff.

    What I think PC developers should do is make an engine, use it to make several games, and maybe (say) every four years or so take a hop forward. In four years some of this cutting-edge stuff will be low-hanging fruit. The existing stuff will be more polished, better implemented, better documents, more optimized, and with better tools available. It’s easier to build up if you have a solid foundation.

    Re-inventing the wheel every time they make a game is killing them.

  53. Nick Pitino says:

    Justin:

    I understand entirely that I shouldn’t be surprised that my ancient computer can’t run the latest and greatest. Mostly I was just venting that Steam wasn’t nice enough to tell me until AFTER I had wasted my time and money.

    But that’s a topic for another day.

  54. Deoxy says:

    If you bought a game for your 386 in 1992, you really have been complaining that it wouldn't run on your 1985 IBM-XT because those new-fangled, technology-obsessed developers insisted on using VGA graphics and you were still stuck with your CGA monitor?

    The CGA setting was available in many games for many years, if you bothered to check the options – your example is basically making OUR case, not yours.

    I understand what you are saying (we do still want progress, afterall), but progress is NOT the problem right now… all we’re getting is graphical progress of one form or another. We’ve got such a glut of graphical progress that stopping there for a time to spend effort on other things would be a very reasonable request.

    Your comment seems to me like this: If I were drowning in a pool, and someone came and added more water to the pool, I would complain, and your comment would be, “Hey, you need water to live, you know – imagine if you had no water!”

    Not the problem we’re having right now, you know?

  55. Ian Davis says:

    “What I think PC developers should do is make an engine, use it to make several games, and maybe (say) every four years or so take a hop forward. In four years some of this cutting-edge stuff will be low-hanging fruit. The existing stuff will be more polished, better implemented, better documents, more optimized, and with better tools available. It's easier to build up if you have a solid foundation.”

    It’s a good argument, but I think that over four years a company is going to want more from the programmers salaries than “I optimised the renderer again”.

    A lot of companies now are paying designers, level designers, AI designers, concept artists, asset artists, animators and musicians, who, given the engine they already learned how to use for the last game, could quite possibly make a new and better game.

    However, they also still have all the guys who built the engine – the technical designers, the graphics programmers, the AI programmers, the animation programmers, even physics programmers nowadays, and most companies aren’t just going to cut them loose because they don’t have anything to do. So why not get them working on upgrading the engine to use the latest tech, and utilise the more powerful algorithms that it supports?

    What would be nice would be if the engine team could build the engine for the third game while the rest of the studio builds the second using the engine from the first? That way, after the second game is released, there’s always a one-game buffer to get the engine right and each game is built with a stable, unchanging set of tools and engine. Again, a nice idea, but when designers know that a feature is available in the new engine, it can quickly become an absolute requirement for the game they’re working on, and then the game gets pushed onto the new engine – back to square one.

    However, what if the focus on tech isn’t actually what detracts from the quality of games? Some of it certainly comes from a lack of respect for quality that pervades among games producers – the game has to be done now, has to be pretty, and that’s about all that matters. We’ve seen it again and again – Prince of Persia, Assassin’s Creed, Overlord, Fable, right back to Soul Reaver… all fantastic games built on great concepts, that kind of fall down toward the end because of being rushed out. I think that tendency is hurting PC and console games more than new hardware is.

    Of course, it would be great if old hardware was still supported :/

  56. guy says:

    WHAT WOULD BE NICE IS IF THE DESIGNERS ADMITTED THAT FURTHER GRAPHICS IMPROVEMENTS ARE OF NO REAL VALUE! we have hit the point where normal humans cannot tell the diffrence between generations of graphics, except that they take more computer. it makes level designers’ jobs easier, unless they somehow screw up the texturing, which means they have to do that all over again. personally, i think cryis is as good as at all possible, and it is still a massive waste of CPU and graphics cards. frankly, i think that graphics engine designers should all get jobs making new stuff for CGI videos, as that is somewhere where it matters, although unless critics have glasses of detect CGI, i think it is just people looking for stuff to critique and making it up. shamus has already made a point about unfinished games, and suggested doing the start and end gameplay before the middle, so that you can just chop stuff out if you run out of time.

  57. Shamus wrote: What I think PC developers should do is make an engine, use it to make several games, and maybe (say) every four years or so take a hop forward.

    Sure. But isn’t that basically the way the industry is working?

    Look at Valve, for example. Several years ago they developed the Source engine. They’ll probably be retiring that engine completely within the next year or two, but by the time they’re done they’ll have released close to a dozen games under it.

    The various Unreal and Quake/Doom engines from Epic and id are widely licensed.

    Right now the Crysis engine is ridiculously uber. I’m actually in the process of upgrading my PC (from 1gb to 2gb of RAM and from a 6-year old graphics card to a 2-year old graphics card) because I’ve reached the point where I can no longer play many of the newer games I want to play (Tabula Rasa in particular), and I still won’t be able to play any games using the Crysis engine for probably another 2-3 years (when I go through my next PC upgrade).

    But the reward for pushing those boundaries is that the Crysis engine will probably still being used to make games 6 years from now. I think that the Source engine probably hit a better sweet spot (I was able to play Half-Life 2 when it came out despite being a couple years behind the technology curve and they’re still getting 4-5 years of development out of it), but it’s not so radically different that I think Crysis is completely unreasonable.

    What’s my point in all this? That when a company takes that next hop, they’re going to hop to the cutting edge. And I think there are many instances where they could (and should) do a better job of being backwards compatible (again, this is something Valve did very well with the Source engine).

    Deoxy wrote: “The CGA setting was available in many games for many years, if you bothered to check the options – your example is basically making OUR case, not yours.”

    Oh, c’mon! 1992 is the year that Wolfenstein 3D, Ultima Underworld, and Alone in the Dark were released. None of those games featured CGA support. They didn’t even offer EGA support.

    CGA was 1981. EGA was 1984. VGA was 1988.

    The idea that major new releases in PC gaming have ever supported 7-year old technology has no basis in reality.

    guy wrote: “WHAT WOULD BE NICE IS IF THE DESIGNERS ADMITTED THAT FURTHER GRAPHICS IMPROVEMENTS ARE OF NO REAL VALUE! we have hit the point where normal humans cannot tell the diffrence between generations of graphics, except that they take more computer.”

    I guess I’m not a normal human, then, because I find it quite trivial to tell the difference between Quake 3, Half-Life 2, and Crysis. The graphical improvements from one generation to the next are pretty clear.

    Here’s Quake 3.

    Here’s Half-Life 2.

    Here’s Crysis.

    In terms of the quality of the gameplay associated with these graphics… Well, there are good games. And there are bad games. Just like always. And I don’t see any particular relationship between the bad games

    Recently I’ve been replaying the old Ultima games. And I’ll freely admit that I’ve been lamenting the fact that nobody makes games the way that Origin used to make games: Their slogan (“We Create Worlds”) really meant something. And it’s amazing that the level of interactivity we have in most RPGs today barely manage to compare with the level of interactivity you had with the world in Ultima VII (and that’s only if you’re lucky).

    Of course, let’s be honest here: Nobody else was making them like Origin was back in the early ’90s, either. I’d love to see someone pick up that mantle again, but I’m not going to lie to myself and pretend that graphical excellence is the problem.

    (And, of course, Origin itself was always one of the progressive, cutting-edge, you’ll-need-an-upgrade-to-play-the-new-game companies on the block.)

  58. guy says:

    @justian

    wow, over several years and many graphics cards, Cryis looks slightly better than quake. honestly, if i hadn’t seen metrocops in the halflife one i might not have been able to tell it was halflife instead of quake if they were unlabeled. maybe i’m graphics-impaired. in any case, i don’t think eye candy is the point of a game. graphics have gotten good enough that you can tell the difrence between enemies and have it be a sublte diffrence if that is what is wanted. also, source has been updated in order to allow valve to ambush you as you leave tunnels and are momentarily blinded, and could be updated further, but that will really not improve the experience much.

  59. There were people who said the “eye candy” of Ultima VI wasn’t the “point of a game”. But the reality, of course, is that the greater graphical quality allowed Origin to support a much richer, more dynamic, and interactive world.

    Of course, it’s also possible to use that graphical potential and do absolutely nothing with it except achieve even more graphical potential. So what? You end up with a bad game that looks prettier than the last bad game. I hate to break it to you, but bad games are going to get made. 90% of everything is crap, and freezing technology in place isn’t going to change that fact.

    You talk about the irrelevance of HDR being added to the Source engine, but this actually had an almost immediate impact in a game like Counter-Strike. (Which you acknowledge, but then immediately denigrate.) Similarly, the ability for Terrorists and Counter-Terrorists to cast shadows allowed people to spot movement from around corners.

    This type of thing not only adds realism, but it also adds depth to the gameplay.

    And if you’re talking about a game that also attempts to communicate a story or make you care about characters, one only needs to look at the difference between the square-faced scientists of Half-Life 1 and the detailed performances that can be evoked from Alyx in Half-Life 2 to see the advantages of improving graphical performance.

    My point is this: Better graphics or physics engines don’t inherently make a better game. Nor are they necessary for a good game to exist. (Look at Introversion’s successes with games like Uplink and Defcon.) But they can be tools for making a better game — a game that couldn’t have been made without those better graphics and physics. Valve, for example, has consistently found ways to take advantage of better technology to improve gameplay and make better games.

    You can have your cake and eat it, too. Origin proved that in the ’80s and early ’90s. Valve is proving it today.

  60. guy says:

    so, because 90% of games are bad, we should keep spending hundreds of dollars so we can play prettier bad games? as for the effect on counter-strike, the shadows let you detect movement around corners. there is a game named UT99 that let you do that by using sound. HDR has an effect on gameplay. is it worth a new graphics card? as for graphics effecting realism, they make the fact that you can’t blow down the door with your rocket launcher stupider. on ability to care about characters: X-com managed that in it’s closing “cutscene.” warhawk managed that without showing them.

  61. guy wrote: “so, because 90% of games are bad, we should keep spending hundreds of dollars so we can play prettier bad games? “

    Obviously I didn’t say that. Please don’t be a troll.

    And here’s a thought for you: Instead of playing the 90% of games that suck, why don’t you try playing the good games? It might make you a happier and more positive person.

    guy wrote: “HDR has an effect on gameplay. is it worth a new graphics card? as for graphics effecting realism, they make the fact that you can't blow down the door with your rocket launcher stupider.”

    “You can display text on the screen instead of printing it out? Is that really worth buying a new machine?”

    “CGA instead of text? Is it worth a new graphics card?”

    “EGA instead of CGA? VGA instead of EGA? Iso-metric graphics instead of 2D tiles? Pseudo-3D graphics instead of 2D? Actual 3D graphics instead of pseudo-3D?”

    Obviously you want to draw the line somewhere. Can you (a) tell me precisely where you’d draw that line; and (b) give a rational explanation for why the line should be drawn there?

    guy wrote: “As for graphics effecting realism, they make the fact that you can't blow down the door with your rocket launcher stupider.”

    I was able to blow up doors using a cannon in Ultima V. Conversely, many text-based Infocom games featured plot-doors despite not having any graphics at all.

    My point? Plot-doors are sloppy design. But they have absolutely nothing to do with the graphics engine.

    Which is actually my overall point in a nutshell: Poor gameplay has nothing to do with higher graphical quality. OTOH, higher graphical quality has often been used to either improve gameplay or make possible gameplay that was previously impossible.

    But if you honestly believe that improved graphics are somehow to blame for poor game design, then I suggest you check out the vibrant and active community of casual and indie games. I’ve been enjoying and playing many titles from Introversion and Amaranthia lately. Rampant Games just released an alpha test of an RPG called Frayed Knights — although that game features 3D graphics, so it might be too visually rich for your peculiar prejudices/tastes.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to Shamus Cancel reply

Your email address will not be published.