PC Gamer: The future of the past

By Shamus Posted Thursday May 11, 2006

Filed under: Nerd Culture 11 comments

I have the March 2001 issue of PC Gamer here. In shuffling around old magazines this one caught my eye. I took a peek because it had the 2000 game-of-the-year awards, but then I noticed something even more interesting: An article on the future of gaming that looks back five years to 1996 and then forward five years to 2006. Some of the predictions are amusing in retrospect.

It’s pretty unfair to pick on articles like this. Nothing looks as dated as yesterday’s future, and articles like this are easy targets for derision. But that’s what makes them fun.

So let’s get started!

Prediction: CPU’s will have speeds of around 10GHz.

They give themselves some wiggle-room with this one by saying “in the next five to ten years”. That’s a LOT of wiggle room, and in the world of computers any prediction with that much variance is almost useless.

Prediction: By 2006 we will have real-time PC graphics that exceed the quality we are seeing in movies today.

Toy Story and Final Fantasy movie are cited, as in: By 2006 PC Games will look better than Final Fantasy: The Spirits Within.

You must be joking. Even in 2001, this was clearly preposterous. In fact, in the last five years, the look of those sorts of movies hasn’t seen that much improvement. The newest Final Fantasy movie doesn’t look any better than the one from five years ago.

At any rate, the person who made this prediction clearly didn’t understand the scope of the problem. Twice as much CPU power does not translate into images that are twice as realistic. Not by a long shot. Even if they did: Those movies were made by huge render farms with many, many dedicated computers working together and still producing the movie footage at rates far below real-time. Sometimes as slow as a frame or so an hour. I don’t care how you run the numbers or look at Moore’s law, there was no way you were getting that much power on the desktop in just five years. I’ll make a counter-prediction and say that given another five years, we still won’t have enough power on the desktop for a single computer to render one of those movies in realtime, much less something even better, as the article predicts.

The problem is that each layer of realisim takes far more power than the last. In the last five years we’ve only gotten one of many needed improvements in this area. As of Doom 3, we finally have unified, real-time dynamic lighting. That means you can now have a scene with any number of freely moveable lights that can all cast shadows. This is a big step. Up until now, shadows had to be pre-computed. The level designer needed to run a program to calculate all of the shadows, which would then remain fixed in place. You could move a light, but it was pointless since the shadows cast by that light wouldn’t move. Now that new Doom engine can do this, I’m sure other engines will follow.

But that is one step of dozens, and it was the easiest one. Some other challenges:

  • Curved reflective surfaces, like a reflective chrome ball. We can make stuff look like chrome in games, but true bending reflective surfaces that can reflect one another in realtime are still a good ways beyond our reach.
  • Widespread use of semi-reflective surfaces. Odds are you are sitting at a desk, and you probably don’t think of it as particularly shiny, but if you look at it from the right angle you’ll see it does reflect the lights in the room. It’s a very blurry and cloudy mirror. Most stuff is. This is really expensive to render, and has only a small impact on the overall look of an object, but if you’re working on realistic worlds you need this. The lack of reflection is one of the things that make PC graphics look fake, like everything is made of dull plastic. You’re not getting anywhere near fixing this in 5 years.
  • Refraction: Notice how distorted things look when looking through a bottle. Doom and Half-Life 2 both fake this pretty well, but the movies have the real thing, which is far more expensive CPU-wise.
  • Extreme detail: A problem in games that you don’t have in the movies is that the viewer can move the camera around. In a movie, if you plan a shot that is tight in on a penny and then pulls back to reveal the inside of a bank valut, then you can make one perfect, realistic penny and the rest of the scene can be lower detail. In a computer game, the entire scene has to have the that same level of detail or it won’t look right. The user might not take a close look at that penny. They might look at the stack of money on the other side of the room, or they might examine the lightswitch. Or they might glance in the room and leave without a second look, wasting all your hard work and attention to detail.

Conclusion: This problem is bigger than most people realize. We’re sort of at a point where you need double the processing power to make an image that’s 10% better.

Prediction: Sound will extend beyond the 5.1 surround sound specs to 10.2 and beyond.

Short rebuttal: Bwah ha!

Long version:

Most computers still come with a pair of speakers that have the power and fidelity of the average speakerphone. Some people put money into nice speakers, but this isn’t a technological problem, it’s a practical one: Who has the space to properly arrange and connect a dozen speakers? Almost nobody. Where they heck would you put them all? Your apartment would be a deathtrap of tripwires.

Prediction: Genre-specific [input] devices will continue to emerge.

The SideWinder Strategic Commander is cited as an example. Hands up! Who has ever seen or held one of these? Anyone?

Again, this isn’t a technological problem (which could have been overcome by now) it’s a practical one. Even if it were possible to make a game input device that was better than the ‘ol keyboard / mouse combo for FPS and RTS, who wants a half-dozen input devices laying around? even if they were all wireless, the clutter would be maddening. Lots of people have a gamepad or joystick handy, but usually they have one. Who could want a controller just for real-time strategy and another just for FPS and still another just for driving games and another just for flying and another just for platformers? Oh yeah: Don’t forget you still need the original mouse and keyboard on top of all the stuff.

It’s hard enough running the wires we have already.

Prediction: Broadband will make action experience accesible to the masses.

This one comes from Cliff Bleszinski, and I think he’s right on. For those that got it, it did.

Then someone else suggests that this might not be a good thing, because, “Online games might turn into chatrooms for adolescents.”

Two for two!

Prediction: A bunch of various facts about handhelds, cell phones, and portable games.

This stuff was pretty reasonable. They turned out wrong in a few places, but this was a really tough call to make. Proliferation of handhelds was just getting started in 2001 and it’s always tough to see where something like that might go. They make a few funny predictions like having Quake III Arena on a cell phone, but even that wasn’t that wild of a guess in 2001. Nobody was sure what was going to happen, which is why we ended up with the tacophone N-Gage.

In fact, we do have handhelds that can pull off Quake III Arena-level graphics. The Nintendo DS and PSP both look great and can rival the visuals of a PC. They aren’t phones, but they are quite portable. Handheld technology has come a long way – much farther than PC gaming in general – since 2001. Even now I would hesitate to predict what sort of PDA / Camera / Game System / Cell Phone / MP3 Player combos we will see in the next couple of years.

They also make some predictions about handheld wireless online gaming, sort of like everquest on a PSP. I imagine there is indeed a market for this, although this presents some interesting challenges. Battery life is the biggest problem I see here, since you are, in effect, playing your PSP and “talking” on the cellphone the entire time you play the game. That is a battery-killer for sure.

This article was fun to read again after all these years. Another thing I note about this issue: 2000 was a killer year for games. The Sims. Deus Ex. No One Lives Forever. Quake III Arena. Diablo II. The Longest Journey. Combat Mission. C&C: Red Alert 2.

That was an incredible year in PC gaming. I currently own or played almost everything on that list. Some of them (like C&C) have been forgotten, but seveal of those games are absolute classics. Despite the better graphics of today, I don’t have any games on my radar that excite me the way the games of 2000 did. In fact, I’m currently playing Final Fantasy X for the Playstation 2, which also came out in 2000. I might pick up Oblivion once it drops in price or I can get it used, but I’m in no hurry. Nothing on the shelves right now has really captured my interest, despite the fact that I have a new computer and a new video card. Maybe I’m just getting old, but I strongly suspect that gaming is suffering from a little stagnation.

UPDATE: Just as I’m posting this, I notice that Steven Den Beste has a must-read post about the difficulty of predicting future technologies and trends.

Also noteworthy: Mark has this post on technology trends and measuring the rate of technological change.

 


From The Archives:
 

11 thoughts on “PC Gamer: The future of the past

  1. Everyone expected that clockrates would eventually level out, but no one expected it to happen quite as soon as it did. Intel certainly didn’t; the “Netburst” architecture was designed to really hit its stride at about 6 GHz.

    I got this wrong, too. My prediction was that things would level out at about 10 GHz, in about 2010.

    Intel’s new processors, especially the dual cores, are actually a retreat to the previous version of P4 which had fewer pipeline stages. They’re faster than before because Intel has moved to 65 nanometers, but one of the reasons you’re suddenly seeing lots and lots of discussion about dual-core and hyperthreading is because serial computing speed is topping out, and sideways is the only way now for processors to become more powerful.

  2. Shamus says:

    I’ve noticed the sideways growth, but I don’t know enough to really have a grasp of where they are headed. For general computing, there is only so much help paralell piplines can give.

    What would the world be like if CPU performance nearly froze for five years or so? That would be amazing. For programers it would be a return to the old-school days of trying to find tricks to maximize performance. Over the past several years, it’s been pointless to spend six months optimizing your software, because in that time Moore’s law would have solved the problem for you, and you could have written something new.

    But if Moore’s law slows down, we will probably see a lot less software attrition. It will make more economic sense to really polish a product, because you can count on it being on the market for years.

    Games would benefit as well. Instead of throwing away all the old engines and tools every three years, developers could spend time optimizing the tools they have and getting the most out of them. Change might slow to the point where some sort of standards develop. No. That’s crazy talk.

    Doesn’t look like it’s on the horizon just yet, but its interesting to consider.

  3. Actually, 3D rendering is almost ideal for taking advantage of multiple processors. In fact, in the extreme case you could use a processor per pixel and utilize them fully.

  4. Pixy Misa says:

    CPU performance – in terms of clock speed – pretty much topped out in 2002. The first 1GHz chips arrived in early 2000. By mid 2001 we had 2GHz Pentium 4s; by late 2002 Intel had passed the 3GHz mark – in a 130nm process. The best they can do in a 65nm process is 3.8GHz.

    Fortunately for game developers, Nvidia and ATI haven’t slowed down at all, because as Steven points out, 3D rendering is very easily parallelisable. Just keep adding more pipelines and bandwidth.

    And fortunately for me, the stuff I care about is inherently multithreaded. (Multi-user database apps.)

    The new Intel “Core” chips are actually closely related to the old Pentium Pro (via Pentium II and III and the various mobile chips). With a lot of tweaks, admittedly, but the basic architecture is Pentium Pro, not Netburst.

  5. . says:

    You note that Final Fantasy Advent Children doesn’t have better visuals than Spirits Within (debateable), but there’s one thing that’s certainly true and it’s that Advent Children was made for much less money and probably less time. I remember a few years ago when I got into learning Maya I had a discussion with one of my Maya-savvy friends about how Square spent millions of dollars just getting Aki Ross’s hair to look natural (she was the only one in the film that I can remember that had more hair than a crew-cut)… Then a few months after the movie was released Alias integrated a hair simulator into Maya by default — Ouch.

    Spirits Within, if I remember correctly, also has characters primarily wearing very stiff / skin-tight costumes. Advent Children makes significant use of near features in Maya to simulate clothing (see: Ballerina-Tifa).

  6. Shamus says:

    You note that Final Fantasy Advent Children doesn't have better visuals than Spirits Within (debateable)

    I’m not talking asthetics. I’m saying AC doesn’t seem to have any noticable innovations that make it look more real. It’s totally possible that realisim wasn’t a priority for the makers, and they passed on chances to improve realisim. I don’t know. I’m just saying skin and hair (the two biggest hurdles to realistic humans) looks about the same as it did 5 years ago.

  7. I would beg to differ regarding Shamus’ opinion of Advent Children. I was involved with him in the 3d rendering of avatars including hair, skin, and clothes using Truspace in the old days -about 7 years ago and we tend to keep an eye on the quality of these in games and movies.

    I noticed a significant difference in the quality of the hair, skin, and clothing. All were significantly more realistic. I would agree with . that the amount of time, money, and energy would definitely have diminished and the amount of plug-ins available would have helped with this. experimented with the free version of Maya a year or so ago and was amazed its capabilities compared to the buggy version of Turspace we started with. 3d design has come forward in leaps and bounds mostly because of processor speed and better graphics cards (it took forever for the screen to render just a 500 polygon avatar on my old machine-if it didn’t crash).

    This is why hair, skin, and clothiing have moved forward, not to mention increased understanding of how to go about the redering and design of these items. As an example I would mention the opening sequence of “Ghost in the Shell: Stand Alone Complex” which is ahead of Final Fantasy: the Spirits Within. From an artists perspective the quality is significantly better, though to the average joe it only means it doesn’t feel quite so “wrong”. The details are better, the flow of the hair and clothing, the fact that the hair is not a big clump, and the fact the skin didn’t “feel”like plastic as it did in Toy Story and even to some extent in Spirits Within. The designers have learned what people respond most to, icluding getting the movement of the avatar right and have learned to focus on those in order to get the most out of their work.

    The fact that the designers for Lord of the Rings and Narnia were able to succesfully integrate CGI into realtime films is another example of how far we have come since Toy Story.

    I think Shamus’ point of view here is as a programmer. Watching him try to sort out all the issues of cramming as much graphic possibility into his engine without sacrificing speed has been interesting and from that point of view I can see how shadows and whatnot have not come as far as was expected 5 years ao. The good news is that designers are, as he suggested, getting better at using what they have.

  8. Pingback: Kaedrin Weblog
  9. Eric says:

    AC is TEN times better looking then Spirits within.

  10. Shamus says:

    Ten times better? Again, I’m not talking about style or special effects, bu realisim. The people are still obviously computer generated. Their hair has the overcooked noodles look. You’re still not going to mistake any of them for photographic.

  11. D.K. says:

    Well…….. :-)

    To point out something, I as a 3D Designer, visual FX artists and animator has seen in movies & games, I can fully say that REALISM can never be recreated in software.

    The illusion of it can however.

    Take for instance “the incredibles”, its animation (not talking about the fake impossible movements) look much more real than the animation doesn in FF7AC.

    THe lipsync is also much more naturat, even though it is OVER emphisided and cartoonified by the animation style, it LOOKS and FEELS right even though it is not.

    FF7AC has stiff facial expressions and the lipsync is not nearly as convincing.

    It all boils down to perception.

    I have seen photograps and i was convinced it was done in 3D software, yet it was captured from reality, and i have seen vice versa.

    Processing power has a LOT to do with how the final look and feel will be in comparison to reality.

    just a small 2c reply from my POV :-)

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply

Your email address will not be published.