Wolfenstein II Part 2: Broken Technology

By Shamus Posted Thursday Feb 8, 2018

Filed under: Retrospectives 149 comments

Like I said last time, my goal here is to illustrate how this game has a lot of overlooked shortcomings and half-baked gameplay systems that should be fully-baked by the third entry in a series. But before I can argue with the critics, we need to talk about the PC launch. So let’s get that out of the way. Let’s talk about…

Technology

To get the framerate up to playable levels, I had to turn the visuals down to 2009 levels of detail. And yet the game still struggles to keep up. Where is all the power going?
To get the framerate up to playable levels, I had to turn the visuals down to 2009 levels of detail. And yet the game still struggles to keep up. Where is all the power going?

The game launched as a broken mess on the PC. I’ve spent hours reading the forums and I’ve never been able to find a pattern in any of it. There doesn’t seem to be a single unifying problem that caused the crashes, headaches, slowdowns, glitches, and bugs. There were people with low-end hardware that could run the game and people with high-end hardware that couldn’t. The problems impacted both AMD and NVIDIA hardware.

I get it. Developing for the PC is hard. This is doubly true if you’re one of the first AAA games to use the new Vulkan API and you’re still working the bugs out. While I always insist that for $60 the publisher is obligated to perform the due diligence required to make the product usable for the customer, I might be more inclined to give the publisher a bit of slack if they had shown even a sliver of competence after launch.

The timeline went like this:

  1. At launch, many users complain that the game runs absurdly slowly. Others report it doesn’t run at all.
  2. Publisher Bethesda announces a beta patch and encourages people to try it out. At about the same time, NVIDIA pushes a fresh set of drivers.
  3. It’s not clear if the problem is the patch or the drivers, but many people report that the game is now worse. For some the game had been working but was now running slowly. For others (including me) the game had been running slowly and now wouldn’t launch at all.
  4. Ignoring this feedback, Bethesda pushes the update to the masses. Additionally, this new version forces everyone to use the latest drivers, which makes it even harder for the community to figure out where the slowdowns and crashes are coming from.
  5. A bunch of new people show up in the forums. The game had been fine for them and they hadn’t paid attention to the beta, but now that Steam has pushed the latest version they’re having problems.
  6. Bethesda posts a bunch of nonsense advice like “update your drivers” and makes it pretty clear they aren’t reading a single word people are posting to the forums.
  7. Another patch goes into beta. This one is focused on getting 4K rendering working for people. That’s nice for people looking to run the game in super-ultra-fancy mode, but there’s nothing here to help the poor folks who can no longer run the game after the latest patch.
  8. The patch goes live, and a few more people show up in the forums. More problems. A lot of the earlier people have moved on. Maybe they fixed their problems, maybe they returned the game, or maybe they shelved Wolfenstein II for the time being. In any case, the posts saying, “You broke the game for me” far outpace the posts saying, “This fixed my problem.”
  9. Bethesda announces another coming patch. This patch only has a single feature: “- Improved leaderboard stability from pause menu while in SAS Machine Combat Sim” That’s… that’s crazy. That’s a very minor problem, and I’ve never even SEEN someone complaining about that.

I never saw anyone complaining about leaderboards. Instead I saw dozens of people asking stuff like: Why are faces and the player’s gun always rendered in the lowest possible detail no matter how high I set the texture resolution? Why does the game crash when I Alt-Tab? Why does the game crash when I load a save? Why is the game crashing on startup when it worked fine for me at launch? How come antialiasing isn’t working the way it should? What’s causing these visual artifacts? And so on. I haven’t experienced all of these problems myself, but they seem to be pretty common and it’s clear they’re being ignored.

What's worse than using tons of graphical horsepower to make simple indoor spaces? Using EVEN MORE power to turn it all into mush with motion blur whenever the player turns their head.
What's worse than using tons of graphical horsepower to make simple indoor spaces? Using EVEN MORE power to turn it all into mush with motion blur whenever the player turns their head.

For me, the most obvious failure of all is just how much horsepower this game demands. This is a linear corridor shooter. When it comes to keeping the framerate high, that’s as easy as it gets. The shadows don’t move around the environment. No day-night cycle. The space isn’t filled with moving light sources. No open-world detail streaming on objects that stretch to the horizon. No dynamic weather. No dynamically destructible environments.

And fine, a Wolfenstein game doesn’t need to have those things. But since it doesn’t, where are all of the processing cycles going?

This is why all of the early shooters were set indoors. Quake, Duke Nukem 3D, Unreal, Dark Forces, and Half-Life were games focused around proceeding through a series of rooms in a set order. Even the supposedly “outdoor” sections were just big rooms surrounded by cliffs. This makes it easyActually in graphics nothing is truly “easy”, but I’m speaking in relative terms. for the game to figure out what needs to be drawn and what doesn’t.

On top of that, this game is mostly focused on fighting in industrial settings. Again, that usually makes things easier. Organic and chaotic spaces are much harder to render compared to rectangular hallways and rooms. And finally, the vast majority of your Wolfenstein II foes are wearing full-body armor so the game doesn’t need to render complex features like skin and hair and it doesn’t need to animate a ton of faces. Yet despite all these advantages the game pushes my system far harder than (say) Grand Theft Auto V or Doom 2016, both of which are facing much tougher rendering challenges.

This would be fine if the game was taking some bold new step in visual fidelity, but this is just another corridor shooter. Yes, the texture resolution is higher and you can push the resolution up to 4k mode if you want, but the game runs like a pig even if you’re not using any of those next-gen features.

A catwalk with some crates and dudes that are the same color as the background? Do we REALLY need a hardware upgrade to render THIS?
A catwalk with some crates and dudes that are the same color as the background? Do we REALLY need a hardware upgrade to render THIS?

I’m sure many of you will be tempted to say, “But the game ran fine for me!” Granted. It obviously ran fine for thousands of people. But it also malfunctioned for a lot of people, and there doesn’t seem to be a good reason for that failure. Maybe Bethesda set the system requirements too low, maybe their Vulkan-based evolution of id Tech 6 still needs some work, and maybe Bethesda’s QA and support staff is asleep at the wheel. I can’t diagnose the problem from here; all I can do is point out that the problem exists and it shouldn’t.

For the record, I did get the game running properly. Many users posted various tricks and hacks in the Steam forum. They managed to isolate some settings that fixed a lot of stability problems. These settings require mucking about with config files because they aren’t exposed by the video options menu. The important thing is that I did manage to experience the game as intended and was able to hit silky-smooth 60fpsActually, there were still a few spots where the framerate was bad, but they were rare. And strangely enough, they were often in spots where it didn’t seem like the game was drawing anything particularly difficult. with no crashes.

So now I have to click through a DLC advertisement every single time I launch the game? Sure. Why not?
So now I have to click through a DLC advertisement every single time I launch the game? Sure. Why not?

If you decide to play along with this series, then I have two bits of advice for those of you playing on the PC:

  1. Be careful if you’re near the low end of the system requirements. Those minimum requirements seem to be what’s needed to boot the game up, not what you need for a playable experience.
  2. On Steam, right-click on the game and select “Properties”. Look for the button that says “Set Launch Options”. That will bring up a text box where you can enter:

    +com_skipBootSequence 1

    This will disable the appalling 30 seconds of splash screens. Look, 30 seconds of unskippable screens is bad enough in a game when it’s smooth and stable. But in a game that is both prone to crashing and that needs lots of restarts to tweak with settings, it’s absolutely unforgivable.

My machine can render GTA V at 60fps, and yet it struggles with THIS?!?
My machine can render GTA V at 60fps, and yet it struggles with THIS?!?

Or maybe you should get Wolfenstein II for the console, if that option is open to you. Just be warned that George Wiedman says the game is significantly harder on the console. Apparently the game was balanced around the mouse? I wouldn’t know. Good luck!

One final note is that all of this means the screenshots that go with this series are not representative of the intended look of the game. I usually try to get screenshots on average or above-average settings, but that just isn’t possible for me with this hardware setup. It’s true I could get clean shots from YouTube videos, but sometimes I need to take my own shots to illustrate things and I don’t want a jarring shift between my shots and stuff from YouTube. So we’re going to have all shots taken with the graphics settings set to “looks like ass”. Sorry about that. If Bethesda marketing wants to send me a fancy new graphics cardAssuming they can afford one. then I’ll be happy to replay things and get a fresh round of pretty screenshots, but if not then we’ll just have to muddle through with what we have now.

Ok, I’m done griping about stupid technology problems. Next time we’ll get into the story and gameplay things I actually want to talk about.

 

Footnotes:

[1] Actually in graphics nothing is truly “easy”, but I’m speaking in relative terms.

[2] Actually, there were still a few spots where the framerate was bad, but they were rare. And strangely enough, they were often in spots where it didn’t seem like the game was drawing anything particularly difficult.

[3] Assuming they can afford one.



From The Archives:
 

149 thoughts on “Wolfenstein II Part 2: Broken Technology

  1. MarsLineman says:

    And yet the game is being ported to the Switch. So it can apparently run on a mobile ARM-based Tegra X1, but not a desktop-class x86 with an Nvidia GTX 780. That’s some nice optimization

    https://www.nintendo.com/games/detail/wolfenstein-ii-the-new-colossus-switch

    1. Addie says:

      I am sure the Switch port is also going to optimise away about 90% of the texture detail, two-thirds of the geometry detail, most of the fancy lighting and particle effects, and be limited to a top-end of 30 fps. And the Switch is not really that underpowered; it’s as powerful as a home console from a couple of generations back (or one Nintendo generation back), and they could all run corridor shooters just fine.

      1. MarsLineman says:

        Right, but in Shamus’s experience, “the game runs like a pig even if you’re not using any of those next-gen features”. He had to “[muck] about with config files because they aren’t exposed by the video options menu” just to get the game running properly, even with the fidelity dialed way back. If the game can run on the Switch (even with the settings dialed way back), it should run gloriously on Shamus’s hardware, which is surely many times more powerful (both cpu and gpu) than the Switch.

        1. Redrock says:

          Come now, we all know that PC optimisation isn’t nearly as straightforward as that. Optimising for consoles which all have the exact same hardware is one thing. Optimising for the countless amount of hardware configurations on pc, as well as the possibility of a myriad background processes that consoles lack is quite another. I’m sure there were multiple instances of the game running smoother on slightly worse hardware than what Shamus has. Or the other way around. That’s not to say that the situation was acceptable. Just that the mere existence of console ports isn’t really relevant here.

          1. MarsLineman says:

            This isn’t just any console port- it’s a port to a mobile ARM-based platform. The differences in processing speed don’t get much more extreme for current-gen platforms. It’s one step away from being able to play the game on your phone.

            For the sake of comparison, I’m *emulating* Breath of the Wild (Wii U version, via CEMU) at 35-45 fps on my 15-watt 6500U cpu+ GTX 965m laptop (at 1080p).

            PC optimization certainly requires extra processing overhead. But this is an absurd delta. How would you feel if you couldn’t get one of your phone games to run at a decent frame-rate on your gaming desktop?

            1. Redrock says:

              A pirate, eh? Naughty-naughty. In all seriousness, I think we should wait and see how exactly the Wolfenstein port will run and look on the Switch. And hell, once we figure out the corresponding PC settings we could even stage an experiment and run it on the same settings on hardware similar to Shamus’s. I’d actually be pretty interested in that.

              1. MarsLineman says:

                There was actually a Digital Foundry video comparison between the Switch and PC versions of Doom, much like you just suggested (linked below). And they found that the hardware specs needed for the PC version to run at Switch-like settings were pretty minimal (certainly far below Shamus’s rig), even as DF were very impressed by the Switch port’s performance. Given that Doom and Wolfenstein run on the same engine, there’s just no excuse for the game to run so poorly on Shamus’s level of hardware

                https://www.youtube.com/watch?v=cXxDKgqBWXA

                1. Redrock says:

                  Weeell, I doubt Shamus was running the game at Switch-level config. Switch resolution for Doom is lower than 720p, with severe cuts to texture detail, lighting, effects, etc, which is lower than low. And it still runs at a very unstable 30 fps, according to Digital Foundry.

                  1. MarsLineman says:

                    “OK, so the resolution is low and the image quality can be very blurry, but if you can get past that, [the Switch version] still manages to resemble the other console versions of the game. Based on the testing we did with a low-spec PC downclocked to give ballpark Switch GPU and CPU performance, it’s amazing that the game manages to reach any resolution beyond 540p. This is a case where seemingly unimpressive results are seemingly miraculous when taking the relatively meagre capabilities of the hardware into consideration.”

                    http://www.eurogamer.net/articles/digitalfoundry-2017-dooms-impossible-switch-port-analysed

                    I mentioned the Switch version because of the extreme contrast in performance with a gaming desktop- DF had to downclock a low-spec PC to approximate the Switch hardware. If you want something closer to an apples-to-apples comparison, my laptop (2-core ultra-mobile 15-watt cpu, gtx 965m) manages to run Doom at medium-high settings at a mostly-stable 45-ish fps (likely bottlenecked by the cpu, since dropping the resolution doesn’t improve the fps). Doom and Wolfenstein share the same engine, and Shamus’s rig is far more powerful than my laptop. Again, there’s no excuse for the game to run so poorly on Shamus’s setup

                    1. Redrock says:

                      It would be cool is Shamus could weight in and talk about his settings and fps on both Wolfenstein 2 and DOOM. Otherwise our discussion here is kinda moot :)

                    2. MarsLineman says:

                      Agreed. And as I posted below, it may be that the GTX 780 is crippled by its lack of VRAM- 3 GB, when the system requirements for Wolfenstein 2 list 4 GB VRAM as the minimum

                2. Khizan says:

                  The Switch playing Doom is like a dog that can sing. The fact that it can do it at all is so amazing that you don’t care that it’s off-key.

              2. Ander says:

                Now that’s the kind of “piracy” that intrigues me. There’s people who run Dolphin simply for the sake of mouse controls in Metroid Prime. I have a friend who runs Xenoblade on his computer for graphics upgrade mods and laptop portability. He owns the game legally. What say you to that, DRM suits? (that was apostrophe, not addressing a commentor here as a suit. I’m a programmer currently implementing licensing in our software, and boy does it make me feel dirty after reading this blog and xkcd for years.)

                1. MarsLineman says:

                  I was actually gifted the Wii U version of Breath of the Wild over the summer, but since I’m away from home for an extended period doing research, I’m emulating the game on my laptop instead. I don’t know where that falls on the piracy spectrum, but I will say that I’m bummed I won’t be able to continue this save file on my Wii U when I get home. Gotta love that DRM

                  1. Matt van Riel says:

                    That’s not even remotely piracy. You own the game, end of story. And using an emulator is, never has been, and hopefully never will be illegal.

                    1. Daemian Lucifer says:

                      Ask nintendo and theyll give you a different answer.

            2. Phill says:

              When I was working in games, I had one case of a game that would run at 30fps in debug quite happily in one resolution, and at a *lower* resolution would drop to something like 1 frame every other second. That’s virtually 100 times worse performance from changing to a lower resolution that ought to give better performance.

              The list of things that can affect PC game performance is ridiculous. Everybody developing PC games has to live with the fact that for a certain percentage of your potential customer base the game will simply never run in a reasonable way, and you might not be able to figure out why. You just aim at the biggest problems, and work your way down the list for as long as it is financially sensible.

              1. Daemian Lucifer says:

                Sure,but when that percentage increases to something significant like in the case of this game or arkham knight,then the developer messed up.

          2. Echo Tango says:

            The PC-only comparison is Shamus’ entire point, though – It plays like ass compared to other contemporary games of similar fidelity.

            1. Redrock says:

              Now that I completely agree with. I have no idea where all that horsepower is going. None. Oh, and may I add that the sound mixing sucks too? Cause it does.

              1. MarsLineman says:

                Huh. I just did a bit of research to confirm the relative power of Shamus’s hardware, and now I’m wondering if the issue is the GTX 780’s lack of VRAM. The card itself is plenty powerful, but it only has 3 GB of VRAM. System requirements specifically mention 4 GB of VRAM as the minimum for Wolfenstein 2

  2. Redrock says:

    I’m one of those lucky guys who managed to run it well, even though initially I still ran into some weird issues, like when changing the resolution would screw up window size, etc. The fact is, its launch condition was pretty much inexcusable, however you cut it. Was the New Order port as problematic? I wouldn’t know, I got that one on console back then.

    EDIT: I kinda think Super Bunnyhop is laying it on a bit too thick on the subject of difficulty and controls. I played through TNC on PC with a controller on normal (yes, I’m one of those people) and was mostly fine. And I’m not that good at FPS games, mind you. The auto-aim helps.

    1. Geebs says:

      I haven’t played The New Colossus, but I found that playing TNO/TOB with a mouse was “not fun” while playing them with a controller was “lots of fun”. The levels are set up so that you’re generally pointed in the right direction to shoot things, and don’t need the quick turns a mouse gives you.

      Meanwhile, triggers and rumble really help complete the over-the-top experience Wolfenstein is going for with its’ ludicrously oversized weapons.

      1. Redrock says:

        Yeah, I never fully appreciated the impact of rumble before I switched from console to a living room PC and started alternating between mouse&keyboard and a Dual Shock 4 frequently. That’s when you realise that with a keyboard something just isn’t there. A trade-off, sure, but these days I’m very much a controller user through and through.

      2. Sleepy the Bear says:

        This is interesting, because I’m only play these games on console and I hate the way they feel.

        I’m playing New Colossus on XBone, on “Do or die” difficulty and I’m hating it. The game is very unforgiving about health and damage, and it does a terrible job of signalling damage. Sure, the damage might be “realistic”, but it really undercuts the bulletproof bravado the game suggests with duel-wielding assault rifles (paraphrasing Campster’s video on Wolfenstein). I’ll rapidly go from full health and armor to dead in a second, without have a clue I’m being shot. I feel like poor controller feedback is a big element of that.

        I have the feeling these revamped Id games (DOOM, Wolfenstein) were balanced for mouse and keyboard, with precision mouse controls, fast movement, and no rumble feedback. In Wolfenstein the controller only seems to rumble for firing weapons (so no damage feedback), and the autoaim feels too weak. This is in comparison with something like Destiny which feels sublime to play on console. Melee, taking damage, firing your weapon, and even reloading all have satisfying rumble, in addition to great auto-aiming. The difference seems to be the PC/console pedigree of these developers, and the care with which they
        optimize the controller experience.

        I’ve been stuck on the New Orleans level for weeks. I’m in the rare situation of absolutely hating the gameplay in a FPS, but wanting to continue for the story cutscenes, which are great. (For what it’s worth, I managed to get through the New Order just fine on a similar difficulty.)

  3. Sarfa says:

    The starting games off with a lot of splash screens baffles me. Sure, you want the player to know who made the game, but making them sit through that every time the game crashes just tells the player which company they should resent if this keeps happening.

    I know I’d probably think more highly of Bethesda today if every time an Elder Scrolls game crashed I didn’t have to then watch their fancy animated logo gloating about it.

    1. Redrock says:

      I dunno, I treat those like a TV show intro, something to set the mood. But not when I’m constantly rebooting the game to figure out graphics configuration, though.

      1. Gethsemani says:

        I’m like you, seeing the intro splash screens as sort of mood setters (the same way I always wanted to see the THX splash when re-watching the old Star Wars movies). That said, they should be skippable for everyone who isn’t like me. If you absolutely feel that you need to force them on the player, at least make sure that they are only unskippable say once every 24 hours and can be skipped every reboot of the game after that. That way you can make sure everyone knows who made the game, but people won’t feel a murderous rage every time they hear your name if they need to reboot the game often.

        1. Echo Tango says:

          Alternately, put all the logos and brand names in the credits screen(s). That’s effectively the same group of people to the player – people who made the game. Either people directly making the game, or people whose engine / middleware / whatever got licensed.

    2. John says:

      Unskippable splash screens are the worst. No, wait. Unskippable splash screens that cause crashes to desktop are the worst. I’m looking at you, you stupid spinning Nvidia logo WMV.

      1. BlueHorus says:

        This sounds a lot like my Divinity OS 2 experience at the moment. The game was running okayish, some occasional lag…
        …and then it installed a patch – without asking me – that included the features Even More Lag and Random Crashes to Desktop.
        (Also, every character grunting and bellowing like they’re constipated and furious about it EVERY TIME they use a skill*. ‘Cos that’s never gonna get old…)

        The ‘fight music changes based on whichever character scored a kill’ feature is a neat idea, but seriously game, it’s not worth it if the game freezes for a second every time it happens…

        *Dude, it’s just a healing spell. Calm down, you’ll give yourself a hernia.

        1. Redrock says:

          Eh, the grunts aren’t that bad if you compare them to some of the barks in the original … Original Sin. Like how they would scream “SWEET RELIEF!” every time you use a healing spell on them. That’s not a phrase I associate with healing magic, let me tell you.

          1. BlueHorus says:

            Bah, misplaced post. I meant to put it in its own thread rather than add it onto John’s comment. And now it’s got a reply.

            But ‘SWEET RELIEF’ wasn’t EVERY time, was it? And that at least made sense of a sort. Whereas to me there’s something just wrong about a mage grunting like he’s a macho guy showing off in a gym while casting support magic.
            And that’s before the repetition problem. I think combat barks are just not that good an idea.
            You may disagree, but be warned: I’ll YIELD to NONE!

            1. Redrock says:

              Well, they can be entertaining at times. In Tales of Berseria each character shouts out the name of each attack or something similar, anime-style. Each and every attack, and it’s a combo heavy action JRPG. It’s funny the first few times, but when companions enter the fray, it’s mostly cacophony. But it’s also somehow endearing, maybe because each shout represents and suits the characters. I think the fact that Divinity uses generic barks for all PCs makes it a bit more annoying.

            2. Trevel says:

              Combat barks are a good idea when they’re providing information to the player, and a bad idea when they’re not. AIs announcing a grenade or change of patrol state is generally good, because that’s information you want to know. Allies announcing that their special attack is charged up, or a new enemy you might not have seen is joining combat is good. Enemies announcing that theyre about to do their big attack is pretty good.

              Enemies announcing that they’re still there, on the other hand, is not so helpful.

              I’m currently playing Xenoblade Chronicles 2 and it’s dense with repetitive combat chatter, but almost everything from my side is actually alerting me to something I care about. (Out of combat, on the other hand….) It might seem unnecessary and repetitive (and it is definitely repetitive), but I play noticeably worse when playing without sound than I do with, because the combat chatter is actually an information stream.

              1. Daemian Lucifer says:

                And allies announcing that ENEMIES ARE EVERYWHERE are comedy gold.

                1. Matt van Riel says:

                  *surrounded on all sides*

                  “I don’t think we’re alone here…”

              2. Asdasd says:

                “I wonder if the Arisen is interested in knowing how wolves hunt.”

    3. Blake says:

      I can’t speak for every company, but I know most of the games I’ve worked on has spent most of that unskippable logo time preloading data, which would mean in most cases if you could skip the logos you’d just be staring at a black screen longer.
      It’s also not uncommon for certain licenses to come with some sort of prominant-logo requirement which the unskippable thing meets.

      Having said that the logo length is often optimised for console load times so if you have a fancy rig with the game loaded on an SSD you’d be waiting longer than needed.
      Not saying it’s a perfect solution, but there are a few reasons for it existing and good developers try to make use of the time as best they can.

      1. Daemian Lucifer says:

        The only time I saw the black screen for a significant amount of time while loading data is in a civilization game.And even then,I preferred just watching that black screen for 10 seconds than the same opening logos for 15.

  4. Echo Tango says:

    I had to turn the visuals down to 2009 levels of detail.

    This is like resolution over 1080p. If it’s on a big movie theatre screen, or right in my face, I can see the difference. Your screenshot looks good enough that I literally didn’t know anything was wrong with it, until your sub-text. I suppose that’s why I play so many indie games nowadays – way cheaper, and the graphics are good enough even at sub-sub-standard levels. :)

    1. Echo Tango says:

      Your final screenshot of crates, dudes, and explosive tanks is actually one I can notice something wrong. The explosive gas tanks are roughly the same complexity as the crates 6 feet to the left, but the crates look like low-res models that should be in the background, and the tanks look good up close.

      1. Daemian Lucifer says:

        Yeah,the crates are the only ones where textures are mushy.And the faces look a bit ugly.But these screenshots are just fine.

    2. King Marth says:

      The motion-blur example aside, I’m not sure where the complaining is coming from. They all look fine to me. Perhaps at least one screenshot from this fabled land of golden pixels polished to a gleam in the crystal-clear fountains atop Mount Olympus would be a good idea to show the unwashed masses what we’re really missing.

  5. Bethesda says:

    You should have updated your drivers to the latest version. That would have solved all of your problems.

    1. Redrock says:

      How on earth did you get your avatar icon to match the Bethesda logo? Aren’t those random?

      1. Daemian Lucifer says:

        I was just as surprised at you.I couldnt have picked a better one even if I actually tried.

        Errr,I mean I have no idea how that random person managed to pick that cool looking gravatar.Yup,thats totally what I meant to say.

        1. Redrock says:

          Yeah, yeah, I know you can use Gravatars. But that looks like the random-generated Wavatars the site assigns to emails. So the question stands.

          1. Echo Tango says:

            You could set your gravatar to one that looks like the auto-gen ones…

            1. Redrock says:

              Yeah, I kinda figured it out, but damn, you need to get the icon database, find a suitable one, set it as a Gravatar, all for the sake of a joke? Talk about dedication. If only Bethesda tested their games that way.

          2. Daemian Lucifer says:

            Honestly that one was random. While posting I was wondering myself what I’d get, and when I saw that one I was like “Wow, that one is perfect”.

        2. Philadelphus says:

          Psh, you can’t fool us DL, we know it wasn’t you. There are spaces after the punctuation marks in Bethesda’s comment!

          :P

          Edit: wait, there are spaces in your comment above! Who are you, and what have you done with the real Daemian Lucifer?

          1. Daemian Lucifer says:

            I put spaces when I type from my phone.Or more precisely,the autocomplete puts them in there.I always pick the lazier route.Except when Im doing a joke,like the bethesda thing.Then I put some effort into it.

            1. Philadelphus says:

              Thanks for clearing that up; I’ve actually wondered for years. :)

  6. Nick Pitino says:

    So that bit about how for $60 the game should be QA tested and work got me thinking.

    What are your thoughts on the recent Extra Credits videos about how they think games don’t cost enough?

    1. Echo Tango says:

      Their video doesn’t have hard data to back it up. Personally, I don’t rely on those guys for those sorts of topics. The videos don’t open up further discussion, they don’t dig for more data, or ask industry for more data, but try to convince gaming customers that their concerns are invalid. That sentiment, coming without much (if any) data, coming from a source that’s got a conflict of interest (they consult for gaming studios). If they could actually back up their claims, I’d be happy with increasing prices, but that’s a lot to ask without concrete facts.

      1. Ander says:

        One specific question, and I ask because it’s an accusation thrown at EC often.
        Why does a conflict of interests matter? It does not change the truth value of their statements. The lack of hard data is absolutely relevant to the discussion. I guess the conflict of interests might make people less likely to take their unfounded estimates at face value, but in that case, we’re still just criticising the lack of hard data. Would we prefer estimates from a third party who has never worked in the industry? From a disgruntled former member of the industry? Obviously the best is a gruntled(?) former member of the industry, but since we don’t have that, we’re stuck with EC and their conflict of interest. What makes that so terrible? (Not that you necessarily think it’s terrible; again, this is a common complaint that I don’t quite understand.)

        1. MichaelGC says:

          I don’t know much about the ins & outs here, but any conflict of interest is likely to influence their view of a remedy, rather than their opinion of the data. So, let’s say EC and Jim Sterling both agree AAA games are too expensive to make. Thus, no dispute about the hard data.

          However, those working within the gaming industry are more likely to think that the way the industry internally operates does not need fixing, and that the solution needs to come from outside, via increased revenue. So, work the same as now, but get paid more.

          Those working without the industry are more likely to think the way the industry works is what needs to change – so, get paid the same, but use the money differently.

          1. MichaelGC says:

            Missed my edit window – was just going to add that if EC does work with the industry and might see a portion of any increased revenue, that stops them from being an impartial observer. They might still have the best solution – I don’t know and I’m not advocating one myself – but it won’t have been a disinterested one.

            1. Redrock says:

              Well, Jim Sterling does make money from stirring anti-publisher outrage. And a lot of money, too. Sooo, he isn’t all that disinterested too. By now his gig isn’t covering the industry, it’s bashing the corps, whatever the reason. If he adopts a more balanced position, he will start losing money.

              1. MichaelGC says:

                Right – everyone has biases; the question wasn’t about Jim Sterling’s. His leanings wouldn’t be strictly called a conflict of interest, because there’s not a direct formal relationship involved. It’s possible he’d start losing money if he changed his stance, but it’s far from certain. (It’s possible he’d start losing money if he stopped faffing about with cornflakes & suchlike, but that’s another open question…)

                However, it’s also entirely possible for Person X’s amorphous inclination to influence X’s judgment more-strongly than Person Y’s definite & direct interest influences Y. Depends on the issue, the context, and the kinds of people X & Y are. (e.g. Y might be more careful to balance their thinking precisely because they know & acknowledge they have a conflict.)

                So, your general point certainly stands – it’s just that to call Sterling’s situation a conflict of interest would be stretching the term more than somewhat.

                1. Redrock says:

                  Oh, absolutely. Personally, I don’t know EC’s business model enough to say whether formal conflict of interest I’d present. So I’d settle to call both sides biased to an uncertain, but tangible degree. Now, whether the mere presence of bias invalidates one’s argument is another matter entirely.

                  1. MichaelGC says:

                    As I said, “if EC does work with the industry and might see a portion of any increased revenue, that stops them from being an impartial observer. They might still have the best solution[.]” Emphasis obviously added.

                    Total invalidation is indeed another matter, and is not something anyone has previously mentioned. Anyway, you’re suggesting that there are gray areas here, and that’s true. However, I believe you’re also suggesting that because things aren’t black & white, having a conflict and not having a conflict can be treated as equivalent, and that isn’t true.

                    1. Redrock says:

                      Nah, wasn’t suggesting that last thing in the slightest. Otherwise, I believe we’re on the same page on that topic overall.

                    2. MichaelGC says:

                      *thumbs-up emoji*

          2. Ander says:

            Thanks for the response. That helps.

      2. Redrock says:

        There is a number of articles from various sources that mention that budgets have grown exponentially since the mid 2000s. It’s a fact. As is inflation. The cost of a movie ticket has grown about 30% in the last decade, for example. And, hell, why is the price 60 bucks anyway? Is it the 11th commandment, by any chance?

        – Thou shall not charge more than 60 dollars for a video game.

        – But Moses, what are dollars? And what are video games?

        – Hell if I know. Let’s just drop that one.

        1. Bloodsquirrel says:

          The increase in budgets is a fact. The necessity isn’t.

          When is the last time that a game has flopped because the graphics weren’t good enough? I’m not talking about shitty art direction or performance issues- I mean when is the last time that a AAA game was received poorly simply because it had less mo-cap? Graphics were a big driver of hype ten years ago, but today most games look about the same, and I don’t see any hard data that suggests that games need higher fidelity today than they needed five years ago, especially when you consider that there are a lot of kinds of fidelity that are much cheaper (hi-rez textures and shader effects shouldn’t be costing you tens of millions of dollars to add).

          Meanwhile, the rise of indie gaming and mid-budget projects are offering more and more lower-cost alternatives to AAA games, while sometimes offering more content in the process. Raising your prices while already under threat from increased competition is usually not a winning move. In an industry where Minecraft is the biggest success story of the decade and PUBG seems to be the hottest multiplayer shooter right now I’d be really nervous about trying to launch a game in an already crowded market at a new higher price point when the old price point was already twice what PUBG is charging.

          They’ll always be a market for the kind of games that the AAA developers are making right now, but you risk turning that into a niche market if you don’t adjust course. Pushing higher and higher prices on a dwindling user base is a death spiral. Business models which rely on a small pool of people who will spend a lot in microtransactions have been proven fatally flawed- You need a game to become a broad-based hit to attract those people in the first place, which is why mobile gaming turned out not to be the bottomless money printing machine that EA thought it would be when they started buying up mobile developers.

          If more AAA games were like The Witcher 3, I think you’d have a stronger case for higher price points. They provide a kind and amount of content that indie developers can’t match. It simply can’t be delivered without a high budget, and the size and scope of the game provides greater value in a much less disputable way. But for graphics? That’s an assertion that needs backing up, because it’s in no way apparent when looking at gaming’s latest successes and failures.

          1. Daemian Lucifer says:

            Precisely this.What makes a game look good is GOOD ART,not MOAR PIXULZ.

            1. Redrock says:

              Who said anything about art? We are talking online looter-shooters and sandboxes here in the AAA world. Shallow things. And shallow things, like shallow people, need to look pretty.

              1. Daemian Lucifer says:

                And they look pretty not because of MOAR PIXULZ but because of nice art.Heck,one of the first things Shamus wrote about borderlands is how they changed their art style and how that made the game more appealing.

                1. Redrock says:

                  Oh, come on. People like graphics. That’s the first thing anyone talks about when discussing games. That’s the only way to catch people’s eye, so to speak, at E3 or GDC. The PC gaming crowd practically worships graphics. Sure, there are outliers. And artstyle does absolutely trump raw fidelity when you get to the nitty gritty of it, but not always. There are a lot of people out there who don’t care for Okami-like cel-shading or Limbo’s minimalism. People like and want high fidelity. They absolutely want MOAR PIXULZ. Maybe they shouldn’t. I think they shouldn’t. But they do.

                  1. Daemian Lucifer says:

                    If people cared for fidelity so much,games like ryse wouldve been received much better,while games like skyrim wouldve been received much worse.Yet despite everyone saying how ryse was beautiful,it sold like shit,and has reviews around 60%.Sure,graphics has some importance,but its way down the line of things that people care about in video games,yet its always being pushed as the most important.

                    1. Redrock says:

                      Never that that it’s all people care about. But without the graphics to catch the public’s eye an AAA game will never get to the stage where they will even bother to judge it by its other qualities. Kinda like with people. Or gadgets. Or anything else. Shallow creatures, these humans. I mean, us humans. Yeah.

                    2. Daemian Lucifer says:

                      Trailers are already using fancier graphics than in the actual game.But thats marketing,not reality.

                    3. Sleeping Dragon says:

                      I think the fact that some steps have been taken to actually limit using fancied up graphics in (at least some) promotional materials for games is telling that yes, they do matter, to the point where tuning production values up in promotion is considered manipulative. I don’t want to go all “sheeple” in my arguments but I think there is a lot of gamers out there who grab the newest games precisely because they look awesome and after they play it they have this feeling of “slight meh” but many of them don’t know how to externalize because they have never been taught to actually analyse games.

                      Now there is a lot of gamers for whom graphics don’t matter quite as much, but precisely for this reason we re not the majority audience for day 1 sales of AAA titles. Also, as has been stated in the comments, we are probably overrepresented on this particular blog.

                  2. Bloodsquirrel says:

                    Oh, come on. People like graphics. That’s the first thing anyone talks about when discussing games.

                    You keep saying this, but I don’t see it reflected in the discussion I see or in sales. I remember back when Doom 3 and Half-Life 2 were generating huge buzz based on how good they look. I remember seeing screenshots of Doom 3 and thinking that they were just unreal. I haven’t seen that kind of thing for years now. The last game I can think of that was able to work its way into public consciousness based on graphical fidelity alone was Crysis. Even the most recent change in console generations was a bit of a ‘meh’.

                    I simply can’t accept the assertion that graphical fidelity is still the primary driver of sales without better evidence to back it up.

          2. Sleeping Dragon says:

            Your thoughts on mid-budget titles are very much what I was somewhat incoherently rambling about in my post below. Ninja’d on that.

          3. Ander says:

            Do any of y’all know of a good breakdown of just how Witcher 3 happened? Like Shamus said in his end of year wrap up when it came out, it does everything, including voice acting, writing, and clever quest design. It has AAA production values on any metric. The weakness people tack on goes something like, “The combat isn’t quite Souls deep,” and that’s it.

            What made CDPR so special? Why did they produce this game, and no one else can? I wonder what happened there the same way I wonder how MST3K movies get made.

            1. Redrock says:

              Well, two things. 1) It was very much a pssion project for CDPR and 2) CDPR is a Polish company. Now, in 2017 the average wage in Poland was exactly 4 times lower than in the US. I’m pretty sure that in 2013-2015, when Witcher 3 was in development, it was even lower. That pretty much explains most of the budgeting differences. Not all of them, sure. The budget for the Witcher 3 was 81 million dollars. I wouldn’t say the cost of living and average wage difference applies to the whole budget, bo, so we don’t quadruple it if we imagine it being made in the USA. Let’s be generous and just double it – we instantly get around 160 million dollars, which just lands in your average AAA ballpark. And I think that’s lowballing it a bit. In March 2017 the total revenue of Witcher 3 was around 250 million dollars, so the game would’ve still been profitable for CDPR even if made in the US. But not right away and not nearly by such a comfortable margin.

              1. stratigo says:

                that is still a comfortable margin

                1. Redrock says:

                  Well, they did create one of the best video games ever, so…

          4. I find myself wondering what effect the price of GPUs is going to have on the AAA games market if price hikes and sharply limited availability owing to cryptocurrency mining becomes any kind of long-term thing. I have had a desire to replace my GPU for some time, but either the offerings haven’t been compelling enough for me to justify spending my precious variable-income pennies to one or, recently, because prices have gone lolinsane. With hindsight, I should have picked something up in the Black Friday “sales” (nothing like watching prices rise so they can then be “discounted” back down again – ugh), but I (wrongly!) figured that things might look better in the new year. OMG, lolNO. So even if I dearly want to splosh down £50-60 on a new AAA game, I absolutely won’t because it’s not going to run on my current GPU. There is a reason that I am currently playing much older games and indies. (Well, several reasons actually, but this definitely is one.) And I would imagine that there are others out there who are in the same boat.

          5. Geebs says:

            @Bloodsquirrel

            When was the last time a AAA game was received poorly (because of fewer graphics)?”

            I don’t blame you for forgetting them, but both MA: Andromeda and AssCreed Unity spring to mind.

            1. Daemian Lucifer says:

              While graphics are the most memeable things about those games,those arent the reason for their poor reception.Asscreed definitely had enough pixels in it,but being buggy as hell,and performing poorly on consoles is what killed it.As for andromeda,its the poor dialogue,boring characters,and it being tied to mas effect 3 that were its ruin.

            2. Bloodsquirrel says:

              MA: Andromeda became infamous because it’s graphics were 99% high-quality, high-production value and 1% ridiculous jank, mostly involving facial animations. Even then, that was hardly what tanked the game. It was the easiest problem to cobble together a 5-minute youtube video for, but the game was much more deeply criticized for unimaginative storytelling and having uninteresting content.

              Ass Creed Unity was, IIRC, just plain broken, turning its characters into Lovecraftian horrors with graphical glitches that turned their faces inside out. That’s quite a bit beyond just not having the latest graphical bells and whistles.

              1. Geebs says:

                That’s kind of the crux of the exponential graphics race, though; both games have much more sophisticated graphics than were even achievable in the previous generation, and both (Andromeda more so) got hung out to dry for graphics glitches which would, a generation ago, have been much better tolerated if not completely unnoticed.

                Agree that Andromeda has plenty of other things to dislike, not least the fact that the only new race were a bunch of ugly penis monsters whose only character archetype was “histrionic”.

                1. Bloodsquirrel says:

                  I don’t think people’s faces not appearing would have been unnoticed ten years ago,

                  If anything, the increased graphical fidelity is creating the problem, since the closer we move toward photorealism the more graphical problems (like weird facial animations) stand out. It’s not an innate demand for greater levels of perfection, it’s that greater levels of perfection are needed to hold things together since we’ve got less and less visual abstraction to help smooth the edges.

        2. Sleeping Dragon says:

          People haven’t been paying 60$ for AAA games (on release) for a while. DLC, special edition stuff, soundtracks, digital artbooks… all of these are a way to get extra money from people who can afford it. It’s mostly a psychological difference at this point because when they put that extra cost on the base price there is going to be backlash as if the publisher ate a live baby on prime time television.

          I know I can’t afford games above 60$, but I also can’t really afford games at 60$*. I tend to buy them in the 10-20$ range a few years after release. Some things I bought at release? Avernum 3 (20$, though I had a 20% voucher, and there was a 10% release sale), Finding Paradise (10$), Shadowrun: Hong Kong (20$ I think?), Particle Fleet… well you get the idea. Will I whine if prices on AAA titles go up? I probably will. Will the big devs loose a lot of business because of me? Probably not. I’m just hoping nothing strangles the mid-range game development that I actually support with my money.

          *This is not entirely true, I probably could be convinced to throw 60$ at a game or two a year. If Kan Gao, or the dev behind Immortal Defense wanted to quote me 60$ for their next game… I’m not making promises but I think I could be convinced. I just don’t think AAA industry gives me that much value for my money.

          1. Ander says:

            Thank you, fellow supporter of Freebird Games. Hopefully we can get the third installment in 6 years this time.

            I can’t imagine what Kan Gao would do on a $60/sale game project. Like, I seriously have no idea what kinda game he would be involved in. I’d probably buy it just to support the guy for his past work regardless.

    2. shoeboxjeddy says:

      They didn’t fail at testing this because of the budget. They failed at testing it most likely because of the release schedule (too soon).

    3. Kdansky says:

      As a disclaimer, I think Extra Credits is not a good show.

      However that video was very bad even for their standards. I think Jim Sterling put it best when he did a reply video to it. In essence: If games are too expensive, make cheaper games and spend less than a hundred million dollars on marketing. There are too many successful small/medium studios to be able to claim that you cannot make a good game under a hundred million investment.

      Basically it’s the publisher’s job to keep costs under control for a game project. If they fail at that, they are bad at their job and blaming the customer for that is ludicrous.

      1. Redrock says:

        I like Extra Credits, but you have to keep in mind that their show is aimed at creators more than at gamers. And I think that they were making a compelling argument. I dug around, and game budgets have grown a lot since 2005. And a lot of that has to do with graphics. And most people pay a lot of attention to graphics. So “make smaller games” isn’t really a solution here.

        And hey, that’s how a market works. If the price becomes too high, people stop buying and the supplier has to adjust. But, so far, people seem to be pouring a lot of money into microtransaction and other stuff. So publishers keep exploring that road. Makes sense, even if a lot of us don’t like it.

        Oh, and they were talking about AAA games, not games in general. And 60$ is an AAA price. Smaller studios are beside the point.

        1. Sleeping Dragon says:

          This is a perfectly reasonable argument and in an ideal world that’s how it would work. In our world we have EA mistaking cause and effect and blaming “the controversy” for lower sales and having to delay microtransactions on Battlefront 2 without giving a second thought to what caused the controversy.

          1. Redrock says:

            Well, it’s still likely that they will learn SOMETHING. And other publishers will too. For example, a lot of the nastier DlC practices that EA pioneered – on disc DlC in ME3, DlC vendor in Dragon Age, aggressive micro transactions in Dead Space 3 – got scrapped, some forever, some only for a while. I’m sure we will see a decrease in lootboxes that directly affect PvP gameplay, for example.

            1. Sleeping Dragon says:

              I really hate to be the cynic here but I need to ask. I wasn’t planning on getting BF2 from the start so I haven’t been following the news in detail but when most everyone* was like “we beat EA, microdumb pulled from BF2” was it made clear that said pull was only temporary and they would reintroduce those in a couple months or did that twist only happen after they sold a couple million copies?

              *I remember Shamus had a column to the contrary

        2. Blake says:

          “I dug around, and game budgets have grown a lot since 2005. And a lot of that has to do with graphics. And most people pay a lot of attention to graphics. So “make smaller games” isn’t really a solution here.”

          It is a solution though, maybe if the game doesn’t look as good as it’s competition it sells less, but if it’s cheaper to make you might still come out ahead.
          And if ALL of the developers were spending less money on graphics, they’d ALL be making more money because they players wouldn’t stop buying games.

          Personally I think there’s room for a couple of massive mega budget graphical masterpieces per year, but that most games should be cheaper, lower risk projects that could try more interesting things because selling a couple of million copies would be a big win instead of selling 9 million copies being a failure.
          Those big masterpieces would be almost guaranteed to sell crazy amounts (because they wouldn’t be competing with other big masterpieces), while all the other projects bring in pretty stable profits.

          Consumers only have so much time and money, so there is a hard upper-limit on how many big projects can be financed per year. Increasing the price of games just makes every project a bigger risk/reward as less total games will be bought per year, and you really don’t want to be the company that brings out a $200M game the same time as a better $250M game if people can’t afford to buy both.

    4. Ander says:

      Also curious, but given the heat it’s generating on YouTube I don’t expect much discussion here. I like EC’s work a lot, and I hope this video – which was out of their wheelhouse – doesn’t damage their Web presence too much.

      The big issue I have with these videos specifically is EC’s assertion in the second video that peple really want/need their hi-poly graphics. Maybe, maybe not, but if so, some consumers and the industry caused the situation together with the graphical arms race Shamus and others have been decrying for…ever, really. The same arms race that caused the launch issues for Wolfenstein 2 and other recent games. In any case, there was a time before the arms race; here’s hoping we shortly come to a time after it. It’s one of the famous 3rd roads whose existence they denied in the first video.

      Not that it affects me. I play the Steam sales and haven’t bought a new game that cost more than $20 since Portal 2 came out, so I have no horse in this race. Maybe I would if prices were lower, but, eh. I get Steam review codes for cheap games and don’t mind waiting to play AAA stuff.

      1. Redrock says:

        People do need graphics. Everyone talks about graphics, everyone writes about graphics, seeing is believing and all that. People get very upset about graphical downgrades, too, don’t they. Yeah, there is a group of gameplay-first people, and I think that there is a lot of overlap between these people and Shamus’s readers, but generally the industry lives and dies by its graphics. The pc gaming crowd basically worships graphics. Nintendo is kinda apart from all that, but Nintendo is Nintendo is Nintendo. And again, the EC video was about AAA games. Not indies.

    5. Daemian Lucifer says:

      As I’ve said on that video:
      the cost of movie tickets has increased by 25%*, while the average length of a movie has increased to 2 hours, gimmicks like 3d are still rare, not mandatory and don’t alter the movie significantly, and the ticket gets you a full experience. Also the cost of making a movie has increased by at least an order of magnitude.

      During that same period, the cost ofa game remained the same BUT gimmicks like season passes are common, they significantly alter the game, the average length of a game has decreased, and due to day 1 dlc and patching, you don’t get the full game just by buying it. The cost of producing a game has alao increased by an order of magnitude. So expecting the price of games to increase by 50% is silly.

      Also everything that Jim Sterling said.

      *If i remember the number correctly. I’ve checked when I initially posted this to the video, but I can’t verify now since I’m on my phone. it’s less than 50% though.

      1. Hector says:

        There’s a much bigger issue that the “pro-price-increase” or “pro-industry” side overlooks: the market is growing, and the unit cost is non-existent. That’s a feature of software generally, but there’s not a lot of room to grow your spreadsheet business, whereas games are still growing. They jumped past Hollywood years ago in global revenue, and these days even indie titles can reasonable be ported across the world. And you can make and sell new games every year to the same market base.

        Without going into the details, it means that you generally want to push prices *down*, not up, because you can serve new customers effortlessly and at zero marginal cost.

        This also calls into serious question why budgets have been growing at the absurd rates we’ve seen. The market is growing, customers are served for nothing more than some administrative overhead, and companies no longer even need to build expensive engines AND have access to lots of high-quality software that offloads the more tedious or time-consuming aspects of development. There are more and better tools for less money and a far greater market – which usually means more profits even with ample competition.

        So… why are games so much more expensive? Sure, graphics are better now, but they’re not *exponentially* better. Quality is up compared to, say, twenty years ago, sure, but not really compared to five or ten years back. Games are shorter, with content that would have been free served up as paid extras. Are budgets really bigger… or is the money being spent on more administration, on “features” that people don’t actually want, and building micro-transactions? Because at least at the AAA development level, it’s clearly not going into the games.

        Further, I’m not sure at all that I buy the “exponential budget” idea. I just took a look at some AAA developer’s financial statements, and nowhere do I see costs skyrocketing. For an example, since 2013, EA grew its revenue by about a billion dollars – and capture about 900 millions as profits. They don’t show any big cost increases – mathematically there can’t be unless somehow they shifted a massive amount of internal expense from everything else towards development, but this is almost impossible. I checked against a couple others like Activision and Ubisoft, and they show similar performance. They’re making more money, and holding costs at the same time. Sure, costs have gone up over time, but not faster than revenue. EA did particularly well and analysis by others shows that the profit growth is heavily weighted from micro-transactions, but the point holds.

        1. Redrock says:

          You are seriously saying that graphics haven’t been exponentially improved over the last decade? Also, the market size thing is tricky – yes, you have more potential customers, but game sales are much less predictable. Getting a big share of these customers is still a challenge and a huge gamble. And keep in mind that AAA is actually less than half of what is considered the video game market. In 2017, total revenue of console games and boxed and downloaded PC games was around $59 billion, and not all of that was AAA, of course.

          I understand that blaming greedy rich people for just wanting to build Scrooge McDuckian money bins is appealing, but very few markets are as simple as Immortan Joe’s magic water mountain.

          1. Geebs says:

            I think it probably makes more sense to refer to “production values” or something else instead of graphics. Writing a shiny renderer is pretty trivial in terms of both time and expense but all of the mo-cap, animation etc. is far more involved. Screen-space effects such as those which create a lot of the graphical glitz in modern games are yours for the cost of a couple of framebuffers and a shader program.

            That said; the new Shadow of the Colossus is a pretty good example of the exponential curve of graphical detail over the last ten years.

            1. Hector says:

              I was criticizing closely the word “exponential”. From a hardware stance, we’re definitely not improving at that kind of rate. It’s been reliably linear instead, which I consider a good thing.

          2. Nessus says:

            “You are seriously saying that graphics haven’t been exponentially improved over the last decade? ”

            I would personally say that graphics have improved by a percentage, not an exponent. Graphics do look better, but most of what I’ve seen has been basic improvements like higher texture and/or mesh resolutions, with a bit of added rendering polish on top. The biggest actual software improvements have been in lighting complexity, with the rest being mostly a difference in what the common end user hardware could do rather than what the game devs could do. From a pure immersion/realism standpoint, IMO the biggest improvements have been animations and animation transitions, but I don’t think that’s something that couldn’t have technically been done ten years ago, hardware or software wise, I think it’s mostly an improvement in technique.

            You can game that comparison by deliberately referencing older games that haven’t aged well, sure. Even with mods, Oblivion looks like it’s made of mud and sticks compared to modern game graphics. The first “Prototype” (I haven’t played the second) looks basically the same as a PS2 game to modern eyes. Games like those could definitely make a case for exponential improvement.

            …But on the other tentacle, I’ve been replaying the first Dead Space this past week, and TBH, while it certainly does show it’s age, it would take very little to fix that, most of it stuff that could have been done by the devs of the time, but not by the consoles of the time.

            If you were to ask me what the graphical difference between Arkham Asylum and Arkham Knight is, I’d say 90% post-processing, 10% texture size.

            I’m sure someone who actually codes games could come up with an argument showing how modern games are so, soooo much more complex “under the hood”, but to my layman’s eye, that flat out hasn’t translated to exponential results in the actual images rendered. And If modders can make a 10 year old game look modern (or nearly so) by replacing assets and dropping in an ENB or ReShade preset, then exponentially higher complexity in the engine is just processor make-work.

            …But also, I’ll believe “graphics matter” when devs finally start taking LOD problems seriously. I *Can Not Care* how advanced your lighting is or what your highest texture resolution is if objects are constantly teleporting into existence in front of me as a I move. The former improves immersion incrementally, the latter breaks it in a binary sense. ‘Till then, “better” graphics is sauce at best, farcical at worst.

            I absolutely believe there are people out there who make their purchasing decisions on graphics alone, but they are a tiny minority. Most of the people who prioritize graphics high still place it lower than “fun”, and we can see this in what actually sells. Games that sell really well are games that are viscerally fun to play, or games that are fun to play with friends. Graphics helps, but it won’t make a bad game sell well, and games with “lesser” graphics that are fun to play sell enormously (see: Borderlands, PUBG, Overwatch). Good graphics enhances the appeal of already fun games (see: multiplayer military shooters in general), but games that have amazing graphics but nothing or janky everything else tank.

            Note that I say “fun” or “fun with friends”, not innovative”, or “good story”. I’m with the general audience here in that I like interesting new mechanics and a well told story over samey mindless popcorn, but I’m under no illusions as to what owns the mass market. The reason “multiplayer shooter #11109862” sells way more than “intelligent story with clever new mechanics” is the same reason Kevin Feige, JJ Abrams, and Michael Bay rule the blockbuster movie market.

          3. Blake says:

            “You are seriously saying that graphics haven’t been exponentially improved over the last decade?”

            I’d say that. A game like The Last Of Us (PS3) came out 5 years ago, and that still looks as good as most recent games. Lower quality textures due to hardware restrictions, but I’d bet a lot of the source images were higher quality anyway so cost-wise swapping out for the higher quality textures would be pretty trivial.

            10 years ago we had GTA4, looks older sure, but not so bad nobody would touch it today.
            If hobbyists can make it look as good as a modern game in their spare time, then paying a few experienced developers for a year, giving access to all the original source code and data, could easily make something that looks like a 2018 game for way less than a million dollars.

            Games are looking logarithmically better, not exponentially.

      2. Redrock says:

        And the average size and scope and graphical fidelity of a AAA game hasn’t increased over the last decade? Also, 3d isn’t all that rare when we talk about blockbusters, which would be the film versions of AAA games. And, just like DLC and microtransactions, it’s optional, but a big part of the model, because a lot of people want it.

        Also, Jim really didn’t make many compelling arguments. Like, EC says, making a game requires a lot of people, and the average wage is this and that. Jim’s reply “well, I have anecdotal evidence of a developer trying to cheat some QA testers out of health insurance”. Okay then.

        And again, it’s a free market. And games are a luxury. Yes, video games are a luxury. So, there is only one correct price – the one that people are willing to pay. And if people are willing to pay for DLC and microtransactions – let them pay. Once they become unwilling to pay, the industry will adjust or die. That’s how it works. Video games aren’t food or water or medicine. Or a charity. No one is morally bound to make the accessible.

        1. Daemian Lucifer says:

          And the average size and scope and graphical fidelity of a AAA game hasn’t increased over the last decade

          At the same time the graphical tools have improved significantly.And while earlier everyone had to develop their own engine,now licensed engines are far better and cheaper.Everyone can shill out some cash to get their hands on unreal engine,and get it working immediately,with coders being familiar with it,instead of spending years to develop new stuff in house.

          Also, 3d isn’t all that rare when we talk about blockbusters, which would be the film versions of AAA games.

          First,blockbusters arent the only movies people go en masse to see in theaters.Oscar baits are popular as well.Also,its mostly just action movies and cartoons that get 3d,and even then 2d screenings are played right besides them.

          Jim’s reply “well, I have anecdotal evidence of a developer trying to cheat some QA testers out of health insurance”.

          Jim already covered in detail a bunch of stories where publishers and developers were being dicks to their employees.Its a known thing in the industry that employees are constantly being screwed over in various ways.Even Shamus here covered the perma crunch.So its not just a single instance of one developer doing one bad thing.

          And games are a luxury.

          Which is exactly why I was comparing them to movies,another luxury.Also,while its a free market,some rules in it should still apply.Like a rule to not screw over your employees in order to maximize profits.There is a reason why industrial revolution type factories are no longer the norm,even though those were more profitable.Its not the free market that killed those practices.

          1. Redrock says:

            This article here helps to contextualize the graphics debate. https://www.forbes.com/sites/quora/2016/10/31/why-have-video-game-budgets-skyrocketed-in-recent-years/#435d710c3ea5

            Even with better tools, you still need way more people and resources to do the same job than before with modern graphics. So the teams became much bigger. That’s a fact.

            And the fact that there is crunch and layoffs after a project is done in no way disproves the argument that there is an average salary. Layoffs that come after the game is shipped have exactly nothing to do with the discussion at hand. It’s still a bad counter-argument. Now, if Jim were to present data that shows that the average wage in game development is actually 10 bucks a year, now that would just blow EC’s argument out of the water. But the problem is, no such data exists, neither to support or disprove EC’s or Jim’s postition. So in the end we just have to decide whom to take at their word. Usually someone who cpeaks to our biases, of course.

            I’ve said before and I’ll say it again – we desperately need more transparency regarding budgets, development costs and revenues. At least at the level that the film industry has, which isn’t all that much, but is still something and allows for actual analysis. Otherwise we’re stuck in that eternal debate on whether publishers are evil or not, which doesn’t help anyone.

            But I must say that there shouldn’t be a rule against increasing the price of non-essential goods and services. I’m sorry, but charging whatever the publisher wants for a game isn’t the same as creating terrible conditions for workers or using child labor. Because, again, why the hell should a game cost 60$? Nobody tells Ford that newer Mustangs can’t cost more than they did in the 60s. I mean, I get it, nobody wants to pay for anything. But the world doesn’t work quite like that.

            EDIT: And yes, blockbusters are the most succesfull movies and the ones that people go to see en masse. It’s kinda the definition of the word. Oscar bait makes peanuts when compared to animation, comic book movies or Star Wars.

            1. stratigo says:

              someone else made Jim’s point for him.

              EA’s revenues are up drastically. As far as we an tell from a company that obfuscates as much as it can about the business in an industry that does the same. But, on the net, EA’s grown it’s revenue enormously. If it didn’t utilize all the questionable business practices, sure, it would have been less enormous, but still A LOT OF MONEY. Like, the difference from nearly a billion in revenue and five hundred million sort of gets academic.

              1. Redrock says:

                Oh, I’m not defending EA’s business practices. Just the notion that there absolutely is a reason for games to cost more than they do. The fact that EA has been successfull has nothing to do with it. Apple is successfull, but iPhone X still came out costing 1000 bucks.

            2. Daemian Lucifer says:

              But I must say that there shouldn’t be a rule against increasing the price of non-essential goods and services.

              No one is disputing that.What Jim has said is that the price has already been increased by all the shitty sales practices,and what I added to that is that expecting the increase of 50% at the same time when other luxury,movies,has increased by just 25% is ludicrous.And the overall growth of both industries during that period* is somewhat comparable,while the change in overall quality is not.Basically,advocating how games industry is so poor and they simply need to increase prices,by 50% no less,is outright false.

              *Im talking about the growth in the last 10 or so years,not the 90s when games were still much more obscure.

              1. Redrock says:

                Well, EC’s argument isn’t that price should increase in addition to all the other extra monetization. Instead, they are saying that extra monetization is the alternative to a flat raise. Also, what’s that 50% thing? EC argues for 70 dollars. Me, I’d take a flat raise across the board if it meant no extra bullshit, but that’s me.

                1. Daemian Lucifer says:

                  Third minute,when he says his opinion on how much a game should actually cost,he says that $90 should be the price.

                  As for not having all the other bullshit,those things arent going away,no matter how much the price increases.Also,he says how “we tried dlc,but people werent buying them” like somehow dlcs arent a common thing being sold these days.Same for the perpetual games.And those two(and the rest of the crap)are happening alongside each other,not one AFTER the other like he is suggesting.

                  1. Redrock says:

                    Ah, well, he is saying what the price could ideally be from the developer’s perspective. He doesn’t actually suggest that price is viable, as I understand. Movie tickets are also underpriced, which is a well known fact. Ticket sales barely allow cinemas to break even – most of the profits comes from popcorn and stuff. It’s always a delicate balancing act, where the ultimate decision rests on what the consumer is prepared to pay. The whole lootbox thing shows that a lot of people are willing to buy a lot of that stuff, sadly.

              2. Redrock says:

                I feel like I should mention at this point that in my country there is a pretty steep regional discount in Steam and GOG – almost 50%. A 60 dollar game costs the local equivalent of 33 dollars here. But, also keep in mind that the average wage in my country is 5 times lower than in the USA, and also that consoles don’t have that regional discount and console games cost a full 60 bucks. Dunno how that context affects my judgement, but I thought it was worth mentioning.

    6. default_ex says:

      Honestly I want to see proof that we are being undercharged for a game when we have so many games that are an utter mess at launch leading one to think otherwise.

      However I’m a massive fan of believing in something doesn’t make it true. We can believe we are undercharged, overcharged or charged on par all we want. Without data to prove or disprove those claims it is nothing more than a belief.

      Really in this case I don’t get it. Yes Vulkan is a new API. So was every other rendering API ever used at one point or another. The difference I’m seeing however is that everyone is jumping on board with using this new API before actually learning it. It’s not even like the jump from Direct3D N to Direct3D N+1 or the jump from OpenGL N to OpenGL N+1. The differences here are a completely rethought API, a new approach that discards previous notions and introduces a lot of new concepts. The closest you can come to Vulkan with prior APIs is OpenGL but the similarity is very shallow, so shallow it’s only similar in style. The important parts like functionality, inter-dependency, failure states and what not are very alien to previous APIs. It’s insane to see companies shipping games built on Vulkan without it being an optional experimental launch option. It has the potential to be so much better than Direct3D and OpenGL performance wise but that requires learning how to use it first, which is no small order. Not only do you have to learn how Vulkan works, you have to engineer your rendering system from the ground up. Good luck finding many people in the industry that understand how Direct3D or OpenGL work under the hood, those of us that dig into the nuts and bolts are often ridiculed until we produce something that proves such a deep dive was worth it.

  7. MichaelGC says:

    Actually, there were still a few spots where the framerate was bad, but they were rare. And strangely enough, they were often in spots where it didn’t seem like the game was drawing anything particularly difficult.

    Is it mining Bitcoin in the background or summat?

  8. BlueHorus says:

    This sounds a lot like my Divinity OS 2 experience at the moment. The game was running okayish, some occasional lag…
    …and then it installed a patch – without asking me – that included the features Even More Lag and Random Crashes to Desktop.
    (Also, every character grunting and bellowing like they’re constipated and furious about it EVERY TIME they use a skill*. ‘Cos that’s never gonna get old…)

    The ‘fight music changes based on whichever character scored a kill’ feature is a neat idea, but seriously game, it’s not worth it if the game freezes for a second every time it happens…

    *Dude, it’s just a healing spell. Calm down, you’ll give yourself a hernia.

    1. MichaelGC says:

      Ah well, at least they’ll have a healing spell ready for when they herniate.

  9. Cybron says:

    Sounds like the usual legendary Bethesda QA at work.

  10. Ciennas says:

    Yeah. I experience every single issue you mentioned, regardless of playing a game on console or PC. Even without the excuse of mods.

    Bethesda games are just buggy as hell. They’re very lucky that their audience generally has patience for them or finds them endearing, but I feel like that patience has worn thin.

    I do wish em luck, and hope someday they’ll actually release the next Elder Scrolls or Fallout, as they’ve admitted that they don’t wanna, and don’t wanna let Obsidian back in to tide the audience over because they’re afraid of getting shown up again.

  11. Joshua says:

    Reminds me of Shamus’s post on Neverwinter Nights 2 and its graphic debacle. Sure, the game looked good at the time, but not that good. I remember getting that game as a pre-order on launch day, installing it, and then texting my wife (she was working, I was not) something to the effect of “we may not be able to play this game, as it’s less than 1 FPS”.

    1. default_ex says:

      I played the hell out of NWN2, just played through it again recently and that’s one thing I somehow almost forgot about. I remember doing a lot of tweaking to my system to handle it at an acceptable framerate (not necessarily good). As well as the headache that Windows freshly introduced HAL provided for that game when used with a hyper threaded processor. I was blown away initially when playing it this time around because it almost never dipped below 60FPS and when it did it was totally understandable (tons of spells going off, lots of actors and tricky scenarios for the AI to handle). At some point within the first couple of hours the novelty of finally playing the game at full speed had worn off despite it hugely affecting the strategies I was using (I was no longer afraid to dump lots of AEs in a single battle and could trust the AI to make choices my options configured it to make).

      Sadly that game suffers from piss poor optimization. Fire up PIX and run the game, compare it to any other game from around that time and it’s pretty clear no one working on it had any real clue as to how much a draw call cost or what the optimal loading strategy for dynamic vertex/index buffers was. One of those is excusable as it was a new technology and we were still experimenting with optimal placement of dynamic assets in graphics card memory. The driver situation didn’t help much back then either as it often required dodgy hacks to ensure a particular loading order. The draw call thing was just stupid as it’s been well know since long before that game released that you want to pack as much as you can into a draw call to avoid bottlenecking the AGP bus. Now that PCI busses have blown AGP busses out of the water for what it takes to bottleneck them and drivers are pretty good at determining optimal placement of dynamic and static resources, we’ve just blasted right past both problems with the foundation it was built upon.

  12. Dev Null says:

    I get why Shamus does it – he’s a hip cool game reviewer who has to play keep up with the zeitgeist – but this right here is why I essentially refuse to play any game that was released less than a year ago. There’s plenty of year-old stuff that has had the bugs mostly beaten out of it for me to play (and it never hurts to prop up the tail of their sales graph, so they know that it’s still worth fixing the bugs after week 2…)

    1. Redrock says:

      I try to follow that rule too, but I just couldn’t help myself with Wolfestein 2. Been kicking myself ever since.

  13. Justice Rains says:

    This has to be some degree of misunderstanding the new technology behind this.
    Switching to Vulkan when playing Doom 2016 improved its performance for me by a huge amount, so much so I couldn’t even play it on default anymore. I was able to run the game on Ultra settings using Vulkan and it ran better than Medium to High on default. I am talking 50 fps average on Medium-High, and 70ish FPS on all Ultra while using Vulkan. I’m sure differences in hardware are a factor as well, but thats my experience on my PC. I’m sure there are others who had similar.
    I couldn’t even run Wolfenstein 2, tried multiple times with different solutions, I had to return it when it got close to 2 weeks since purchase.

    1. Daemian Lucifer says:

      This is just a blind guess, but could the reason be that Dewm was finished and polished before it was converted to vulkan, while Wolfenstein was started without it, changed mid production, and then had to be a bit rushed to meet the deadline?

    2. Richard says:

      The essence – and need for – Vulkan is multi-core/threaded CPUs.

      OpenGL, OpenGL ES and DirectX 11 all present the CPU with a single pipe, down which all commands must be stuffed.

      Simple example – you have two blocks of stuff to draw. Call them X and Y.

      In OpenGL (etc), the CPU says:
      “Use data X. Draw using method A. Use data Y. Draw using method B”.

      However, a modern GPU might be capable of doing both sets of drawing at the same time.

      In Vulkan, the CPU can say both at once:
      CPU Core #1 “Draw data X using method A”
      CPU Core #2 “Draw data Y using method B”

      However, with great flexibility comes great opportunity to make a horrible mistake…

      1. Justice Rains says:

        This all makes sense, especially considering Dewm was not released with Vulkan support out of the box, they enabled it later. If they spent that time making sure it works properly, or to implement it, that does not matter, what does is that it was working as intended and was an improvement for the game.

      2. Addie says:

        It also abandons the old-school glBegin(), glTriangle(), glDraw() etc. commands, in favour of issuing commands much more similar to how modern graphics cards work, which mostly means allocating buffers with precalculated values in an appropriate memory region (which have several different properties on a graphics card – some prioritise faster reads over slower writes, for instance) and preparing a draw list that visits all of them once they’ve been loaded in. The graphics drivers become much simpler and lower-overhead on account of the simpler-to-issue but more-complex-to-use commands, too.

        However, we’re not all John Carmack or Tiago Sousa. Managing lots of complicated interconnected memory buffers, and having them being visited by several different threads each frame, leaves a lot of opportunity for shooting yourself in the foot. Which they seem to have achieved here.

        1. Richard says:

          glBegin() and friends were deprecated in OpenGL 2.1, and removed completely in OpenGL 3.3/OpenGL ES 2.0.

          They’re also very broken on current hardware/drivers.
          For example, GL_SELECT doesn’t work with any nVidia cards you can currently buy. Probably not on AMD either.

  14. poiumty says:

    Bought myself a GTX 1070 for Christmas. You’d think I could run Wolfenstein II without any issues on that and an i5, right?

    With the graphics turned up to “sensibly high” (ultra textures, high most other things, medium shadows) I got sub-60 FPS all the time. Sometimes going as low as 20. I set everything down as far as I could… and I got sub-60 FPS all the time. Sometimes going as low as 20.

    My mind, as they say, is full of fuck. Eventually I just gave up and played it that way. I’m used to super low FPS anyway – I was rather poor to afford PC upgrades my entire life, so 20FPS is in the “still playable” range for me. But I can’t say it wasn’t disappointing.

    As for the difficulty, honestly I had way more problems when you have 50/200 health/armor rather than when you get the standard 100/100. Game had a reverse difficulty curve – it was much easier at the end than at the beginning. Didn’t even know I was finishing the game until the credits rolled.

    1. Addie says:

      Until I’d read this, I would have agreed with you…

      http://www.adriancourreges.com/blog/2016/09/09/doom-2016-graphics-study/

      Adrian uses some debugging tools to break down how the engine draws a scene, a few draw calls at a time, including which information is stored in which buffers, and so on. I found it very interesting.

      Essentially, the id6 engine uses motion blur to implement Temporal Anti-Aliasing, which means it gets both effects at a high quality at a very low rendering cost, which is extremely clever.

      1. Addie says:

        (this should have been a reply to Decius, below)

    2. Redrock says:

      Interesting. I have the same setup as you, an i5 and a 1070, and I had pretty solid 60 fps throughout. What’s your resolution?

      1. poiumty says:

        1080p.

        Maybe it’s the fact that the i5 is a relatively old 4590. But somehow I doubt it.

  15. Decius says:

    I believe that there is never any reason to implement motion blur in software. The monitor and eyeball-visual cortex pipeline already make it harder to see moving objects clearly.

    1. Daemian Lucifer says:

      Motion blur can be implemented well.The problem is that,for some unfathomable reason,its implemented on the entire screen,when its just the edges that should be affected.

    2. Mephane says:

      I always disable motion blur. At best it is just a useless and slightly annoying artifact from cinema transported into video graphics, at worst it is a terrible ugly hack that ruins all screenshots with heavy movement.

  16. “30 seconds of unskippable screens”

    They never learn. I actually would be okay with this being forced (you gotta show off who made the game after all).
    If the next time around I started the game it showed a “Skip” symbol and made it possible to skip, and provided a option in the settings that I could toggle to disable the intro logos.

    Logos could also be shown when the game goes into attraction mode (staying on the menu screen for too long) and play a logo loop etc.
    Or do as some games do, make the logos and stuff part of the start of the first chapter, that way the player only see it once (or only when they start a new character).

    1. Mephane says:

      They never learn.

      Actually, they do. Just not the lessons we want them to. Some have started putting extra checks into the game’s launch procedure to ensure the stupid logos are run, and refuse to launch the game at all if for example some of the video files happen to be missing.

      1. Sleeping Dragon says:

        Ah yes, happen to be missing. Or happened to be somehow randomly replaced with empty files with the exact same name as the original video file. Odd how that sometimes happened.

  17. James says:

    Apparently the game was balanced around the mouse.

    FIFY

  18. Neil Roy says:

    Vulcan, ugh, no thanks. I finally learned how to do OpenGL, then I had to unlearn all of that and learn modern OpenGL shaders. And now Vulcan, which is OpenGL, with all the easy parts ripped out and the newer parts made even more complicated than modern OpenGL. I am like, screw that, I’m going back to fixed pipeline OpenGL, just in protest! ;)

    1. Richard says:

      If you’re not doing high-performance, bleeding-edge GPU mangling, then you don’t need Vulkan.

      Stick with OpenGL/OpenGL ES “Core” profile.

      For extra joy, OpenGL 4.0 is a superset of OpenGL ES 3.1

      So learn OpenGL ES 3.1 and you can work with mobile phones and desktop, all in one glorious whole.

  19. PPX14 says:

    Imagine a future, where AAA PC games release as 1hour demos, collecting the performance and hardware information (opt-in) of thousands of users and thus identifying and fixing any issues.

    As I wrote that ‘idea’, I realised I was describing a beta!!

    1. poiumty says:

      Can’t have people play *CONTENT* that they didn’t *PAY FOR*! It’s actually detrimental to sales! Gotta exercise that sunk cost fallacy while we can!

      Betas exist for multiplayer games for a reason. Those are the only games that people generally play more once they’ve tried them. And a lot of it has to do with progression and grind systems.

    2. Redrock says:

      Don’t Steam refunds kinda work that way now? I mean, mass refunds are a good way to provide incentive for fixing whatever is wrong.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to Richard Cancel reply

Your email address will not be published.