Experienced Points: The HairWorks Debacle Explained

By Shamus Posted Tuesday Jun 2, 2015

Filed under: Column 100 comments

This week I talk about the whole NVIDIA vs. AMD pissing contest as it applies to the Witcher 3 “HairWorks” feature.

I actually wanted to do a rant on how both companies are staggering morons when it comes to how they name and market their products. But to illustrate how bad they are, you have to explain the naming systems of product lines, and I can never remember how those work because my brain refuses to store large quantities of unmitigated gibberish. I keep trying to educate myself about it, but my eyes glaze over and Istart drooling on myself. Hours later I’ll regain my senses and find myself scrolling through cat pictures on Facebook. It’s so dumb and confusing that I can’t follow it well enough to explain it for the purposes of describing why it’s so dumb and confusing.

This is also why my graphics card is as old as it is. I kind of feel like, if you asking me to put down that kind of money, at the bare minimum you need to make it as simple and as painless for me as possible. People shopping for a new iPhone don’t have to worry about choosing the correct GSM frequency bands. For crying out loud.

Having said all that, I’m delighted at how much things have improved. Back at the turn of the century, I used to need to worry about graphics hardware every two years. Now graphics hardware has a sort of console-like lifespan where you can go four years or so before you need to go through the ordeal of buying a new one. Even nicer is that it’s less painful to be on the low end of the tech curve. Oblivion was a nightmare. I’m running the Witcher 3 on the lowest settings, and it’s all good. The grass ends about ten meters from me, but other than that it looks like Shadow of Mordor from last year.

 


From The Archives:
 

100 thoughts on “Experienced Points: The HairWorks Debacle Explained

  1. Daemian Lucifer says:

    Off topic:

    I wish there was a database that listed the various documented/cannonical power levels of superheroes, IMDB style. Would solve many debates.

    There is:
    http://www.superherodb.com

    It even allows you to compare rorschach to batman to punisher,to see who growls the most.

    1. Starker says:

      Bleh, it doesn’t even have The Tick.

      1. Daemian Lucifer says:

        Why dont you add him then?Contact galactus over there(the creator)and give him info you think is correct.It is a growing database after all.

    2. Bloodsquirrel says:

      Oh, how naive.

      Having spent time on versus forums, I can tell you that are some completely deranged ideas floating around about what “power levels” (also known as biggatons) various factions/characters are at, some of them widely accepted by highly vocal and flame-happy fandom factions.

      You’re about as likely to get a good, objective analysis on one of these sites as you are to find the secret to turning lead into gold.

      1. Comparing heroes even toe-to-toe is rather missing the point of comic books, since a lot of what happens in comics is plot-related. I mean, until it was required, Kitty Pryde/Shadowcat/whoever didn’t figure out (or try) using her phasing powers to pull someone’s heart out of their chest until recently.

        Then there’s heroes like the Flash who seem to forget everything they’ve done with their powers previously when the narrative demands…

        1. Mike S. says:

          I think “ho’od win?” debates are sufficiently embedded into the genre that it’s hard to call them missing the point. The fact that the answers are inherently indefinite, and that on any given Sunday Aunt May can save the world from Galactus, just ensures that they’ll continue forever unresolved and still arguable.

          Even ignoring the long history in the fandom, Marvel’s Silver Age business model was built in part on its creating regular hero-on-hero confrontations that purported to answer those questions. (Though of course they mostly didn’t, as the heroes stopped or were interrupted and went off to face the real threat while leaving the question open, or had different results in the rematch). And one of the lasting legacies of Frank Miller’s The Dark Knight Returns (remember when you could just say “the Dark Knight” without ambiguity?) was more or less permanently changing which character was the underdog in Superman vs. Batman fights.

          (At least to comics fans. My experience of people outside the fandom is that they still see that as roughly equivalent to Bambi vs. Godzilla, with their money on the Kryptonian.)

          I agree that the indefiniteness of the powers is one reason that it’s hard to do any sort of systematic comparison. (Though as you allude, narrative is a bigger one– the battle is generally to the strong and the race to the swift, but stories are built on surprises and “if this just works…!”) In a lot of cases, it’s necessarily built into the character. If you tried to build the Flash in a numbers-heavy RPG system, you’d inevitably wind up with a character who’d either be too fast for a Captain Boomerang to give him trouble, or too slow to do the cool speed feats that distinguish his character. You could build a consistent speedster, but he wouldn’t look a lot like any incarnation of the Flash.

          1. Deoxy says:

            (At least to comics fans. My experience of people outside the fandom is that they still see that as roughly equivalent to Bambi vs. Godzilla, with their money on the Kryptonian.)

            That’s not really right – it’s more like a big, nasty grizzly bear vs. Godzilla-who-refuses-to-ever-kill-anyone-ever-even-accidentally.

            It’s only even vaguely interesting because Godzilla self-limits. All the mega-super-nifty-tech in the world can’t save you from something that moves at near the speed of light and can melt you with a look from outside the range of any kryptonite you have on you. Or smash you with a small planetoid, or whatever else the situation calls for.

            Comics routinely under-value ridiculous speed. Go watch the scene from Megamind where MetroMan goes to “clear his head”, then think what he could have done with all that “time” if he was going on offense. The fights with Titan (Tighten, ha) are only interesting/possible because he’s too dumb to superspeed.

            1. Mike S. says:

              I entirely agree with you: Superman’s self-imposed limits are a critical part of his character. Anything that begins with observing that he can vaporize his opponent or drop asteroids on him or throw him into space, other than to follow it up with why that’s not on the table, is talking about someone else. (At least after pretty early in the Golden Age.)

              But when I speak to, e.g., relatives who aren’t immersed in this stuff, they don’t approach it from that direction. It’s pretty much “obviously Superman wins”, full stop. For example, my brother posted a link to a video that edits the Batman v Superman trailer so that it ends (fatally) before Batman’s finished his sentence. (“Do you–” fzap!)

              I likewise agree super-speed can only be used in a superheroic context by being deliberately inconsistent with it. Try to do it straight, and you’re either stuck with speedsters too low-powered to be satisfying, or a Flash (or even Quicksilver) who can’t be challenged except by other speedsters. They can be overpowered by or useless against sufficiently cosmic opponents (punching Darkseid in the face a million times just hurts your hand, and makes him grimace slightly and send the Omega Beams to follow you till they catch you), or they can have an I Win button against everyone else. But prolonged cinematic conflict is hard to arrange.

  2. Chauzuvoy says:

    Hey, Shamus. Is this supposed to be filed under Diecast?

  3. Paul Spooner says:

    I kind of wonder if Nvidia paid CD Project RED under the table to exclusively use HairWorks for Witchomanz Mk. Tres. It would explain a lot of the apparent oversights in the development.

    1. mhoff12358 says:

      It took me a couple re-reads to figure out what you meant by “Witchomanz Mk. Tres” because I kept expecting it to be a TressFX joke.

      1. Paul Spooner says:

        They really missed a golden opportunity there. Should have gone with TressFX for the title pun combo finisher.

      2. Galad says:

        I still don’t get it :<

        1. krellen says:

          “Tres” is “Three” in Spanish. Does that help?

    2. AileTheAlien says:

      My guess is: Not enough to compensate for the degradation of CD Projekt Red’s public image.

    3. Humanoid says:

      I believe it’s safe to say the answer is Yes*.

      * CDPR issued a statement to say they were not paid by nVidia. Which is likely true in a literal sense, at no point did nVidia write a cheque for CDPR to deposit in their bank accounts. But sponsorship can and does take many forms, and the most obvious example here is the bundling of the game with nVidia’s mid-to-high end products. It can take the form of joint marketing, having staff on loan, or an agreement to cover some third-party costs.

      It stands to reason that a similar non-cash deal would have occurred when they agreed to include additional and exclusive physical swag with XBone CE copies of the game, which PC and PS4 owners could not get.

      1. Peter H. Coffin says:

        Or even just “free support”. Having someone show up from nVidia and spend a day teaching people about the API and having a stack of sample code that works would be a tremendous incentive to include something.

  4. The Rocketeer says:

    In situation like you describe in Witcher 3, in which you turn a feature all the way down like that, I often wish you could just turn it off altogether, and sometimes that option doesn’t exist.

    For instance, it looks really strange to turn down grass all the way and have a roving shadow of grass that follows you around. If the radius gets small enough that you consciously notice it, it just looks better to turn it off altogether. Likewise, if shadow quality gets turned down too low, you get the bizarre sight of realistic people haunted by blocky, blurry mannequin shadows, which is way creepier and more jarring than just having people with no shadows.

    Thankfully, my computer is pretty powerful, or at least powerful enough to run what I want without much compromise. (I’m not optimistic that it would run Witcher 3 very well.) But it occurs to me that whenever I do fiddle with graphics, there’s usually not an option to just turn a lot of these things completely off.

    1. Andy_Panthro says:

      It probably doesn’t happen much these days, but I’m sure I remember games where you could turn off shadows and other details like grass. Mount & Blade has some pretty good options to turn detail right down, which is handy if you want battles with over 200 soldiers without it becoming a slideshow.

      1. MadTinkerer says:

        If you get the Ultima Underworld games, you can enjoy the option of turning off ceiling textures, ceiling & floor textures, or all surface textures and just have shaded polygons and scaled sprites. Just in case you are trying to play on a 386 and want a better framerate.

    2. Humanoid says:

      Ironically a criticism of the HairWorks implementation is that it’s all or nothing, when in truth HairWorks is just effectively a bundling of tessellation and some other miscellaneous graphical features to create the effect. As per Shamus’ article, CDPR were handed a black box, and were not able to adjust the specific parameters of the tessellation level used, which was locked in at an extreme 64x – frankly an absurd amount of tessellation. There is no sensible reason for this not to be customisable, fuelling the sabotage theories. I’d call it a conspiracy theory, but the term is somewhat redundant when similar things have happened before*.

      In a further twist of irony, AMD users are actually able to override, via the control panel, the level of tessellation any game uses. Setting 8x tessellation for example provides most of the benefits of going the whole hog, but at a fraction of the performance hit. nVidia’s control panel is unable to do this, and so have to jump through more hoops to achieve a similar compromise or otherwise cop the full performance penalty of HairWorks.

      * With Batman Arkham Asylum, nVidia helped to implement anti-aliasing with the game, amongst other things like hardware PhysX support. PhysX is of course proprietary, but that’s reasonable enough in that being a hardware feature, AMD cards literally couldn’t support it (it can still run in reduced mode on the CPU). But the anti-aliasing was a huge controversy because the solution was a fully generalised one, able to run on any hardware, except…. nVidia had the developers implement a software lock on the feature, such that if the game detected an AMD card running when the game was launched, the feature was locked out despite being fully-functional. Indeed if one messed around with a Windows setting to trick the game into thinking an nVidia card was running, the feature worked perfectly on any hardware.

    3. Mephane says:

      And then there is lens flare. Most games look much better with it turned off. (I am looking at you, Mass Effect, oh wait you don’t let me turn it off…)

      Similarly, depth of field or motion blur, more often than not activating these options is a downgrade to visual quality, at least in my view.

      1. Incunabulum says:

        Motion blur certainly – but I like Depth of field. Especially for games with long larege open areas and long view distances.

        Skyrim, for instance, is a game I can jack up the view distance pretty far but it looks really bad because you can see the perspective tricks that they did in the vanilla game to hide its actual size. If you can see clearly a large distance away you realize that that valley isn’t big and the fort on the other side is really only a hundred yards away (or of cyclopean proportions).

        1. Talby says:

          Do you think you could post a screenshot of this? I didn’t know they used tricks to make the world look larger than it is, that’s interesting.

      2. Zak McKracken says:

        I guess that’s what happens when people get a new toy: They play with it until they either get bored or figure out a way to use it that makes sense.

        When I got my first DSLR camera, I kept the aperture open for all my pictures because I just loved DoF so much. It took me a year or two until the effect wore off, and I started to understand why sometimes you would want to have a picture where almost everything is in focus and you’re not forced to just look at some random detail in a sea of bookeh…

        I think DoF can be a great help if you want to save graphics RAM and processing power by going to very coarse LoD in the background but still look as if your game had enormous draw distance. Certainly looks better than some older games where you can see that the skybox is an actual box, and the edge of the world is just that, and is only meters away from the part of the map you can walk on.

        It can also be a good thing in cutscenes (as it is in movies), to focus the viewer’s attention on something (as well as, again, allowing your graphics card to focus its processing on the foreground), or create a certain mood (Some of my old detail-grabbing shots were actually not bad!).
        During gameplay, blurring anything that you may possibly interact with within the next few seconds … ugh. That is not an aesthetic thing, it just makes gameplay worse.

        So I think developers will have to learn to use DoF in creative and useful ways, and it will take time.

        1. Mephane says:

          My experience with depth of field effect so far was that it is an after-effect, and doesn’t improve performance at all by rendering at lower LOD or something like that. It just introduces a focal plane for the camera, and, like lens flare, is thus an effect that comes from movies, but usually makes little sense in a game where what you see is supposed to be through the eyes of your character.

          And at the very least the devs could just make it an option. Everyone has different preferences. I understand that some people love lens flares, or depth of field, or motion blur; i hate them. On the other hand, I consider antialiasing absolutely, non-negotiably mandatory, but there are people who don’t care about it and just turn it off if they want to boost their FPS.

          1. PowerGrout says:

            Pretty sure you’re right, DoF is resource intensive to a point where use cases for hiding low LoD and thus improving performance are likely very rare and probably so coarse as to be very marginal in terms of merit anyway. Older generation console games perhaps where an effect can be locked at a specific strength to exactly suit the visual aesthetic & hardware capabilities on hand.

            1280*1024 alias apatheia 4life bro.
            “the only winning move…” etc

        2. Peter H. Coffin says:

          Plus DoF in the real world varies a lot more than most artists of various fields use it. It exists between objects that are a meter away versus three meters away, but one you get out to 4-5 meters, people with good uncorrected vision will also have objects 10, 50, and 1000 meters away in focus as well. Blurring distant objects at that point is either atmospheric (in which case watch out space games) or literally myopic. (And I won’t even go into what merry hell it plays with comfortable viewing of stereo images such as through Rift…. For stereo, you’re far better off erring on the side of greater DoF than lesser.)

  5. Hal says:

    This almost plays out as a shadow argument for consoles, since there’s rarely anything to consider between hooking it up to the TV and dropping the disc in the drive.

    1. Incunabulum says:

      Once upon a time, in a kingdom far, far away, PC games had the edge in
      *gameplay* quality and being able to, you know, *save anywhere* in addition to graphics goodies.

      Nowadays even the AAA games are made for the consoles, with console level graphics, and – worst of all – a strong push towards simplifying the games so they have the widest appeal (willing sacrificing depth in the process). With that becoming the new normal, ‘it just works’ has become a strong selling point for consoles.

      This is why AAA PC gaming is dying and us PC gamers now have to spend time trolling through EA lists to find a game that might be good in a couple of years – *if* its ever finished.

      1. The Rocketeer says:

        *cannot tell if satire*

    2. Canthros says:

      “It just works” is always the fundamental argument in favor of a video game appliance (i. e. a game console) versus a generalized computer, especially a PC made of assorted components from varied vendors which is necessarily a mismatch for the platonic ideal of a personal computer that software is developed against and then ported away from (in the form of abstractions, error-handling, etc).

      I don’t even see how that’s controversial.

      1. Humanoid says:

        Releases like AC:Unity are doing a good job of dispelling the “It just works” mantra though. :P

        1. Canthros says:

          “It just works” is grotesquely undercut in a day of day-one patches, and videogame consoles that are practically indistinguishable from commodity PCs. It’s still the fundamental argument for a console over a PC, at the consumer end of the deal. It just isn’t as true, now.

  6. Incunabulum says:

    “Flowing hair” isn’t exactly the next-gen killer app that gamers are looking for. If it wasn’t there, most of us wouldn’t complain or even notice.

    I turned it off – got 10 FPS back – and didn’t notice any difference whatsoever, even up close.

    And that’s with running the game with everything else maxed out.

    1. AileTheAlien says:

      Do they just have static animations for the hair if the option is turned off?

      1. Syal says:

        It removes the hair entirely.

        1. Incunabulum says:

          Not for me – I still had flowing (moving) hair.

          Its possible that this is intended for use on lower settings? As a way to increase hair quality even if you can’t run the rest of the game in high settings?

  7. WWWebb says:

    I feel I should point out that even though AMD seems like it’s a distant second to NVIDIA. In the world of people who aren’t buying Witcher 3 for the shininess, both of them are FAR FAR FAR behind the market share of people just using the integrated Intel graphics that came with their motherboard.

    1. John says:

      Yay! I’m in the big group! I feel so validated.

    2. Primogenitor says:

      But most “PCs” are office machine that don’t do any sort of graphics. If you look at the Steam hardware survey http://store.steampowered.com/hwsurvey/videocard/ and add ’em up, its about 46% Nvidia, 22% AMD/ATI, and 18% Intel.

      Which is actually a lot closer than I expected – huh.

      1. Peter H. Coffin says:

        Cover results (one level up from your link) are showing

        52% nVidia
        28% ATI
        20% Intel and Other

        and that proportion looks pretty steady for the past two years. Which looks pretty credible for Steam’s audience of “People who play games on computers”.

  8. Licaon_Kter says:

    TressFX did have the same issues on nVidia on Tomb Raider’s launch and Deus Ex: Human Revolution had an actual hardware check and disabled 3D Stereo on non-AMD cards on launch, yeah they patched them out later, hence painting the green team as evil _now_ is rather meh. Not to mention that they got flack from their own customers that have non-latest generation cards that showed lower performance, they vouch they’ll get a driver update for that, we’ll see.

    Also, in the long run, how desperate is AMDs situation that they lament on one freaking single game like their whole user base can’t live without it and the sky is falling otherwise?

    1. Orillion says:

      Pretty desperate. People are willingly handing Nvidia a monopoly.

      1. Cilvre says:

        I’ve been standardized on AMD and ATI for a bit now, every off chance I’ve tried to step out into intel or nvidia land, I’ve always ended up burned and left with something overpriced to try and resell. I understand each person has had different experiences, however I won’t bother trying to step out of those sections anymore.

        1. Orillion says:

          I’ve literally never used an Nvidia card, but I had to, for the first time since my Pentium 4 in the early(ish) 2000s, swap over to an Intel CPU after several failed attempts at updating my system to a newer AMD chip. They don’t seem to do gamer-friendly CPUs anymore.

          1. Licaon_Kter says:

            I switched to an Intel CPU last year, for the same reason, AMD can compete in only some tasks there.

            Regarding GPUs I went full nVidia 8 years ago after waiting for a long time for a proper AMD Linux driver, that did not come yet either.

  9. Jabrwock says:

    “People shopping for a new iPhone don't have to worry about choosing the correct GSM frequency bands. For crying out loud.”

    Except… you do.

    https://www.apple.com/iphone/LTE/

    There’s an iPhone 6 that works on some US networks (and all networks in Canada for some reason, probably regulatory that they can’t exclude a phone from competitor’s networks). An iPhone 6 that works on nearly the entire world. And one that only works on one network in China.

    WTF.

    1. AileTheAlien says:

      Well, phone manufacturers were legislated into building phones that just charged with USB instead of proprietary bullshit*, so maybe we can do the same with international-compatibility in phones. :)

      * Except Apple, who somehow got an exemption. At least their new laptops will have USB 3 connector C, so maybe their new phones will too?

      1. Ranneko says:

        All the information I can find about the EU charger regulations says that they only fully come into effect in 2017. So Apple has another couple of years before it has to move.

        Would like to look a little more closely at the rules, because I am hoping that their is enough flexibility to allow the industry to move to USB type C, rather than leaving microUSB as the permanent standard.

        1. Incunabulum says:

          And thus the major problem with *legislating* standards is revealed.

          1. AileTheAlien says:

            In general, yes. However, USB was always designed as a standard not ruled by any one company in particular, and also to be improved upon in the future. Like, they have version numbers on USB standards. So, even if the legislation was literally written as “USB 2 with micro B connector” they could easily update the law to “USB 3 with type C connector”.

            1. Incunabulum says:

              Yes, USB is a *designed* standard – but its also now a *legislated* standard. IOW, in some areas its a legal obligation to use it.

              Depending on how the legislation is written, it can be very difficult to change to another standard (or even improve on the existing) as situations change – even if, for some reason, USB is a sup-optimal standard for your particular product.

              And it gives a decent boost to the power of the standards agency that now has the force of law behind it. There’s that much less incentive to be the best as people are forced to use it rather than choose to.

              All that to ‘fix’ what was a non-problem to start with.

              1. Jeff says:

                You’re either very young, or you’ve repressed your old memories of having a dozen different adapters and cables for similar devices.

                1. This, oh so very much this. I have a box full of cell phone wall-warts of various sizes, shapes, and voltages that I keep just for electronics projects, but I’m sure most people throw them out.

                  Then there’s the fun of trying to charge a device you can’t find the cable for and the manufacturer (or even a knock-off company) demands an arm and a leg to replace. Then when you get another phone/device, you have to toss out all of the cables in your home, the ones you take with you on vacation, the one you hook up to your computer, and the one you use in your car. Efficient!

                  I suppose the idea of some villain with a USB costume stroking a white cat appeals to some people, but they never experienced the way things were.

                  1. Thomas says:

                    It’s even things like, if you’d forgotten to bring your phone charger when visiting a friend for a week, pre-USB standards there was barely even any point asking if they had a phone charger that you could use. Now you pretty much just say “Can I borrow your phone charger?” – and then glare at them if they have an iPhone

                    1. Cilvre says:

                      This, so much this.

    2. Daemian Lucifer says:

      Yeah but keep in mind that the first iphone wasnt that functional as a phone either.So that it now works only on half of the networks isnt that surprising.Give it half a dozen more generations,and an iphone will be able to do almost everything other smart phones can do now.

      1. Humanoid says:

        You must be holding it wrong.

        1. AileTheAlien says:

          You have to put some electrical tape over the one area your thumb wasn’t meant to touch. Duh. :P

      2. Jabrwock says:

        It does work on all networks, IF you buy the right model.

        It’s like they have a crippled version of the iPhone6 for certain scumbag carriers to sell.

  10. Bryan says:

    Shamus, did you ever play Silent Hill Downpour? I want to know what you thought of it, but I can’t find any record on your site of you playing it. It’s like Silent Hill 2, in that it focuses on personal problems rather than the cult. I have been playing a horror game recently and just found myself wondering what your thought of it.

  11. Kronopath says:

    Shamus, did you misspell “CD Projekt RED” as “CD Project RED” (with a C) throughout the article, or was that a case of an Escapist editor getting too uppity?

    1. Shamus says:

      Nope. Totally my fault. I’d noticed the non-English spelling of the name before, but I forgot all about it. My eye usually scans right over the alternate spelling without noticing. In fact, I had to re-read your comment a couple of times to spot my error.

  12. Daemian Lucifer says:

    So why are people(developers mostly)going for the flowy hair anyway?I thought they were all about “realistic” games,and having long hair that flows with all that sweat and dirt is far from realistic.

    1. guy says:

      Presumably because it’s a good way for them to make the requisite offerings to the gods of graphical bling so money will fall from the sky.

    2. Phill says:

      It’s one of those things where apparent realism isn’t the same as realism (just as if you put a realistic explosion into a film these days, everyone complains it is unrealistic). Pretty much the only time you notice hair movement on TV is in shampoo adverts, where big, bouncy shiny hair moving in a conspicuous manner is the norm. So if you are doing hair simulation, that is pretty much what you end up doing. ‘Realistic’ equates to ‘what images do people conjue up when they think of hair physics’

      If you do actual realistic hair, then like most hair movement in most TV and films, people just aren’t going to notice it.

      For the same reason, most game visual effects (lighting, shadow, reflection etc.) are exaggerated compared to reality too.

  13. RTBones says:

    Am I the only one that turns off things like tress and other ‘high level’ effects by default? I rarely even look at them, as my PC wont ever be on the bleeding edge of hardware. I just tend not to bother with them. My eyes and brain are FAR more bothered by a drop in frame rate than they are impressed by fancy-pants effects.

    1. AileTheAlien says:

      I also tend to turn down effects for a higher framerate. Although high framerate is usually only an issue when I play FPSs, and I don’t play those much anymore. What always bothers me in any type of game however, is a variable framerate. That always distracts me from my story, and sometimes forces me to shut my eyes or pause the game and look away. Kind of like the feeling you get from a too-fast elevator – not painful, but very uncomfortable.

  14. Neko says:

    I’m still grouchy after finding out that after being originally touted as a “Coming to SteamOS” game, it’s a Windows-only release. Maybe this stupid hair library wasn’t portable enough? Pfft.

    1. Abnaxis says:

      Maybe they expected SteamOS/Steam Machines to become available in 2014, but CDPR neglected to factor in Valve Time when they made the claim?

  15. Earlier this year, I assembled my latest gaming rig with a GTX 970 card after my 280 passed away. I got the 280 just as they were coming out for a real bargain back in ’08 and I was able to play games on the max settings up until it finally blasted off.

    I’m pretty sure I’ve given up on being a console guy, and I’ll prolly always complain about console port-itis, but I will always be grateful they stopped the graphics wars.

    1. AileTheAlien says:

      Yeah, now we just have to deal with the Exclusive Titles Wars, and the Input Devices Wars. Maybe some day we can have peace. :P

    2. Humanoid says:

      I had to do a double take when I read 280 there, before I realised you were talking about the nVidia 280 and not the AMD 280. Yeah….

  16. Why are we still talking about this old Witcher game when Bethesda finally released a trailer for a future season of Spoiler Warning?

    1. MichaelGC says:

      They haven’t answered any of the really important questions like:

      -Where is the Incinerator and how much does it weigh?
      -Will the protagonist at any stage be referred to as ‘Mungo’?
      -How overpowered is melee combat on a scale from ‘utterly broken’ to ‘Galactus’?

      There is still time, though. (And probably rather too much of it, ‘n’ all! – still a year away?)

      1. Yep. Though there’s a thread over on Reddit titled It Was Written. The first link goes to a post about 10 months ago from someone who claimed to have worked for Bethesda and played Fallout 4 (and was subsequently fired for leaking info). I refrained from reading it, but from what I gleaned in the ensuing comments, a lot of what she said lined up with details from the trailer.

        One of them being something I think Mumbles mentioned about Three Dog in the New Vegas season: Three Dog is the radio DJ, but you never meet him in-game, and some characters wonder if he’s just a recording. Of course, this could be to cover all bases from your playthrough of Fallout 3, where (if you were a right-thinking wastelander), you killed him for his Luck headwrap. :)

  17. RCN says:

    Huh, I went to the link to Bethesda’s website on that old Oblivion article of yours and it led me to a teaser video for Fallout 4.

    Have you seem the video? It is like Bethesda rediscovered color!

  18. MichaelGC says:

    Weird. I had this really vivid dream where after this post there was one about favourite episodes of Spoiler Warning.

    And you were there… and you were there … and you – you I’ve never seen before. Hi! :D

    Oh well, that’ll teach me not to take naps at work.

  19. Decius says:

    It’s shameful the way that your article intentionally fails to accuse NVIDIA of any malfeasance, then looks at the evidence and concludes that there was probably malfeasance.

    Suppose that someone in NVIDIA had the technical knowledge and access to make HairWorks suck on AMD cards and chose to do so. What evidence would you expect to see that you would be less likely to see if the reason HairWorks sucks on AMD cards is because graphics is *hard*?

    1. Shamus says:

      It’s not shameful at all. I don’t have the evidence to prove anything, but that doesn’t mean we shouldn’t discuss the possibility. I covered all this in the article with the insurance analogy.

      And it doesn’t matter if they’re guilty or not, because the entire thrust of the article was that we shouldn’t let CD Projekt RED off the hook. Maybe NVIDIA is guilty and maybe they aren’t but CPR is CLEARLY guilty of being irresponsible by using black-box vendor code.

      1. Decius says:

        You discuss the possibility, and then in your last paragraph you conclude ” The performance of their videogame was harmed so that it could be used to sell more NVIDIA hardware. ”

        That’s not just discussing the possibility that they might have- it is literally concluding that is what actually happened. You didn’t express ambiguity as to whether or not it happened.

        If you want to claim that maybe nobody at NVIDIA is guilty of writing code with the intention of making AMD cards suck at HairWorks, look for evidence regarding that. Don’t just say “it looks suspicious and they probably did it” and “They did it”.

        1. Shamus says:

          I’m not obligated by some “innocent until proven guilty” thing here. I’m not a jury, I’m not a judge, and this is not a courtroom. They LOOK guilty as hell. I can’t prove it, and I don’t need to. I allowed for the possibility of their innocence, and then I got on with the rest of my column.

          Yeah, I could have cluttered up the whole thing with disclaimers: “If these allegations are true” and “if they did it”, and “allegedly”. But I did that above and I trust people can think for themselves. The point of the article wasn’t to prove NVIDIA did anything wrong, it was to point out that developers have a responsibility to not get dragged into hardware battles.

          If NVIDIA has hurt feelings, they could always shut me up by releasing the source and proving I’m an unreasonable old meanie. :)

          1. Actually, the article legally could be considered to be slander so far as I understand it. I mean, I doubt NVIDIA is even aware it exists, but you are making accusations about their behavior without providing any supporting evidence.

            So yeah, I pretty gave this article a pass for all the reasons Decius pointed out. I wouldn’t go so far as to say it’s ‘shameful’, just…disappointing. I wouldn’t say responding to these very legitimate criticisms with “It’s my party and I’ll cry if I want to.” is the best way to handle things either.

          2. Decius says:

            You also could have accused them explicitly, based only on the evidence available to you. You didn’t do either; you refused to accuse them, looked at a lot of evidence that supported the conclusion that they were guilty, used your refusal to accuse them to justify not looking for evidence that they weren’t, and then clearly concluded that they were guilty.

            I didn’t say it was libelous. I said it was shameful. Not because you accused an unnamed individual(s?) at NVIDIA of adding the code “IF AMD THEN SUCK”, but because you published that conclusion at the end of a thoughtful article that implied you had thought about whether it was true even though you hadn’t tried to look for evidence that your conclusion was false.

        2. Joseph says:

          I didn’t read that line as Shamus concluding that NVIDIA had definitely done anything intentionally. Even if NVIDIA’s software doesn’t perform poorly on AMD cards by design, the fact is that the Witcher 3’s quality was compromised for the sake of marketing hardware. So Shamus’ statement rings true even if you give NVIDIA the benefit of the doubt.

          I wouldn’t be remotely concerned about the suggestion of libel either. Apart from the fact that the article doesn’t go beyond reasonable speculation, NVIDIA isn’t going to want to confirm this theory in people’s minds by appearing to try to censor it.

          1. MichaelGC says:

            I’m not a lawyer, but a few very small tweaks to the final paragraph (if such were possible) would ensure it stays firmly within the realm of reasonable speculation. If this:

            I’m not worried about the feature itself, but the fact that the Witcher 3 developers got played. The performance of their videogame was harmed so that it could be used to sell more NVIDIA hardware. That’s a really sad outcome for such a massive, bold, uncompromising game.

            were changed to this:

            I’m not worried about the feature itself, but the fact that the Witcher 3 developers [may have] got played. [Was t]he performance of their videogame harmed so that it could be used to sell more NVIDIA hardware[?] That[‘d be a] really sad outcome for such a massive, bold, uncompromising game.

            that should do it. At least, I think it would – but as I say, I’m not a lawyer!

            1. Decius says:

              Those changes would be asking the question, rather than answering it.

              1. MichaelGC says:

                Yep – I was attempting to make it more speculative.

              2. Daemian Lucifer says:

                Those arent question marks,those are cavuto marks.

            2. Shamus says:

              Ah! I see how people are interpreting this.

              “The performance of their videogame was harmed so that it could be used to sell more NVIDIA hardware.”

              But this statement is true, even if HairWorks works perfectly. (Or at least, equally badly on all systems.) NVIDIA isn’t writing this stuff as a hobby. They’re writing these libraries because they expect it will sell more hardware down the line.

              1. MichaelGC says:

                That’s true, although that particular sentence does take on a certain tenor when combined with the previous one! Personally (and non-lawyerly!) I don’t see a big issue – certainly didn’t see a problem the first time around, and only when I went back to carefully (if amateurly!) doublecheck it did that passage stand out a little.

              2. So you’re now saying your article is criticising them…for being a business?

                1. Shamus says:

                  The article is criticizing CD Projekt. Flowchart time:

                  
                      Is NVIDIA dishonest?
                  	         |
                  -----yes-----------no-------
                  |                           |
                  |                           |
                  |                           |
                  ---->Don't blindly trust<---
                        their software.
                  
                  1. …so what? Seriously, how does that invalidate the issue?

                    Yes, your article criticized CDP…and in your article criticizing CDP, you also did this other super sketchy thing Decius elaborated on better than I ever could.

                    1. Shamus says:

                      You’re arguing in circles. I addressed this two comments ago. I illustrated how that “super sketchy thing” is a matter of interpretation. If you read it differently than intended, fine. It’s ambiguous.

                      But don’t put words in my mouth.

                    2. “You're arguing in circles. I addressed this two comments ago. I illustrated how that “super sketchy thing” is a matter of interpretation. If you read it differently than intended, fine. It's ambiguous.

                      But don't put words in my mouth.”

                      You didn’t address it, you dismissed it. And actually you were correct. You AREN’T required to be unbiased, you have no obligations (that I’m aware of at any rate) to present your points objectively and that IS perfectly valid. It IS your blog, it IS your article and yeah, you CAN cry if you want to. Let’s be clear, I’m not trying to dictate what you say.

                      Buuuut…

                      You still did it. The focus of the article, and the supposed ‘misinterpretation’ of that one sentence does not address nor invalidate the criticisms that were brought up. Pointing that out is absolutely not “putting words in your mouth”. That is a seriously skeevy way to justify censorship. Ugh…

                      Look, you obviously have made your line in the sand. As regrettable as it is that this how you choose to respond, once again it is absolutely your right to. As I said, it’s just…disappointing.

              3. Decius says:

                So, you were saying that HairWorks subtracts from the performance of their videogame? And now you are saying that it would harm the performance even if it worked perfectly?

                I’m not sure how easy it is to disable; if HairWorks can be disabled easily it causes no harm to performance and improves graphics when it does work.

                So, in what sense did HairWorks harm the performance of TW3, compared to the absence of HairWorks, that a perfect HairWorks would also have done?

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to PowerGrout Cancel reply

Your email address will not be published.