Experienced Points: How Massive is Wolfenstein: The New Order?

By Shamus Posted Tuesday Jun 3, 2014

Filed under: Column 104 comments

So here is an article based on a Tweet, where we compare the size of the new Wolfenstein game with the sizes of the games of yesteryear.

(Do we capitalize tweet? It’s a noun (or verb) derived from a proper noun. I had the same question the other day (also in a tweet) about whether we should capitalize the verb derived from Patreon. Someone pointed out that since Patreon was derived from “patron” and already had “patronize”, it wouldn’t make any sense to start capitalizing an existing word. (Although I’ll bet we can find examples of people doing exactly that.) But this particular verb form of tweet is new. My gut says we shouldn’t capitalize it, but my fingers want to. So I dunno.)

ANYWAY.

Getting back to the article. I ask the question, “How much space would you need to store every game ever made for every platform, on or before 1992?” I’m pretty uncomfortable with my appraisal of DOS games. After I turned it in, I re-read it and felt that the size I came up with was just way too small. On the other hand, the number was a really wild guess and I don’t know how to come up with a more solid number. I didn’t want to submit a re-write with one arbitrary number replacing another simply because the new number seemed “better” in some ill-defined gut-sense of the word. What I really needed was a better way to extrapolate an answer, and I didn’t have one. I’m content to leave the DOS stuff as a weak spot in the article and see if readers have any better answers. Even if I was off by a factor of ten, the main thrust of the article stands: Wolfenstein: The New Order is BIG.

I’m still really enjoying the game. It’s absurd and fun and very old-school in its approach to shootin’ Nazis. I’ll have a more detailed discussion about the game once I finish it.

 


From The Archives:
 

104 thoughts on “Experienced Points: How Massive is Wolfenstein: The New Order?

  1. Abnaxis says:

    I think the correct thing to do there, is pick a number that is the smallest number you know is greater than what you’re trying to guess at, and use that. So I would have used 700,000 KB, since there are so few games more than 1 disk long, and that’s the number for “every game is a full disk,” right?

    1. Ben says:

      That would be way too big. Wolfenstein was one of the earliest games available on CD-ROM, and didn’t come anywhere close to filling a 700 MB disc on its own. I think it was alternately available on a scant two 3.5″ floppy disks, so 2,950 KB would be a more sensible number.

      1. ET says:

        I was thinking along similar lines. Even if we assume two full 3.5″ disks for each game (2.88 MB), that still only bumps Shamus’ estimate from 342 MB to 6.6 GB. So the new Wolfenstein is still larger than all the games before 1993, but only about 4X the size, not 10X. So…still substantially larger. ;)

        P.S.
        Are we supposed to use 1024 or 1000 when talking about mega, kilo, giga, etc? I know for a while, there was an effort to use KiB, MiB, GiB, etc when talking about 1024, and the normal prefixes when talking about 1000, but I don’t know if that died off or is still struggling to gain acceptance. Or if anyone cares besides me. ^^;

        1. Daemian Lucifer says:

          I dont care because Im rooting for memristors to take over computers eventually,hopefully putting them in the decimal system.Though my guess is that first they will go through hex,and my fear is that they will stay there.

          1. +++Divide By Cucumber Error. Please Reinstall Universe And Reboot +++

        2. Wooji says:

          The KiB/KB is still in use, atleast in every singel CS class i have had the last two years and most of my course literature, beyond that i however dont think most people give a flying f**k.

        3. krellen says:

          Microsoft has always counted by 1024s, but storage manufacturers count by 1000s because it makes their product sound better.

          1. ET says:

            It’s not just Microsoft; Linux tools count in 1024 too. I think the dividing line is roughly:
            1000: network engineers (and thus also your ISP), hard drive manufacturers
            1024: operating systems, RAM, optical disks?

            1. A good rule of thumb is that anything RAM related is 1024 and anything else is not. Though things do get confusing with SSDs.

              Even more confusing is that 1.5GiB is not 1500MiB it’s actually 1536MiB if my math is right.
              The Metric system of TB, GB, MB, KB is way easier as you can just add remove zeroes. (don’t even get me started on the Imperial system that a few certain countries still insist on using, hint hint).

              The fact that HD makers used the metric MB instead of the MiB way of showing numbers may have been for marketing reasons, after all, formatting a disk eats up a lot of space itself so a 1TB drive is not really 1TB after all.

              What is a shame is that the memory guys did not just move to use 1000s instead of 1024s. Or just display 1000MB (1024) in parenthesis, issue solved.

              1. ET says:

                Well, that’s the whole thing with the prefixes with the extra i in them – they’re 1024, not 1000. So, 1.5 MiB is 1536 KiB, but 1.5 MB is 1500 kB! (Also not that kilo is capital in the “i” system, and lowercase in SI prefixes. ;)

              2. Bryan says:

                “Memory guys” didn’t move to powers of 10 because … well, that’s not how memory works actually. :-)

                There are a fixed number of physical address bits on the bus between your CPU(‘s cache) and your memory. These used to be all run parallel to each other on the motherboard (and so had to be very close to the same length, due to the clock rate of the memory bus and the speed of light, which would make too-long traces cause the signal to arrive behind the RAM controller’s latch of the bus lines…), but I don’t know for sure if they still are, or if someone has figured out how to serialize the addressing. Either way, it’s digital and binary, though.

                (Used to be that only half the required address lines were present, and were reused as a row address and column address; this is what the CAS latency used to be related to. But again, whether that’s still true or not, it’s still digital and binary.)

                Since each RAM board has an integer number of address lines coming into it, and the addresses address bytes, each RAM board *always* contains an integer power of two count of bytes. (Even if the addressing is serial now; it’s still an integer number of bits, and it still addresses per-RAM-byte. So the byte count is still forced to a power of 2.)

                The only way to make memory work in powers of 10 (…rather than wasting fractions of at least one address line) would be to have a base-ten digital-logic addressing system. I’m not even sure how that would work; as the logic is still two-level, you’d need log-base-2(10) two-level pairs, or ~3.3ish.

                …Unless I guess you used ten different voltage levels on the analog side of the logic setup. So a 1.7V memory board would trigger its address bus in increments of 0.17V per line. That seems rather hard to re-discretize on the other end, though, given the general response curves and required bias voltages of the semiconductor junctions that I know about; most are somewhere in the range of 0.6V; much higher. It might work to raise the operating voltage, but of course then you waste a lot of power…

              3. Blackbird71 says:

                SSD (and by extension, Flash) guys use 1024. Or at least, the ones I work with do, but it seems pretty standard across the industry.

            2. Actually for networks and ISPs bits is used so instead it is 10MBit so you need to divide that by 8 to get 1.25MB or by 8.192 to get 1.220703125 MiBi. (Ugh!)

              Then again I’m one of those that wish we had metric time too. (100000 seconds a day, 10 hours a day, 10000 seconds an hour, 100 minutes an hour, 100 seconds per minute, which means that 1000ms per second wold match up nicely).
              Days would kind of match up as a day is 86400, alternatively they could redefine the duration of a second so current day 86400 per day would become 100000 instead though and keeping the day the same length.
              A 100000 seconds day would mean 3+ longer days, and Earth day/night cycle would be sliding all the day relatively speaking. Though if traveling in space or living on other planets the Earth day/night cycle is useless as a universal day and time system anyway.

              What the heck was I talking about again?…

              1. Squash says:

                Why not work in Unix timestamps — the number of seconds since 1 January 1970?

                What am I doing? I can’t believe its 1402308467 already. Time for bed!

        4. Myself I try to use KiBi and KB as defined, sure some people with no clue might wonder what the hell that i is doing the4e, then again they don’t know enough to care anyway, and if they do care they’ll quickly find out why and thus learn something new. So in projects I work on I do use either KiBi or KB but try to avoid using both at the same time or in the same place as that could be confusing.
          Sometimes I slip up and type KiB instead of KiBi though.
          And I never liked the look of it so I try to use KB where possible due to that.

          1. Zock says:

            Your slip-up is actually the correct way to do it. The chosen prefix (k, Ki) does not affect the chosen unit (b, B). You can also combine the binary prefixes with any unit and talk about kibiseconds (1024 s = 1 Kis) if you want.

            PS. “Bi” is not a standard unit for anything. The units for bits and bytes are “b” and “B” respectively.

      2. Abnaxis says:

        700,000 KB was for all DOS games made put together, not for Wolfenstein by itself.

        1. Ben says:

          Ah, that makes sense. It’s just a confusing quirk of the numbers that your total matched a well-known disc format.

  2. Wilcroft says:

    Having been born in ’91, what percentage of games used 5.25″ versus 3.5″ floppies? (As your best guess)
    The 3.5″‘s did offer 4x the storage compared to their larger cousins. Looking at my dad’s old collection, he seemed to have a mix or the two.(Not that the 4x will make that much of a difference…)

    1. krellen says:

      3.5″ floppies weren’t in widespread use until the time of CD-ROMs, or if you had an early Macintosh.

      1. Peter H. Coffin says:

        CD-ROM Specification (“Yellow Book”) was written in 1988, and didn’t really get finalized until 1990. I remember CD-Roms showing up on computers bought around 1994, as opposed to being $500 add-on devices that you needed a SCSI card to connect. By then, I’d been using 3.5″ floppies for six or seven years, starting with ProDOS on a Laser 128 (like a fat Apple //c that worked more like a //e). I’d been rocking twin 3.5″ drives, a 1 meg RAMDisk, and a ZipChip accelerator since about 1988-90, depending on when I bought the gear.

        1. Brandon says:

          Thanks to Apple using SCSI as their internal and external drive bus, CD-ROM drives showed up pretty early on Mac computers. I remember having an external 2x on an old Mac. It was a cartridge-based drive, so it would use this case that you had to open and put the CD into. I was fascinated by using early audio editing software to rip an audio track from a CD and reduce the quality to like 22khz mono to try and fit a whole song onto a single 1.44 MB floppy disk.

          Additionally, the Japanese PC Engine had a CD-ROM add-on as early as late 1989/early 1990 and by the end of 1992 there were over 100 titles out. A lot of damn fine games on that system. Can’t remember when the Mega CD (Sega CD) first came out and I’m too lazy to Wikipedia it.

    2. I remember seeing the smaller floppies at least a year or so before a CD-ROM, but I have no idea how common or uncommon they were. Mom worked for IBM so we often ended up on the bleeding edge of the PC tech curve (and now she’s so far behind it that she needs the Hubble to see it).
      I know we had a computer with drives for both floppies for a couple years before we got a new computer with a CD-Rom drive, but Mom may have needed the 3.5 drive for work.

      1. Humanoid says:

        Too young to remember buying the games myself, but what I remember is that manuals often spoke about what ‘version’ you purchased, i.e. what disk size it came with. Some games came with leaflets (or notes in the manual) saying you could mail the disks back to the publisher in exchange for the alternate size. Some generous games might have come with both sizes by default.

        But yeah, also remember having a computer with both 5.25″ and 3.5″ FDDs for a while. Prior to that, one might see them with two 5.25″ drives.

        1. Especially if a load of pirated floppies lurked nearby. :)

          1. Mike S. says:

            Or if you just wanted to be able to run a program and have a data disk without swapping.

    3. Humanoid says:

      The capacity calculations weren’t that simple – are we talking single density, double density, high density? Single sided or double sided? A HD 5.25″ could fit 1.2MB, compared to the 1.44MB HD 3.5″ disks people nowadays are most familiar with. (Both double sided, by the way, but by that times drives were able to read both surfaces without needing to physically flip the disk)

      1. Heh single and double sided, I remember cutting holes in the big floppies to force them double sides even if they where not.
        And tape was used sometimes to make the hard floppies writeable even if they where not.
        I can’t recall how reliable that actually was over longer time though…

        1. Humanoid says:

          Makes me wonder, barely any DVDs shipped double-sided, though I’ve had a few movies presented that way (in the early days, 4:3 on one side, 16:9 on the other). Anyone recall if there were any games that bothered with double-sided DVDs?

          1. Bubble181 says:

            I have at least one…Big arguments with my brother over how to position that disc when not in the tray :p

          2. I hate double-sided DVDs. I could never be sure of which side had the widescreen version and which had the “cut off but some people think 4:3 is somehow ‘more movie’ than widescreen” version. I mean, if I could see “widescreen” on a given side, did that mean “this side up for widescreen” or that “this side has the widescreen version, so flip the disc over.”

            It’s things like this that cause unrest.

  3. Daemian Lucifer says:

    Even if we assume that all the games before 1992 were 1mb in size,which they definitely were not,there would still have to be 40960 of them in order to rival the new wolfenstein.Im pretty sure that there werent 40960 games made by that time.So yeah,your guesstimate seems spot on.

  4. Whether or not to capitalize Tweet (or other, similar terms) depends on how you’re using it in a sentence. If you’re using it to refer to a singular, official, proper-noun individual, then capitalize it. If you’re using it to refer to a class of things or a single member of that class, then don’t. It’s kind of like how you capitalize Mom when you’re using it in place of your mom’s name, but when you say “my mom” you don’t capitalize it because you’re referring to her as a member of a whole class.

    And if you’re using it as a verb, no capital unless you’re using the *actual trademarked name*. So if we said “I Twitter’d my friends” capital. “I Google’d it” deserves a capital.

    1. JackTheStripper says:

      I don’t know about the last rule you mentioned. From purely empirical evidence, the verb forms of company names seem to be most prominently used without capitalization:

      “I photoshopped this image.” vs “I Photoshopped this image.”
      “I skyped with my family.” vs “I Skyped with my family.”
      “I xeroxed the document.” vs “I Xeroxed the document.”

      1. HeroOfHyla says:

        The companies hate it when you use their name as a verb though. Adobe doesn’t want you to photoshop things, they would rather have you “edit pictures with Adobe Photoshop” because using their brand name as a verb dilutes the trademark.

        1. evileeyore says:

          Which is dumb considering all the free advertising that comes of brand recognition.

          What do you blow your nose in, a disposable napkin or a kleenex?
          What do you toss for your dog to fetch, a plastic throwing disk or a frisbie?
          What do kids play with, soft foam rubber swords and foam pellet/dart firing guns or nerf?
          Do you make photstatic copies or xeroxes?

          1. Would Frigidaire rather still own the trademark on “Fridge?”

          2. Mike S. says:

            When you hear about cellophane, do you think DuPont? Is the use of the term escalator free advertising for Otis? If a term becomes generic, it loses its association with the manufacturer (and potentially its legal protection), and Adobe gets to start over with “Adobe PixTrix brand photoshopping software” or whatever, competing with GIMP’s photoshop and Paint Shop Pro’s photoshop.

            I’m not especially interested in letting companies dictate the use of language outside of trade, and I’m happy enough to google something with DuckDuckGo or use Puffs kleenex. But the companies aren’t crazy to try to push back where they can.

          3. Trix2000 says:

            The correct term is ‘facial tissue’. :)

            Of course, the ones I usually heard complaining about that were from other manufacturers, not Kleenex. But yes, defending the trademark IS a pretty good reason for them to be doing this. The fact that they have to do so means they don’t really need the marketing from it.

        2. Richard says:

          I deliberately choose to photoshop things using Gimp purely to annoy Adobe.

          Oddly, “xeroxing” seems to be a purely left-pondian concept, the act of using a photostat copier is pretty much only called photocopying over here.

          1. Andrew_C says:

            Whereas hoovering is a purely Right-Pondian concept, apparently. Is Hoover still a trademark in the US?

            Also, I use Paint.Net to photoshop stuff, because GIMP is a ridiculous name for a piece of software.

    2. Ivellius says:

      I’m not sure a consensus has emerged on such things.

      I’d suggest capitalizing Tweet in pretty much all contexts. I believe it’s trademark/”platform” specific, unless tweet can refer to non-Twitter activities. Proper names are capitalized, so I’d argue verbed forms of such should likewise be capitalized. I “Googled” or “Photoshopped” should be correct in such contexts even if people don’t do that.

      “Patreonize” could be a verb with the capitalization, but it’s so simple to just say “patronized” that I’m not sure the “through Patreon” aspect implicit in the first term is generally necessary.

      A lot of these are determined by convention, though–grammar’s not necessarily prescriptive. If people decide to change things, things change.

      1. Whenever I see Patreonize I think Patronize heh..

  5. Chris says:

    “But coin-op arcade games don’t belong to a ‘platform’ because each machine is built specifically for the game in question.”

    Actually, a lot of the old arcade machines had compatible hardware. You could change the game mostly by swapping out EPROM chips. That was easy enough to do that arcade machines employed DRM in attempt to make this process more difficult. I’d say it qualifies as a platform.

    1. I think Shamus meant consumer products, if you where lucky you could probably rent a arcade machine.

      Also a lot of consumer systems came with a cartridge with multiple games, or the game came with a set of games and you had to buy a new game instead as you could not swap cartridges.
      Where would one draw the line of bundles or cross-platform editions?

      If one where to do the numbers from scratch one would probably take the deluxe/biggest platform version (with the audio music tracks instead of midi files etc). But now we are getting into the CD era again when stuff just ballooned in a year or two.

      1. evileeyore says:

        My first roommate and I owned 4 arcade boxes and 10 or so arcade EPROM chips (and the associated peripherals where needed). We’d swap out chipsets every so often to change up what games we had available. In fact the shipsets were often far cheaper than the boxes, which is the second reason we had so many different games with so few boxes (the primary reason was our apartment didn’t have enough room for more than another box or two).

    2. After a while, according to a friend of mine who worked in an arcade, cabinet games basically became glorified playstation consoles with customized controller inputs.

      Oddly enough, the only “innovation” (and I use that term loosely) in this area seems to be with slot machines. I saw a Monopoly one that had a full-color LCD touchscreen overlaid with a transparency that could light up red (to highlight what you selected). Five windows were then cut in this LCD screen where five curved LCD displays animated as if they were spinning reels on a mechanical machine.

      I was rather impressed by the trouble they’d gone through.

  6. Tse says:

    The whole of the MAME game library is just short of 40 GB. And that includes both variations and bios files.

    1. ET says:

      Hmm. That’s going to make Shamus’ original tweet wrong. If we assume that at least a quarter of those ROMs are just ROM-hacks done by fans, or trivially different versions (i.e. discount them from the total size), Shamus still gets to have the new Wolfenstein be bigger than allgames before 1993. :P

      1. Tse says:

        No, I was making a different comparison, actually. Last game in the archive is from 2010. EVERYTHING in there is still smaller than Wolfenstein (2014).

    2. Cybron says:

      A number of those are past Shamus’s cutoff, I’m pretty sure.

    3. boz says:

      If you remove any game after 94 (arbitrary cut of point at KoF94) You’ll have a 10gb archive at most. And I am being generous. Most of that 40gb is from CPS3 images(SF3 and 2 sequels) and late NEO-GEO stuff that takes 100-150MBs (Later KoFs, Samurai Shodowns, Garou etc)

      1. I’m also assuming a lot of those ROMS has more than just the game itself, wasn’t there some extra OS stuff needed too?

        For example, Wolfenstein (2014) does not ship with an OS and drivers, but those old cartridges did do that. I’m not sure (and it varies by game system back then too) but I think a lot of the OS stuff was replicated on each cartridges.

        To be fair I guess you could call it a game engine then, but the game engine for Wolfenstein (2014) is certainly not more than 100MB is it? So compared to the game as a whole the game engine of Wolfenstein (2014) is just a footnote compared to the full game size.

        1. Tse says:

          The last game in MAME is from 2010. I’m actually saying that the whole library, which almost extends to today, is STILL smaller than Wolfenstein.

  7. pearly says:

    What about “patreonize”?

  8. BlackmoreKnight says:

    Titanfall for PC was 50 GB, for something that exceeds Wolfenstein’s size. Although Titanfall was mostly due to massive, terribly unoptimized for PC uncompressed audio files. I think CoD: Ghosts was similarly huge. Entering the age of massive downloads, indeed.

    1. ET says:

      Umm…are those completely uncompresed? I don’t even know why anyone would do that. I mean FLAC is lossless, and still smaller than uncompressed audio. Looking here, it looks like FLAC is averaging a file size which is 50%-70% of the original file size. Darn kids and their uncompressed files… ^^;

      1. kerin says:

        Pretty sure it’s irrelevant for most modern use cases, but uncompressed audio is much easier to decode. If you’re hurting for CPU cycles, trading processing cost for storage size can be attractive.

        1. I have no numbers to back this up with. But I’m pretty certain that The space savings due to FLAC evens out the CPU time used. (FLAC loads faster than wav thus less load/disk time)
          This is also why so many games use very high bitrate Ogg Vorbis, sure CPU is used but the files/audio loads/streams very quickly reducing load times.
          So it just allows better CPU time management/utilization.

          Now if Titanfall did ship uncompressed wavs then that is just stupid, sure there is virtually no decoding needed but that disk access is just a huge waste, eating into texture load times etc.

      2. Drakhoran says:

        35 GB of totally uncompressed audio. The reason given was:

        “A two-core machine would dedicate a huge chunk of one core to just decompressing audio,” says Richard Baker, Respawn’s Lead Engineer.

        Minimum requirements were a 2.4 GHz dual core CPU.

        1. Humanoid says:

          Follow up question: is that also the download size, or is it just the space it takes up on your local drive? Because something like FLAC shows at at the very least you can ship in a compressed format then uncompress it back to raw format as part of installation.

          1. Eruanno says:

            It was (fortunately) compressed to something like 20GB when you download it and then it uncompresses to 49-something GB on installation. I really wish they would have added an option for people with faster computers to go with compressed audio/smaller installation size… :(

        2. Andrew_C says:

          They must have been using a terribly written implementation of whatever codec they were using because when I had a Core 2 Duo decoding multiple audio streams hardly shifted the CPU usage above 5 % on one core. Also, isn’t that what sound cards are for? I’m pretty sure most built in sound chips have an onboard mp3 codec these days

  9. TMTVL says:

    If I for whatever reason decided to get Wolfy:TNO, my ISP would throw me on smallband. So yeah, AAA games are starting to get too big, which is funny when you consider AAA batteries are so tiny.

    40GB would be the database of a small business (or a somewhat larger business in a small country like Belgium).

    1. Bubble181 says:

      Err, I work in Belgium, with the database of one of the bigger security companies. We’re a moderate-size company, and our *client* database alone runs into the several TERAbytes; not even looking at personnel data, helpdesk services, and so on. I think you’re severely underestimating how large databases get when you’re keeping track of lots of variables.

  10. sab says:

    Hmmm, Someone who isn’t me once ‘acquired’ a collection of DOS games. It was over 8gb in total. If you exclude everything over 20mb, it comes down to about 6gb, over about 2700 zipfiles. I think that’s about all of them.

  11. SteveDJ says:

    But Wolfenstein still will fit on a single disk, if they wanted to go that route. No, not a DVD, a single Blu-Ray! :-)

  12. Radagast says:

    As an old-school Amiga fan I have to say you drastically underestimated the size of their games… I owned over 200 of them, at least half were two disks, and they had a number of epic size games of 4 or more discs, at 680 kb per disk if I recall correctly…

    Still doesn’t make a difference in your count of course. And I remember Myst coming out but never enjoyed it all that much. As I recall, the Warcraft/Starcraft series was a much bigger driver of hardware – and now I shudder at the memory of trying to plug in a VESA local bus 32 bit video card……. Here’s a picture for those who weren’t around for that bit of fun: http://www.microstar.net/museum/vlbvideo.jpg

    1. Nataline says:

      880kB/1760kB

      Beneath A Steel Sky (1994) came on 15 disks (DD),
      Rise Of The Robots (1994), 13 disks,
      The Adventures Of Willy Beamish (1992), 12 disks,
      Monkey Island 2 (1992), 11 disks
      to name a few of the big ones.

  13. Arven says:

    The big size of AAA games is the main reason why I’m avoiding them. I’m using a 40$ for 40GB internet plan, so the download size pretty much translates into added cost for the game. This wouldn’t be so bad, if not for the fact that the huge size is most likely for graphic and audio, the 2 components that I care least about. Don’t get me wrong, I do like me some good graphic and sound. But the race for realism is a huge waste of processing power and disk space for details that my eye won’t catch.

    1. If they use Ogg Vorbis (or Ogg Opus now that it is matured some) the audio is probably not the issue.
      Clever compression, using mono for dialog and sound effects, stereo for music and such and appropriate bitrates the size can be squeezed down rather well while perceptually sounded transparent as far as quality goes.

      The issue is sometimes cutscene video. If you see the name BINK tied to a game if you are lucky it’s just the opening trailers that we all love to skip, at worst it is used for every single cutscene in the game.

      Myself I hate BINK (no offense meant to the guys at RADVideo) as over te decades I’ve had bink videos cause games to crash at startup, or during cutscenes, audio sync issues, or half speed video or audio and the audio levels are all off compared to the game settings and so on. A lot of this is probably due to mistakes by the game developers, but if those mistakes are so easy to make then the blame also falls on BINK as it might be a API issue.
      Ever had to reboot yor system because a cutscene froze your machine= I had to on a BINK cutscene once. Oh how fun, especially considering the last save checkpoint was ages ago.

      Right now I’m playing good old KoTOR, and obviously it has BINK cutscenes, I can’t run the game fullscreen as something goes weird with the video then. And when not in fullscreen the video is tiny as hell. Interestingly enogh the game engine is good enough that the cutscenes could have been done with the game engine instead and still be acceptable.

      Now BINK is not the only game in town, there is also Ogg Theora, I can’t recall having issues with games using that (maybe because it’s mostly software based?).
      These days however there is no excuse not to use the game engine for cutscenes, in fact I don’t understand why the intro stuff isn’t also done using the game engine.

      As to other graphics like textures. Since PC games has such varying hardware shipping Low, Medium, High resolution textures is needed.
      However, what they could do is ship with just High res textures and resize down (if needed) to Medium or Low upon installation (and inform the user about this please so they know why the install is slower).
      This should cut down on size a lot.

      Another issue with textures is that sometimes really huge resolution textures are used for objects in-game that you never really see up that close so the detail (and thus size) is wasted.
      This can be fixed by the artists or it could be automated with advanced tools as a final packaging optimization stage.

      1. Eathanu says:

        This is what happens when you let a Gungan design video software.

      2. Humanoid says:

        The other reason to dislike Bink is that it rules out some classic games from being Spoiler Warning seasons. :P

        But yeah, for textures and the like, it’s probably not a bad thing that high-res textures are an optional free download such as in Skyrim and Sleeping Dogs, though there is a downside in that people might not become aware of the existence of these ‘extras’ and thus never know to get them for the ‘optimal’ experience. The other approach would be to specify which components to download prior to actually downloading. That said, a thing to consider is that these days “advanced options” for installation include things as basic as the install path and whether to put a shortcut on the desktop so it might result in an increase in support queries.

        1. Arven says:

          I think it’s best to keep those hi-res pack as their current “free bonus” status. That way, normal people can enjoy min, med and hi settings while those who have the resources can play at max. Putting them in installation’s advanced setting is just making it harder to find. Though I would love for them to put ‘download low res only’ on advanced setting. Though I feel that I’m pushing my luck with this request.

          Slightly off topic : I would argue against the argument that including more options in an installation’s advanced setting would increase support queries. As long as you make sure that people can’t see it if they don’t go out of their way to see it, you should be fine.

          Definitely off topic : I can never understand why people puts stuffs in desktop (whether it’s icons or files). Wouldn’t you have to minimize your current applications to access them? It sounds really inconvenient.

          1. Eathanu says:

            Well, applications don’t tend to ask if they can put stuff in my start menu, so that’s a mess of hundreds of programs I used exactly once that I don’t ever use unless it’s to open specific programs. Also you can immediately minimize everything with Win+D, so it’s not exactly inconvenient.

    2. Alex says:

      I have the same problem, except in Backwateria it’s $40 for 25 Gigabytes. I took Wolfenstein off my Steam wishlist when I saw Shamus’s tweet, because I knew it would take me 3-4 months just to download the damn thing.

  14. silver Harloe says:

    The point is, I’d pay good money for “download this 10GB file for every game released before 1992”

    1. Daemian Lucifer says:

      Well then,good news!You dont have to pay a dime.Pretty much all of those stuff comes on emulators,and no one is selling those,so the only place you can obtain them is torrent sites.Admittedly,this isnt legal(in some countries),but then again,if you cannot obtain it the legal way,who cares?

      1. The legality is questionable, but look up Abandonware (I think I typed that right) on Wikipedia for more background then search the net.

        The gist of it is that none of those games are available any more, no current active publisher. Usually if a game becomes available again those sites remove the game.

        Some of the effort of such Abandonware sites has helped preserve games from vanishing. In a few cases the actually developers has dug up game files to help.

        Good Old Games was (I assume) founded on the very idea of getting those abandoned games back out there.

        The popularity of Abandonware even caused some developers/publishers to release their really old games for free, sometimes with small tweaks to make them run.

        What I don’t understand is why for example GoG when they contact a publisher or developer and ask if they can sell the game for them why they don’t get a huge YES at once.
        Hand the original (no DRM, no copy protection) game files to GoG and they will make that thing run on a modern system. Heck if you also have the source they can really fix a few issues (maybe support a few Widescreen resolutions which could be just tweaking a line or two of code).
        The requested list(s) at GoG is huge. Myself I’ve requested Blade Runner The Game, but nothing yet.

        1. Veylon says:

          Why not? Like the rationale for DRM itself, it likely has more to do with internal corporate politics than whether it’s an actual good decision or not.

          The default position is not selling the game and the company has been doing that for years. More importantly, everyone else at the company is on board with this (non) decision and can’t criticize you for staying the course.

          If you break ranks by wanting to do something, you implicitly lose the security of the pack. What if we could’ve made more selling it ourselves? What if customers buy it instead of our latest offering and/or attack us for it not being as good? And so on and so forth in the endless clash of egos. Why risk all the sturm and drang? It’s not like it’s your job to benefit the company or anything.

          1. Humanoid says:

            Sometimes within a studio the available titles are bafflingly arbitrary though. Want Loom, Last Crusade, Fate of Atlantis or The Dig? They’ve been on Steam forever. Want Day of the Tentacle, Sam and Max Hit the Road, Full Throttle or Grim Fandango? Nuh-uh.

          2. Daemian Lucifer says:

            What is the expiry date on copyright anyway?Personally,I think that any software that is not commercially available for at least a few years(5 maybe)and without announced sequels/expansions should become public domain.

            1. ET says:

              Only five years? Not even close! Welcome to Murrica! XD

              1. Rack says:

                I like the sentiment but you have to be very very careful about terms like public domain. Film companies would never have to buy the rights to a book ever again if it was anything like 5 years. If it was something like non-commercial rights it might work but I’d still be very antsy about as short a periods as 5 years. 15 maybe?

                1. Mike S. says:

                  The original US copyright term when the Constitution was adopted was 14 years renewable once (so 28 if it was worth keeping track of and paying the renewal fee). That’s still a little long for software, but probably covers the large majority of profits seen by most creators in most media.

                  It removes the possibility of some jackpots, of course. But the purpose of copyright (in the US) is to “promote the Progress of Science and useful Arts”. I’m not sure how many authors/filmmakers/game designers are primarily motivated by what they plausibly expect to earn in year 29+. (The net present value of which when the project is started is probably pretty close to zero.)

                  Of course, that’s about as likely to happen as, e.g., Disney recognizing its debt to the public domain by voluntarily releasing some of its creations back into it. But that’s the standard political problem of concentrated, motivated narrow interest vs. broad but very diffuse public benefit. At this point, I’d settle for requiring renewal for a nominal fee every, say, ten years, so that at least true abandonware that no one knows or cares that they own would enter the public domain.

              2. Mike S. says:

                To be fair, the most recent extension in “Murrica” was done in part to harmonize with Europe, which already had a longer term.

  15. Smejki says:

    BTW a fellow of mine a semi-professional digital archeologist claims that total number of finished games to date is ca 200 000. Including unfinished, the number shoots up to 1 000 000.

    1. And all those unfished once are now on steam right as Pre-Alpha-whatever? *j/k*.

  16. DosFreak says:

    Since underground gamer is no more then this will likely be the easiest way for the average person to get fairly accurate numbers for commercial releases of DOS game:
    https://archive.org/search.php?query=EXO%20DOS

    Browse each collection for a listing of the games along with their date.

    Not sure how many of the zips include both a installed copy and CD/floppy images.

    Total for all of the DOS games for that collection is 352GB

    P.S. https://archive.org/details/tosec

    1. Those are “images” which means all sectors on a disk or floppy is captured since a lot of games (floppy and cd) stored extra content outside the normal readable area as party of copy protection. Thus a lot of those images have a lot of dead space, now if luckily it’s all zeroes or ones, but if the dead space is unormatted it contains noise which compresses very badly.

      Also… “There are over 1,800 disk and cassette images for the IBM PC Compatibles collection, including games, applications, and operating systems.”
      Besides the games probably comes with PDF containing a copy of the manual etc.
      I also highly doubt this is all the commercial games.
      Also things like DOSBox or ScumVM is bundled with the games and may in some cases may be as large as the game itself.

      The dataset is the limited factor her it would take weeks and weeks to get any meaningful numbers out of this, sadly.

      Then again, the entire collection is less than 4TB…Whic is crazy considering that 4TB drives are actually for sale to consumers now.

  17. Nataline says:

    Size of average Amiga game is certainly not 83kB if that’s the figure you are using (really not sure, I’m utterly terrible at reading or speaking Mathematese). A great many of them were also multi-disk, but if we sort of even that out by assuming all disks to be filled to capacity, those 3658 titles alone would amount to little over 3GB.

    Or let’s assume that all but one disk in each set is full. The Hall Of Light lists a bit over 60 5-disk titles released in 1988-1992, about 20 of which are collections. 40 titles with 4 full disks each amount to 140800kB. That’s just 5-disk games, so I’d say that 448532kB really can’t be an Amiga-inclusive figure.

  18. Don’t worry Shamus, the numbers are the way they are due to lack of data (pun intended).

    I really hope that those who know (or have the free time to waste) would go to Wikipedia and if references exists, add that info to the platform articles, ideally this info should mostly have been found on Wikipedia these days.

    As to the numbers themselves, I did have a Atari 2600, A C64 Game System (broke), then a Atari XE (sold/gave away), then a C64 (broke eventually), then a Amiga 500+, Amiga CD32, Amiga 600, Amiga 1200.

    If one truly dig down into numbers I think the twitter statement is still mostly true. Wolfenstein (2014) is larger than all games released (for purchase) before Wolfenstein 3D.

    I’m sure with more number digging it will crawl closer to to that 40GB.
    But consider this, this is just ONE game. Heck take any single Square Enix game and it will dwarf a handful “normal” games.
    This is largely due to their love for video cutscenes.

    If we add in the 3.5inch hard floppies (am I the only one thinking the naming of these are rather unfortunate?), the sum still does not go past 40GB total.

    BTW! Does the the Wolfenstein have video cutscenes or is it fully in-engine?
    With the CD revolution way back that is what most games did, besides avoiding multi-disk issues (that issue stuck for a while as they supported both CD and Disks so the games still had a “size budget”) the almost unlimited size of te CD allowed them to make bigger longer games (code/asset/content wise), but most of all it let them add lots of video and or music. First CD_ROM releases was the disks of a game on the CD + cd tracks for the music (instead of midi or tracker music).

    Back then compression was lacking or too computationally expensive. Wolfenstein (2014) uses compression, sure it uses 32bit textures and whatnot, but everything is compressed losslessly or lossily, the audio is compressed.
    So if we really where to compare numbers to numbers we would need to decompress Wolfenstein (2014), how big wold it get then? a gig bigger? 10 gig? Average compression on modern compression with good performance vs memory vs size (think zlib Deflate) is about 50% which means the game could be up to 80G in size.

    PNG (same as zlib’s Deflate) and Zip uses Deflate.
    I do know that old games especially the cassette days used compression, but this was not to save space (it all had to be un-compressed into memory), the compression was to reduce loading time. Even when floppies became the thing, heck even with the C64 catridges there was compression used (there to save space I think as larger ROMs cost more).

    So should we decompress Wolfenstein (2014) fully then decompress all (consumer purchaseable) games fully that where made before Wolfenstein 3D?
    That would be the only fair way to compare, but impossible really.

    Then again, they dug up that horrid old ET games from the desert so who knows these days really…(Now there’s a nice setting for a demonic horror movie plot. “They dug in the desert to find old ET game cartridges, but found something even worse than the game…”)

  19. I think memory and size will begin to flatten out soon.
    System memory on PCs seem to ideally be about 4 times larger than the graphics memory.
    So…
    2GB System RAM and 512MB GFX RAM,
    4GB System RAM and 1GB GFX RAM,
    6GB System RAM and 1.5GB GFX RAM,
    8GB System RAM and 2GB GFX RAM,
    12GB System RAM and 3GB GFX RAM,
    16GB System RAM and 4GB GFX RAM,
    32GB System RAM and 8GB GFX RAM.
    At 32GB System memory is at the edge now, there are even some Motherboards that can only do 16GB.

    I predict that at 32GB system mem, and 8GB graphics mem a plateau is reached (if not sooner). At some point you get severe case of diminishing returns where the cost dwarfs the benefits.

    Another reason why I think that artifical limit is where things will stop is that creating all that content is going to require insane budgets, if you thought budgets of AAA games today is insane (and they already complain profits aren’t good enough except a few top ranking AAA games)…

    Imagine trying to make a game that pushes a 32GB RAM 8GB GFX system properly?
    Think about it this way, Wolfenstein (2014) could fit entirely in memory.

    I’m not saying more system or GFX ram isn’t needed, certain professional areas will require more, but they process huge databases or astronomical star charts or track all the junk orbiting the Earth.
    I’m just saying that for consumer systems there will be a limitation there, and to increase profit or stay profitable they need to expand the userbase instead, which means making those 32GB Ram 8GB GFX ram systems affordable enough so everyone can have it (and run the games for it).

    So as that plateau gets closer the prices will drop more and more relatively speaking.

    Disk space will keep increasing be it HDD or SSD or optical holo-hybrids or whatever. As storage devices get larger and faster the need for System RAM and GFX RAM will diminish as it’s usually just as fast to load from a storage device.

    Windows Vista and later does this today, things that are used often is cached in memory, other things are cached on disk, on bootup the OS uses a prefetch cache on disk to load code tat is often used and needed at startup.
    Games already use background streaming from disk to memory and this will keep improving (both in hardware and in software).
    With 4K and 8K anti-aliasing is less needed too, I say that at 4K/8K displays at around 300PPI anti-aliasing is no longer needed so all the anti-aliasing overhead is gone (the scene/image is re-sized down today, with 4K and 8K that step is no longer needed) and you get better performance.

    A 32GB System RAM, 8GB GFX RAM, 4320p display, and 10TB long term storage is probably what the high end systems will plateau on in the not too distant future.

    As to other things like how many cores or GHz I’m not sure. 3-4GHz seems to already be the plateau except for those with insane cooling solutions, liquid nitrogen will not be typical consumer solutions.

    Number of layers/cores though is something else. Even mobiles have 4 cores now. myself I have a 6 core here. And anyone that ask I always advice to get at least a 2 core preferably a 4core or better be it a Desktop or mobile or touchpad. But when will cores reach a plateau? I predict when they reach 8-16 cores. Once you reach 16 you are getting towards the server area where they can never have enough cores since they sometimes run multiple virtual OS with hundreds of threads on each OS.

    For the normal consumer though I think that about 8 cores (8 actual cores not 4 cores + 4 half cores) is about ideal, though 6 cores + 4 or 6 half cores or similar would probably work too.
    The trick is to get that out to as many as possible at an affordable price.

    The other issue is making the OS and the software running on the OS to fully utilize it and avoid multiple-thread overheads due to context switching/synchronization and so on.

    I’m not saying hardware will stop evolving, it will just plateau there for a long time until the next technological leap is reached. I think a combination of Quantum processors and traditional processors and traditional hardware and optical CPU parts or components may be the next leap.

    … I’ll shut up now!

    1. Moridin says:

      I don’t think 8 GB of VRAM will be sufficient for high-end 4320p setups. There are already games that need more than 3 GB of VRAM to run smoothly at 2160p, and 4320p setup would require pushing out 4 times as many pixels. Granted, that’s with anti-aliasing enabled, but you have to take into account that games are still getting more demanding and I don’t think 2 GB will be enough for 2160p(even if it still is) for very long, even if you disable AA completely.

    2. ET says:

      I’m actually hoping hardware stabilizes a bit within the next decade, so I can stop upgrading my hardware. Maybe then companies will have an incentive to make computer components last longer than the standard 2-year warranty, eh? ^^;

      1. I hope the hardware progress flatten out soon so programmers can finally start to optimize the software for the hardware. There are still Amiga demos (as in scene demos) that still look damn good. Then you have the 64KB or 96KB whatever PC demos, what those hackers can squeeze out is amazing.

  20. Heh that old Starflight cover brings back memories of Game covers that looks so awesome, and never matched the small images of the actual game that you could see on the back of boxes.

    I kind of miss those artistic covers, I think a lot of modern games could benefit from those today too.

    If you go to google and do a image search on: cover image
    or cover images then what you see is mostly the main characters striking a “tough pose”.
    I see GTA IV is kind of not following that trend, staying a little more artistic.
    Sim city also has a more artistic cover.

    (a lot of scrolling later)

    Wow, this is depressing, it’s all just the same shitty stuff.
    One that stands out though is Ghostbusters which have their logo instead.

    (more scrolling)
    Damn, I never realized how much the same all these covers really are. Not just the orange/blue look either but the pose and things.
    The games that stand out of the modern games are very few, and it’s mostly the odd older game that has a more inventive or artistic look.

    I miss the old days.
    *makes grumpy sounds*

  21. RonC says:

    You could asked these guys about DOS sizes: http://www.abandonia.com/index.php

  22. BTW! Shamus, grats on the Patreon so far, looking at that number there if it stays at that or maybe a little higher, then that sum supplemented by a Escapist article now and again and maybe the other odd project plus the direct Paypal donations, it looks like you achieved your goal of replacing adsense. And over time this site will attract a larger audience which means more Patreons. If this holds up then it should become fully sustainable this way. People are awesome. (refering to Shamus’ Patreons)

    1. Humanoid says:

      Just a quick note that the ads haven’t been removed from the forums. *poke*

  23. mewse says:

    Using Fermi estimation, we round off every figure to the nearest power of ten in order to make rough back-of-a-napkin estimates.

    According to the List of Every Video Game Ever Made, there have been a bit over 40,000 video games made. Not every game ever made is in that list, so the actual number of all games ever will be somewhat higher. But also, many of those games will have been made after 1992, and the rate of game creation has dramatically increased since 1992, so the number of games before 1992 will be much lower than the total number of games of all time.

    For the sake of Fermi estimation, let’s round off and just call it an even 10,000 games created before 1992. Seems fairly reasonable to me.

    As Shamus pointed out, Myst was released in 1993, and pretty much ushered in the era of CD-ROM games. There had been a couple before, but not a lot, and so let’s ignore them. These games were each shipped on (1-12 disks, biased toward the lower numbers. I’m going to call the average 4, and so round to) 10 disks, which each stored (360-1440 kilobytes, round to) 1000 kilobytes.

    (By the rules of Fermi estimation, we round 4 disks up to 10 disks, not down to 1 disk as you might expect. See wikipedia or XKCD’s explanations of Fermi estimations for the reasons why.)

    So we have a total of 10,000 games, at 10 disks each, at 1000kb per disk. That gives us 100,000,000 kilobytes in total, or 100,000 megabytes, or 100 gigabytes. Which isn’t terribly far off from the final estimate that Shamus came up with; Shamus’s number would, in fact, round up to 100 gigabytes, if we were going to use it in a Fermi estimation, itself.

    So I tend to concur with your numbers, Shamus! You’re probably in the right ballpark. :)

    1. Brandon says:

      As Shamus pointed out, Myst was released in 1993, and pretty much ushered in the era of CD-ROM games. There had been a couple before, but not a lot, and so let's ignore them.

      In Japan, on the PC Engine alone, there were 50-60 CD-ROM games released by mid-1992. By the end of 1992 there were about 100 and change.

      The mainstream US computer market was actually slow on the uptake for CD-ROM technology. NEC was one of the pioneers, using the same technology and interface (stripped-down SCSI), even the same drive unit, for a CD-ROM unit for the PC-88 computer. By 1989 (according to Wikipedia), there were two main PC-88 models on the market, one with CD-ROM standard and the other with it as an option. I have no doubt game makers in Japan started producing CD titles for the PC-88 in non-trivial numbers prior to 1992, especially given the popularity of the PC-Engine and its CD unit at that point in time.

  24. Ingvar M says:

    I noticed that neither the Amiga nor the Atari ST (both primarily getting games releases in the 80s) are listed. Simlarly the Sinclair family. It doesn’t change the overall conclusion, but grows the size from ~4 GB to ~5.5 GB (so, roughly, by 35%)

    I have only wild estimates for game sizes, but somewhat accurate counts.

    The Amiga had on the order of 2100 games released (Wikipedia lists 2149 titles) and at ~512 KB per *(0.5 MB), that is roughly another GB.

    The Atari ST (same source) had on the order of 500 games (490 listed on Wikipedia), for another .25 GB, using the same estimate for game size.

    The Sinclair Spectrum had (according to World of Spectrum 10724 games released for it and I would wildly estimate the typical game to be 20 KB in size (so, 50 games to the MB, or roughly 0.2 GB).

    I have no numbers for the Sinclair ZX81, but games would probably typically fit in “under 16 KB” and I’d guess the average size would be on the order of 2 KB.

  25. Blackbird71 says:

    Remember the Odyssey? I’ve played the Odyssey! No, I’m not that old (I was only in the 70s for less than a year), but there was one in my grandparents’ basement. Every now and then my cousins and I would pull it out and hook it up. It still worked as recently as the late 90’s; but I’m not really sure if anyone has done anything with it since.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to Bubble181 Cancel reply

Your email address will not be published.