Project Good Robot 24: Portability

By Shamus Posted Monday Oct 14, 2013

Filed under: Good Robot 66 comments

Early in the project I said (hopefully here on the blog) that one of my goals for the project was to leave the door open for porting to linux. I might release a linux version or I might not, but I wanted the option and that meant I needed to keep the codebase free of Microsoft-specific code.

I’ve been working on Microsoft platforms for my entire professional life. In fact, my history with C begins at the same time as my history with Microsoft. In 1990, my uncle passed along his old IBM running Microsoft DOS, along with an old edition of Borland Turbo C. For you kids saying “C is hard to learn”, I just want to point out that I did it with no teacher and no internet*. I didn’t even have a textbook. Just the Borland reference manuals. In hardcopy. (What? Store an ENTIRE BOOK on disk? That’s crazy talk! You’d need industrial-grade hard drives to store something that big!) All the insane hours I’ve poured into this language, and I’ve never done so outside the context of a Microsoft operating system.

* It IS friggin’ hard to learn and I probably could have learned it ten times faster with the proper materials. Start with something easier.

But check this out:

Where is the start button? Why is the task bar at the top? How come the windows key isn’t working? HELP!

That’s Good Robot, running on Linux. It’s not running under wine. This is a native build.

I can’t take credit. 95% of the work was done by Peter Olson. You might not recognize the name, but Peter and his brother Clint have done a lot for this site over the years and are basically outstanding people.

Peter loaded up the code on his Linux machine and went through the steps required to get it to compile. For a lot of the process, Peter was streaming his desktop for me to watch and then we hammered out the details in chat.

If you’re curious about what goes into porting software from one place to another, here is the list:

Actually wait. Before we get started, I need to do a disclaimer. Throughout this article I will equate programming on a Microsoft platform and using Microsoft tools. This is not necessarily the case, since you could use (say) the Code::blocks development environment even if you’re developing on Windows. But since Microsoft Visual Studio is really good and Microsoft offers a free version, a lot of people naturally use it. I understand that “usually” is not the same as “always”, but to keep things simple I’m not going to make the distinction every time it comes up. If this really bugs you then:

#ifdef _PEDANTIC
#define USING_WINDOWS        USING_WINDOWS_AND_ALSO_USING_VISUAL_STUDIO
#endif

Okay? Fine. Let’s get on with this.

Makefiles

Turning C or C++ code into a usable program is a two-step process. Step one is the compile stage. That’s were it parses all of your code, checks to make sure everything makes sense, makes sure your syntax is valid, and turns it all into “object code”. If you declare a variable a “Hitpoints” in one place but then refer to it as “hitpoint” later on, the compiler is the thing that will say, “I’ve never heard of this ‘hitpoint’ thing and I have no idea what it is.” This is also the stage where it pulls in headers for external libraries. If I’m using OpenGL, I don’t actually have the source code to OpenGL in my project. Instead I #include "gl.h", which is just a text file that tells the compiler, “Yeah, all this OpenGL stuff exists elsewhere and here is what it will look like.”

The second step is linking. The linker takes all the object files made by the compiler, adds in all the external libraries (like OpenGL) and then ties them all together. If I miss anything (like if I said I was going to have a function called SpaceMarineDie () but never got around to writing it, or if I included a header saying I’d use OpenGL but didn’t add the OpenGL library to the linker) then the linker will tell me what’s missing. If it has what it needs, it makes an executable file for us.

This is a complex process. It’s actually the thing I hate most about this language. It can be fiddly, tedious, obtuse, and unpredictable. But as bad as it is, it started out a lot worse.

It used to be that you compiled things from the command line. I don’t know how to compile anything from the command line for the same reason I don’t know how to sift wheat or shoe a horse, but I understand that it was done in the past and is still sometimes done by rugged independent types who would rather memorize hundreds of compiler options than resort to something as decadent and ostentatious as a menu, or use something as humiliating as a mouse pointer to select options. The point is, we don’t usually compile things by hand, even when using a terminal window. There are countless little options to control what files the compiler will read, where it will look for them, how it will interpret their contents, how it should report errors, and (if you’re successful) what kind of code it spits out.

Entering all these options every time you wanted to compile would be insane. So you stuff all of those options into a makefile. The makefile will guide the compiler and linker to do their thing so you don’t have to. Then you can give your project to someone else, provide them with the makefile, and they’ll be able to compile it even if they don’t know how it’s all organized. They should be able to compile your project even if they’re on a different platform.

This is all fine, EXCEPT…

If you’re using Microsoft tools, you don’t have a makefile. Microsoft uses project files. It’s the same idea except it’s, you know, different. So step one of porting from Windows to Linux is creating a makefile. This means that everyone can share their code, but it’s a bit harder to share between Windows and non-Windows.

Headers

Remember the headers I mentioned earlier? Well, there are a lot of them available. Hundreds of them. Maybe even thousands. I dunno. Now, some of these are standard headers. If you’re using a standards-compliant version of C or C++, then you should have a stdio header. (Standard Input/Output.) If I #include <stdio.h> in my code, you should be able to compile that code on your completely different machine with no changes.

But! Different platforms have their own ideas about where to put all those files.

On Linux, the io header is under the sys/ directory but utime isn’t:

#include <sys/io.h>
#include <utime.h>

On windows, it’s reversed:

#include <io.h>
#include <sys/utime.h>

Why? Why is this different? Why is this not standardized? Who saw the files arranged one way and decided they just HAD to reverse them? I have no idea. Is this a case of Microsoft just doing as they please and expecting the standards to conform to their behavior? (As they did with Internet Explorer 6.) Or is this a case of anarchic Linux environments leading to fragmented systems? Or do we blame the ISO for failing to herd these cats? Beats me. The politics of the language are opaque to me. All I know is that we’re beset by stupid trivial crap that ought to work but doesn’t.

Differently-named functions

Some things aren’t part of the strict C++ specification, but end up as part of the language anyway. Sort of. Informally. If you want to open a file you use open () to open a file and unlink () to delete it, unless you’re on Windows where you use _open () and _unlink (). Again, we can argue about who to blame but the fact remains that you have to deal with this when porting. A bunch of little stuff might have slightly different names or subtly different ways of being used.

Now, in this particular example we don’t have to use open () and unlink (). There are newer systems with better portability you could use instead. If you’re writing new code, you could use an ifstream or ofstream for file access. But there’s a ton of legacy code out there still using the old way, and “rewrite all your file access code from scratch” isn’t a super-attractive option.

In any event, you’re likely going to have several dozen little points in the code where you’ll need to deal with name conflicts.

Compiler differences

We’ve got this little thing called max () that returns the larger of two numbers. There’s a version of it for comparing two floating-point values (like 4.43587 or 0.3429085) and another version of it for comparing integers.

1
2
3
int   a = max (10, 52);   //a all be set to 52
float b = max (10.1f, 0.52f); //b will be set to 10.1
float c = max (2, 0.001f); //c will be set to 2

Note how in line 3 I didn’t explicitly say that the number 2 was a float. One compiler will see that my code looks like: float = max (ambiguous, float); and conclude that the ambiguous value is a float. Another compiler will throw a tantrum and refuse to proceed until the ambiguity is removed. You can argue this either way. The former is more permissive, while the latter is more strict. More strictness can save you from making mistakes but can also make code more cluttered, verbose, and hard to read. It depends on what you’re doing. The coding conventions of your project might make these little differences incredibly important or a non-issue.

But what really sucks is moving from a relatively permissive compiler to a stricter one. The compiler will hound you for hours over tiny little bits of inconsequential code like this and make you do many little edits to “fix” code that worked just fine for the other compiler.

Filesystem differences

On windows, these are equivalent:

1
2
3
open ("mygame\\games\\saves\\savefile1.sav");
 
open ("mygame/games/saves/savefile1.sav");

On Linux, they are NOT the same. The sad thing is, I’ve known about this distinction for years, but I can never remember which way is the portable way and I’m not inclined to Google it when I’m in the middle of working on something else. So I guess, and apparently I’ve been guessing wrong more often than I was guessing right. This was fine until we tried to port, at which point it caused all these goofy problems.

Wrapping up…

And after all that work, the game is still broken in many stupid little ways, even on other versions of Windows. One tester is reporting performance WAY lower than what I would expect of a machine with their specs. One person using Windows XP reports that all of the robots get more transparent depending on how bright their colors are, which means there’s some shenanigans going on with the alpha channel, but ONLY on Windows XP. One Linux machine renders fine. Another one has textures and sprites flickering in and out all over the place. Another user reports that music never plays. Another one has the music play, but only if they open and close the main menu.

Sigh.

A lot of these problems can be traced back to the use of my vertex shader. As annoying as it is getting C++ code working on more than one machine, vertex and pixel shaders are far, far worse. NVIDIA and ATI always find a way to interpret the spec differently, and those differences aren’t even consistent across different driver versions. I’m seriously considering pulling out the vertex shader for now. It’s the source of about 80% of all of my technical glitches, with multi-threading making up the other 20%.

So that’s the adventure porting the game to Linux. It “only” took a few hours, which is either amazing or horrible, depending on your expectations.

 


From The Archives:
 

66 thoughts on “Project Good Robot 24: Portability

  1. Paul Spooner says:

    Obligitory typo roundup:
    “Story an ENTIRE BOOK…”
    “On Linus, they are…”

    Love the commentary and explainations about portability. I’ve never tried writing portable code (except stuff in Python that is ostensibly portable by default) but it sounds aggravating. The Olson Bros really are neat guys. Glad to know they were able to get involved!

    Looks like you’ve got the “light show” powerup there in the title image. I’d guess it’s less useful than pretty?

    1. Jake says:

      Another typo for the herd:

      That's were it parses all of your code

  2. ShadowAgent says:

    About makefiles vs. project files, I suggest you look into CMake (for future projects at least). It’s a really great tool that can generate a project file (or makefile) for the IDE of your choosing using config files in the project’s folders. This way, you don’t have to limit yourself to one IDE if necessary.

    1. Shivoa says:

      It is also worth saying there is a layer on top of (or below, depending how you look at it) your makefile/project management layer, the accelerators that try to make (incremental) builds etc as fast as possible, like Ninja. The range of different expertise levels of the different build systems is always staggering to me, considering the rather fixed task at hand (correctly call compiler/linker, often those now being the same program, with the settings the user wants and has saved somehow). I have found that this topic is dangerous ground, even to make general recommendations about, as coders have their workflow and expertise and want to make sure they don’t have to learn a whole new system to building their project. I’ve been lucky to be forced to adapt to several systems but in general you’ll also find me using CMake.

      I would possibly say that a coder coming in to a new project on Linux today is almost as likely to use an IDE build system as their Windows counterpart. That build system may well be a wrapped on CMake or another build system but I’ve seen a few young coders who would not be able to tell you what is managing their Linux build beyond the name of the IDE that wrangles it.

    2. Bryan says:

      Please, for the love of all that’s holy, avoid cmake.

      No standard way to tell the build system where to install to after the build is done.

      No standard way to control which C or C++ compiler is used.

      No standard way to pass arbitrary flags to the compiler, or linker.

      Compared to autoconf+automake, which — admittedly — is pretty bad, but at least follows standards in terms of all of these. The first is a –prefix flag to the configure script; the second and third are CC=, CXX=, CFLAGS=, CXXFLAGS=, and LDFLAGS=, which can be either set in the environment when running configure, or passed to the configure command, or set when running make.

      With cmake, every package you try to compile, you have to go read someone’s build system rules to try to figure out how to manage to set all these correctly. Assuming it’s even *possible* to set them correctly; I’ve seen projects where it isn’t, where I’ve had to generate the makefiles out of cmake and then hand-edit them, to pass the right -m32 or -m64 or -L flag to gcc/g++.

      Alternately, skip all the complicated preprocessing crazy, and just use a raw hardcoded makefile — which, it appears, is what Shamus is doing. That has some of the same problematic hand-rolled-ness that cmake has, but at least the makefiles aren’t (usually) more than about one screen long.

      1. I had to roll my own makefile for Google’s WebP (at that time building on windows was really messed up, especially if trying to make a DLL)
        So commandline compiling + makefile was the result.

        I did report back a few issues to some of the maintainers and they did some tweaks to the WebP project. Today it’s a lot better. But I’m still using my custom makefile (which is way over-complicated for the simple task it’s supposed to do).

        These days a lot of programs/tools are trying to be or do this:
        http://joshlinkner.com/images/2012/05/SAN.jpg

        I wish they’d try and scale back a bit, I don’t mind using two different tools if they work great, I do mind using a single tool that look like that… (ref. image, metaphorically speaking)

        Ever tried to code without any dependencies on a standard C library? It’s crazy, and depending on the MSVC you use it may not be possible, or if it is then it may not run on XP for example.

      2. Jacob Albano says:

        Not to turn this into a “my tool is better than yours” thread…but I’ve really enjoyed using Premake. It’s miles simpler to actually use than CMake, and it can create projects for any environment you throw at it, including VS solutions and Makefiles.

      3. Zock says:

        It’s funny how you keep talking about standards where there aren’t any defined. The stuff you’re referring as standard is only that in the Stallman Bros. world of GNU – with automake and other archaic tools built upon a language (m4) no one uses anywhere else. I’m sorry to say, but people believing that the way they’ve used to do things is somehow the standard for doing stuff are part of the problem, not the solution.

        And no, I’m not advocating CMake any more than Autotools. I’ve had to fight with both and they both have their flaws and benefits. As always you should choose one based on your needs and stick with it if it keeps you happy and productive.

        1. Bryan says:

          But here’s the thing. The “GNU coding standards” document (and in particular I’m thinking of the section on which parts of Makefiles should be overridable using which variables, not e.g. the parts on indentation style) was born out of decades of people having to port random packages’ code to random UNIX-like platforms, before POSIX was even a thing. CFLAGS and CC *are* standards, on every Makefile created to be built on any UNIX-like system in the world, because the GNU standards doc actually fixes so many portability problems that people adopted it whether they know about it or not.

          Unless, of course, the person writing the build system used CMake, in which case everyone else is apparently left on their own.

      4. Alan says:

        My own experience with a large project converted to CMake has been pretty awful. The software in question is old, complicated, and has lots of exceptions in how it’s built, so this may not generalize for everyone. But for us, it’s a mess. The resultant CMake files are a blob of opaque complexity that few people are willing to do into. I believe we would have done better special casing Microsoft Windows and using autoconf for Mac and Linux.

    3. CMake is just wonderful. It’s a meta build system that lets the coder choose what build system they want to use. So one person can develop in VC++, another can use Code::Blocks, while a third can use Makefiles. It works really well, on all platforms.

  3. MichaelG says:

    This would make a really nice WebGL app. Why don’t you use another compiler (Clang) under Linux and then convert it to Javascript with Emscripten, and debug WebGL shaders?

    Then try not to give up programming forever.

    1. Bryan says:

      In a moment of craziness, I looked into trying to convert Frontier to a webgl app. (Though, that was writing it natively, not using any kind of clang to-javascript backend.)

      The most complicated part was the constant use of the GL matrix stack when sending points to the video card. Geometry shaders would help with that a little (the push/pop sequence could be done by sending just one vertex to the video card, and it’d generate all the other vertices for e.g. a tree), but they’re not in OpenGL ES yet. :-/

      So I’d have had to either do a crapton of matrix math in javascript in the browser when drawing the scene, which is pretty much what the video card is good at, not either the CPU or javascript, or figure out if it’s possible to reorganize the whole thing to send all the vertices down in (basically) world coordinates. Or see if there was a way to do a varying matrix in glsl, but I don’t think there is…

      …Like you said, try not to give up programming forever. :-)

  4. The Rocketeer says:

    Yes, fine, but what do the robots eat?

    1. Paul Spooner says:

      Hah! Love it.

      By the way, the answer is they eat hours and hours of programming time.

    2. James says:

      they eat people obviously, isn’t that the end goal of all synth/artificial lifeforms, to rebel and kill/eat humans?

      or they eat banana’s, banana’s have it all man

      1. MrGuy says:

        Are there people in the game? Or bananas? Because otherwise it breaks verisimilitude and makes the game world seem flat and uninteresting. Oooh, my immersion is ruined forever!

        1. Syal says:

          Well, it’s 2D. It’s supposed to feel flat.

          And obviously robots eat powerups.

    3. Retsam says:

      I understood that reference.

  5. Rohit says:

    You can have a makefile that works on Linux and Windows (if you have MinGW) by checking what gcc -dumpmachine returns and behaving accordingly. Mine just changes how libraries are linked (-lopengl32 -lopenal32 vs. -lGL -lopenal) and how cleaning is done (del vs. rm). No need for Visual Studio. ;)

  6. Erik says:

    Only took hours to port from Windows to Linux? Congratulations – you did a pretty good job of keeping it portable to start with. Writing against OpenGL instead of DirectX is a huge part of that, but that’s still a good job of keeping Windows-isms out of the main code base for someone who’s always written for Windows.

    1. winter says:

      Yeah, i was going to say this also. “A couple of hours” is pretty damn good for doing a port.

    2. swenson says:

      That was indeed far, far better than I expected.

  7. That actually seems amazingly fast to me. I’d hate to do portability stuff – but I think even worse than this would be Linux -> Windows porting – linux has so much more interesting things that you can do, especially with web servers, so converting to windows would be a nightmare.
    Then again, you don’t really need webservers here.

    At least your game can be played on a Steam Machine now – your game is future proof.

    1. Shivoa says:

      In my experience the trap of Windows to Linux is leaning on MS systems, the other way round is about the same but this time you have to remember that, even though a lot of things are talked about in a standard lib as being portable, Windows is not POSIX compatible (and forcing users to install a shim to fix that is not what people want to hear).

      Those people who are highly enthusiastic to make sure you stay on Linux are very quick to mix a bit of POSIX functionality into any advice they give for a solution, which is not helpful when you start out but quickly can be filtered out (just as you do when someone suggests a Windows API as the answer you need the other way round). But if you’ve just read K&R and you went through the chapter on the POSIX part of the C std lib then maybe it isn’t immediately obvious that Windows will not give you that functionality.

  8. FacelessJ says:

    Just a point of curiosity, is your development machine using an nVidia card? I’ve found in my limited experience of writing shaders that the nVidia shader compiler is much more permissive than the ATI one. Unfortunately, my dev machines always end up having an nVidia card in them, which means when I get around to testing on an ATI machine, it instantly comes with an hour or so debugging the shader I wrote, since there is always at least one line where I’ve used code that nVidia lets through no problems but ATI has a huge hissy fit over. Although, interestingly, I don’t recall ever experiencing it the other way around (As in, ATI compliant code seems to run a-ok on nVidia).

    1. MichaelG says:

      Same here. I even thought that I would have to have two separate versions of the shaders. I did finally get stuff that works on both types of card though.

      On WebGL, they made the parser much fussier and stuff that works there seems to work on either NVidia or ATI.

    2. Peter says:

      The pattern does seem to be that the shader runs better on nVidia cards. ATI seems to have issues with transparency, and nouveau has a cycling / texture flashing problem. Whereas nVidia with the official binary drivers seems to run fine.

      On my Arch Linux box (one the screenshot above is from) I’m running the nouveau driver with issues, but switching from shader processing to internal processing fixes the issue.

    3. psivamp says:

      I did some work with OpenGL using LWJGL for a class last semester and did all of the work on my laptop (NVIDIA card, Windows). Toward the end of the semester I started trying to get it to work in linux and Mac on the same laptop and on my desktop (ATI, Windows/linux) and using code derived from their example code only the laptop would render it properly, every other configuration did something weird.

      So, portability is weird even if you’re using a language and library that are by nature portable.

      1. FacelessJ says:

        That’s the problem that happens when nVidia allows non-standard code, and that c++ compilers & IDEs don’t really give all that useful debugging information about shaders (Although, if anyone knows of decent standalone shader debuggers, I’m all ears). You end up writing things which seem to work, but are actually making use of nVidia’s allowances.

        At least, I believe that it is nVidia being too friendly, because I would imagine other drivers would allow at least the bare minimum specified in the standards for the shader languages.

  9. McNutcase says:

    A few hours, for someone who’s been very accustomed to Microsoft environments, is a heck of a job. Well done, Shamus and Peter, for getting it done so smoothly.

    If you want someone for accessibility reports, I’d be happy to oblige – and if I can get in soon enough, I can offer the incentive of having a production baby in the credits…

  10. Peter says:

    Thanks for the writeup! It was a lot of fun to work with you on porting. The code was really quite portable for having been written and tested exclusively in one OS up till now. There weren’t any fundamental changes we had to make, only edge points and strictness stuff.

    For the Makefile template I have to thank Bryan Kadzban for his work on porting frontier to linux.

    All told I think I probably put 16 hours into the port, most of those on Saturday.

    I’ll try to get a video (with sound) up on youtube to show it off.

    1. Bryan says:

      Wait, that Makefile? Wow, that dates all the way back from 2008 or so, when I had a couple weeks of time and tried to port Terrain. It got pulled forward into pixelcity (modified a bit), then frontier (also modified a bit). :-)

      The one thing it doesn’t do is install. The code didn’t really install though either (all the runtime file accesses were relative paths, so it had to run with the current working directory basically in the source tree — once installed, the cwd could be anywhere when the program was run), so hey whatever. Not installing does, at least, simplify the makefile a lot; only have to be sure it builds.

      1. Zukhramm says:

        “Your father’s Makefile. An elegant tool, for a more civilized age.”

    2. Peter says:

      Here is a poor recording of Good Robot on Arch Linux:
      http://www.youtube.com/watch?v=s2RdhCl8pnA

      The audio is messed up somewhere in the process of recording. It sounds fine when playing normally, but the audio here was recorded by making an alsa loopback virtual soundcard. The game played audio out to the virtual device, and ffmpeg recorded from X11grab (screen) and the loopback output. I don’t know where the delay/time dilation is coming from, and don’t have the time to track it down. Enjoy the video and blame the delay on the recording, not the game.

      1. Rick says:

        This is the first time I’ve seen gameplay, it looks like a lot of fun! Well done to both of you on the first cross-platform build of Good Robot :D

  11. Neko says:

    Awesome, well done. While it can certainly be quite painful at times, I’ve found that programming for multiple platforms has made my code more robust and exposes some bugs that might not have been caught otherwise.

  12. MikhailBorg says:

    Thank you both very much for your hard work! I do understand that this promises nothing, but it makes me hope that an OS X version is that much closer :)

  13. Nick says:

    Kinda interesting from my perspective, as I’ve only ever coded in C/C++ on linux using standard Makefiles and gcc – I don’t have the first clue how to make it work on Windows

  14. DGM says:

    >> “And after all that work, the game is still broken in many stupid little ways, even on other versions of Windows.”

    I was disappointed that I didn’t get to see Josh breaking Good Robot in the stream. But you know what would make up for it? A quick series in which Josh tests the game on whatever setups he can get his hands on. I’m going to go out on a limb here and predict hilarious bugs.

  15. jw says:

    > It’s the source of about 80% of all of my technical glitches, with multi-threading making up the other 20%.

    Could you in a future post expand on how you use multi-threading? Do you multi-thread internally in some components (e.g. parallelize the rendering, or the AI), or do you actually run different components of your game in parallel (e.g. AI is running in a separate thread)? The latter seems like it would be littered with technical difficulties (synchronization issues, race conditions…), especially in a low-level language such as C++. Do you use any typical ‘patterns’ when parallelizing, or any abstractions (e.g. existing libraries or typical techniques)?

    1. lethal_guitar says:

      +1, I’d very much like to hear about that too!

    2. Veylon says:

      Actually, C++ has cleaned up the multithreading quite well recently. All that mutexing and atomics and syncing is bundled into the STL now and can be handled in a sane manner.

      On the other hand, given that Shamus still apparently relies on open() and unlink() from the earlier days of C, he may not have tackled any of the cutting edge C++11 stuff yet. So count me as interested, too.

  16. sab says:

    NVIDIA and ATI always find a way to interpret the spec differently, and those differences aren't even consistent across different driver versions.

    Heh. You should try webdevelopment sometime ;)

    Speaking of which, how about making a html5 game using canvas or webgl? :D

  17. Jacob Albano says:

    Not that I didn’t enjoy the rest of the article (I did), but I just wanted to comment on that game menu in the screenshot. There’s just something really nice about the design.

  18. Arvind says:

    Hey Shamus,

    Are most of your cross platform problems related to the renderer? Are you using just OpenGL, or is there a layer like SDL involved?

    Arvind

    P.S. I hate platform specific bugs.

  19. 4th Dimension says:

    Nice. Well not nice in the way that you had to work hard simply to port allready portable code, but nice job.

    Also have you gone into how you decide where to spawn enemies? You did go into how do your level specs work, and they of course have their specs about what robots are present. But I don’t think there was much about where are they spawned.

    1. postinternetsyndrome says:

      He did mention something about generating spawners in dead-ends, but it was an aside I believe. I too would like to read more about the actual game design.

  20. Volfram says:

    I built a website using Microsoft tools once. Not only did it force me to jump through all sorts of awful loops that went away when I switched to writing the HTML directly in plain-text, but I found out it was actively checking whether a browser was MSIE or not MSIE and displaying a deliberately broken page in the second case.

    For that reason, I never use Microsoft tools when a cross-platform option is available. They try to enforce operating system lock-in.

  21. Didero says:

    Don’t leave us hanging! What IS the correct multi-platform way to reference folders and files? Forward or backward slashes?

    1. Mephane says:

      Forward slash is used by Linux and accepted by Windows (though the latter delivers paths with backslashes, using forward yourself works fine), I suppose Mac OS should also use forward slashes.

  22. The RIght Trousers says:

    Yay! Shamus, if you sell this, I can SEND YOU MONIES because I am one of those weird Linux people!

  23. Noah Gibbs says:

    Wow! A few hours to port is, like, borderline sane! That’s *amazing*!

    Which tells you a lot about my expectations ;-)

  24. Lalaland says:

    “..So that's the adventure porting the game to Linux. It “only” took a few hours, which is either amazing or horrible, depending on your expectations.”

    This is approaching ‘a Wizard did it’ levels of amazing, I doff my cap to you Peter Olsen and Shamus Young!

  25. Decius says:

    It seems to me that most of those issues could be resolved with a ‘interpret this as ported Windows code” option on the Linux compiler.

    or are

    open (“mygame\\games\\saves\\savefile1.sav”);

    open (“mygame/games/saves/savefile1.sav”);

    meaningful ways to do open two files with different paths?

    1. Actually the issue is resolved by using concatenation instead, and you use a compiletime directive of sorts to set the / or $ for that platform, and then store the path and point a variable to it.

      “/” (or “$”) would be in a path_separator
      “mygame” Would be in a path_program
      “mygame/saves” would be in a path_saves
      “mygame/saves/savefile1.sav” would be in a path_saves_current

      Normally you reference a file or files in a directory, it’s rare one reference files in multiple directories at the same time.
      The savegame/loadgame code only pokes around in “mygame/saves” and the texture loader in “mygame/textures” and so on.

      With a modern programming language it’s tempting (and I’d even suggest it) to enable compiler features like string clustering and storing a static string only once even if it’s referred multiple times. But even then I still prefer to put paths in variables, that way I can simple use the filename without having to worry about the path (which could have / or $)

      And in case you wonder why I wrote path_saves instead of path_program_saves it’s because at least on Windows it’s more common to put savegames either under My Documents/My Games/Program/Saves
      or in the AppData folder somewhere instead of the program installation folder, so the path_saves variable could easily point a path to anywhere.

      Note: Please mentally replace $ with a backwards slash, for some reason the blog commenting here eats up backward slash.

    2. Alan says:

      The compiler is perhaps a bit low level to do that sort of trickery. You could certainly create a wrapper (smart_open) that knew how to do the conversion. On some compilers, you could event transparently replace calls to open with calls to smart_open, although that’s more magic than I care for. If I was concerned, I’d create smart_open, then replace all calls to open with smart_open. There are a few techniques to help flag accidental calls to open.

      Since pretty much everyone except Windows went with slashes, and reportedly Windows will handle slashes correctly, I’d be tempted to just use slashes everywhere and call it a day. That assumes you’re not showing the path to end users, of course, who will be unamused when asked to understand an alien format.

      Roger describes a common solution. It’s arguably the the correct solution, it sacrifices conciseness for completeness. If you take this idea further, you create a dedicated API for managing paths. This can pay off if you want to support non-file locations on Windows, Mac OS 9, or some Linux frameworks. For example, it might allow you to read a file inside of a zip file, or off of a network share that isn’t mounted.

    3. Christian Severin says:

      Allow me to mention the Boost Filesystem library for system-independent file wrangling.

      1. Veylon says:

        I’ll second this. Although it’s one of the few boost libraries that needs to be compiled and linked to, it’s very handy for all those directory-walking needs.

    4. Bryan says:

      On Linux “mygame\games\saves\savefile1.sav” is a perfectly valid filename. It’s the name of a file in the current working directory (and not the name of a file three subdirectories deeper than the current working directory, which is what “mygame/games/saves/savefile1.sav” is), which has three backslashes in its name.

      So you can’t put a wrapper around open() that does something like “convert all backslashes to forward slashes” — because backslashes are perfectly valid characters to put into a filename. (The only characters — well, no, that’s wrong; bytes — that are invalid in a filename are zero and 0x2F — zero is the end of the string, and 0x2F is the directory separator. Anything else can be part of the filename, and should not be changed by generic code.)

      I don’t know how Boost does it in C++, but in Python this is what os.path.join() is for. os.path.join(“mygame”, “games”, “saves”, “savefile1.sav”) will give the right path on any OS that Python runs on. Of course, “rewrite Good Robot in Python” is a terrible suggestion, but its variable-argument setup makes this particular problem’s solution a little less verbose. :-)

  26. Anachronist says:

    I find it ironic that Shamus has spent so much time with Microsoft development tools for the same reasons that I have avoided them. Like Shamus, I’m also self-taught in multiple languages (C, C++, Java, PHP, C#, Ruby, HTML/CSS, etc.). I have no CS degree. I did a semester of FORTRAN and assembly, as electives, in the early 1980s, although I taught myself BASIC on mainframes before that (with a 300-baud acoustic modem).

    From day 1, Microsoft has happily ignored standards and made up their own. Path names with forward slashes were well established before MS-DOS with its CP-M heritage established a new standard with backslashes. Standards bodies, in which Microsoft participated, established standards for header includes and such, which Microsoft ignored. Standard library functions are not named properly. And on. And on.

    Makefiles were standard before Microsoft came along and did their own thing. IDEs managed just fine with Makefiles (even some I remember using under MS-DOS, Zortech C++ comes to mind although I may remember incorrectly). Microsoft could use makefiles if they wanted to, but they admittedly don’t need to because they bind the programmer so tightly to MS-specific features that a Makefile would be useless in other operating systems due to the MS dependencies. The SAS C++ compiler for Amiga used makefiles, but you couldn’t really port those programs either if you relied on the Amiga system libraries for windows, mouse events, etc.

    Even though my computer is a Windows machine, if I want to do any development work, with the exception of C# projects, I use a Linux virtual machine. The Microsoft Visual IDE is nice, but that doesn’t mean non-MS IDEs aren’t available.

  27. Deoxy says:

    If you're using Microsoft tools, you don't have a makefile. Microsoft uses project files. It's the same idea except it's, you know, different.

    This is the entire Microsoft business model – own the standard. If something is a good idea, make your own version of it and try to force all other versions of it out of existence.

    1. Richard says:

      And that’s why I don’t use the Microsoft IDEs.

      I do use MS’s compiler for Windows targets because it produces pretty good binaries for Windows, so obviously I do collide with the fun and games of “So where did you put the standard libraries?” and forces me to write to the “lowest common denominator” of GCC and MSVC2010.

      I use Qt Creator as my IDE, which again uses its own form of project file but transparently handles all the makefiles for multiple target systems, and oddly, is actually much, much faster at compiling & linking for Windows than using Visual Studio itself.

      (Mostly due to the magic of jom.)

      I still cannot figure out how you’re supposed to distribute the MSVC2010 libraries though!

      Everywhere I look gives a different answer, including different parts of MSDN.

  28. Rosseloh says:

    One person using Windows XP reports that all of the robots get more transparent depending on how bright their colors are, which means there's some shenanigans going on with the alpha channel, but ONLY on Windows XP.

    Wasn’t there something related to your comic-making program that was practically the same issue? I know at some point there was something funky with how XP handles alpha channels.

  29. Neil Roy says:

    Love CODE::BLOCKS. I use it on Windows, and you can download it for Linux, load up your project you started on Windows and compile away.

    You could also do what you are doing, let someone else work on the Linux port, just make sure your code is commented for them properly. Problem solved. ;)

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to MikhailBorg Cancel reply

Your email address will not be published.