About the Author
Mass Effect
Final Fantasy X
Batman:Arkham City
Borderlands Series
Weekly Column
Champions Online
World of Warcraft
DM of the Rings
Good Robot
Project Frontier
Forums
"Music"



Crowdsourcing Advice: Result

By Shamus
on Monday Mar 17, 2014
Filed under:
Personal

 
 

So last month I mentioned that my son’s laptop died, and I asked for some advice on what to get and what to avoid. We got some good feedback and got him a reasonable machine without breaking our budget. (Some people even threw a few bucks at us via paypal to help with the purchase. Thanks!) I didn’t think about it again until last week when someone emailed me and asked for a follow-up. It didn’t even occur to me that I’d sort of left you hanging.

So for the curious, here is what happened…

Our big concern was price. I know a lot of advice from people centered around the idea of “you know, you can get a LOT more power for just a LITTLE more money”. Which is true, but that’s sort of always true at these lower price levels.

Aside: See, it’s tax time, and tax time can be tough on self-employed types who don’t have their taxes automatically withdrawn. We have a savings account where we set aside money for paying taxes at the end of the year, but I totally forgot to allow for the fact that we’re no longer homeowners. In the US, the interest on mortgage payments is tax-deductible but rent isn’t. So suddenly we’ve got thousands of dollars of tax liability that we didn’t plan for. We’re fine now and we managed to get everything paid, but in the meantime we just did not have the money for buying computers.

So that’s why we were shopping on the inefficient low side of the price/performance curve.

But here is what we got him, which came as a bundle:

  • AMD A10-Series AD580KWOHJBOX Quad-Core A10-5800K Black Edition APU – 4MB L2 Cache, 3.8GHz, Socket FM2, Radeon HD 7660D, Dual Graphics Ready, DirectX 11, Fan – AD580KWOHJBOX
  • Patriot Viper Xtreme 4GB Desktop Memory Module – DDR3, 1600MHz, PC3-12800, CL 11, 1.5V – PX34G1600C11
  • Kingwin ABT-650MM Maximum Series ATX Power Supply – 650W, 120mm Fan, Single +12V Rail
  • Gigabyte GA-F2A55M-DS2 AMD A55 Motherboard – MicroATX, Socket FM2+/FM2, AMD A55, 1866 MHz DDR3, SATA 3.0 Gb/s, 7.1-CH Audio, Gigabit LAN, USB 2.0
  • Cougar Solution Black Steel Gaming ATX Mid Tower Computer Case – 1x120mm Cougar Turbine Hyper-Spin Bearing Fan, USB 3.0 port, Water Cooling Ready, Supports 320mm Long VGA Card, Air Filter Included
  • We were able to salvage the 230GB SSD from the dead laptop, so we didn’t need to buy a hard drive right away.

I must say that “Patriot Viper Xtreme” is really pushing my tolerance for ridiculous hyperbolic self-aggrandizing product names. If this trend keeps up then someday when I’m shopping for graphics cards I’ll have to choose between the AMD DEITY MEGA DRAGON SUPERNOVA and the NVIDIA SUPER BOOST LIGHTNING GIANTPENIS. Where does it end? This inflation of product name intensity is not sustainable!

issac_computer.jpg

The bundle was ideal for Issac. He would have been really disappointed if the machine came pre-built. Buying a bundle let him have the joy of assembling the thingWhen I was twelve, I would have been thrilled to put a machine togather. Now I’m 42 and I’d rather just use my afternoon to Get Stuff Done. without burdening me with the worry that maybe some esoteric detail would keep the parts from working together. I think we came in just a tiny bit under budget.

I had to donate my second monitor to the cause. A year ago I didn’t even use a second monitor, but after just six months I was completely dependent on the thing. Now I feel like I’ve gone blind in one eye. This is not nearly as cool as Nick Fury makes it lookPeople complain his character has no depth, which is obviously false. He just can’t PERCEIVE it..

So that’s the story. It plays the games he cares about, so he’s happy. I’ll talk a little bit about the games we’re playing together on the Diecast later this week.

Thanks for all the advice, everyone.

Footnotes:

[1] When I was twelve, I would have been thrilled to put a machine togather. Now I’m 42 and I’d rather just use my afternoon to Get Stuff Done.

[2] People complain his character has no depth, which is obviously false. He just can’t PERCEIVE it.



 
 
Comments (101)

  1. Abnaxis says:

    Hey, I use that case for my PC. I like it a lot.

    EDIT: Incidentally, did you consider declaring a part of your apartment as your place of business, for purposes of a deduction? I thought I remembered a post saying you had a dedicated area where you make comics/posts/do blog-related gaming, so it might be at least a little helpful…

    • Geoff says:

      It’s been awhile (so my facts might be a little off) since I looked at it for myself one year, but the ability to do that is really janky and opens you up for trouble if strict adherence isn’t met. Basically, working in a dedicated office is best as it can all be written off as business expense. A closed off, dedicated room at your home is harder, but still ok. If your “office” is a corner of a larger room that is used by multiple people (a loft, or a den) you probably don’t even want to go through the hassle.

      Even if you have a dedicated room or want to go through the trouble of figuring out how many square feet and the cost of rent per foot, it still has to be *dedicated* work equipment. If you work on it, but you ALSO check Facebook and handle personal emails and play games and etc. etc. etc. then it is NOT a dedicated work equipment. You can *try* to estimate that you use the space / equipment 40% of the time for work 60% of the time for personal use or something like that, but it quickly becomes a mess just to accurately calculate how much you can claim. All of that is assuming you never ever get audited, at which point you actually have to *prove* those abstract claims somehow.

      When I looked into it, the complexity and risks never balanced out to the tax deductions I might have gotten from it. In any event, the 2.5% (or whatever you end up working out) of your monthly rent is FAR less than the interest on a monthly mortgage payment (which makes up around 50% at the beginning of the life of the loan).

      • Abnaxis says:

        Following the link in the yellow-box text, though, it sounds like this is already the setup Shamus is using. He says they had to get a space where he could have a dedicated office.

        To me, this is exactly what the tax break is meant for–if Shamus paying extra for a bigger apartment so he can have an office to work out of, he should deduct the expense of that shared footage of office as a business expense, because that’s what it is. With or without an audit, I think that following both the intent and the letter of the law, so it should be golden.

        • BenD says:

          It’s completely possible Shamus IS deducting this percentage and still not coming out ahead vs. the homeowner situation previous, depending on the interest they were paying on the mortgage and how large the mortgage was.

      • Jeff says:

        “If you work on it, but you ALSO check Facebook and handle personal emails”

        I think that means my office building is not an office.

    • Paul Spooner says:

      Yeah, I had my taxes done by a professional this year, and they encouraged me to set up a home office for tax purposes. Apparently you can deduct based on floorspace, though I don’t know if rent counts toward deductions. It’s all very complicated… on purpose I’m sure.

      • Abnaxis says:

        Maybe I’m weird, but I never found tax stuff all that complicated. Excruciating and slow to slog through, sure, but not hard to understand.

        In this case, my understanding is the deduction = [area of office] / [total floor area] * [total cost of floor area] (including utilities, rent, and maintenance you paid for). As long as you have a room dedicated to the purpose of a business, that you can grab a tape measure and quantify the exact floor-space of, you’re good to go (otherwise no-one would be able to agree on how much space corresponds to the “corner of the room” you did business in).

  2. TouToTheHouYo says:

    Sounds wonderful. I’m personally iffy about integrated graphics and running anything less than 8GBs DDR3, preferably high speed, but if it fits your needs and doesn’t break the bank, all the better.

    I’ve never actually used integrated graphics before. I’d be interested to know just what that chip runs and how well.

    • Unfortunately this build will not be able to provide guidance to the performance of the integrated graphics in the 5800K. That single stick of RAM (that SoC is dual channel so can address two sticks of RAM at the same time if you have two sticks – literally doubling memory bandwidth for the same frequency) is going to be a limiting factor. Even when starting out with two channels (note this test starts with dual-channel so this setup with be even more memory starved when trying to do 3D rendering), there is enough GPU there to feed on as much bandwidth as you can give it.

      Remember that the Xbox One has barely a $100 AMD GPU’s worth of GCN blocks (which is still quite a bit more than the 5800K, which also has older VLIV4 cores, not GCN) in that SoC but feeds it with quad-channel DDR3 (and an eDRAM cache). The stronger PS4 just decided to make sure memory was there and pumped as much GDDR5 into the box as they could find, along with having 50% more GCN cores feeding on it. When it comes to ‘integrated’ graphics then the PS4 is a pretty clear indication you can fit a $150 GPU’s worth of resources onto the side of a CPU block (ok, the CPU block is actually a small thing on the edge of the GPU at that point) without totally ruining your chances of being a high quality rendering block.

      When you’re buying a $200+ GPU then you’re buying more processing resources than you can bolt onto the side of a CPU but that die area will always be getting smaller to give ‘good enough’ rendering. When it comes to passable 3D then any SoC you buy for more than $50 isn’t a bad choice (and even some of the mobile SoCs that are less than that can do an ok job with a simple scene if you accept 720p or a very simple scene (say GTA3 on mobile) for 1080p) and once you avoid mobile or tablet then taking a 70+ Watts desktop SoC means it’ll have a healthy clock and not be constantly bumping off a very low TDP limit.

      But rendering requires lots of memory accesses. That’s the gap in this build (and a completely reasonable one for a cost-concious build – this also means that if money does come up for an extra 2nd stick of RAM then a decent perf boost will unlock with that (cheap) purchase, make the machine feel new again and games run with higher settings/more smoothly).

      • Abnaxis says:

        This. Very much this.

        I always built PCs with an eye toward later upgrading, skimping on some parts of a build to spend more on others. For my current system, that meant settling for an integrated graphics card which saved enough money for me to set up a pair of RAID’d SSDs for my primary drives.

        Predictably, I got really fast load times coupled with terrible performance after any level was loaded. Even on min settings, games were playable, but I frequently got headaches from the low FPS. And that was with two sticks of memory utilizing both channels. I shudder to think what would have happened if I had tightened that bottleneck more.

        • Paul Spooner says:

          Yeah, I always build with an eye toward upgrading too… But I never have actually upgraded. It always ends up being more worthwhile to buy a new one than hassle with compatibility of upgrades. Maybe my experience is exceptional here… but there you have it.

          • Mike S. says:

            My last couple of desktops have each gotten at least one video card upgrade and RAM expansion. The previous one also got a couple of hard drive additions/upgrades; this one is only a couple of years old, and hasn’t really needed it yet.

            (The previous desktop has also taken repairability of desktops to near-Ship of Theseus levels: in addition to the aforementioned upgrades, at different times I had to replace the PSU, warranty replace the RAM, and replace the motherboard. So while it’s the same case and CPU, it’s a matter for debate whether it’s still the same computer. Whatever it is, it’s running eight years after first building it.)

            Laptops, by contrast, rapidly reach the point where I’d really like to be able to upgrade, but can’t. The market has spoken, but I could wish things had gone in a more modular direction.

            • Paul Spooner says:

              It’s an interesting study in the various pressures of standardization versus performance. Envelope issues are always a problem for mobile devices of any sort. Desktop PCs have the luxury of volume and mass, so the internals are designed to have interchangeable (read as “inefficient”) fit and function with a relatively broad variation of form. Mobile devices, however (such as laptops, phones, etc) need to be light weight and small, so people feel much more comfortable altering the fit and function to compensate. The form is much more restricted as well, to keep volume down.

              However, we’re quickly approaching the lower limit of usefulness in miniaturization. Batteries compose a significant portion of all mobile devices, and heat elimination is a significant fraction of the volume of laptop internals. This is only going to become more problematic as hardware becomes more powerful. I’m fairly sure the optimization is leveling out, at least in terms of packaging volume, and weight. All of the disparate standards are going to sort themselves out to one or two primary methods, at least until another disruptive technology comes along.

              Practically speaking, I predict we can look forward to more modular laptop hardware in the near future.

              • ET says:

                We could have slightly smaller batteries if they were atomic!
                I don’t have my work to show off anymore, but I worked out the size/mass of a…strontium (?) powered thermoelectric generator, big enough to power a laptop.
                Normal laptop power needs of around 100W worked out to be reasonable, especially if you used a smaller chunk of your decaying substance, to charge a normal battery when you’re not using the laptop. :)

              • Trix2000 says:

                One can only hope. I had to open up my laptop recently to clean out the fans, and it was… involved. Felt like I had to unscrew every fastener in the thing – even had to take the hard drives out.

                Still, I’ve never been one to upgrade much when I get plenty of use out of them anyways. Of course, I’ve also tended to splurge a little on my computer purchases so upgrades wouldn’t have been all that big anyways.

          • Asimech says:

            I have several friends who have mentioned they’ve cheaped out on something, usually RAM, in order to “update later” and then never did. I had the same with my previous computer, but that was because it used DDR2 which had a price hike up to double what DDR3 was and Intel changed MCH frequencies so none of the CPUs on sale were supported by the board.

            With the current setup I did actually update my RAM, which was the only one I really planned to update, so all went according to plan. It helped that the RAM manufacturers had a compatibility checker on their sites.

            • Abnaxis says:

              I’ve had computers with planned updates that never happened and computers where they did.

              I have a theory that, the more up front it seems I have to disassemble a computer, the less likely I am to do the upgrade. Like here–I only needed to plug the card in, so it only took me a year to do it. The fact that it was a royal pain in the ass since I needed a power supply didn’t deter me because I forgot about it until I already had the card in hand.

              Other setups where I had plans to upgrade the processor or cooling system, OTOH, were stymied because there was more up-front investment of effort.

          • Humanoid says:

            My previous box, a C2D E6300, ended up going through four video cards, though it could be said they weren’t really genuine upgrades for upgrade’s sake – a 7900GT that died, an X1950XT with a bad cooler that I gave up on, an 8800GT that survived to the end of the box as my primary machine, then a hand-me-down HD5850 in its last days of my ownership, after which I passed it along to my sister.

            A 9800Pro in my previous P4 machine that died prompted buying that system in the first place, so that was a bad bad run for me in terms of hardware reliability. But anyway, that C2D also had its RAM replaced and an additional hard drive added, so it’s safe to say it’s my most upgraded machine over the years.

            Current machine runs it close though, a 2009 ‘vintage’ Lynnfield that’s had its RAM upgraded then later replaced altogether, one video card upgrade, and all its storage replaced (one SSD and one HDD removed, three SSDs and one HDD added).

      • Humanoid says:

        At this point, any future upgrade should probably involve either two sticks of high-speed RAM (then repurposing the existing stick in another machine), or a basic gaming video card. The latter is a no-brainer in this case, when the time comes.

        It’s unfortunate, but such is the case with many of this type of bundle that the seller only cares whether the combination works, and not whether the combination works *well*. But yeah, at least there’s room to move a year or whatever down the track.

      • default_ex says:

        That’s not what dual channel means. It means the pipeline in the CPU, RAM and the bus between them is configured in such a way that it can perform two memory ops in parallel on each physical channel. The reason why two stick are required is to keep it all synchronized with the pipeline tieing everything together. There’s no reason it couldn’t be done on a single stick, just more expensive and less reliable than spreading the work load.

        • Erm, I’m not sure I get you (you disagree with what I’ve said, I’m just not sure what you think I’ve said is wrong).

          Dual channel does mean there is a memory controller on the SoC that can address two channels of RAM at the same time, the motherboard traces are doubled up so there are now two paths (64-bit wide, 72-bit wide if you’ve got ECC support) to the RAM slots (which is why you have banks for each channel and have to put he RAM in the right slots, half are connected via the one trace, the other half by the other so you can, with 4 slots, have both RAM sticks hooked up to the same one trace and so not get dual channel mode). Dual channel mode = double the bandwidth by using both of the traces you paid for on the mobo and all of the memory controller on the SoC rather than just half of it. And you need two sticks of RAM to do it. Scale up accordingly for describing triple and quad channel designs.

          Current RAM is a great example of pushing every multiplier. Your normal (low/mid end DDR3 dual-channel, none of the exotic high end triple/quad channel designs) memory subsystem is 64bit wide in the bus, multiplied by two because DDR (two signals per clock), multiplied by four because the bus clock is quad the memory frequency, multiplied by two because you’ve got two independent channels in parallel. Every area they could pump they have.

  3. Horfan says:

    YOUR SON IS A BRONY.

    • Very much so. He was so excited when another of his friends declared his never-ending love for all things MLP. He also loved knowing that many of the readers here enjoy it. Knowing you are not alone in liking what is essentially, to the outside world, a “girl’s show” is very important to a 12 yr old boy.

      • Akri says:

        This gives me warm fuzzies.

        …And reminds me that I still haven’t watched Saturday’s episode.

      • rofltehcat says:

        Thumbs up!
        I personally don’t really like the show (although I have to say it has better production values and much less BS than other shows these days) and I think some of the related fandom is… strange.
        However, this does not mean that he shouldn’t watch it. It is a great show to watch for teenagers of all genders.

        But kids can be oh so cruel. I hope he doesn’t get made fun of at some point :(
        Having someone else his age who shares his enjoyment of the show is awesome!

        • Mike S. says:

          Watching my preteen nieces, I see some very mixed signals about how These Kids Today deal with these sorts of issues. On the one hand, they seem to have very firm ideas about what constitutes Boy Stuff and Girl Stuff. (One of them considers herself primarily into Boy Stuff, but even she’s not really willing to abandon the category.)

          On the other hand, their “boyfriend” (so-called by them, a family friend who’s held that status with two of them for a lot longer than I’d have expected)[1] attended a school costume event dressed as a girl. (Convincingly, not exaggerated. When he was pointed out, I literally didn’t recognize him till I was told what his costume was, even though he was waving to me.)

          That he felt comfortable wearing a dress and pigtails to school, even as a costume, and more, that there were no visible repercussions of the sort I’d have expected when I was eight, at least suggests some welcome changes in the world of kids.

          [1] He’s gotten two of my nieces into Minecraft and Doctor Who, so as far as I’m concerned his influence is entirely welcome.

        • BenD says:

          Where my wife teaches high school, bronyism is cooler than non-bronyism. Lack of tolerance for ponies is not tolerated. XD

      • wererogue says:

        I’m totally with him that Rainbow Dash is best pony! :3

      • The Rocketeer says:

        Speaking of esoteric interests, did you ever find a copy of “Come, Tell Me How You Live”?

      • Volfram says:

        The MLP community is one I find fascinating to observe from the outside, but which I am not comfortable attempting to integrate into. Or as I told one of my Brony friends, “Sorry, I’m not confident enough in my masculinity to watch MLP.”

        It is worth noting that while the series was initially targeted at girls between the ages of 6 and 12, the primary viewing demographic is actually men between the ages of 24 and 30. And not all of them have daughters.

        This is a fact which Hasbro hasn’t failed to notice, and as a result, later seasons have included aspects intended to appeal to the older and male audiences as well.

      • Kavonde says:

        ONE OF US! ONE OF US!

    • Shamus says:

      It’s true, he likes the show. Although I doubt he’s even aware of the term “brony”. He’s not really aware of the fandom or its controversies. He just found a show he likes and watches it.

      There’s something about that show. He’s got two older sisters. Through them, he’s been exposed to Powerpuff girls, Sailor Moon, and dozens of other “girly” shows. Never had the slightest interest in them. But MLP? He likes the show and his sisters don’t. Not sure why this show resonates and the others don’t.

      EDIT: I just checked, he has indeed heard of “brony” before.

      • Paul Spooner says:

        I’m a brony myself. Here are an insight I’ve stumbled over, in an effort to understand the phenomena.

        I’m going to say a large factor is that all the main characters in MLP:FIM are basically male role models. Their characters are largely defined around their social roles, which are largely professional. They have their own businesses, and maintain their friendship despite the demands of their profession (classic traditional male roles). None of them have children, or take care of a household (classic female roles) in any traditional sense.

        If I told you I watched a show where the main character is a magical scholar who becomes a national ruler, and who has friends who are a farmer, a veterinarian, an athlete, a tailor, and an event organizer, would you guess that it has an all-female cast targeted at young girls?

      • Rax says:

        Well, let’s all hope he had safe search on when he googled for rainbow dash wallpapers.

      • Thomas says:

        But the Powerpuff Girls were amazing…

      • Destrustor says:

        Incidentally, now I’m wondering if you’ve ever been made aware of friendship is dragons?
        It’s basically DM of the rings applied to MLP.
        Even more awesome than it sounds, even for those who are usually only interested in one half of the combination.
        And the comments section is usually filled with a multitude of gaming anecdotes and interesting tales. Some say the comments are half the fun of every page.

        • Kavonde says:

          I don’t know if it’s just me, but FiD’s server has some serious load time issues that have made it a real chore to archive binge. What’s there looks funny, but not funny enough to warrant two or three minutes of loading per page.

          But yes, Shamus, the genre you created has been ponied. So it must be with all things.

  4. adam says:

    When you say that mortgage payments are tax-deductible, I assume you mean it’s the interest that’s deductible? Want to be sure I’m not missing some crazy loophole somewhere. :)

    • Shamus says:

      No, you’re right. It’s the interest. Fixed.

      • Mark says:

        It’s also the property taxes, so there’s a double tax benefit to homeownership that’s denied to renters.

        • Abnaxis says:

          Presumably renters get the benefit by paying less rent, since the building owners pay less taxes and pass the savings on.

          I personally find the notion…somewhat dubious, but I won’t get too far into it since it’s straying into political territory.

          • Mike S. says:

            Think about it this way: if landlords lost the deduction, would you expect rents to go up?

            (Of course, in the real world, the answer is complicated, because nothing in the economy happens in isolation. If there are lots of empty units on the market, maybe the landlord keeps the rent the same and eats the cost– or, if that tips them over into losing money, just exits the rental market. If the market for rentals is tight, then maybe rents are already going up and the increased tax is lost in the noise.)

            • Abnaxis says:

              I’m not a big fan of thought experiments in the social sciences. I much prefer actual data to back up any claims–unfortunately, I find the methodology in the papers I’ve read more than a little questionable (which isn’t rare any time I read economists’ papers) so I guess there isn’t much alternative.

              Specifically, I think prices in real estate and rentals are set by quality of facilities and by geographic location, and that if a renter had the capability of charging more for the location and quality of their property, they would already be doing so with or without the tax deduction.

              Of course, it depends on the renter. There will be some pressure to raise prices, and as a results rents will go up, but dollar for dollar I’d be a monkey’s uncle if it averaged out to more than a few cents rent per extra dollar tax paid by renter.

        • BenD says:

          Property taxes are deductible so that you aren’t taxed twice (double taxation). If you don’t own property, you don’t pay property tax. This is not a benefit unless you mean in the sense of ‘I get to support civic improvements with my money,’ which is okay with me, but not with everyone, I suspect. In any case, paying tax is not a financial benefit of ownership, and paying tax less than twice on any given dollar is also not a benefit, just a good thing.

    • Peter H. Coffin says:

      Which, until you’re into the last hunk of the mortgage, is the vast majority of what one is paying out that isn’t property taxes, which are also deductible.

      ‘sfunny: real estate interest’s all deductible but boat or RV loan isn’t, even if you’re living on said boat/RV.

      • Abnaxis says:

        I can kind of see that. First, prices for RVs are way more volatile than homes–the deduction will probably outstrip the actual value of the RV in a couple years given depreciation if you buy brand new. Second, there’s a whole issue of municipality to figure out on a vehicle that doesn’t come up with houses. Third, what about all the other stuff that becomes and issue if you count a vehicle as a home–gas, insurance, repairs, maintenance, sales tax, etc.? It would require an entire section in the code.

      • McNutcase says:

        Easy to explain that: if you’re living in a mobile thing, you can just up and leave whenever, so they try to gouge as much tax as possible out of you before you do so.

        Yes, I know that makes no sense, but really, very little to do with taxes makes sense beyond the basic concept of “collect a little bit of money from everyone, to pay for the things that no one person can afford but which benefit everyone by their presence”.

        • Coblen says:

          A lot of taxes are about persuading people to do “good” things, or punishing them for doing “bad” things.

          Your definition of either may vary.

          Where I live we have extra taxes on cigarettes. It was argued that people who smoke damage the health of people around them. This damage is payed for from our public health care. We should tax the people who smoke enough to make up the cost of taking care of these sick people.

          • Joshua says:

            Was reading a case study recently in a class where it says that the smokers actually end up subsidizing the health care of the non-smokers. People who smoke tend to die earlier, and pay extra taxes for these cigarettes. People who don’t smoke tend to live longer, until they hit that age in their life when the health costs really start piling up. Smokers typically die before they hit that range. The ethical question from the case study is whether we should *encourage* people to smoke as it’s better for the populace at large.

            • Decius says:

              Wait, there’s really someone who takes the ethical position that semivoluntary, indirect, expensive, uncertain, painful euthanasia that also harms the health of bystanders is both permissible and more effective than traditional euthanasia?

      • Paul Spooner says:

        I don’t know about you, but my interest is half my payment. Half != “vast majority” (though it is a much larger fraction than I’d like).

      • Decius says:

        Rich people who own congressmen don’t live in boats or RVs, but they do pay interest on their mortgage.

  5. Daemian Lucifer says:

    Thats an easy choice.Its AMD DEITY MEGA DRAGON SUPERNOVA for guys and gals that like gals,and NVIDIA SUPER BOOST LIGHTNING GIANTPENIS for gals and guys that like guys.

  6. Seems like a OK all-round machine. Good thing you ended up with a quad core, these days I’d say quad is the minimum (and x64), two core is not that future proof, and single core is practically stupid, even mobiles/tablets are moving away from single core CPUs.

    That 4GB stick is perhaps on the edge there, later when possible (wallet allowing) popping in another 4GB stick with similar specs (the AMD CPU has a built in MMU that will allow dual channel memory handling and the memory speed will be that of the slower module).

    I know people will say that even 8GB is too little. But considering my machine is only restarted when there are windows updates pending that require a restart, and my browser regularly ends up with a hundred tabs, and gobbles 2GB memory easily and I got Foobar2000 going and a programming IDE and Notepad++ and various folders open and a second browser. This is on Windows 7 though.

    I also use the hybernation feature instead of just powering off (thus the state of my system is resumed at power on later), the 8GB i have here is plenty, I have not managed to use up the memory yet during “normal” use. (as I’m a programmer I have managed to use up and crash the machine on occasion obviously)

    • Humanoid says:

      I sat on 4GB for about a year last year when one stick of my 4x2GB setup failed. I don’t do any ‘work’ on my PC, but I note it was completely fine for even the most demanding games.

      For practical reasons, I’ve since shuffled things around and gone back to 8GB (a new 2x4GB pair), but that wouldn’t have happened if I wasn’t going to shuffle around the old sticks for lighter duty – I’ve got a HTPC and NAS build coming up, which will use two and one of the surviving 2GB sticks respectively.

  7. The long product name stuff is silly.
    It is even worse in some cases as they not only have a 3-4 word long product name but the company may have a 2-3 word name (not counting any Inc. tacked on) so you get stuff like:

    Awesome Mega Corp Inc. presents Super Duper Hyper Derpy IIs X1.

    In my eyes if your company name plus the product name together is more than 6 words long then you should fire the entire Marketing/PR department, and if the company name can be considered part of the product name then you should be able to do nicely with just 5 words max.

  8. Akri says:

    Fun fact: being blind in one eye does not actually ruin your depth perception. I have two friends who each learned to see depth using only one eye (the first only has one working eye; the second has two working eyes but his brain refused to use them simultaneously for many years, until he fixed it with the power of video games).

    And yeah, going from two monitors to one is terrible. It’s one of those situations where until you have a second monitor you can’t imagine needing it, and then once you DO have it you can’t function once it’s gone.

    • The reason that depth can still be perceived is due to the eye actually making micro-movements, sort of a supersampling or intentional jitter, the human mind is freakish in it’s ability to sum/merge/interpolate and interpret sensory input/data.

      • There’s also two other major methods in play:

        The lens is changing shape to bring different light into focus so we can basically tell comparative depth for objects 20ft or less away (beyond this it all becomes a bit close to focus at infinity) by changing what is in sharp focus (which is a slightly subtle trick compared to seeing two of an object but if you’ve only got signal from one eye then you use what you’ve got).

        Your memory is a great thing, so you’ve got temporal depth perception going on. This is something that optical tricks can break but your low level image processing is doing tests on movement and working out your depth perception based on how objects move relative to each other. You can ‘see’ in 3D without seeing two images, in fact (unless you use a stereoscopic 3D monitor or a Rift/VR) you do this every time you play a 3D game.

        Cool note: the first of these effects is not seen is stereoscopic 3D (or classic 3D via a 2D monitor), your depth focus is on the screen surface (or infinity for VR, where they use the lenses to push everything to parallel as your eyes are more relaxed when focussed at infinity) so you’re actually operating without all of your depth perception tools in play. It’s one of the things thought to force an adaptation period when starting to use stereoscopic (and VR) for people with two eyes that work together: you need to get used to changing the angle of your eyes to focus at different ‘depths’ without changing the lens shape as you’ll always be focussed at the same static depth.

        • Abnaxis says:

          Does using other objects as calibration fit into the temporal category?

          I’ve found I have to concentrate harder on depth perception if I’m in wide open space, without walls or lines to work with. I think it’s because my brain is doing some sort of odd trigonometry on my head, using datums provided by my environment to figure out distances, but I could be pulling this out of my ass.

          • I was rather conflating temporal with static (scene) depth perception there (or just ignoring it existed) – the later being where you, for example, see parallel lines coming together at infinity/the horizon and do all sorts of other shape fitting functions to the scene to get an idea of depth. I’d kind of ignored it rather than giving it a 3rd category to get into the details because monocular temporal (at most distances) and sharpness focus (up to the infinity focus of 20ft) are pretty accurate, even if they can be tricked. Static monocular depth perception from feature recognition is really easy to mess with and screw up (which gives us those great optical illusions where you look at something and then move your angle of inspection and realise it was all different to how you perceived it – and those tricks work better when you’re looking at them on film because you can’t use your eye’s lens to change your focal depth or a second eye’s stereoscopic to get accurate depth hints).

            By a lucky coincidence, there was just a crash course about just such a topic (and many more great videos about this stuff on YouTube for the curious, but far fewer of them came out just yesterday): http://youtu.be/n46umYA_4dM

    • As to two monitors, I currently have that, but I probably would be fine with a 20-23 inch super wide screen (like a 21:9 or whatever, the bonus being no bezels unlike two narrower screens would have).

      It is the benefit of a wider desktop work area that is hard to go back from.

    • Abnaxis says:

      Yeah, I have the latter problem. I was born with lazy eye (got corrective surgery when I was 1) and my eyes never learned to work together naturally. If I concentrate, I can get them working together and get stereoscopic depth perception, but I rarely do.

      Oddly enough, I think I work by just knowing how big stuff is–I know how big a desk is, how long a city block is, etc, so if I’m driving and something is a half-block away or a pencil is three-quarters across my desk, I can do well enough. Also, I’m really good with linear lengths of measure (I can usually eyeball a meter to within ~1cm), and as long as I have straight lines to calibrate with (lane markers, hallway baseboards, etc) I can usually pick out how far something is away in empirical units with a surprising degree of accuracy.

  9. KremlinLaptop says:

    I used to have one monitor. I used to have two monitors. I now have three.

    I lent one out to a friend for a few weeks. It felt like suddenly I’d lost an arm! I kept trying to move stuff over to that monitor or expecting things to open there (Might sound like some serious first-world-problems here but it does mess with the workflow).

    My three monitor setup usually works thus from left to right: Entertainment, work, and reference. So I have what I’m doing right in front, references to the right, and usually some youtube stuff (let’s plays) on the left.

    Oh and congrats on the new setup! Nice to have a follow-up on it.

    • Paul Spooner says:

      The way I deal with this problem is to periodically disconnect the extra monitor. It makes the setup easier to transport, and then the additional visual channels continue to feel like a luxury instead of a necessity.

    • MichaelG says:

      Well, count me as unusual. I have two, but leave the second one turned off most of the time. It’s just a bright area in my peripheral vision that distracts me when programming.

      The only time I use it is if I have a reference document or something I want to look at while working. Otherwise, one screen is fine.

      • ET says:

        I’ve only got a single monitor, and it works fine for me.
        The reason is, that screens are wide now, and I (try to) keep all my computer code to less than the magic 80 characters long.
        Plus, I have a hard time reading websites and crap, if the paragraphs are too wide, since I lose my vertical place, when tracking from the end of a line on the right, to the beginning of the next line on the left.*
        So, browser crap is resized into paragraph-friendly half-screen on the left, code on the right. :)

        * Seems like Shamus either suffers similarly like me, or it’s a widely-studied phenomenon, and he sized the column-size of his website accordingly.

        • Humanoid says:

          I don’t get distracted as such, but for a couple reasons keep my browser window take up only about 60% of my right-hand screen: one because it’s just not comfortable to turn my head that far around, and two because not all websites handle 2560 horizontal pixels all that well and end up inserting too much whitespace on the left margin. Mind you, the whole setup is ergonomically suspect in that my neutral ‘forward’ position results in me looking at approximately the three-quarters mark of the left screen (the gaming screen, though they’re identical).

          I haven’t coded anything at home since I upgraded from my 19″ CRT about a decade ago, so that’s not a concern.

      • I tend to turn off the second screen at times too (I have it on my left) and use it for Foobar2000, documentation and notes and calculator instances, and debugger output, must be a programmer thing. :)

    • rofltehcat says:

      I’ve been planning on getting a second monitor for ages.
      I have enough money for one, enough space on my desk for like 5 more… but then again I can’t really be bothered.
      What size do I go for? The same as the first one? Or go bigger? What contrast etc.? Or get a super cheap old 3:4 smaller screen I’ll just use as a secondary?
      I’d probably profit a lot from it. Most gaming these days I do is playing round-based or slow stuff in windowed mode while doing other stuff at the same time.

      Meh, I’m so lazy and I’ve never been good with decisions… I’ll probably never buy a second one. I probably should have gone full out when we renovated and switched to a beamer for gaming but now that one wall where it’d project the pic is orange.

      • My advise, try and hold out for 21:9 to come down in price and get one that is as small as possible, the wider workspace but without the bezel between 2 or 3 monitors will be worth it, this will be my own plan at least (I’ll probably keep one of the current monitors around but they will most likely be off most of the time).

        Edit: Alternatively wait for the 2K screens to drop some in price or try and hold out for a 4K.

        For example currently I have a 16:9 monitor that is 21.5 inches and native resolution of 1920×1080 this gives it a PPI (pixels per inch) of around 102 which is pretty high, in fact it’s the most pixel dense 1920×1080 monitor size available.
        Larger monitors (with the same resolution) has a smaller pixel density thus you can see the pixels more easily.

        I created a tool to show/calculate the PPI and pixel pitch of a monitor (which may not always be listed in the description of a monitor),
        http://emsai.net/projects/widescreen/dpicalc/

        Hopefully it’ll be of some use.

    • Humanoid says:

      Yeah, I went from 1×22″ to 2×27″ a few years ago. It’s both the best and worst thing I’ve done in terms of my setup. Best because oh my god all that extra space. Worst because I’m in a bit of a bind right now, I want a centre screen for better ergonomics, but would rather not be constrained by size: a 30″ would be perfect, and I know it technically works, but whenever I’ve worked with mismatched panel sizes in the past it’s been quite awkward.

      In hindsight I would have started with 1×30″ and gone from there instead of buying the pair of 27s simultaneously. But then I might have to disassemble my hutch to fit a trio of screens on my desk (it’s one of those L-shaped workstations), so maybe it was just as well.

      And now it seems 4K panels are coming in weird sizes like 28″ just to annoy us. (Not that I plan to buy one, most are still limited to 30Hz which is just short of a scam in my view)

    • wererogue says:

      I recently had to share my mac at work with someone else, so I moved it between us and turned off synergy (cross-platform screen/input sharing). Now every time I need to use that mac I still try to mouse over there…

    • Cuthalion says:

      Question on the dual-monitor stuff:

      I have two 4:3 LCDs side by side, running 1280×1024 each. This is great, except for games. Games will run in the left monitor, but tend to do weird things to the right monitor’s resolution, fail to start at all if I don’t turn the right monitor off first, or run really poorly unless I turn the right monitor off first.

      In fact, I suspect that I might get better fps in general if I intentionally disconnect the right monitor and change my Windows display settings to be single-monitor before I start playing games. Does anyone know if this is true, or if there’s some easy way to make it switch between games-friendly and work-friendly beyond just turning the monitor off?

      My video card is a lowish-end ATI Radeon HD 5450.

      • Humanoid says:

        I’ve only experienced the first thing myself, but it only happens if the game running on the primary monitor is running at a different resolution to the desktop – i.e. it basically only happens when I run a game for the first time.

  10. SlothfulCobra says:

    I’ve recently had to forego my second monitor too, and I’ve taken to overusing multiple windows at once to compensate. I used to just fullscreen everything.

    Two monitors can do things to you though. My eyes were starting to fall out of sync from watching two things at once, and I can’t have my computer around if I ever want to focus on a videogame.

  11. Dude says:

    Congratulations, Shamus and son!

  12. Noumenon says:

    Every dollar you pay in extra taxes because of no interest deduction, represents about three dollars of interest you didn’t pay, so congratulations on your loss of that deduction!

    Wisconsin gives you something back for your property taxes, you disclose your rent less heat if heat is included in rent, and it takes that off.

  13. ClearWater says:

    Isn’t there a law that says that the names of graphics cards become twice as ridiculous every two years?

    • Paul Spooner says:

      It used to be called “The More and More Law” but nowadays it’s “Maximal Grande Gigantor Immutable Maxim Supreme”.

      • Abnaxis says:

        You’re, like, two generations behind on your jargon now. After the MGGIMS, they released the “”Maximal Grande Gigantor Immutable Maxim Supreme CXIII,” followed by the “Maximal Grande Gigantor Immutable Maxim Supreme CXIII Absurd,” aka the MGGIMS113-A, or “the A model” for short.

        I can’t wait for the 30-A model to come out. They’re supposed to weed out some of the tautological redundancies in that architecture, maximizing their PPW (posits per word) throughput.

  14. Norman Ramsey says:

    It sounds like any sacrifices you might have made in the power of the bundle were more than made up for because Isaac got to build his own PC. And it plays all the games he likes. Win!

Leave a Reply

Comments are moderated and may not be posted immediately. Required fields are marked *

*
*

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun.

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>