Saturday 1 December 2012

Opinion: Programmers getting lazy?

It's no secret that modern PCs are more powerful than their predecessors.  CPU power is up by orders of magnitude, multi-cores that are even able to process multiple threads at a time.  This is in turn backed up by graphics cards that leave things from even as little as 5 years ago to shame and more RAM than some 10-year-old PCs had in total storage.  And the best is arguably still yet to come.

So why, then, are we hitting a wall when it comes to software?


There's no doubt that modern software is superior in many ways to its older counterparts.  This, of course, is most noticeable in videogames.  Over the course of 30 years, we've gone from around 60,000 pixels using just 16 colours to multi-megapixel screens, sub-pixel accuracy and literally billions of colour options.  From single-layer 2D representations to quasi-realistic 3D worlds, sometimes even in 3D.  Over the same time, intelligence and immersion has also increased, though perhaps not quite by the same amount.

But has all this extra power started to make programmers relax their efforts a little too much?  When storage and power simply aren't considerations anymore, there's less drive to ensure that what you're doing is being done in the best possible way - or even in a pretty good way.  Once upon a time, things such as texture mapping and sometimes even solid polygons were pushed aside in preference to keeping performance up; even though "up" in this case was still a framerate in the single digits.  Now, though, the expectation - not entirely unreasonably - is that something powerful enough will be coming soon, so it's not really a problem.

If I had to pin it down, I'd say this line of thinking really started in the early '90s - courtesy of a fellow named John Carmack, his company iD, and the Doom engine.  Doom, at the time, could not be run with its settings at maximum on any commercially available system without a sub-par experience.  It wasn't until the 486DX2 had reasonable availability that you could finally enjoy Doom the way Carmack had intended.  By the time Doom 2 came out, this of course was all a non-issue as it used exactly the same engine, but Quake was just around the corner and the whole process started over.

More recently, of course, the attention has been focussed on CryTek's Crysis, made infamous by the phrase, "Does it run Crysis?"  Even today, on machines significantly more powerful than those which were available at the time of its release, Crysis still cannot be run with absolutely every setting turned up to its maximum while giving an optimal experience.  My own PC, which is no slouch, still has to leave FSAA disabled to average in excess of 60fps with everything else turned up as high as it can go while running in 1080p.

Sometimes, poor performance has a reason.  Sometimes developers are legitimately trying to do something which the PCs of the day simply cannot do.  Carmack, for example, pushed the envelope with each of his major titles - Wolfenstein, though not the first publicly available 3D title, is easily the most well-known and often considered the father of first person shooters; Doom added elevation and perspective correction; while Quake was arguably the first fully-3D game, in both view and mapping, signalling the dawn of modern 3D gaming.  Many will argue that Crysis pushes similar envelopes, but personally I'm not convinced as there are many other titles which are similarly striking but offer significantly better performance and gameplay.

But Crysis is not the worst offender I've ever seen - nor is it the most recent.  Not even the likes of The Witcher 2, with its "Ubersampling" feature takes that crown.  No, the worst I have seen in recent times goes to probably one of the simplest games I've seen in a very, very long time - a top-down, Asteroids-type shooter called "Violet Storm" which I found on the Microsoft store.  Visually, it appears to be little more than a collection of transparent shapes which could be laid out on single polygons - circles and triangles, for the most part.  And yet, it seems that the programmers either forgot their target audience or simply didn't care to check their code.

What bothers me most about this particular title is that it just runs badly on what should otherwise be a decent enough system.  I refer to a Samsung XE700T1C, with its dual-core, quad-thread Ivy Bridge Core i5 running at 1.5GHz, 4GB of RAM and the onboard Intel GMA4000 graphics.  The problem is that, on battery power, as soon as a particular enemy gets onto the screen (it looks like a collection of about 8 triangles) the whole experience tanks something severe.  Since the game appears to be aimed at tablet users (it features two "virtual" analogue controllers on-screen) and the average tablet is not going to be plugged in while it's in use, this would appear to be a fairly major oversight from the developers.

Now I realise that's just one example out of quite literally thousands of possibles, but it still begs the question: are programmers just not doing as well as they could anymore because they can get away with it?  And should we, as the buying public, accept such practices?

Please remember this represents an opinion based on information assumed to be accurate at the time of writing.  You are free to agree or disagree with me, but please remember that I'm entitled to have an opinion just as much as anyone else.

No comments:

Post a Comment