The Price of Progress
2010-01-21 — Faster hardware, slower software. What's wrong with this picture?I recently installed the beta of Microsoft Office 2010, and the first thing that struck me is how it performs noticeably worse on my 3.0 GHz quad-core gaming PC, than Office '98 performed on a now 12-year-old PowerBook G3, powered by a little 250 MHz PPC processor.
You can probably guess the next stage of this anecdote... Office '98 on that G3 performed ever-so-slightly worse than Office 4.0 on a truly ancient PowerBook 180, which sported a fantastic (for the time) 33 MHz Motorola 68030 CPU.
Now, I am not being entirely fair here - the spellchecker is much faster, the grammar checker didn't even exist back then, the user interface only had to render at a 1024x768 resolution in those days, and various other ancillary features have been added and improved. But the core issue remains, Office 2010 (or 2007, which is not in beta) running on a decent gaming PC, takes longer to launch and is less responsive to keyboard input than Office 4.0 on an 33 MHz 68k.
And the problem isn't restricted to Microsoft products alone, as many pieces of software have suffered the same sort of creep, not least among them the Mac and Windows operating systems.
In the open-source world and among smaller developers this phenomenon is far less common: a well configured linux or BSD installation boots in a handful of seconds, Blender (sporting most of the features of expensive software such as 3DS Max and Maya) launches immediately and always remains responsive, and Maxis' Spore takes minutes to start up and load a game while Eskil's Love throws you into the game in under 10 seconds.
My current computer is many thousands of times faster than that PowerBook 180, so in theory at least, we should be able to do far more, and do the same old things much faster. Why then the slowdown?
It can't be lack of resources - we are talking about companies such as Microsoft, Apple and Adobe, all with enormous R&D and development budgets, and teams of experienced programmers and engineers. Besides, the open-source guys manage just fine, some with just a handful of programmers, and most with no budget whatsoever.
It has been argued that programmer laziness (a.k.a. badly educated programmers) is to blame, but I am not sure this can be the entire story. Certainly the 'dumbing down' of University-taught computer science hasn't helped, nor has the widespread rise of languages that 'protect' the programmer from the hardware, nor the rise of programming paradigms that seek to abstract away from low-level knowledge. But that is the topic of another rant, and is somewhat tangential to the topic at hand. Companies can afford to hire the best programmers, and could if they wanted to, create the demand necessary to reform education practices.
And that brings us to the real heart of the issue: software developers measure success in terms of sales and profit. As long as your software sells, there is no need to spend money on making the software perform better. And if you happen to have a virtual monopoly, such as Microsoft's Office or Adobe's Photoshop, then there is no incentive to improve the customer's experience, beyond what is needed to sell them a new version each year.
However, when you lose such a monopoly, the game changes, and it generally changes for the better. When FireFox, Opera and later Safari started cutting a swathe into Microsoft's Internet Explorer monopoly, Microsoft was forced to adapt. The latest version of Internet Explorer is fast, standards compliant, and relatively free of the virus infection risks that plagued earlier versions.
This outcome of the browser war has led at least a few to the conclusion that open-source is the answer, and that open-source will inevitably recreate what has been developed commercially, and either surpass that commercial product, or force it to evolve. Sadly, I don't see this happening particularly quickly, or on a wide scale - OpenOffice is playing catch-up in its efforts to provide an out-of-the-box replacement for Microsoft Office, GIMP lags far behind Photoshop, and linux, despite widespread adoption in a few key fields (namely budget servers and embedded devices) still lags far behind Windows and Mac in many areas.
For many years this wasn't a problem - every few years you would buy a new computer, typically an order of magnitude faster than the computer it replaced. If new versions of your software consumed a few million more cycles, well, there were cycles to burn, and besides, the hardware companies needed a market for faster computers, didn't they?
Nowadays the pendulum is swinging in the opposite direction. Atom powered netbooks, Tegra powered tablets, ARM powered smartphones - all of these promise a full computing experience in tiny packages with minimal power consumption. Even though the iPhone in your hand is considerably more powerful than that 33 MHz PowerBook 180, it doesn't have even a fraction of the computing power offered by your shiny new laptop or desktop. And users expect a lot more than they did in the early nineties - animated full colour user interfaces, high definition streaming video and flash applications, oh, and don't drain the battery!
CPU cycles today are becoming as precious as they ever were, only now many of our programmers have no experience of squeezing every last drop of performance out of them. Has the business of software development come full circle, and once again become the territory of the elite 'low-level' programmer?