Much was made recently by Microsoft about the “Apple Tax” – a closer look at how much it costs to run a Microsoft OS shows substantial not-so-hidden costs.
There is no doubt that Apple’s hardware is more expensive than for a “PC” – the PC hardware business for everything but Apple is extremely competitive, characterized by major and minor players working with margins that are razor thin. I love competition and overall consumers have benefited when buying PCs.
Or have they?
Buy a PC and you get a Windows OS with it – like it or not, that’s the game. And along with the Windows OS, you pay more than the raw cost of just the OS – you also pay in three areas that turn out to be fairly substantial costs:
Need I mention that due to glaring security gaps in the way Microsoft engineered its OS, an industry grew up to protect Windows PCs against the evil doers who routinely mount attacks of varying degrees against your PC. What I did not appreciate is how large this business has become. According to the Gartner Group, the software security industry racked up $13.5 billion in 2008 – that’s not chump change.
What’s not so obvious on this number is that there is a significant expense on the server side to protect clients against who-knows-what threat of the day. Email for example is a favored carrier for all manner of scams and malware, and ISPs spend considerable sums to protect their clients (and themselves) with security gear and software products.
Guess who pays?
Consider the $13.5 billion you, dear reader, absorb one way or another your cost for using a Microsoft OS. Even if you use Linux, your ISP is still paying for security gear and you pay for it every month.
Now let’s hand it to Microsoft for owning up to this expense and the vulnerability of their OSs by developing their own anti-malware “free” (not free – Microsoft will not absorb the cost gratis) security software, now called Morro. How effective and how consumers will react is a bit of an unknown, but I’m fairly certain that the Joe-sixpack crowd will jump on it at the expense of the likes of Symantec. And I reiterate – it’s not free, you pay for it as part of Microsoft’s operating costs; it’s just hidden.
An OS is developed under certain assumptions regarding the hardware that will run it. Microsoft developed Vista assuming PC hardware was on an ever-increasing performance curve. Ergo consumers discovered that the privilege of running Vista required a hardware upgrade. Fast forward to the real world and we find that contrary to Microsoft’s assumption, in fact we see hardware going in the other direction; need I say netbooks? Need I say net-tops?
The fact that netbooks are running under XP is a testament to how much consumers changed their PC fix to the detriment of Vista. Son-of-Vista Windows 7 recognized this issue by emasculating a version of Windows 7 specifically so it can run on under-powered netbooks (maybe “sufficiently powered”?).
I have an old, small laptop that has 64 MB of RAM and uses a Pentium 366. By any means, it’s a hardware relic. I did manage to run XP on it, but sometimes you can see the screen slowly painting across the LCD. I don’t think you’ll see the same with the emasculated version of Windows 7, but you will not get all the eye-candy that comes with it. And of you upgrade, you will see a performance hit. I used my son’s Dell Mini with XP and for me, an excruciatingly slow experience.
So we have another expense for hardware to enjoy the “Windows Experience”. I have no such expense in using Ubuntu – it works with minimal gear just fine. Fortunately hardware costs have come down, especially for RAM, but nonetheless you pay extra to run a Windows OS acceptably.
And what does “acceptable” really mean? Like it or not, Microsoft engineers its OS under the “bigger is better” school. Each subsequent version of its OS incorporates more eye-candy and features than the last – otherwise why upgrade? Aside from the fact that old OSs will lose support at some point, Microsoft has to obsolete its older OS to keep the juggernaut running profitably. Persuading consumers to pay for a new OS requires a feature set that is clearly to the consumer’s advantage, such that it overcomes the “purchase hurdle”.
Vista did not do that, partly I feel due to a feature set that was underwhelming. In addition, this feature set had performance issues that become well known and I think a direct result of the “bigger is better” development scheme. I don’t know how many lines of code were developed for Vista, but it was substantially more than for XP.
I use Ubuntu and one thing I see right away is that it boots up way faster than Windows XP – and shuts down in under 10 seconds compared to about a minute for XP. My son has a Dell Mini and he recently turned it into a “Hackintosh”. The single most notable thing I experienced in using it was how much faster it is using a Mac OS compared to what I saw with XP – “night and day” is about the right phrase. It’s now fast, does not bog down and love it or not, Mac’s OS is (IMHO) a better OS overall than Windows.
So how much time is spent among the masses waiting for a Windows OS to boot or shut down? How much time is spent waiting for things to happen when you load an app? How much time does the IT guy spend on keeping things humming for Windows users that call with performance issues?
Anecdotes I hear from IT guys that have a Linux population that their calls fall to almost zero. More complex systems just naturally tend to do unanticipated things than less complex systems, and organizations and individuals pay for this complexity in time and effort to keep things running well. Think of the number of third party apps geared to keeping Microsoft’s registry from bogging down system performance and you get a hint of the hidden costs.
These three areas are the hidden (maybe not so hidden) costs you pay when using a Microsoft OS – protection against malware, upgraded hardware and the time the hidden costs for bloatware. I may spend some time to develop a cost estimate, but for now my guess is that it’s as much as the supposed “Apple Tax”.