MS filed a patent application about 18 months ago detailing a means of pay-per-view PCing.
First, rather than pay by the time you have the computer, you pay by the time you use the computer, which means paying something like $1 an hour rather than something like $100 a month.
Second, the amount you pay depends on the intensity of your use. The computer you get would have multiple performance modes and software packages. You would pay less if you were just browsing, you would pay more if you were gaming. All this would be managed by hardware-based controllers.
What should we think of all this?
First, this is just an idea MS is trying to get patent credit for. There’s no sign this is going to actually occur any time soon, or ever.
Second, when/if it does occur, it will become yet another computing ownership option. To be succinct, if computer leasing already holds some attractions for you (and if it doesn’t, this isn’t meant for you), this could make leasing even more appealing. There’s a lot of money spent on buying hardware and software that rarely gets used to its full potential, gets used much, or even gets used at all. Paying-by-use, even if the fee seems very high compared to the overall cost of that being used, could well end up being cheaper if used rarely.
And that’s what is likely to be the major problem with this idea. MS is saying, “We’re going to give you this really hot box, but you’ll only pay a lot when you use its hotness.” That sound good until you realize the leaser has to be given a hot box to begin with. Insofar as wasting resources, there’s no difference between a company buying a $3,000 computer than gets used like a $500 one 95% of the time, and a lessor providing the box. The difference is that the lessor will be the one wasting the money rather than the company. Overall, on average, if somebody gives you a $3,000 box on lease, the average leaser is going to have to end up paying $3,000 over the term of the lease. You can quibble about the exact amount, but no leasing company can stay in business if they pay $3,000 upfront for something just to get on average $1,000 in lease payments. The average person is going to have to pay $3,000 in leasing fees for it, which hardly makes much sense if you need and can buy/lease $500-$1,000 computers for much less moeny. If you’re a company that needs fire-breathing dragons every once in while, just lease a few dragons for communal use and give everyone else plain vanilla for regular work.
Much the same can be said for computers-by-the-hour, six is still one-half dozen of the other; there’s no magic way by which you can make money renting to others for much less than the stuff costs. No matter what, the leaser still has to cover the cost of the box and related services on the average box, whether they charge by the month, the hour, or by the user’s top Minesweeper or Crysis score. Yes, if some leasing firm did this, a company might initially save a lot of leasing money by cutting back on the amount of time computers were used, especially the . . . uhh . . . private use of them by the employees. However, what it would mean in the long run is that the leasing company would just jack up the rates (since the OEMs aren’t charging the leasers any less for the boxes) so you’d pay the same for half or whatever the use).
This is cloud computing without the cloud. There’s no economy of scale here. This is like leasing Ferraris to little old ladies from Pasadena who’ll use it to drive to church on Sunday, and charging them by MPH, hoping that they’ll start going to the track to do quarter-miles on the way back from church. It’s not going to work too often, especially when they can lease a more appropriate vehicle for much less than the would-be Ferrari providers would end up charging.
Where this could make a good deal of sense is with software. You could have economy of scale there, charging by use, and the fee schedule could make sense for both lessee and lessor.
But thinking you can do that in hardware in the way MS is suggesting is just silly.