The Real Bottleneck

News Item: Doom III to not include multiplayer.

This is a game more than three years in the making, coming from a company that pioneered multiplayer, and being released in a gaming environment that is now mostly multiplayer.

Multiplayer will be something that will have to come later.

What does this mean?

This illustrates what is the real crisis in computing these days.

Computer capabilities grow geometrically, while programming capacity only grows arithmetically, but the demands of future computer initiatives grow exponentially.

Let’s take id. It is a small company, about twenty people, and it wants to stay that way.

The problem is you can’t develop programmers like you can CPUs. You can get CPUs and GPUs to double in capacity every 18-24 months, but not programmers.

There are ways to help the programming along, but each answer causes its own problems.

You can get more programmers, but the more programmers you have and the more bureaucratic your organization becomes, the less efficient the organization becomes. The organization spends more and more time telling each other (with varying degrees of success/failure) what they’re doing and less and less time actually doing it. Meanwhile, the cost escalates, as does the required revenue payback.

You can reuse code. You can lower the programming standards and write in compiled languages rather than assembler, and put the burden of optimization on compilers.

But generic code and compilers yield generic results. You get McDonald’s code; it gets the job done, but it’s hardly great.

The Buck’s Been Banged

Whether it’s an application or a game, you do the easy stuff first. With limited resources, you do whatever gets you the best bang for the buck.

The problem now is the easy stuff has been done. Anything major that could be easily done with a business app has been done. The same is true for games.

Now comes the tough stuff, and it’s even rougher than you think, because you reach the area of diminishing returns, big-time.

Here’s a graphic illustration of the problem. Take a look at these four photos:

Bush0

Bush30

Bush60

Bush1

What’s the difference? These four photos take up the following space:

Photo

Saved As

Size

First

JPEG Quality 0

6.4K

Second

JPEG Quality 30

12.6K

Third

JPEG Quality 60

25.1K

Fourth

Original JPEG

135K

The last picture is over twenty times the size of the first. While you probably can tell the difference between the first and the last photo if you
look a bit, it’s not like you think Hillary Clinton is the subject of the first photo.

Of course, all these pictures use compression, and they only show 72 dots per inch (that’s all your monitor can do). The same photo done with 35mm film quality would be over
a thousand times the size of the biggest JPEG shown here. Even with typical broadband, it would have taken about ten minutes to download.

Would it be better? Yes. Is it a thousand times better? Not for most people.

This is the same type of problem software manufacturers face when they try to get more accurate voice recognition or AI, and game developers face when then try to get more realistic faces and bodies or character movement or water flowing, much less game logic.

Chess is a relatively simple task compared to simulating reality. You can have a PC-based chess program play an awfully good game. It took a supercomputer with 64 processors to beat Gary Kasparov.

A rough approximation is easy, but getting closer and closer to reality means exponential jumps in complexity while yielding less and less (relative) results for that effort.

Take the average gaming character. Does he look better than he did five years ago, or move better? Sure, but is there any chance you’d mistake him for a real person caught on film?

More importantly, how much better does he think? Isn’t he still pretty much the village idiot when it comes to fighting you, especially compared to a human opponent?

Why do you think networked games got so big? It’s a lot easier hooking you up to fight a thinking, learning human being than making the program think and learn as well as that human being.

Letting someone else do the thinking leaves much more programming time left for (relatively easy) “boom, boom, boom” stuff rather than much more difficult than “How do I fight this humanoid?”

Money

For truly accurate voice recognition, there’s really not a problem that a whole lot of gigabytes of RAM couldn’t handle. If you had multigigabytes worth of textures built into your video card, that would help quite a bit, too.

But these are non-starters for now because a lot of people won’t pay the hardware costs for it, just like no one calls IBM and orders a copy of Deep Blue to play chess with.

Yes, some day the hardware costs will go down, but the kind of frontier tasks we’re looking at will require magnitudes more computing firepower than we have now.

Even then, if you build it (and just how do you justify the interim generations in the interim), will they come? If you need ten or fifty or a hundred or a thousand times more code to get one of these tasks done well, your development costs skyrocket, too, and you either need to charge a lot more, or need many more people to buy your product.

It may be realistic to expect five million people to buy Doom III, but could you expect a hundred million people to buy Doom V or VI?

By that point in time, can a relatively small company be able to absorb the financial cost of such development, which, after all, could well flop?

Bigger If Not Better

Television and movies and music are often criticized for serving up more of the same old thing and not being diverse.

There’s a number of reasons for that, but one of the good ones is that when the cost of making the product gets so high, people understandably want to make safe bets.

I think the same thing is going to happen in the software industry. As the projects get bigger and bigger and more complex, and the costs of them skyrocket, only big companies will be able to afford them, or, more importantly, be able to afford failures.

Let’s take id again. Eventually, they’re going to have to get a lot bigger, take even longer to deliver product, or try to do relatively less with each new generation of product.

Nor will the world be wonderful once the big guys take over. They aren’t going to be all that willing to take a lot of gambles, either. See Microsoft.

The computer industry does have a problem the media industry doesn’t, though. Most people will keep watching TV or watch movies or listen to music. They aren’t anywhere near as likely to keep buying computers and software for it.

No Perpetual Revolution

There seems to be this widespread belief that rapid progress is inevitable because . . . it’s inevitable. No Christian ever believed in the Holy Trinity more devoutly than many geeks believe in progress.

Actually, even the most devout Christian knew there were pagans and heathens around; geeks often have problems believing that anyone might not want what they want.

However, if you look at computing today, it’s like someone on a motorcycle who’s been zooming away until he reaches a cliff. There’s another cliff a couple hundred feet away, but he doesn’t have the nerve to jump it, and in all honesty, he doesn’t have enough bike to do it with.

So he dawdles around, revs his engine up a lot, signs autographs, runs every which way on the cliff, makes a lot of noise, and does everything except jump off that cliff.

After a while, people notice, and stop paying attention to his latest “feats,” or paying to see them.

That is just what is happening with the computer industry.

Another factor that must be kept in mind is that computers have become so mainstream that they’ve become a consumer electronics item for most people, and they’ve reached the “good enough” point. When that happens, rapid progress is no longer inevitable.

Look at earlier technological wonders that affected the average person, and you’ll see rapid progress at the beginning, then the industry matures and settles down for a long time. Occasionally, it gets spurred by some other technological advance down the road, sometimes far down the road.

If you look at radio, TV, movies and music, you’ll see initial great expansion, followed by long periods of status quo interrupted by brief technology spasms.

With radio, you had AM. Twenty years later, you got FM. Ten years after that, stereo, and really nothing after that until recently.

With television, you had black-and-white, color fifteen years after that, cable fifteen years after that, and HDTV now.

A computer is now considered to be a consumer electronics device by most users, and in all likelihood, it’s going to start looking more and more like one over the next decade: smaller, cheaper, easier-to-use.

This doesn’t mean computer technology won’t continue to advance rapidly someplace; it just won’t do so on the consumer/business desktop.

What Computer Libertarians Don’t Want You To Know

Government (usually but not always the military) gave birth to computing, and it has been and continues to be its often verbally abused mother by the libertarian types.

Look at any major advance in computing technology, from the transistor to the integrated circuit to the Internet, and government is either responsible for it or nearby. It pays the huge costs of technological R&D that private industry later exploits.

Take the programming bottleneck, for instance. The Pentagon certainly faces the same programming bottlenecks private industry faces, but unlike private industry, it has billions of dollars to throw at a possible solution.

If you look at what the military wants out of computers, they certainly run into the same types of problems facing public industry, problems like, “Make the cruise missile hit this house, not that house” or “Pilots cost too much. We don’t want Top Gun, just the gun.”

It is likely that cutting-edge computing will head back to the mother teat for a while until a new generation of breakthroughs lead to a new generation of computing: a road between those two cliffs for the biker to ride across, no doubt cursing the Federal government all the way across. 🙂

Ed

Be the first to comment

Leave a Reply