When you look at video cards, look at the rated speed of the memory on it. – Ed

We’ll soon see GeForce4 Ti4200 cards coming out. This is supposed to be the GF4 card for everybody. Is it?

Maybe, maybe not.

Pay More, Get Less

The Ti4200 will come in two flavors: a $180 64Mb version with memory running at 500Mhz, and a $200 version with 128Mb with memory running at 444Mhz.

So for $20 more, you get more memory, but it’s slower. In all likelihood, the initial 64Mb cards will come with 4ns RAM, and the 128Mb cards will come with 4.5ns RAM (For comparative purposes, the Ti4400s so far all are coming with 3.6ns RAM, and the Ti4600s are coming with 2.8ns RAM, all DDR, of course.)

How much does it matter?

If you play a not-too-demanding game (which describes most games nowadays) at relatively low resolution (1024X768 or less), you’ll do better with 64Mb than 128Mb, simply because it takes the video GPU a little more time to access more RAM. (If the additional RAM is also slower, tack on a few more percentage point difference).

For that matter, for most current games, at 1024X768 or less without quality extras like 4X AA, you aren’t going to do much better than a Ti4200 no matter what you buy.

The more either you (through increased resolution or quality settings) or the game demand from the card, the better the more expensive GF4 Ti cards look, though in most cases, you’ll live without them.

If, however, you like 1600X1200 and/or 4X AA, with all the eye candy turned on, the Ti4200 may not be for you.

Outside of those folks, though, for now and a while to come, that $180 card actually looks pretty good for many if not most people.

However, if you can’t replace video cards every year, it looks to be a bit less of a bargain, especially if you like to play cutting edge games.

For example, take a look at this. Here we have a benchmark-of-the-future that brings even a Ti4600 to its knees, and having only 64Mb of RAM slices performance in half.

The problem is the answer to that won’t be “buy the 128Mb 4200 with even slower RAM” but rather “wait to buy the 4400 when it gets cheap,” which has its own problems. You’ll wait a long time, and it is likely that some future generation nVidia card will outshine it just when the 4400 gets into your price range (as happened with the GF3s).

Overclocking A Video Card

There’s two main items on a video card that can be overclocked: the GPU and the RAM.

While it looks like nVidia does a little cherry-picking to ensure that the fastest GPUs end up in the most expensive cards, initial reports indicate that getting the GF4 GPU up to 300Mhz or a bit more is no big deal.

However, memory speed is more important than GPU speed on video cards. That’s usually the bottleneck, and once you hit it, increasing the GPU speed does you no good.

Generally, you can overclock video RAM somewhere between 10-20% (please note that overclocking video RAM too much can cause permanent damage, so when testing, watch the screen carefully, and if you see warps or tears while testing, scale back. Providing additional heatsink and/or cooling to RAM chips may also help).

RAM is not like a genii. Your wish is not its command. If there’s 4.5ns RAM in the card; it’s not normally going to reach as high or higher memory speed than a card with 4ns RAM. 99% of the time, the 4ns will do better.

If you do only one thing in comparing video card, you should look for the fastest rated RAM available in the package you’d prefer. While you’ll usually not find them in the first generation of products, you’ll usually find at least a few manufacturers stick in faster RAM in the second.

The 128Mb 4200s will probably be the most likely candidate for a speed bump. You might even find a few vendors offering 4ns RAM in the first generation. At least it’s something to look for.

Even with the 64Mb cards, especially if you decide to wait a while, you might find some cards offering 3.8ns or possibly even faster RAM eventually.

Will it make any huge difference? No, a few percentage points, but if a little research gets you some free performance, why not?


Be the first to comment

Leave a Reply