My GF3 Cooling Mod

Our test subject today is a Gainward CardExpert Geforce3 Ti200 64MB Golden Sample card. The project is to remove all stock cooling apparatus and replace it with something that does a better job.

Why? Two reasons. One, so I can overclock the card to match the speeds of a Geforce3 Ti500 or perhaps even higher. Two, because I needed a project to keep me busy :).

With the help of a friend’s digital camera, I was able to document this tale of bewilderment, head scratching, salvation, and completion. Before we begin, here’s the hardware I used to get the job done…

– (1) Delta 60mm x 60mm x 10mm fan purchased from
Plycon Computers

– (1) 50mm x 50mm x 10mm fan I had laying around

– (2) 60mm x 60mm x 10mm heatsinks purchased from
Cofan USA

– (2) #4 1 1/4 in. machine screws

– (4) #4 nuts

– (4) #4 star lock washers

– (2) #4 nylon washers

– (2) #4 1/4 in. nylon standoffs

– Arctic Alumina epoxy

– Arctic Silver II thermal compound

– (3) #6 1 in. sheet metal screws

– (3) #6 1/4 in. nylon spacers

Stripping It…


Here’s the card before anything was done to it.


While the orb-style cooler looks kind of neat, it’s a number of parts rather than a solid piece. Being in parts hinders a heatsink’s ability to transfer heat, so it’s definitely going to need replacement.

The first step is to take off the orb cooler and the stock ramsinks.

The orb came off without incident. I was actually surprised to find a generous amount of generic thermal paste underneath.

It was tough to tell how the ramsinks were attached, though. Glue? Adhesive thermal pad? Either way, you take it off the same way.

I put the card in an anti-static bag and taped it up. I then placed that in a large zip lock bag and tossed it in the freezer.

After 45 minutes I pulled it out. Using a business card underneath the tip of a small flathead screwdriver, I twisted the ramsinks upward until they popped off. The whole process was painless.

It turned out that the they were held on with adhesive thermal pads that left little to no residue on the memory chips. To be safe though, I used a Q-Tip soaked in carburetor cleaner to wipe them off. I also did the same with the Geforce chip.

Here she is stripped down…


Cutting and Pasting…


Next came the hard part: cutting the heatsinks to fit on the chips.

I tried this with my dremel tool first. What a wasted effort that was.
The aluminum was a bit tougher than I thought which forced me to bring out the big guns.

I grabbed my gear and headed to a friend’s place. Along the way, I stopped off at a Sears and picked up a metal cutting blade to fit in his bandsaw. All I can say is it cut like warm butter using that.


I had two heatsinks and two things to do to them.

First, mod one of them to fit over the Geforce chip. This meant drilling holes for the mounting bolts and making sure said holes lined up perfectly with the holes in the card.

Next, chop the other oneup into four 22mm X 27mm parts to fit on the memory chips (two chips per heatsink).

Using the bandsaw, the dremel, my pistol drill, and an electric grinder, things went pretty smooth. After all of that the bottom of the heatsinks were in need of some serious lapping. A trip to Walmart to snag some 400 and 1000 grit sandpaper fixed all that. I lapped all five heatsinks until their bottoms looked like mirrors and ended up with these…


I took some Arctic Alumina epoxy and applied it to the memory chips. I mounted my new homemade ramsinks on them and let this dry overnight.

The next day, I used some Arctic Silver II thermal paste on the Geforce chip and mounted my other heatsink.

There I decided to toss the little plastic mounting pins aside and instead go with two #4 1 1/4 in. metal machine screws. On the topside of them I used a couple of nylon washers to hold them off the heatsink.

Here’s how it looked at this point.




Next up is the Delta fan. There I used (3) #6 1/4 in. nylon spacers and (3) one inch #6 sheet metal screws to hold it on.



On the bottom, I had to use two nylon spacers to keep the metal nuts from making contact with the board. If not for some small electrical thingies there, I probably wouldn’t have done that but I was taking every precaution.

On top of each nylon washer, I put a star lock washer and then a bolt. Since the tightness of the bolts determines how “hard” the contact is between the heatsink and the Geforce chipset, you want to make sure they are tight but not too tight. The lock washers will keep the bolts from slipping later.

On top of those bolts I mounted the 50mm fan blowing down on the card. Then on top of that, I put another lock washer and another bolt. This fan was needed after I found out how hot the back of the card got opposite the chipset. Better safe than sorry.


A Fan Sandwich…


After that, my card looked like it was in a “fan sandwich”.




It Loves It, It Loves It Not…


I have to say that I was very nervous when I put the card back in and pressed the power button. The way my luck runs, I felt like I had probably zapped the card with static electricity with all this carrying around.

Imagine my sense of doom when the monitor failed to come on when I hit the power!

Then I decided that it was probably a good idea to plug the monitor cable into the video card. 🙂

I watched in glee as I made it past POST and the desktop loaded up.

With everything in place I began pushing up the core and memory speeds and then testing the settings with 3DMark 2001SE to sample stability. So far I’ve been able to get to 240 core and 545 memory and she’s as solid as a rock.

My 3DMark scores jumped over 400 points to land at 6229. It’ll probably do a little better than that, but I think I’ll keep it at this speed for a while and test it out. At the very least, it looks a lot better than it did before :).

(BTW, I initially used Nvmax to adjust the core and memory clocks while messing with the settings frequently. Once I decided to keep it at 240/545, I used information from Ray Adams Nvidia BIOS Collection to modify the original BIOS on the card to run at those settings and enable sidebanding.)

After everything was said and done, I more than exceeded my expectations on how this would turn out. If I had it to do over, I wouldn’t change a thing…


Be the first to comment

Leave a Reply