• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

8800GT BIOS mod...success!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

jason4207

Senior Member
Joined
Nov 26, 2005
Location
Concord, NC
I received my GT yesterday afternoon. I went home early just to make sure UPS didn't leave one of those stupid notes I have to sign instead of the package. UPS actually called me (pre-recorded message) on Sunday to inform me the package wouldn't be left w/o someone present to sign for it.

I did the step-up through eVGA from my 320GTS. I received the vanilla GT w/ 600/1500/900 clocks. Many folks got SC factory clocked cards despite the fact the box claimed it was vanilla. I was not so fortunate.

My first attempts at OCing were not very promising. I was only able to get 712/1674/929 (pretty low) to be artifact free under ATI Tool w/ 100% fan on the stock cooler. I was very curious why my RAM was doing so poorly w/ people claiming over 1000 was pretty easy. I OC'd each slider in turn to see what each could do on its own. I was able to get a little higher on the core and shader when I did this, but when I put them all together it wasn't artifact free until I dropped them a click to what you see above.

I got around 14,500 in 06.

I played some Crysis demo at these settings, but I turned the fan down to 67%. No issues. I was able to play at 1280x1024 (didn't try a higher rez) w/ almost all very high settings; no AA. I turned post-processing down to high which increased a jumpy 15-25fps into very playable 25-40fps. I then turned down shadows to high, but I didn't see much improvement in fps.

I was thirsty for more, so I tried the BIOS mod I've been reading about over at XS. I followed these directions, but I skipped the 1st few steps b/c changing the 1.1v entry to 1.2v has no effect. According to the creator of Nibitor the 1.1v is just a label. Changing the label doesn't change the VID.

Originally Posted by Mavke
Yes, and it is not capped. The voltage circuitry on NVIDIA based graphics card only looks at the VID xx (the label 1.1V or 0.95V is just for you guys ti understand what each VID give sin terms of voltage). The card just looks at VID xx, and for the GeForce 8800 GT, the VID 00 is 0.95V, so having set the core at VID 00 the GPU voltage circuitry will recognize VID 00 and run at 0.95V. And VID 03 is just 1.1V... So if there would be a VID 04 then maybe we could get 1.15V...

Now I thought these cards were putting out 1.1v under load stock, so what could changing the VID to 1.1v really do? Well if you look at post #31 in the BIOS mod thread over at XS you will see the default VID is 1.05v. So changing it to 1.1v might actually provide some good results. And it does!

I did the mod and am happy to report that my max stable OC climbed to 741/1728/951! I'm not sure why the RAM started doing better, but I'm not going to complain. As before I was able to get a little higher on the shaders, and core when I OC'd them independently, but when I threw them all together I had to back down a bit. The higher shader clock allowed finer control of the core clock, so I was able to tweak the core to 741 from 739, and remain stable. Now I only tested this w/ ATI Tool for about 15mins, but no artifacts in that time.

I'm still on the stock cooler, and my max temps (ATI Tool load) at 100% fan went from about 67 on the stock BIOS w/ 712/1674/929 to 70 w/ the modded BIOS and 741/1728/951. Not bad at all!

14,860 in 06! Didn't have time to play Crysis anymore b/c it was too late last night. I will try tonight, and see if 67% fan will keep the card in check under the Crysis load.

With the rumors of RAM dieing at 1000 or higher I'm happy w/ 950, but am still wanting more speed out of my core/shaders. My next step will be to get the HR03GT (when released), and try the volt mod w/ a variable resistor. I'm hoping for results in the 800/2000 range! Wish me luck!

BTW I'm sorry for not posting any pics, but last night I was more concerned w/ just getting everything working right. Now that I'm at work I feel the urge to share my experience, but I have no ss to back me up. I'll try to add some pics later.

:beer:

========================================
Update:

Alright I finally got 3DMark06 to submit. New high! ORB listing.

14872oc4.jpg


Some show-off shots:

pb140136wm8.jpg


pb140138ez2.jpg


pb140141ga7.jpg


pb140143oo6.jpg
 
Last edited:
Wait so you can up the voltage on the GT even more?

So stock is 1.1V and well your modded gives it 1.15V?

Mmmm more core/shader speed I like the sound of this.
 
Well I thought stock was 1.1v, but some people measuring w/ DMM are getting closer to 1.05-1.07v. After this mod they're getting 1.12-1.15v. Plus, F@32 (at XS) was showing that the default BIOS voltage was 1.05v. This mod changes that to 1.1v.

I haven't tried to measure voltages yet w/ my DMM. I don't want to put any solder to the board until I'm ready to do the VR mod, and I don't want to just stick my probes in there. I'm afraid I'll short something. I'll solder some leads on there and do it right when I get a new cooler, and am ready to try the VR mod.

I do know that it works, though, and my temps went up a little bit too. Vgpu must have been increased. Easy mod with good results!
 
This just got posted at XS:

Just did the 1.1v mod

Voltage readings and oc
Before
idle 1.09V
load 1.13V
oc: 730(740 for 2k1)/1840/1070

after mod
idle 1.14V
load 1.18V
oc: Still testing but 780/790 was without artifacts. Shader @ 1940 now

It helps to confirm that the mod does work.
 
I know just saw that. Will have to figure out how to flash from my flash drive once again.

I taste faster speeds :)

How much did your temps go up? I mean mine runs pretty cool so I have no problem with a little higher load for more performance currently. At least til I can get me a aftermarket cooler.... Hurry up Thermalright....
 
you lot seem to be getting massive shader OC's , do you runn in linked mode or seperate ?
i have yet to really bump the shader yet but im WC and getting temps in the mid/high 40's for load :)
side note i was a t lan over the weekend and a guy there was running a Q6800 @4gig ish on water with 8800gts , stock cooling and getting over 20k in 3d06 ! dont think he had the cards clocked as high as you guys either !
 
shaders unlinked. Only way to get them up there. I like to run each slider up individually first to see what they'll do on they're own. The same way I OC the rest of my system; I try and isolate the variables if possible to maximize potential and save time.

The guy that got 20k...did he have (2) GT's in SLI, or (2) GTS's in SLI, or (1) GTS? First you said "gts", then you said "cards", so I'm a bit confused.

If I had to guess I'd say he's running (2) 8800GT's, though. That and a quad at 4GHz will get you there. He probably paid an arm & a leg for his system, though. I'm pretty close to him, and I probably spent the same amount on my entire rig as he spent just on the CPU! Can you say bang4$?

edit: Since you're under H2O you should try this BIOS mod to help get your core/shader clocks up. You should really try the VR mod and go for super clocks, but I can understand not wanting to go there.
 
Defaintly unlink the shaders. Otherwise it will take a while to get a good OC. As well suppositly with the shader at or over the 1836 clock, suppositly I never really looked at it but suppositly that will fine tune the core clock in 3Mhz incriments.
 
Dang... 14K+ in 06.... My "old" GTS only does 12,100... :cry:


What version of ATI Tool are you using?

I score 12750ish currently with my setup. Plan to get 13k pretty easily if I clock my CPU to 3.6Ghz again and more so when I try for 3.8Ghz. As well as my bios mod with more voltage :) All more to help my clocks :soda:
 
@ SuperDave - beta .27 Beta2...the latest, I think?

Edit: I only use ATI Tool to test for stability. I use RT to OC, and monitor temps/speeds.

Edit2: 12,100...you must have the 320GTS. That's what I just traded in to step up to this. I was able to get to 12,360 or so after some DDR2 memory tweaks and 621/1620/918 on the card.

@ deathman - temps went up a whopping 3*C!

I was getting finer core clock control at 1728. Didn't try lower, though. I had 739, 741, 745, & 756 as available options to fine tune my core clocks at 1728 shader. I was able to run 756 when I OC'd just the core, but was getting artifacts after a couple minutes. I dropped it down to the next step which was 739 w/ the shader at stock. After finding 739/1728/951 stable I inched the core up until I had problems again at 745. Settled for 741/1728/951!
 
Last edited:
3C Nice :)

I personally can't use ATITool to test stability so thats a no go for me. I know in Vista 64-bit it has issues due to no driver signing. Not sure can I even run the artifact tester in ATITool? I relay on if its stable in 06, bump it down to the next notch for shaders and then about 20 on the core and im good to go right now. But as well found that light program that seems to heat the card up well beyond what Crysis and other games do and seems to work nicely for stability testing as well.
 
thats sweet... so it looks as though the stock bios with these cards in 3dmode is 1.05v... so u can bump it up to 1.1v for a lil extra thats pretty tight, wich my 8800gts could do something like that.
 
3C Nice :)

I personally can't use ATITool to test stability so thats a no go for me. I know in Vista 64-bit it has issues due to no driver signing. Not sure can I even run the artifact tester in ATITool? I relay on if its stable in 06, bump it down to the next notch for shaders and then about 20 on the core and im good to go right now. But as well found that light program that seems to heat the card up well beyond what Crysis and other games do and seems to work nicely for stability testing as well.

I run a dual-boot; XP-32, and VISTA Ultimate-64. I do all my stability testing and benchmarks in XP...actually do about everything in XP. I use VISTA-64 just for DX10 games. I like to know I'm getting the best possible graphics even if they aren't that much better! I use XP to find maxes, and then just set those maxes in VISTA-64. Haven't had a problem yet.

I downloaded that light program as well on your recommendation. It seems cool, but I stuck w/ the tried and true ATI Tool. I just like the fact it is so easy to see if you are having issues. Yellow dots. Also, it makes an annoying sound like your holding down a keyboard key too long if you get an artifact. The wife called me in for dinner yesterday, and I heard the sound while eating. I went and nudged the OC down a notch and returned to the table. I like that...I don't have to constantly stare at it. That light program is cool, but you have to sit and watch it to make sure you don't get any artifacts popping up.

In ATI Tool I press "show 3D image" or whatever it says. The spinning cube pops up, and the video-card starts heating up. Press "find artifacts", and the cube stops, but somehow the GPU is still getting taxed very hard. The temps keep climbing or stay high, so I know it's doing its thing. Then you will either see yellow dots immediately or w/n a few minutes. If you get to 10-15mins w/o artifacts you are doing pretty good, and that is as far as I've gotten. The timer resets if you get an artifact, so if you walk away it is easy to tell if your getting artifacts while gone. I guess I could run it overnight, but as long as I don't have any problems in games I'm going to keep it here.

:beer:
 
yeah meant 2 * GT as for paid for the rig ..... he works for intel as did the top scorer there (intel sponsored LAN, they gave away 10 q6850's to people by draw for those who put their sytem in for benching)
top score was another sponsored rig(well employee) 2 gtx's q6800+prommy got 23kish .
as for vmodding my cards might do but prob not till imsettled witht he damn sink... yet another one fell off not sure what ive got it on but it gets hot !
 
I run a dual-boot; XP-32, and VISTA Ultimate-64. I do all my stability testing and benchmarks in XP...actually do about everything in XP. I use VISTA-64 just for DX10 games. I like to know I'm getting the best possible graphics even if they aren't that much better! I use XP to find maxes, and then just set those maxes in VISTA-64. Haven't had a problem yet.

I downloaded that light program as well on your recommendation. It seems cool, but I stuck w/ the tried and true ATI Tool. I just like the fact it is so easy to see if you are having issues. Yellow dots. Also, it makes an annoying sound like your holding down a keyboard key too long if you get an artifact. The wife called me in for dinner yesterday, and I heard the sound while eating. I went and nudged the OC down a notch and returned to the table. I like that...I don't have to constantly stare at it. That light program is cool, but you have to sit and watch it to make sure you don't get any artifacts popping up.

In ATI Tool I press "show 3D image" or whatever it says. The spinning cube pops up, and the video-card starts heating up. Press "find artifacts", and the cube stops, but somehow the GPU is still getting taxed very hard. The temps keep climbing or stay high, so I know it's doing its thing. Then you will either see yellow dots immediately or w/n a few minutes. If you get to 10-15mins w/o artifacts you are doing pretty good, and that is as far as I've gotten. The timer resets if you get an artifact, so if you walk away it is easy to tell if your getting artifacts while gone. I guess I could run it overnight, but as long as I don't have any problems in games I'm going to keep it here.

:beer:

Oh with the light program if the OC was too high it would just drop back to desktop for me.
 
yokomo - what are you using to hold the sink on there? Is it just some thermal tape that came w/ it? I've read a lot of people having luck w/ scrapping the tape off, and using some good TIM like ASC or MX-2, and using a tiny spot of crazy glue in two corners. I haven't done it yet, but plan to when i get the HR03GT.

deathman - good to know about it just crashing to the desktop. I was staring at it looking for artifacts! I still think the way ATI-Tool works is better for my situation, though.
 
Holy crap, that '06 score is insane.

I can barely scratch 14k with my Ultra overclocked.

Once you pick up your 9650 and get your 2 extra cores scored in '06, it'll be a lot easier getting past 14k. :) My ultra stock and my temporary quad at very low clocks (close to stock, definitely below 3ghz) still scored ~14200.
 
Back