• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Beginners Guides: Overclocking the Nvidia Videocard

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
calcal said:
hrmm looks complicated

so if i flash to GT bios i can definetly get 16 pieps and 6 vertex?

You can unlock the pipes with the Rivatuner without flashing but its a great chance that you will get artifacts since not enough voltage is supplied so therefore you need to flash your card to give it more Vcore just like with CPU's where you feed it more voltages if its not stable.
If this is to complicated for you or you don't have understanding how to do it then I would strongly suggest that you don't do it at all since you could damage your video card.
Its better to run at stock settings then not to run at all. Do more reading on this if you are interested but everything you need to know about flashing is provided in that guide. You just need little understanding with DOS and that's about it, everything else is smooth sailing.
 
Driver list has been updated!

Latest edition 78.10 which have modified .inf file that makes the driver compatible with most GeForce graphics cards and laptop solutions. So if you are laptop owner with 6800 vid card you should get these.

Also to note that some of the latest modified Forcware come with Coolbits built in and you will be presented with option to install them while installing the drivers.
 
**Moon999** said:
hy guys

which program is beter to do overlocking RivaTuner or Power Strip?

thanks
I prefer pstrip myself, but I think alot of people tend to use coolbits :)
 
Up there you say "here's not much that can be done about the video interface"
You mean it can not be changed or it should not??? For my motherboard allows to change it (MSI k8N SLI platinum) from its starting 100 up to more than 150.
Furthermore, i can oc it just from my CoreCenter.
Is it wise to OC the PCI-e Bus??? Will i get a best graphics performe?? Or how can i test it????

Anyway, thanks for all and pretty nice post.
 
Gussman said:
Up there you say "here's not much that can be done about the video interface"
You mean it can not be changed or it should not??? For my motherboard allows to change it (MSI k8N SLI platinum) from its starting 100 up to more than 150.
Furthermore, i can oc it just from my CoreCenter.
Is it wise to OC the PCI-e Bus??? Will i get a best graphics performe?? Or how can i test it????

Anyway, thanks for all and pretty nice post.

I just leave the PCI-E Bus at default like most of the people do since noone of the todays vid cards can take full advantage of PCI-E bandwith.
 
Last edited:
Hello
first thing i'm from France so my spelling is not one of the best.

i have to ask you something because i have a strange issue with my POV 6800 Ultra.

first thing my 6800 ultra is watercooled.
idle temperature is about 45°.
burn temperature is about 52°

i've modified the bios to have Vcore at 1.5.
unfortunatly my bios as a single mode so i've in both 2D/3D 400 / 1200 Mhz with 1.5 volt all the time.
i've tried several bios event GTU15, but the one i have give me the best overclocking headroom.
i use rivatuner to overclock and atitool for scanning for artifacts and it works great because no game seems to burn the videocard as strongly as ati tool do.

well, here is my rivatuner settings :
2D : core 400 Mhz / memory 1305 mhz / vcore 1.5 volt
3d : core 465 Mhz / memory 1305 mhz / vcore 1.5 volt

my issue is that if i set core to 470 Mhz and run atitool, i works with no artifacts for 4 minutes and then (still no artifacts) gpu come to throttling mode (temperature 52 °C)

i don't understand why because temperature is not really high (my waterblock is quite cool to touch and thermal sensors seems to be OK).

Can someone help me to understand why please ?
 
FabinovX said:
Hello
first thing i'm from France so my spelling is not one of the best.

i have to ask you something because i have a strange issue with my POV 6800 Ultra.

first thing my 6800 ultra is watercooled.
idle temperature is about 45°.
burn temperature is about 52°

i've modified the bios to have Vcore at 1.5.
unfortunatly my bios as a single mode so i've in both 2D/3D 400 / 1200 Mhz with 1.5 volt all the time.
i've tried several bios event GTU15, but the one i have give me the best overclocking headroom.
i use rivatuner to overclock and atitool for scanning for artifacts and it works great because no game seems to burn the videocard as strongly as ati tool do.

well, here is my rivatuner settings :
2D : core 400 Mhz / memory 1305 mhz / vcore 1.5 volt
3d : core 465 Mhz / memory 1305 mhz / vcore 1.5 volt

my issue is that if i set core to 470 Mhz and run atitool, i works with no artifacts for 4 minutes and then (still no artifacts) gpu come to throttling mode (temperature 52 °C)

i don't understand why because temperature is not really high (my waterblock is quite cool to touch and thermal sensors seems to be OK).

Can someone help me to understand why please ?


:welcome: TO THE FORUMS

Most likely because you increased your GPU Vcore since I seen this happen before.

My best suggestion would be to run on stock 1.4V and see if you can pull 470MHz on the core and run RivaTuner Hardware monitor in the background while using atitool to check if the vid card is throttling.

Also which tool did you use to modify the BIOS?
 
Thank you for your help.

well actually, at 1.4 volt (stock voltage), i have the same issue but at 450 Mhz (with ATITOOL).445 Mhz was stable with ATITOOL at stock voltage.
But i've been using 450 mhz at stock voltage for 6 monthes in games and never noticed any problem.(i didn't know i was able to bench my card with this tool 6 monthes ago, i've just discovered that a few days ago).
ATITOOL seems to stress a lot my video card components, so i'm shure to be under the throttle limit in games, if i pass ATITOOL test.
I've just rise the voltage to get more overclocking, but 20 Mhz is really disappointing for 0.1 volt increase.
What do you think about that ?

to modify my bios, i use NiBiTor v2.4.
NiBiTor v2.4
quite a nice tool but doesn't work well with every BIOS.
Sometimes, there's still a checksum error with some BIOS but you have to test it with vgabios befor flashing (i'm shure you know the procedure but i just tell you how i do).

EDIT: Forceware driver stability test passed successfully at 486 max with 1.5 volt.
at 1.4 volt the max limit was 460 if i well remenber.

Thanks for your help
FabinovX
 
Last edited:
FabinovX said:
Thank you for your help.

well actually, at 1.4 volt (stock voltage), i have the same issue but at 450 Mhz (with ATITOOL).445 Mhz was stable with ATITOOL at stock voltage.
But i've been using 450 mhz at stock voltage for 6 monthes in games and never noticed any problem.(i didn't know i was able to bench my card with this tool 6 monthes ago, i've just discovered that a few days ago).
ATITOOL seems to stress a lot my video card components, so i'm shure to be under the throttle limit in games, if i pass ATITOOL test.
I've just rise the voltage to get more overclocking, but 20 Mhz is really disappointing for 0.1 volt increase.
What do you think about that ?

to modify my bios, i use NiBiTor v2.4.
NiBiTor v2.4
quite a nice tool but doesn't work well with every BIOS.
Sometimes, there's still a checksum error with some BIOS but you have to test it with vgabios befor flashing (i'm shure you know the procedure but i tell you how i do).

Thanks for your help
FabinovX

I never use ATITool personally. I go with 3DMark05 and look for artifacts myself.

Like with every other component you may need to do more burn in's for your vid card cause its been running under 1.4V and its not taking that increase very well, but it should work fine after about 24/48 hours of burn in.

Also be sure to use RivaTuner Hardware monitor to check if the vid card is throttling. This time go with 3DMark05 and just run the 3D test and skip the CPU. After its done look if your clocks were steady through out the test and check the temp as well.
 
Well, reading your post i felt it was not really cool to raise voltage with 6800U.
so i went back to 1.4 Volt and a 440 Mhz stable core.
i flashed with a different Bios which use tighten mem timings in order to keep a good performance increase.
so now i run 440 Mhz core (intead of 445 with other bios) and 1280 Mhz memory clocks with 1.4 volt in 3D mode and 400 / 1280 in 2D mode.
this bios has 2 available mode and i would like to know if you have ever heard about issues or card destruction using different voltage between 2D and 3D mode with 6800 ultra ?
have you ever heard about core destruction using 1.5 volt too ?
for the moment, i don't have any informations to confort me using 1.5 volt vcore with my 6800U, so i'm going to 'play it safely' as we say in France.
before modifying the bios again i did what you tell me.
After a 3Dmark 2005 bench at 465/1305 1.5volt, the core and memory speed were rock steady in riva tuner monitoring and temperature never exceeded 52°C.
But as i said below, i must found some more information and meet some more people whom have more experience with 1.5 V core and extreme overclocking, in order to fix some safe limits and don't burn my graphic card stupidly.
 
Last edited:
FabinovX said:
Well, reading your post i felt it was not really cool to raise voltage with 6800U.
so i went back to 1.4 Volt and a 440 Mhz stable core.
i flashed with a different Bios which use tighten mem timings in order to keep a good performance increase.
so now i run 440 Mhz core (intead of 445 with other bios) and 1280 Mhz memory clocks with 1.4 volt in 3D mode and 400 / 1280 in 2D mode.
this bios has 2 available mode and i would like to know if you have ever heard about issues or card destruction using different voltage between 2D and 3D mode with 6800 ultra ?
have you ever heard about core destruction using 1.5 volt too ?
for the moment, i don't have any informations to confort me using 1.5 volt vcore with my 6800U, so i'm going to 'play it safely' as we say in France.
before modifying the bios again i did what you tell me.
After a 3Dmark 2005 bench at 465/1305 1.5volt, the core and memory speed were rock steady in riva tuner monitoring and temperature never exceeded 52°C.
But as i said below, i must found some more information and meet some more people whom have more experience with 1.5 V core and extreme overclocking, in order to fix some safe limits and don't burn my graphic card stupidly.


If possible I'm not 100% sure if you can edit the BIOS with latest Nbitor revision to leave 2D voltage at stock, that should give you lower idle temp and prevent your card from overheating.

As far as seeing card being damaged at 1.5V I have yet to see that, but don't consider it 100% safe. Best thing you can do if you want to stay at 1.5V and those higher clocks is to do a long burn in for your vid card by using some of the Game Benchmarks or 3DMark and leting them run in long cycles.
 
what is the best way to overclock without raising temps with a stock cooler?, is it better to raise the memory than the core clock?, which uses more wattage?
 
timtooAMD said:
what is the best way to overclock without raising temps with a stock cooler?, is it better to raise the memory than the core clock?, which uses more wattage?

The core puts out the most temp as I found out myself by simply aplying artic ceramique to the RAM sinks I was able to increase its clocks from 1100 to 1200MHz without any change, but core clock is what gives you the better performance.
 
I'm having some strange problems trying to overclock my LeadTek 6600GT Extreme, for when i rise the frecs of core and ram in 3D mode, it also rises frecs in 2D mode. I can get up to 1300 MHz of ram frec in 3D without any problem, but it shows snow in 2D mode. Anyway, i dont need oc in 2D mode, and i would like to maintain default frecs.
How could i keep 2D frecs while rising 3D frecs??????
Thx.
 
Last edited:
In riva tuner, I change the sliders to the desired clocks, but when I go to check out the "Hardware monitoring" section, the clocks pretty much are the same as stock. What am i doing wrong?
 
Did you follow all the stpes that I provided when OC'ing your card using Coolbits?

You need to do it all in order for it to work or you can just use PowerStrip which automatically locks the clocks.
 
Back