So today I wanted to figure out what the absolute maximum amount of electricity I could get my computer in my signature to use. According to the following site, each of my GTX 275s have a max TDP of 219W.
http://www.hwcompare.com/13381/geforce-gtx-275-vs-geforce-gtx-660/
However, GPUBoss.com says that cards have a total electrical consumption of about 40% above the TDP, so about 305W each. Then we have to consider my cards are overclocked as well. Then of course we have the CPU, which is also overclocked significantly, my three hard drives, six fans, PCI card, yada, yada, yada. Well theoretically, I should be pulling over 600W for the GPUs alone. So with everything, I should max out my 750W PSU, right?
Well, I wanted to find out. So I ran Prime95 to max out my CPU, and I ran Heaven Benchmark at the same time to max out my GPUs. I opened the resource monitor and Precision to make sure my CPU and GPUs were maxed. Then I used a KILL A WATT to measure the electrical consumption. The results?
With everything redlined, all the fans maxed, and everything overclcoked, the highest I saw the KILL A WATT read in the 60-second test was 505W. On average, it pinged around 450 - 470W, and when you take into consideration the inefficiency of the PSU (maybe 80% efficient), that means the computer components were actually using under 430W the entire time, and the GPUs likely never used more than 175W each at any given time. So where do they come up with this 305W per card for stock speeds crap? I dont think the cards even hit the official TDP of 219W. If they did, that would have meant the cards used 438W alone, leaving about 10W for the rest of the computer, which is just silly. So where do they come up with these numbers...? The only power consumption estimator I have found to match my real-world test is the following:
http://extreme.outervision.com/psucalculatorlite.jsp
Which also puts the electrical consumption of the GPUs way below the official TDP rating.
http://www.hwcompare.com/13381/geforce-gtx-275-vs-geforce-gtx-660/
However, GPUBoss.com says that cards have a total electrical consumption of about 40% above the TDP, so about 305W each. Then we have to consider my cards are overclocked as well. Then of course we have the CPU, which is also overclocked significantly, my three hard drives, six fans, PCI card, yada, yada, yada. Well theoretically, I should be pulling over 600W for the GPUs alone. So with everything, I should max out my 750W PSU, right?
Well, I wanted to find out. So I ran Prime95 to max out my CPU, and I ran Heaven Benchmark at the same time to max out my GPUs. I opened the resource monitor and Precision to make sure my CPU and GPUs were maxed. Then I used a KILL A WATT to measure the electrical consumption. The results?
With everything redlined, all the fans maxed, and everything overclcoked, the highest I saw the KILL A WATT read in the 60-second test was 505W. On average, it pinged around 450 - 470W, and when you take into consideration the inefficiency of the PSU (maybe 80% efficient), that means the computer components were actually using under 430W the entire time, and the GPUs likely never used more than 175W each at any given time. So where do they come up with this 305W per card for stock speeds crap? I dont think the cards even hit the official TDP of 219W. If they did, that would have meant the cards used 438W alone, leaving about 10W for the rest of the computer, which is just silly. So where do they come up with these numbers...? The only power consumption estimator I have found to match my real-world test is the following:
http://extreme.outervision.com/psucalculatorlite.jsp
Which also puts the electrical consumption of the GPUs way below the official TDP rating.
Last edited: