• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

CPU Power Consumption

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Opusbuild

Member
Joined
Jul 23, 2012
Location
UK
I would like to understand the "real life" power consumption of the following processors:

1) AMD FX 8120 (105W)
2) Intel i5 3570k (77W)
3) Intel i7 2600k (95W)

This might help me to make my decision on which to go for?

I can see the power consumption values, but I am trying to understand this in terms of electricity billing. Bare in mind that I will sometimes be leaving my PC running at full load for 24/7 to run numerical analysis.

http://www.bit-tech.net/hardware/cpus/2012/07/27/amd-fx-8120-review/7

I look forward to you discussions.

Thanks.
 
BitTech shows the idle numbers as:
1) AMD FX 8120 (105W)
2) Intel i5 3570k (97W)
3) Intel i7 2600k (118W)

You show the numbers below and I did not see them at BitTech. Where did they come from?
1) AMD FX 8120 (105W)
2) Intel i5 3570k (77W)
3) Intel i7 2600k (95W)

The BitTech link you gave was endeavoring to show real idle and load current consumption. I would assume you would be interested in more real world power consumption based on their findings?

Calculating your difference in electric bill is not complicated at all. IF the AMD cpu draws twice as many watts as the Intel cpu under full load, then the portion of your electric bill that is for that particular computer use, will be twice as much.
 
P.S. it is astonishing how much power the AMD 8120 consumes when overclocked!
 
For this test, we leave the PC doing nothing but displaying the Windows 7 desktop (with Aero enabled) for a few minutes and record the wattage drawn from the wall via a power meter.

The difference in the values is because Bit-tech is measuring from the wall. This is affected by all the other system components as well as the PSU efficiency.

The values that Opusbuild are the published TDP of the CPUs, which is what they would draw at stock settings on 100% load according to the manufacturer.

Electricity billing is done in usually done in Kilowatt-hours. AKA, if your computer drew 1000W from the wall for one hour. You should be able to find your kWh rate on your electricity bill.

Easiest way to measure power consumption is to just get a Kil-a-watt meter. If you want to do it theoretically add up the TDP of all CPUs and GPUs. Approximate overclocking by assuming that a 10% OC will results in a 10% increase in power. Add on hard drives and fans and stuff, which usually adds up to another 40-50W as such. That's how much your PC is drawing from the PSU. Then find a PSU review to figure out your PSU's efficiency at that load, and divide by that percentage, that's how much is drawing from the wall.

Or use this PSU calculator to get your wattage from the PSU, and then figure out the efficiency.
 
From the BitTech article you linked I could see the results as outlined below. Note these are FULL LOAD numbers and I hardly see a full load at any time. But just for reference, I did the caculations below.

AMD FX-8120 (3.1GHz/4.65GHz) Load 242W Stock/3.1Ghz clock - Load [email protected].

Intel Core i7-2600K (3.4GHz/5GHz)Load 180W Stock 3.4Ghz - Load [email protected]

Intel Core i5-3570K (3.4GHz/5GHz)Load 161W Stock 3.4Ghz - Load [email protected].

Using BitTechs watt numbers an AMD 8120 running overclocked to 4.65Ghz and running fully loaded for 3 hours a day, 30 days a month would cost $49.24 per month to run in my state.

Using BitTechs watt numbers for an Intel Intel Core i7-2600K running overclocked to 5.0Ghz and running fully loaded for 3 hours a day, 30 days a month would cost $25.63 per month to run in my state.

Using BitTechs watt numbers for an Intel Core i5-3570K running overclocked to 5.0Ghz and running fully loaded for 3 hours a day, 30 days a month would cost $21.87 per month to run in my state.

I only calculated using full load watts per cpu and probably used an unrealistic full load time of 3 hours every day of a 30 day month. At least it is not realistic for me and my uses. The money differences are pretty large per month. If my full load time was only 10 hours per month the monies for each cpu would be cut by a factor of 9 and the amounts of monies per month would then be $5.47 for the 8120; $2.84 for the 2600K and $2.43 for the 3570K per month. Those dollar amounts do not seem so large but the "spread" of monies is actually still the same.

Dollars and Cents are one of the reasons why at times I say that removing all the green stuff from the speed equation is not a good thing. Running overclocked continuously makes little sense in todays energy climate and job climate for many.
 
So the argument about AMD's being cheaper than Intel, may not be so true.
 
The difference in the values is because Bit-tech is measuring from the wall. This is affected by all the other system components as well as the PSU efficiency.

The values that Opusbuild are the published TDP of the CPUs, which is what they would draw at stock settings on 100% load according to the manufacturer.

Electricity billing is done in usually done in Kilowatt-hours. AKA, if your computer drew 1000W from the wall for one hour. You should be able to find your kWh rate on your electricity bill.

Easiest way to measure power consumption is to just get a Kil-a-watt meter. If you want to do it theoretically add up the TDP of all CPUs and GPUs. Approximate overclocking by assuming that a 10% OC will results in a 10% increase in power. Add on hard drives and fans and stuff, which usually adds up to another 40-50W as such. That's how much your PC is drawing from the PSU. Then find a PSU review to figure out your PSU's efficiency at that load, and divide by that percentage, that's how much is drawing from the wall.

Or use this PSU calculator to get your wattage from the PSU, and then figure out the efficiency.

^----- This. Bittech is not listing "CPU only" wattage....that's system wattage. Geesh...a CPU with 500+ watts consumption?? :rofl:

Knufire, and don't forget that while OC'ing the MB's VRM efficiency affects power consumption too. A certain amount of power is lost through radiated energy from the MB as well as the PSU.
 
Its quite a common misconeption... mostly with videocards though. People see 'requires 550W PSU' and think that the GPU itself needs 550W.

Im certain you see a lot of that at Tom's (everywhere).
 
Yeah my 6100 at 4.6ghz probably consumes 525-550w Total Power! But not sure about just the CPU itself, dont know how to test that?

I know in AIDA64 it tells me what the power consumption is of the CPU so that is one option. You can also roughly figure out what it is based on the stock clock and power consumption. Then take your overclock, translate that into a percent increase, and then increase the power draw by that percentage. Not perfect obviously :p
 
Back