• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Does anyone consider PSU efficiency when looking at hash/watt?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Great thinking all around IMOG. I think any miner should be doing the math on the difference between their current PC/PSU and a new one. This, and others that run 24/7 are the group that could benefit the most. That said, math needs to be done to figure it out.

As far as readings from the Clamp on, that isn't going to tell you your efficiency because that measurement is before the PSU. It does give accurate values, at least for the CPU, not so much for the GPU as it can draw up to 75W from the PCIe slot.

Here are the teirs for the ratings... 89-92% for platinum depending on load.
 
i just did the math.

in my case. it would take a year to realize the savings from the more efficient, more expensive PSU.

assumptions:
1. my 2x PSUs, 80+ bronze, 1550w combined, costs $140
2. a comparable 80+ gold, 1500w, is ~$400
3. assume worst case that my setup is 80% efficient
4. assume the 80+ gold is 90% efficient
5. assume a 4x r9-280x system that requires 1000w of power.
6. assume electricity costs $0.1375/kWh (including all extra fees from power company, this is comparable to the ACTUAL out the door cost on my electric, not just the base electric cost)

Single PSU costs $400, Dual smaller PSUs cost me $140
So, single, more efficient setup is $260 more expensive up front.

the dual setup will use (1000/0.80) = 1250.00w from the wall
the single setup will use (1000/0.90) = 1111.11w from the wall
So, the dual setup uses 138.89w more electricity

that equates to (138.89/1000)*24h = 3.33 kWh more per day
and, (3.33*30days) = 100 kWh more per month

and so, (100kWh*$0.1375) = $13.75 more per month to run dual.

$260/$13.75 = 18.9 months until the higher priced PSU starts paying off.

not worth it, IMO
 
As far as readings from the Clamp on, that isn't going to tell you your efficiency because that measurement is before the PSU. It does give accurate values, at least for the CPU, not so much for the GPU as it can draw up to 75W from the PCIe slot.

Here are the teirs for the ratings... 89-92% for platinum depending on load.

I'm so lost now.

The clamp on read the amps the PSU draws. How did we end up talking about how the GPU get 75 watts from the board??

I use 1.83 Amps full GPU load = ~35 % load on my 600 watt psu.
It's 80+ Bronze, so at 35 % load, it's ~84% efficient.
 
for the sake of argument, lets do this again for 2 single PSUs. one bronze, and one platinum. 1200w each. i'll use newegg new prices.

1. Corsair Platinum-1200 - $340 (on sale), 92% eff
2. Thermaltake Bronze-1200 - $220, 86% eff

same system, 1000w load.
same electric cost, $0.1375/kWh

The platinum is $120 more expensive, and 6% more efficient.

The corsair will pull (1000W/0.92) = 1086.96 W from the wall
The thermaltake will pull (1000W/0.86) = 1162.79 W from the wall
So, the thermaltake pulls 75.83 W more than the corsair.

the thermaltake will use (75.83/1000)*24h = 1.82 kWh more per day
and, (1.82*30days) = 54.6 kWh more per month

that equates to 54.6kWh*$0.1375 = $7.51 per month more.

and then, $120/$7.51 = 15.9 months before the platinum is worth it.
 
Last edited:
I'm so lost now.

The clamp on read the amps the PSU draws. How did we end up talking about how the GPU get 75 watts from the board??

I use 1.83 Amps full GPU load = ~35 % load on my 600 watt psu.
It's 80+ Bronze, so at 35 % load, it's ~84% efficient.
This is a bit beyond me, and hoping for OW to chime in and correct as needed...

I understood things as you have the clamp on the PCIe lead to the GPU, correct? The PCIe lead will show what power is being used through that plug, but will not show how much the card is drawing through the slot so it is not an accurate value missing the power draw from the slot. Did I misunderstand?

EDIT: I did. You are taking that value from AFTER the PSU at the cord, not the PCIe cord, LOL! In that reading, you have to account for PSU efficiency to get a more accurate value. So for the sake of easy math, Lets say platinum level and 90% efficient... 220*.9 = ~198 actual use. You have recorded SYSTEM power consumption, not just the GPU.
 
Last edited:
I think he was using the clamp on ammeter on the cord to the wall, so it should be using total watts, then remove one card to find that card's power usage.
 
This is a bit beyond me, and hoping for OW to chime in and correct as needed...

I understood things as you have the clamp on the PCIe lead to the GPU, correct? The PCIe lead will show what power is being used through that plug, but will not show how much the card is drawing through the slot so it is not an accurate value missing the power draw from the slot. Did I misunderstand?

EDIT: I did. You are taking that value from AFTER the PSU at the cord, not the PCIe cord, LOL! In that reading, you have to account for PSU efficiency to get a more accurate value. So for the sake of easy math, Lets say platinum level and 90% efficient... 220*.9 = ~198 actual use. You have recorded SYSTEM power consumption, not just the GPU.

I think he was using the clamp on ammeter on the cord to the wall, so it should be using total watts, then remove one card to find that card's power usage.

I am :)

I forgot to mention that lol.

Yeah, I measure the black (power/hot) wire going into the PSU.
So, 220 *.84 = 184 watts actual draw.

I lose 36 watts (184 - 220 = 36)

That's nice to know, that I use under 200 watts to mine, but when you factor in the efficiency, it's 220 watts I use.


Thanks everyone! That really clears things up :)
Now, if only there was an accurate way to see if I could run my 6850 and some other card like a 280x or something at the same time ;)
 
I use Kill-a-Watt to count the power from the wall, since its what you will be charged for anyways.

In a high power consuming rig (eg 3-r9-290s) you will be consuming approx 1000 watts.

in your efficiency example provided you are using approx 1 extra cent per hour (under average Kwatt/hr cost in the USA.

24 Hrs is 24 cents in perspective lets say 4 days is $1.00, 40 days is $10, 400 days is $100.00 If you do plan to mine with rigs as strong and hungry as these and the price/efficiency is not that high it will be worth grabing a 80+ platinum over a 80 plus bronze.
 
I am :)

I forgot to mention that lol.

Yeah, I measure the black (power/hot) wire going into the PSU.
So, 220 *.84 = 184 watts actual draw.

I lose 36 watts (184 - 220 = 36)

That's nice to know, that I use under 200 watts to mine, but when you factor in the efficiency, it's 220 watts I use.


Thanks everyone! That really clears things up :)
Now, if only there was an accurate way to see if I could run my 6850 and some other card like a 280x or something at the same time ;)


Yes you can just don't enable crossfire.
 
I use Kill-a-Watt to count the power from the wall, since its what you will be charged for anyways.

In a high power consuming rig (eg 3-r9-290s) you will be consuming approx 1000 watts.

in your efficiency example provided you are using approx 1 extra cent per hour (under average Kwatt/hr cost in the USA.

24 Hrs is 24 cents in perspective lets say 4 days is $1.00, 40 days is $10, 400 days is $100.00 If you do plan to mine with rigs as strong and hungry as these and the price/efficiency is not that high it will be worth grabing a 80+ platinum over a 80 plus bronze.

my price, is actually closer to what you will ACTUALLY pay.

in almost every case, the power company has additional fees that are charged per kWh, yet billed separately and not "technically" the base electric rate. so these fees are not counted in those national average electric cost surveys. the fact that these fees are separate really doesnt matter, you have to pay for them anyway, so they need to be accounted for.

look at your bill if you dont believe me.

i've pretty clearly shown, that until you mine 24/7 for 12-18mo, the additional upfront cost of the platinum over the bronze wont be worth it.

that was just one scenario however, and all of this can change based on the price that you buy your PSUs for. if you can get a GREAT deal and get that time to ROI for plat over bronze down to less than 6months, then sure, go for it.
 
Yes you can just don't enable crossfire.

I know, I just meant to see if there is a good way to calculate power usage of a 270x (since I don't have one) I'd need the watts used of just the 270x, not the whole rig.

Than I can see if my 600 watts PSU will be enough to power both :)
 
If you are using 220W at the wall, you could add a card that uses 300W without worry (assuming the PSU can output what it says).

That said, its TDP numbers are all we have to go by. It is 180W. With a 20% power limit, so that card cannot pull more than 216W without throttling or modifications. So, I think you are fine.
 
for the sake of argument, lets do this again for 2 single PSUs. one bronze, and one platinum. 1200w each. i'll use newegg new prices.

1. Corsair Platinum-1200 - $340 (on sale), 92% eff
2. Thermaltake Bronze-1200 - $220, 86% eff

same system, 1000w load.
same electric cost, $0.1375/kWh

The platinum is $120 more expensive, and 6% more efficient.

The corsair will pull (1000W/0.92) = 1086.96 W from the wall
The thermaltake will pull (1000W/0.86) = 1162.79 W from the wall
So, the thermaltake pulls 75.83 W more than the corsair.

the thermaltake will use (75.83/1000)*24h = 1.82 kWh more per day
and, (1.82*30days) = 54.6 kWh more per month

that equates to 54.6kWh*$0.1375 = $7.51 per month more.

and then, $120/$7.51 = 15.9 months before the platinum is worth it.
What if you use two 750w bronze vs two 750w gold psu? The difference in price can tend to be fairly low, which would make it pay itself off much sooner.
 
my price, is actually closer to what you will ACTUALLY pay.

in almost every case, the power company has additional fees that are charged per kWh, yet billed separately and not "technically" the base electric rate. so these fees are not counted in those national average electric cost surveys. the fact that these fees are separate really doesnt matter, you have to pay for them anyway, so they need to be accounted for.

look at your bill if you dont believe me.

i've pretty clearly shown, that until you mine 24/7 for 12-18mo, the additional upfront cost of the platinum over the bronze wont be worth it.

that was just one scenario however, and all of this can change based on the price that you buy your PSUs for. if you can get a GREAT deal and get that time to ROI for plat over bronze down to less than 6months, then sure, go for it.

I've never notice thpse cost, I do know they had a AVG rate or a peak rate, plans. I choose AVG since in the long run its cheaper if your mining
 
in any plan, there are additional fees. "service" charges. billed per kWh, and separate from the base electric rate. even still, the less you pay for electricity, the LESS the efficiency boost is worth it. making it take even longer to make up for that upfront cost.
 
What if you use two 750w bronze vs two 750w gold psu? The difference in price can tend to be fairly low, which would make it pay itself off much sooner.

i did a comparison on this a few posts back. look on the last page. but i'll admit that i was comparing used prices to new, thats why i did the second comparison.

you should try to compare "comparable" setups. comparing 2x 750w to 1x 1200 isnt comparable.
comparing CoolMax to Seasonic isnt comparable.
 
Back