- Joined
- May 23, 2008
- Location
- London, England
Disclaimer: I am just a second year electrical engineering student, so there could be safety precautions I'm not aware of. Even though the procedure should be very safe, playing with mains power (or any kind of electricity) has some inherent risk, so follow at your own risk.
Want to know how much power your computer consumes?
Kill-A-Watt is nice, but it costs quite a bit, and you can't use it for anything else.
With a clamp-on multimeter, you can achieve pretty much the same thing (I will explain what's the difference later), and a multimeter is a useful tool to have around anyways.
However, unless you want to be poking around main's power with bare conductors (generally a DANGEROUS thing to do unless you have appropriate training), you need a multimeter with a clamp. Commonly referred to as clamp-on multimeter or clamp multimeter.
Something like these
http://www.multimeterwarehouse.com/clampmeter.htm
(first Google result, I am NOT recommending buying from them, and I have never heard of them)
I am not sure if anyone makes clamp test leads for regular multimeters, but that's possible.
They are like regular multimeters, except they allow you to measure AC currents without making contact with the actual wires (using electromagnetic effects).
I happen to have one lying around, so I decided to give it a try.
The clamp meter is very easy to use. Just set it to measure AC current with the correct range, and clamp it around a wire.
Problem is, a computer power cord (from outlet to power supply) actually has 2 wires (we can ignore the ground wire here, since it shouldn't carry any current), with current flowing the opposite ways. So, if you clamp the meter around the power cord, you should see a 0, since it measures the sum of the currents in the 2 wires (they are equal magnitude, opposite signs).
Therefore we need to strip it first, to allow us to clamp it onto only 1 wire.
First, unplug it from BOTH ends!
Stripping can be easily done with a pair of scissors. Remove a section of the outer insulation. Try not to damage the 3 wires inside. Carefully examine the wires afterwards. Use electrical tape to fix them if you damaged the insulation of the wires, like I have done. If you actually cut the inner wires in half... I guess you can fix it by stripping both ends, tying the ends together, and wrap everything in electrical tape. Add a drip of solder if you want. Not really important. Make sure there is no exposed metal!
With the power cord still unplugged, clamp the meter around ONE of the three wires. 2 of them should give you identical values, and 1 should give you 0 (ground, if it's not 0, run for cover). Usually, brown/black is live, blue/white is neutral, and green/yellow is earth ground. Either live or neutral will do (they will give you the same value, since one is the return path of the other).
Set your meter to measure AC amps (range should be ~0-10A, 10A is ~2400W), and turn on the computer.
Now the reading on the meter is how much current your computer is drawing at that instant.
The "apparent power" is voltage multiplied by current. Voltage is 120V is North America, and 100-130/200-240 elsewhere in most places. Current is what you are seeing on the meter.
For example, if you are in the US (120V), and you are seeing 1.5A on the meter, your computer is consuming approx. 180W.
However, the "real" power draw is not the same as apparent power. It's less. That's the power factor due to the voltage and current waves being out of phase. But since all modern power supplies have some kind of power factor correction (PFC), they should be pretty close. Kill-A-Watt has better accuracy because it can measure the real power.
Also, this is the input power to the power supply. Therefore, to calculate the power your computer is drawing, you need to divide it by the efficiency of the power supply. There is no easy way to measure that, so you'll have to rely on reviews. For my EarthWatts I'm assuming 80%. So if I measure 100W from the wall, that means my computer is drawing 80W.
For my computer (E6300 @ 2.8ghz, 9600 GT heavily overclocked), I get
Idle - 1.13A (135.6W, 108.5W out)
Orthos - 1.59A (190.8W, 152.6W out)
Orthos + Furmark - 2.32 (278.4W, 222.7W out)
So the 380W EarthWatts I am using is much overkill for my machine .
Imagine what kind of machine would actually NEED 500W. I think we really should stop buying larger and larger power supplies just because people make them. Very few people actually need 500W. I would guess 650W would do for 2 GTX 280's (236W max power) in SLI, with an overclocked i7.
Want to know how much power your computer consumes?
Kill-A-Watt is nice, but it costs quite a bit, and you can't use it for anything else.
With a clamp-on multimeter, you can achieve pretty much the same thing (I will explain what's the difference later), and a multimeter is a useful tool to have around anyways.
However, unless you want to be poking around main's power with bare conductors (generally a DANGEROUS thing to do unless you have appropriate training), you need a multimeter with a clamp. Commonly referred to as clamp-on multimeter or clamp multimeter.
Something like these
http://www.multimeterwarehouse.com/clampmeter.htm
(first Google result, I am NOT recommending buying from them, and I have never heard of them)
I am not sure if anyone makes clamp test leads for regular multimeters, but that's possible.
They are like regular multimeters, except they allow you to measure AC currents without making contact with the actual wires (using electromagnetic effects).
I happen to have one lying around, so I decided to give it a try.
The clamp meter is very easy to use. Just set it to measure AC current with the correct range, and clamp it around a wire.
Problem is, a computer power cord (from outlet to power supply) actually has 2 wires (we can ignore the ground wire here, since it shouldn't carry any current), with current flowing the opposite ways. So, if you clamp the meter around the power cord, you should see a 0, since it measures the sum of the currents in the 2 wires (they are equal magnitude, opposite signs).
Therefore we need to strip it first, to allow us to clamp it onto only 1 wire.
First, unplug it from BOTH ends!
Stripping can be easily done with a pair of scissors. Remove a section of the outer insulation. Try not to damage the 3 wires inside. Carefully examine the wires afterwards. Use electrical tape to fix them if you damaged the insulation of the wires, like I have done. If you actually cut the inner wires in half... I guess you can fix it by stripping both ends, tying the ends together, and wrap everything in electrical tape. Add a drip of solder if you want. Not really important. Make sure there is no exposed metal!
With the power cord still unplugged, clamp the meter around ONE of the three wires. 2 of them should give you identical values, and 1 should give you 0 (ground, if it's not 0, run for cover). Usually, brown/black is live, blue/white is neutral, and green/yellow is earth ground. Either live or neutral will do (they will give you the same value, since one is the return path of the other).
Set your meter to measure AC amps (range should be ~0-10A, 10A is ~2400W), and turn on the computer.
Now the reading on the meter is how much current your computer is drawing at that instant.
The "apparent power" is voltage multiplied by current. Voltage is 120V is North America, and 100-130/200-240 elsewhere in most places. Current is what you are seeing on the meter.
For example, if you are in the US (120V), and you are seeing 1.5A on the meter, your computer is consuming approx. 180W.
However, the "real" power draw is not the same as apparent power. It's less. That's the power factor due to the voltage and current waves being out of phase. But since all modern power supplies have some kind of power factor correction (PFC), they should be pretty close. Kill-A-Watt has better accuracy because it can measure the real power.
Also, this is the input power to the power supply. Therefore, to calculate the power your computer is drawing, you need to divide it by the efficiency of the power supply. There is no easy way to measure that, so you'll have to rely on reviews. For my EarthWatts I'm assuming 80%. So if I measure 100W from the wall, that means my computer is drawing 80W.
For my computer (E6300 @ 2.8ghz, 9600 GT heavily overclocked), I get
Idle - 1.13A (135.6W, 108.5W out)
Orthos - 1.59A (190.8W, 152.6W out)
Orthos + Furmark - 2.32 (278.4W, 222.7W out)
So the 380W EarthWatts I am using is much overkill for my machine .
Imagine what kind of machine would actually NEED 500W. I think we really should stop buying larger and larger power supplies just because people make them. Very few people actually need 500W. I would guess 650W would do for 2 GTX 280's (236W max power) in SLI, with an overclocked i7.