• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Start-up power usage

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

KOXC2003

Member
Joined
Jan 1, 2004
Location
Upstate New York
It's been a long time since I last posted, but I had a quick question that I can't find the answer to,

Several years ago on Call for Help, an old TechTV show, they had a guest on who explained that the power needed to turn on/ off a computer was greater then the amount of power needed to leave the system on stand by (monitor off of course). The show host then suggested to turn your computer on the in the morning when you first use it, and then off at night after you last use it for the day. I think the guest agreed with this, but it's been awhile so I don't quite remember.

I've looked around the web a bit, and a few sites denounced this as an urban legend, but the man on the show had some good points, and measured the electricity used to start up one of their computers on the show.

So getting to the actual question, I'm curious how long a computer would have to be idle for (again monitor off), to take more electricity then turning it off and then on again when it was next used. I'm sure it would vary based on the computer, but does anyone have a general idea of this/ can provide a link on this? I tried to find the show notes for the episode but I don’t even know what year it was.

Thanks for any replies.
 
Not sure of the formula but I believe the online PSU calculators show percentage of increase in power for power up. (it does require more)

As for the "aginig" effect, thats totally different then just more power. As you are sending a sudden large power requirement through a dead system.

That will require a CE or EE major to discover im sure

EDIT: quick google turns up this Not an algebraic formula but I will research some more.

If you want to look yourself this google search seems to reveal alot of good results. How many are opinion vs fact based ... I havent looked through them all yet...
 
Last edited:
Computers do pull more power at startup. Caps have to charge up, disks have to spin up (although they do after waking up as well), etc. I'm not sure where the crossover point is, but it will vary from system to system. The best way to find out is to measure power consumption in all of the various states and at startup with an ammeter.
 
I think the more important issue here is that powering up puts stress on all of the components. The more times you turn it on and off, the shorter the life of the entire system. A computer costs anywhere from $500 to $4,000 and up depending on how fancy and how many gadgets are installed. Shortening the life of the components will probably cost you a bundle over the long haul. Having said all that, the power usage of the computer can be measured at idle with a clamp on AC amp meter. The watts = Volts X Amps. This is an approximation because of power factor and other issues, but for your purposes is close enough. You can calculate the electricity cost by converting your calculation to killowatt hours and multiplying by the power company selling price / killowatt hour. With this information you can estimate how long it will take to use up enough electricity to equal the cost of your computer. Then it is a personal decision on how long to leave it powered up on a daily basis.
 
To use a clamp-on ammeter, you need to separate the wires so that you're testing only the hot or the neutral at one time.
With the juice flowing in one and out the other, you'll get a net reading of zero no matter what the amperage draw is if you don't.

Just so ya know. That's what I had to do when I tested mine.

My DMM can handle 10 amps, which is more than a computer draws. I double checked the clam-on's reading using it.
You'll need to sacrifice a PC cord to test it though, as you have to separate the wiring and cut one (black). Hook the meter to each of the cut ends.
Make darned sure what your meter can handle beforehand though. A 2 amp meter will let the smoke out if the fuse isn't fast enough.

There are plug in thingies that will show you the power usage. It plugs into the wall and your targetted device plugs into it. The LCD readouts are handy too, giving you a hard number to do the math with.

Sadly, I haven't done the power up numbers when I tested mine, but it drew 2.75 amps when defragging the 4 drive raid (the most I could get it to draw).
 
If you're really adventurous and know something about house wiring, you can measure the current draw with a clamp on ammeter without separating the power plug wires. Go to your breaker box and locate the breaker that feeds the circuit that your computer is on. Make sure that the only thing plugged into that circuit is your computer. Then you can use the clamp on ammeter inside the breaker box on the line exiting the specific breaker that serves your circuit. ********WARNING******DON'T DO THIS UNLESS YOU KNOW WHAT YOU ARE DOING. YOU CAN REALLY HURT YOURSELF OR DIE INSIDE A BREAKER BOX.
 
Back