• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Power consumption

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Ben Cole

Member
Joined
Jan 28, 2003
Location
London
My latest electricity bill was almost double the last one. The only thing I can think of that could be the cause is my new computer. Since I built my athlon based machine, I have been leaving it on a lot crunching SETI. What is the consumption of a machine like mine? Could it really be the cause?
 
Sure can. Try running 12 folding rigs and see how much your electric bill goes up. What made it worse was that I had to aircondition the room they were in for stability during the summer.
 
From what i've heard the actual computer isn't as bad as the monitor. Monitors gulp down loads of power, try turning your monitor off when it's not needed and see what happens.

The PG&E page said (i'm pretty sure it was pg&e) to buy lcd monitors because over the corse of a year an lcd would save you $300 on power compared to a crt. and i think lcd use around half the power so year charge to run a crt would be around $600-->$50 per month and since your on a lot it'd be around $80 per month for your monitor alone.
 
I'm not sure if this is right, but doesn't it consume as much wattage as your powersupply? I'm not sure what the timeframe would be, like 450W per minute or hour (though a minute might be too much and the hour might be too long), but I know the companies charge kilowatts per hour so if you have your old bills, maybe you can make a comparison and try to find the rate that your bill increased.

I'm not sure if what I just said makes any sense, but if you can find the rate you're charged per kilowatt/hour, maybe you can figure out how much extra juice the comp is taking up by using your bills.
 
Just because the power supply is rated at 450W doesn't mean that it is continuously consuming 450W of power. That is simply the maximum power consumption. There are meters that you can buy that see what the real power consumption of appliances and other electrical devices is. Depending on the size of the monitor, it may very well consume more power than the computer; you just don't know unless you test it. I run my computer nearly 18 hours a day, with the monitor on power save when not in use, and I know that isn't costing me $50 a month. My power bill is only like $25 a month total. I don't know where the flat-panel people get there data, but it seems that their power company is ripping them off. I have a 21" Sony CRT by the way.
 
Back