I don't know if this is the right way to do this but if you don.t get anything as a link he address should work,
http://team.macnn.com/faq.phtml
How much energy is wasted by running all these distributed computing clients?
Assuming each CPU is left on for 24 hours a day (the worst case calculation), and that the average CPU consumes 50 watts of power (just the CPU - no monitors or peripherals), the monthly power usage will be 36.5 KW hours. If the average home electric bill is in the 700 KWh range, then one CPU would contribute only 5% to the electric bill in question. If electricity is charged at 10 cents per KWh, then the monthly cost would be US$3.65.
On a larger scale, calculations suggest that a distributed computing project running on 100,000 clients will produce only 0.0001% of the world's carbon dioxide production per year.
These calculations were done for the "average" CPU - that means a PC. Macs on average use less power, so should have a smaller impact. Also, the computer was presumably going to be on during some of this time anyway. In the case of a server, the computer was going to be running 24/7, so the extra power to run a distributed client would be minimal.
- data source: climateprediction.org faq