I'll just post my standard wall of text for these questions.
Why does OCing dmg a CPU?
This is a bit hard to understand if you don't know some basics about electron orbital theory and quantum tunneling?
I'll assume you know nothing so I'll try to keep this as simple as I can. You'll have to take my word on a few facts though.
Normally, electrons stay around their atom's and don't go wandering off. So in a CPU, they'll stay in one transistor and not move to others. However, if you've learnt about quantum mechanics, you'll know it's actually possible for electrons to escape from energy wells, even infinitely deep ones, it's just very uncommon. In a process known as quantum tunneling, electrons can pass through solid matter and be ejected out the other side.
Now, a transistor in a CPU is made from alternating + and - doped and undoped silicon. Once in a while, an electron will escape and bury a couple atoms into an adjourning transistor, and if this happens enough times, eventually all the way through to the adjourning transistor before coming back to it's orbit.
Keep doing this and eventually an electron doesn't come back, but stays attached to an atom in the adjourning undoped section of silicon. Over time (usually years), this tunneling causes a hole to be formed between two adjourning transistors and allows free electron flow.
This bypasses the "gates" between the transistors and as a result, the computer will misread this resulting in an error.
This process is called silicon degradation and eventually results in a complete CPU failure.
Now, as to where overclocking comes in.
If you know about electron orbital theory, the more energy an electron has, the more likely it is to leave it's orbit and tunnel. IE if your CPU is running hot, or has a considerably higher voltage going through it, electrons tunnel in much higher numbers. As a result, the more you OC, the faster you make those tunnel which cause silicon degradation.
In addition, if you increase the voltage enough, you can actually physically destroy the silicon lattice of the gates within a
processor. Don't make me explain this cuz I can't without lots of math.
Now, on to OC and Heat
In a CPU boosting F, has a very minor, almost insignificant heat increase.
It's v increase that dramatically increases heat.
I'll just quote myself again
Power Dissipation = PD in Watt
Voltage = Volt
Freq = Hz
C= Capacitance in Farads
Total PD in Watt = C x F x V^2
As C doesn't change (ok it technically does, but for the sake of keeping the math simply we can assume it doesn't)
If you actually plug in numbers and graph the function, the heat increase due to a freq increase is minute compared to the heat increase from a v increase, as one increases exponentially, the other linearly.
Indeed, the more you increase the V, the less the F part of the equation is relevant to the total temp.
Looking at real world data, look at the power usage increase in Tom's i5 efficiency article.
http://www.tomshardware.com/review [...] 500-7.html
Each bump was a constant 10mhz clock speed increase, but due to the exponential nature of the voltage increase contribution to PD, the graph is not linear, and power usage does not increase until you start seeing large v increases.
Power usage directly translates into heat.
As for actual temps, it's more complicated than purely based on power dissipation
Cpu temperature = (Total PD in Watt) x (HSF's Thermal Resistance in
C/W) + (Ambient Temp in Celcius)
For comparison purposes the resistance and ambient can be considered constant (technically not true once again, as resistance changes slightly with temp, and ambient increases with more heat output).
In your specific case, the answer is not so much the 200mhz F increase, but how much v increase you'll need to attain it. If there is no v increase, life of the CPU will be minimally impacted.
There is no easy way to tell how each chip is affected as due to imperfection in the manufacture process, the degradation rate vs v or f graph would be unique to each chip.