Hi, so I have been wondering about this for a long while now. My question is which part is the one that really kills/shortens the lifespan of a CPU or a GPU, the voltage itself or the temperature?
Here is an example(This is not fact, I'm just making this up to try to make a point), Let's say I'm at 4.0GHz at 1.2v@80C and another example is at 5.0GHz at 1.5v@40C, which factor is really the one that shortens the life? At a lower voltage but at very high temperature or at very high voltage and at lower temperature? I hope you guys see the point I'm trying to make here. If this is confusing, I'll try to rephrase it a bit, thanks.
Here is an example(This is not fact, I'm just making this up to try to make a point), Let's say I'm at 4.0GHz at 1.2v@80C and another example is at 5.0GHz at 1.5v@40C, which factor is really the one that shortens the life? At a lower voltage but at very high temperature or at very high voltage and at lower temperature? I hope you guys see the point I'm trying to make here. If this is confusing, I'll try to rephrase it a bit, thanks.