- Joined
- Apr 19, 2003
- Thread Starter
- #21
XWRed1 said:All things being equal, shouldn't the answer be "never"?
It's just another factor of performance. It should never hurt to take increase it, but real-world issues like heat give you a short-term limit. But assuming heat isn't a problem going to a certain clock speed, then it is always going to be a good thing because IPC and bandwidth of the various components of the system won't suffer for it.
Adding more mhz can eventually reach the physical limits of a chip, as in the speed of electricity isn't fast enough, not enough bandwidth from the cpu to the motherbaord to complete operations and store int he RAM, even i the ram is fast enough etc. Also, there may be a point where the program executes XXX in near 0 seconds, and adding more Frequency doesn't help, which if the other components aren't a bottleneck then the software's code itself can be the bottleneck.
XWRed1 said:I think you can only extrapolate this if you can divine an equation expressing the performance of the cpu. The fact that it is sitting in a system that has a hard drive and ram and things like that in it, and that you are running benchmarks that might get bottlenecked by those things, doesn't help.
As I said, It'd be possible to find how much a slow hard drive hinders, XXX ram hinders, etcetera. Obviously you can take them into account. This may not be accurate on a hundred ghz jump, but at a 1.4 ghz jump my results remained within a few percentage points. If I had booted certain linear benchmarks at 1000 mhz, and then 1500 mhz, I'd be able to ascertain the performance at 2400 mhz, then actually try it.