Judging from the forums name the question am about to ask may have an answer inclined in favour of one side probably because everyone here does it.
I'll try to make you understand where am coming from before asking it.
Am the kind of person who will go for the best optimal settings (Overclock/performance/watt/cost ratio be it the cpu,cooler or graphics card) and not the maximum possible settings as they may not turn out to be the best. Am strongly inclined to gaming and occassional video editting/conversion. I have overclocked my cpu, compared sythetics to stock clocks and thats just as far as it goes.
I havent really seen real world applications especially games really benefiting from overclocking as much as just upgrading the graphics card. When it comes to video editing/conversion overclocking seems to chop of some minutes, say if it was taking 25min to convert a 4.7gb dvd to .rm or rmvb, it just knocks off 7min.
I would like some realife experiences and not predictive thoughts when responding to the question. To really understand my plight this guy called billin30 here would understand where am coming from.
Alot of his member responses are not really realife experiences but presumptions, his question emanates from a realife experience. There are 16 responses in that post but about 5 address his question directly.
Am sure you may be wondering, I have been a member this long here and all over sudden I dont seem to understand why overclock. The simple reason is a friend of mine visited me, saw my air cooler heatsink, researched about its performance on the net and finally yesterday bought it from me. So what happened? I was forced to return to stock settings as I have been on overclock settings all the time. It is after returning to stock settings that I noticed little difference.
I thought overclocking is supposed to benefit greatly in real world as or just about synthetics. I dont mind one taking their time to answer this question. I'll read through your response just as you have read through my question to this bit. Just where is overclocking applicable and worth the trouble?
I'll try to make you understand where am coming from before asking it.
Am the kind of person who will go for the best optimal settings (Overclock/performance/watt/cost ratio be it the cpu,cooler or graphics card) and not the maximum possible settings as they may not turn out to be the best. Am strongly inclined to gaming and occassional video editting/conversion. I have overclocked my cpu, compared sythetics to stock clocks and thats just as far as it goes.
I havent really seen real world applications especially games really benefiting from overclocking as much as just upgrading the graphics card. When it comes to video editing/conversion overclocking seems to chop of some minutes, say if it was taking 25min to convert a 4.7gb dvd to .rm or rmvb, it just knocks off 7min.
I would like some realife experiences and not predictive thoughts when responding to the question. To really understand my plight this guy called billin30 here would understand where am coming from.
Alot of his member responses are not really realife experiences but presumptions, his question emanates from a realife experience. There are 16 responses in that post but about 5 address his question directly.
Am sure you may be wondering, I have been a member this long here and all over sudden I dont seem to understand why overclock. The simple reason is a friend of mine visited me, saw my air cooler heatsink, researched about its performance on the net and finally yesterday bought it from me. So what happened? I was forced to return to stock settings as I have been on overclock settings all the time. It is after returning to stock settings that I noticed little difference.
I thought overclocking is supposed to benefit greatly in real world as or just about synthetics. I dont mind one taking their time to answer this question. I'll read through your response just as you have read through my question to this bit. Just where is overclocking applicable and worth the trouble?