• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

SOLVED Point of OCing an i7

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

PeterPwned

Member
Joined
Jan 22, 2011
Location
Berlin, Germany
Hi,

I've recently installed an i7 930 running at 2.8 ghz, which seems like I definitely need to overclock it. I had actually been planning to do so and to reach 4 Ghz, but I no longer see a reason to decrease the life span of my cpu. Why?

In games a maximum of 34 percent cpu usage is never exceeded! It would actually make more sense to underclock it, as evidently its performance reserves are not needed.

So tell me, please, why is it that people overclock their CPUs (with exception to people who partake in projects like the rosetta project)?

Thanks in advance,
Mark
 
Last edited:
Really, the majority of people don't need to overclock their systems. Some people do it for the "free" performance boost. Some do it for the fun and challenge of it, and seeing how far the hardware can actually go. Personally, I'm the latter. I do it for fun just to see what I can squeeze out of a system. My daily driver is at stock though, no need for it to be OC'd 24/7.
 
Ill bet you money that that 5970 will see improvements in gaming w/ that i7 overclocked to 4ghz (noticeable FPS change). Granted, if you dont play demanding games or at a demanding resolution with no eye candy it really wont matter, but if you do all 3 itll help.
 
Failure to overclock an i7 is like sitting in rush hour traffic in a Ferrari.
Yeah I laugh whenever I see one going to/coming from work during rush hour.

If you need a legitimate reason to overclock your system, you can always join one of the distributed computing team on the forum. :salute: Edit: oops, didn't see the last line in the opening post! Yes join DC!

Edit again: Does bragging rights in front of friends running Pentium 4 machines count?
 
One benefit exploring overclocking. Is learning how a machine works and the hardware.

One thing the overclocking community as a whole gives. Is what stays running and what does not. Since we tend to push the envelope a bit. Reliable hardware rises to the top, lesser products get booted and no recommends.


Overclocking even mildly, is fun and gives insight to what you can do with computers.
There is a certain thrill. Even after all these year of me overclocking. Looking at what others are doing with their products and me going that one bit higher. Figuring out that puzzle of settings and what makes my setup hum, or cry/

The point of overclocking an i7? Because your curious enough to ask and that is a good enough reason. Why not satisfy your curiosity, and see for yourself.
:D
 
@ Matt

"Some people do it for the "free" performance boost."
But if your CPU isn't fully used in games anyhow, then you don't really get a performance boost. Instead you get useless bragging rights.

"Some do it for the fun and challenge of it, and seeing how far the hardware can actually go."
Hmm how is it fun to go into your bios, change some numbers and then to run a stability test? I understand people who comprise unique cooling solutions and then test them out, but what is the point of number crunching with widespread components?

@ SMOKEU

Failure to overclock an i7 is like not installing a supercharger in a truck that only needs to pull a small mass.

@ doz

Why would it? I am playing at 6036*1080 and recent games at that. Now if even then only 30 percent of the CPUs capabilities are needed to complete necessary computing tasks, how would it benefit me if only 20 percent of the CPU would be used due to overclocking?

@ Ignited

I think that is probably the only legitimate reason to do so.

@ Enablingwolf

That is a benefit, but if you've worked with CPU's for 6 years and you've explored i7 overclocking (BCLK etc. is new), then there is no long-term advantage of doing so.

Hmm I guess your key message is that overclocking an i7 is challenging and allows you to become more experienced with hardware. I suppose my original post was then not clearly formulated, because I was trying to inquire the actual benefit in using the system to execute graphically intense applications. Fiddling with technology is always interesting in itself, if done passionately.
 
Well rendering will benefit from the faster clocks. Pretty much anything that relies on clock cycles, will get a bump in lessening times to finish a node. From overclocking your i7, depending how far you take it. Is how much benefit you get in return.

My wife has her i7 and most of the time. Runs stock speeds for her. When i need to do some things my 775 quad does ok on, I fire up her machine and overclock it. So the networked node render times are really much lower in the times it takes to render out a full 3D animation. I use her overclocked i7 to render out and speed things up. I just export the file over to her machine over the network and tell it GO! Moving on to other things faster.

I also send large encodes to her machine. If I have need for faster.

Is this a closer (concept) to what your original question is trying to get at?
 
That is exactly what I was trying to get at. So you're basically saying that CPU clock affects the rate at which operations are completed, even if only a fraction of possible operations are being carried out? If that is the case, why is it that intel did not increase the stock speeds to 4-5 ghz and adjust the architecture to carry out less operations simultaneously? That way increased heat due to overclocking could have been compromised by changes in architecture and would have resulted in greater gaming performance, if I am correct in my interpretation of what you have said.
 
"Some people do it for the "free" performance boost."
But if your CPU isn't fully used in games anyhow, then you don't really get a performance boost. Instead you get useless bragging rights.

There's a difference in using ~30% of your CPU at 2.8GHz and using ~30% at 4GHz. Your CPU doesn't have to be maxed out on % usage to see benefits from faster clocks.

"Some do it for the fun and challenge of it, and seeing how far the hardware can actually go."
Hmm how is it fun to go into your bios, change some numbers and then to run a stability test? I understand people who comprise unique cooling solutions and then test them out, but what is the point of number crunching with widespread components?

I was referring to benching. Testing for 24/7 OCs is boring, I agree.

...why is it that intel did not increase the stock speeds to 4-5 ghz and adjust the architecture to carry out less operations simultaneously?

At this point in time, I would say cost and avoiding RMAs.

Intel would have to pack a mean stock cooling solution for an i7 quad/hex X58 chip clocked to 4-5GHz, which would cost them quite a bit in R&D and manufacturing such a cooling solution. There are a few high clocked dual cores, like the recently released X5698 at 4.4GHz stock with HT.
 
In the pre-Core days. There was serious voltage leaking issues and resultant heat output. The theoretical limit if the P4 was something like 7Ghz or something.. By the time they got to 3.2ghz. It was a space heater and leaking voltage really bad.

Basically, the higher you design the clock speeds, usually it/and can equal(s) more voltage and more heat. So you need beefier cooling. Robust parts cost money and in turn products cannot be as cheap as they are now. Plus failure rate and binning parts for that clock rate. There is no one simple answer to that aspect of your question.

More cycles per second will in fact increase output of demanding applications. The resultant heat of adding more clock cycles generating per second will overwhelm most stock coolers. If your overclocking enough, even in some cases with stock cooling and a machine in a harsh environment. One last thing is the power draw itself and energy requirements in the long term.
The older 2.+V CPU designs were a burden in long term energy costs.



Most of the CPU manufacturers are leaning towards power efficiency, not raw speed. Though a very efficient CPU of any design will give good output versus clock cycles. Maybe slower for something it is not designed to do, but if your using generic hardware, then it will not be as speedy. If it has the proper instructuion set. WHAM! Benefit. Hence why overclocking on generic hardware can be a benefit for the average user in some cases. Going with raw power instead of nimble power efficiency.

Hence why there is a specific market for the graphical rendering or general use/gaming cards.

Having more threads passing through a given CPU at a higher cycle clock rate will be a benefit for the average power user. Since if the clock are higher, it can process more threads faster.
 
I'm sorry if I don't understand, but a solid CPU cooler only costs 50-100 bucks. That is nothing compared to the price of say a 980X, which runs at a mere 3.33 ghz. I think there would be a consumer group willing to pay for a better cooling solution to get a stock 4-5 ghz CPU. Afterall many people get an aftermarket cooler anyway, just because they want something quieter.

There is clearly a large market for gaming hardware and a number of people who are afraid of OCing, because of consequences if done wrong. What they don't know is that it's very hard to actually damage a CPU nowadays. I would bet my house that large profit can be made by selling 4-5 ghz CPUs for 500 dollars.

Anyway thanks a lot for answering my questions. I better get to work then, if I want my FPS. Although, to be honest, I have to say I am disappointed in eyefinity gaming. It doesn't really give you a feeling of immersion, but rather that there are two screens on your sides to look at when you feel you don't have the time to look in a different direction, which is, for me, never the case.
 
The huge majority of the computing world leaves things stock. A $100 cooler to many is outlandish. To us overclockers, it is just part of it. Hell, dumping LN2 into a special made pot to get -195°C, for some is normal..

What level of resource expenditure are worth the gains. that is up to each and every user. Be it an Atom user or a top end CPU, fully dressed. Maybe it is the user who goes mid-level and overclocks, until the CPU cries. Then shut it up watering it down or icing up.
Computer users are very diverse and have needs only they can know. Even if they do not know better, or unwilling to go past what they got and are fully happy with it.

Hence why more machines are sold as OEM kits, than build your own. The majority of the computing market is commercial/OEM. I just do not see a bean counter opting for sending out a PO for 1000+ machines with $100 CPU coolers. Most home users do not want to overclock nor want the added overhead/ leaning curve to do it right. They are happy with the machines that do not even support overclocking. Even if they knew what overclocking was. It might not even be what they want to do with the computer.

There is clearly a large market for gaming hardware and a number of people who are afraid of OCing, because of consequences if done wrong. What they don't know is that it's very hard to actually damage a CPU nowadays. I would bet my house that large profit can be made by selling 4-5 ghz CPUs for 500 dollars.

As Matt made aware...
There are a few high clocked dual cores, like the recently released X5698 at 4.4GHz stock with HT.
 
Last edited:
I'm sorry if I don't understand, but a solid CPU cooler only costs 50-100 bucks. That is nothing compared to the price of say a 980X, which runs at a mere 3.33 ghz. I think there would be a consumer group willing to pay for a better cooling solution to get a stock 4-5 ghz CPU. Afterall many people get an aftermarket cooler anyway, just because they want something quieter.

There is clearly a large market for gaming hardware and a number of people who are afraid of OCing, because of consequences if done wrong. What they don't know is that it's very hard to actually damage a CPU nowadays. I would bet my house that large profit can be made by selling 4-5 ghz CPUs for 500 dollars.

Anyway thanks a lot for answering my questions. I better get to work then, if I want my FPS. Although, to be honest, I have to say I am disappointed in eyefinity gaming. It doesn't really give you a feeling of immersion, but rather that there are two screens on your sides to look at when you feel you don't have the time to look in a different direction, which is, for me, never the case.

The market for HIGH end cpus is actually very small in comparison to the mainstream market. I mean ask everyone you see in a day (do like a poll) and ask what they are running. You think theres a large market for 980x cpus? Not hardly. You think theres a huge market for 2600k? Not nearly as big as the i3/i5 more budget minded market.

Do most of these people need 4ghz+? Nope, most of those i3/i5 mainstream machines are running some mild video card, not a $300+ beast of a card that could actually gain a decent amount in.

And if you dont believe us about the CPU clock affecting your gaming performance, put our beliefs to the test. Run a bunch of gaming benchies at 2.8ghz stock clock, then give it a nice 4ghz overclock.
 
@ Wolf

That may be true, but I dare argue that the market for computing products is unfathomably large and while the gaming and OC market is comparatively small, it is still large in itself. Forums like these and the tons of gamer magazines out there (12 well known ones just here in Germany) do point at the fact that there are at least many people interested in high performing systems.

I believe more machines are sold as OEM kits, because there is not enough education in school. It took me two months to understand and build a rig. I think seeing as information technology plays an increasingly great role in both leisure and office tasks it is more than adequate to make a one year computer studies course obligatory.

Putting that aside for a moment, I do see your point, but I believe that is only due to the fact that they haven't been able to explore the computer as a thing itself and experienced the things you can do with one. I know many people who only surf and use office - what a shame, considering the diversity and flexibility of PCs.

A dual core remains a dual core. Quad core games are on the rise. That is all.

@ doz

See what I wrote above - while it may be true that the markets are relatively small, they are big when isolated from the chunk that is the IT world.

I will definitely give that a shot when I'm done flashing my gpu bios over.
 
Peter, I sort of do not understand.

You went from, why should I overclock, and asking for more details. To now pushing it as everyone should be doing it and there is a huge market for it. Could you clarify your point please.
 
Anyway thanks a lot for answering my questions. I better get to work then.

Here you go, my core question has been answered. Maybe I should stop diverging from the actual topic and starting a general discussion, please excuse me. Feel free to close this if you like.
 
I have neither the power to close the thread. (Green and Red star members have that powah) Nor the desire for it to be closed. :D

I just wanted a clarification of what you were trying to get out of the discussion. I find it an interesting topic. Why I chose to participate. Hence me asking for clarification. So I was to be able to discuss it and understood your point.
 
Okay then ^^

While we're at it, is socket 1366 dead? Sandy Bridge looks pretty good and I'd like to know whether I need a new mobo by the time the high-end ones are released. 775 lasted quite a while...
 
I would not be as concerned about the socket. As I would be about a chipset supporting newer CPU iterations on the socket..
775 might of been around a freakishly long time. But the chipset iterations is what changed and allowed what new CPU technology to be used.

1366 is far from dead. Since right now it is the high end socket choice for desktops. I would assume the budget/midline sockets would be phased out before that. As newer higher bandwidth sockets come to market. With 1366 possibly supporting the equivalent of the Celery.
So the (now) high end 1366 may become a low end socket in the future. As higher density sockets come out.

As for answering, should you wait out the Sandy chipset. That would be better served in the motherboard section. There is a robust thread on the topic of that particular subject. Possible many more to come. Since everyone is drooling over it right now and the issues it just had.
 
Back