• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE NVIDIA Launches GEFORCE GTX 680, aka Kepler

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Its pretty clear there is a winner when averaging things out. In your link Frakk, in the stock vs stock graph (as clock for clock comparisons are frankly asinine to me - you are overclocking one card with a COMPLETELY different architecture to match another card's clockspeeds which makes no sense) it wins
12 of 18 benchmarks @ 1920x1080 (the most common resolution). Some games just work better naturally with different implementations and some are TWIMTBP games and others sponsored by AMD. So you add up winning in most games/benchmarks, costing $50 less, using less power, and quieter in reference form, it seems pretty clear to me(most) who stands a bit taller between the two. Excluding the other stuff and focusing on performance for the majority
(1920x1080 or less) its still heads above the rest in most cases.

The only way I can see that stance holding water is if you only play the 2 games where there is actually a difference (AvP, Civ V). Will you notice some of the differences, like the hexus article said, you really dont, but that is when the other factors come in to play (price, power, noise).

Performance wise it wasnt a knockout punch, that I agree with, but it sure was a solid left to the chin, especially being a 'mid-range' card if you go by the
core used.
+1
I couldn't say it better myself :clap:
 
Hey Frakk:
“The best argument against democracy is a five minute conversation with the average voter.” -Winston Churchill

I LOVE it!

:D Truer words never spoken IMO

@ EarthDog just relax dude there is little in them, you talk up this card rather a lot, its not that special its just the latest high end card with a little step over the last high end single GPU card, no one disagrees its cheaper (right now) or a that its good allround card.

Soon the will be another one to take its place, that's how the cookie crumbles..

I think you can make better word choices than that, don't you Frakk? -hokie
 
Last edited by a moderator:
True arguments from both sides... but business is business, AMD priced its card at 550 and people were all over it. Does this mean we should boycott a card that is $50 cheapher and completely dominates the AMD card? No it just means because AMD didn't push the envelope in terms of performance, nVidia gets to win the lottery and make great margins with this card.

Could nvidia have sold this card at $300? Maybe.. but at $500 they are making investors happy, and its good for business to make investors happy. Why? Because they begin to have more faith in their investment. Happy investor = happy company = stimulus in production, design etc..

If the profit margin is larger, this only means that he product is worth that much at the time. Take the 680 as it is, the current GPU flagship and king of the hill. If you look passed the price, buy it and are a happy customer, then that is a good purchase in my book. If you think the $500 price is too high, then you are as right as the guy who thought the price was right.

Personally, I bought one because it just hit all the points I wanted to cover. High performance, low power consumption, cool new features, 3D and multi-display, great drivers and best performance in my favorite game etc… You can't go wrong with that much innovation and $100 cheaper than the equivalent performer is a no brainer.
 
@Frakk

I don't really see how EarthDog was inflating the GTX 680 in his post. He was merely saying that you can't compare clock for clock especially when one of the cards has to overclock or underclock to make such a comparison. Making stock clock comparisons is fine and fair, and we all know how that ends...
 
Stock vs stock, stock+10% vs stock+10%, max reasonable air OC vs max reasonable air OC, those are all useful comparisons.

Clock/clock? Not useful.
 
Let's compare a Pentium 4@3GHz vs a 2500K (one core)@3GHz:thup:
It just makes no sense:shrug:
 
Its not at all difficult to understand actually, here is how simple this is, hands up who would run any of these cards at stock.

Comparing how they run at different clocks is absolutely useful.

@ diaz, your emphasizing points that no one has argued against, chill out people its just a GPU card :)
 
Last edited:
Its not at all difficult to understand actually, here is how simple this is, hands up who would run any of these cards at stock.

Comparing how they run at different clocks is absolutely useful.

With different architectures, clocks mean nothing.

What would be a good indicator would be to benchmark@stock+x% for both cards, as Bobnova wisely highlight earlier.

EDIT: competitor of the 6970 is the 570, right? The 6970 runs@860/1375 and the 570 runs@732/950. Would you compare the 570 to the 6970@same clocks? The 570 would litterally kills the 6970... Does it mean the 570 is better? No! Because the 6970 overclocks much higher under normal cooling conditions (air/water) than the 570. Which means that at their respective highest OC (still under conventional cooling), they are still neck and neck.
 
Last edited:
With different architectures, clocks mean nothing.

What would be a good indicator would be to benchmark@stock+x% for both cards, as Bobnova wisely highlight earlier.


Diffrent ideas perhaps on what matters, for me is how they perform overclocked and how far that overclock goes and in that how they stack up.
 
Its not at all difficult to understand actually, here is how simple this is, hands up who would run any of these cards at stock.

Comparing how they run at different clocks is absolutely useful.

@ diaz, your emphasizing points that no one has argued against, chill out people its just a GPU card :)
I run at stock. There is no need for me to overclock any of my last few cards I owned. I have been blessed to be able to have nice cards. But yeah, I dont overclock 24/7. I only overclock to benchmark. :thup:

MY only point up there was that I didnt like, in that review you linked, that they matched clocks and ran them like that was worth it. :p

I certainly understand your point of wanting to see how they both perform overclocked. The problem with that is every card overclocks different. So you may wind up with a dud, or a stud, you never know. BUT you are getting a midrange card in the first place so......these cards and how they clock are of little consequence in the first place. I can tell you first hand that the architecture down the chain, at least the cards I have, dont exactly scale as high as the 7900 series either. Thats why I personally look at stock clocks for comparisons.
 
Last edited:
I run at stock. There is no need for me to overclock any of my last few cards I owned. I have been blessed to be able to have nice cards. But yeah, I dont overclock 24/7. I only overclock to benchmark. :thup:

You should try it, you get better performance out if it for your money... give it a go :thup:
 
Ehh, I'm well aware of the benefits. I do not remotely need to do so... 680 @ 2560x1440 = plenty good FPS for me in BF3 and everything. But thanks though!

(and I edited my post above. ;))
 
Ehh, I'm well aware of the benefits. I do not remotely need to do so... 680 @ 2560x1440 = plenty good FPS for me in BF3 and everything. But thanks though!

(and I edited my post above. ;))

I would ask you for your gamer tag but i did that once already with another member here and we are never gaming at the same time, despite the fact that i work when i want from home and my internal clock is not rigidly set to UK time, actually its set to Newyork time...
 
Can you use nvidia surround with different resolution monitors like you can with the 7xxx series AMD cards?

I ask because I happen to use one 1920x1080 and two 1280x1024 for day to day use (none gaming) right now.
 
I'm not sure the surround gaming function will work (I'm pretty sure it won't but don't know for certain), but your desktop will span the multiple displays for sure. The one downside is that, same as the AMD cards, it probably won't be able to enter a low power state using that sort of configuration.
 
Damn, did not know both manufactures would not enter low power mode in multiple monitor configurations.
 
Can you use nvidia surround with different resolution monitors like you can with the 7xxx series AMD cards?

I ask because I happen to use one 1920x1080 and two 1280x1024 for day to day use (none gaming) right now.
You can use different sizes for more desktop space but for surround gaming you need three monitors and thier resolutions have to match. I do beleive that you can use them in portrait mode instead of landscape ie
3240 x 1920 versus 5760 x 1080.
 
Back