• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE NVIDIA Launches GEFORCE GTX 680, aka Kepler

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Mmm... I am waiting for IB to be launched for my next rig, and am still wondering:7970 or 680?

As it's been said already, it seems that the 7970 overclocks somehow higher than the 680, and this is OCF, no?

IF the 7970 price drops by $/€50-100 , I think I will go AMD. If not, it will be 680 for me...
 
AMD has reportedly said they aren't sweating and can beat this (with one GPU), so it was actually smart business not to tip their hand with the full GK110 beast.

Sounds like the good old days again. We all sit and /popcorn + /profit watching the two titans punch each other in the face. They make their zillions and we get cutting edge technology for relatively cheap. Capitalism the way it should be.
 
I dont think I have witnessed that sentiment (acceptable b/c its Nvidia). There are a couple of games the 7970 wins, but the rest the 680 takes it (Read: Anandtech review, Tom's, Techpowerup).

In reading the Anand review, it beats out the GTX 580 by an average of like 30-40%. Thats big. Prior to its release, the GTX580 was selling (in the US) from $400-$530 (non watercooled). Now its $360 - $500 (non watercooled).

I dont understand this... I would imagine it would catch up and possibly beat it. BUT, (like fractions) do to one side what you do to another. When you overclock the 680, I would imagine it still beat out the 7950.

You need to look at the overclockers uk forum then lol its proper swinging handbags !
Like I said though the gtx 680 reference card stock is 1066core mem 6000, core boost is around 1110-/= depending on tdp and temperature.
In comparison the 7970 reference stock in reviews is 925 core mem 5500. So quite a big difference. As i'm sure you're aware it'd be easier to do our own reviews on the hardware we own, but i cant afford a 7970 and 680 to compare.

Please don't think i'm trying to derial this thread by the way I'm still working out with reviews and findings to the performance difference betwen the two cards. The problem with different review sites is they all have
different hardware ie cpu, ram etc. and different ways to measure .
Like techpoweup sometimes dont use msaa 4x in some benches, where anandtech do.

For the 1536 mb 580's (aircooled) the prices havent really changed here in the uk at around 320- 340 gbp.

The only reason I brought up the 7950, is because for its price average = 330-350 gbp its around 80-100 gbp
cheaper than a 7970 and 680.
Martini currently owns a 7950 oc, and he has just bought a 680. He is going to put them back to back. Of course
the luck relies upon on the silicon overclocking, but hopefully he'll be able to work out the max oc of both
cards. The end result will be how much of a punch the 7950 offers for the money.

This is only a snippet of info I was able to attain so far, as im at work,
but on guru 3d it shows how much the 7970 scaled in 2 games from overclocking.
I'm not biasing this info it's just the only info I have found at the moment.


Summarised Crysis 2: 1920 x1200
DirectX 11
High Resolution Texture Pack
Ultra Quality settings
4x AA
Level - Times Square (2 minute custom time demo)

Standard 7970 61 fps
Asus duII oc 1000x5600 66 fps
Asus duII oc'd 1250x6000 76fps
std 680 1006 1058 6000 63fps
680 oc'd 1264 1264 6634 70fps

---------------------------------------------------------------------------------
Alien vs pred 1920x1200 4x aa 16af

Standard 7970 55 fps
Asus duII oc 1000x5600 61 fps
Asus duII oc'd 1250x6000 71 fps
std 680 1006 1058 6000 52 fps
680 oc'd 1264 1264 6634 58 fps
-------------------------------------------------------------------------------------

Yes this is only 2 games, I'm well aware of the potential of the 680 in bf3 and other games vs the 7970. This is

just to prove how much the 7970 can scale when overclocked and how benchmarks showing the std 7970 vs the 680

could show different results.

For the links from my summary here they are
http://www.guru3d.com/article/asus-radeon-hd-7970-directcu-ii-review/23
http://www.guru3d.com/article/geforce-gtx-680-review/25


Like I said if i owned both cars id bench them with a wide variation of games.
but its hard to cross reference different review sites with their fps because the results differ. Just like my pc would someone elses.
 
As I understand it (not from NVIDIA, just from reading around) GK110 isn't finished yet, so they couldn't release that. AMD was kicking their prior gen in the rear-end and they had something that could beat it at a better price point. Thus, GK104 was released at a lower price.

AMD has reportedly said they aren't sweating and can beat this (with one GPU), so it was actually smart business not to tip their hand with the full GK110 beast.

It might not be the best for the consumer, but these guys exist to make money. Why release something that puts the competition squarely in their rear-view and charge $600 when you can beat it by a little bit and charge $50 less than the competition at $500?


I'm not sure i believe anything AMD or nVidia say, AMD Re-Bulldozer / nVidia Re- Kepler 2x as fast as GTX 580.... its no where near that.

But the GTX 680 while definitely faster then the 7970 its only by about 10% overall, that's far from a killing....

The GTX 680 is running a much higher stock clock then the 7970,- 925Mhz vs 1060Mhz.
If you take the performance deficit of the 7970 and account it to the lower clocks it adds up in theory.

As for the load power draw its about the same.

The 7970 is the only AMD GPU of the 7### line with no Ghz version.
Add to that they have 3 months to refine there 28nm architecture, now look at the 7870, its just about as fast as a 7950, has less specifications and draws notably less power.

I would like to see these two titans fight it out clock for clock in the widest possible range of games and resolutions, i think the 7970 might just get close to the GTX 680 or catch it if clocked at 1060 with the memory also clocked to the same....

On this occasion i can easily believe AMD when they say they are not intimidated by the GTX 680, i would even believe them if they said they already have what it takes to at-least match it.

I think both these guys have more to come, this is just a warm up.

I'm not counting AMD out, we might even hear from them soon.
 
@ davedree i here you, it's hard to cross reference fps from different web sites. I've been wanting to update my pc for a year now but looking at benchmark from different sites they get me going crazy because i shouldn't be beating the numbers they are throwing up or coming close but since i been on this site i heard about anAndtech web site it seems to be more realistic.The links you put up i cant believe there is no way i can beat a 580 let a lone a 680
 

Attachments

  • avp2.JPG
    avp2.JPG
    69.7 KB · Views: 101
frakk the 680 doesnt run at 1060, thats its minimum base clock it boosts on average upto 1110 + more if theres room..

Ok stock for stock the 680 looks a winner.
But Its early days, I want to see a good oc 680 vs a oc'd 7970.
In a wide variation of games.
 
@ davedree i here you, it's hard to cross reference fps from different web sites. I've been wanting to update my pc for a year now but looking at benchmark from different sites they get me going crazy because i shouldn't be beating the numbers they are throwing up or coming close but since i been on this site i heard about anAndtech web site it seems to be more realistic.The links you put up i cant believe there is no way i can beat a 580 let a lone a 680

What are you comparing that to?
 
frakk the 680 doesnt run at 1060, thats its minimum base clock it boosts on average upto 1110 + more if theres room..

Ok stock for stock the 680 looks a winner.
But Its early days, I want to see a good oc 680 vs a oc'd 7970.
In a wide variation of games.

ah... okay, that's even more... i thought it was 1057 or something, don't ask me where it comes from, its to late in the day.....:)
 
@ davedree i here you, it's hard to cross reference fps from different web sites. I've been wanting to update my pc for a year now but looking at benchmark from different sites they get me going crazy because i shouldn't be beating the numbers they are throwing up or coming close but since i been on this site i heard about anAndtech web site it seems to be more realistic.The links you put up i cant believe there is no way i can beat a 580 let a lone a 680

Its a game i've yet to play, your results look great, I think i'll have a play with this game and see how my card performs
 
@ davedree i here you, it's hard to cross reference fps from different web sites. I've been wanting to update my pc for a year now but looking at benchmark from different sites they get me going crazy because i shouldn't be beating the numbers they are throwing up or coming close but since i been on this site i heard about anAndtech web site it seems to be more realistic.The links you put up i cant believe there is no way i can beat a 580 let a lone a 680

I just looked, thats confusing, looking at that it looks like you beat a stock GTX 680 with a 6950?!?!?!

That's not right and i have noticed it myself,- in that apparently similar setups from different sites do not match at all sometimes, at times even similar benches from different times from the same reviewer are way off eachother....

One could argue its down to different hardware or even drivers but some look so far off it looks like numbers they have simply pulled out of there collective rear ends.

Strange old going on.
 
I'm not sure i believe anything AMD or nVidia say, AMD Re-Bulldozer / nVidia Re- Kepler 2x as fast as GTX 580.... its no where near that.

While it's true that it isn't close to 2x as fast as the 580, that's not what nVidia was shooting for. The infamous "GPU Roadmap" image from an old nVidia presentation showed a goal of 3x the DP GFLOPS per watt for Kepler compared to Fermi. It was a computational goal, not a gaming goal.

Their stated goal for gaming was 2x the performance per watt consumed. GK104 is about 44% better than the 580 in terms of fps per watt according to one review site (Tom's), but remember, Kepler isn't entirely rolled out yet. "Big Kepler", the GK110, might be even more efficient than GK104, but I don't know if they'll hit their goal of double the performance per watt.

The Kepler show isn't over quite yet, this was just the first act. Hopefully it keeps going well and the price drops a bit over time.
 
While it's true that it isn't close to 2x as fast as the 580, that's not what nVidia was shooting for. The infamous "GPU Roadmap" image from an old nVidia presentation showed a goal of 3x the DP GFLOPS per watt for Kepler compared to Fermi. It was a computational goal, not a gaming goal.

Their stated goal for gaming was 2x the performance per watt consumed. GK104 is about 44% better than the 580 in terms of fps per watt according to one review site (Tom's), but remember, Kepler isn't entirely rolled out yet. "Big Kepler", the GK110, might be even more efficient than GK104, but I don't know if they'll hit their goal of double the performance per watt.

The Kepler show isn't over quite yet, this was just the first act. Hopefully it keeps going well and the price drops a bit over time.

(With respect)


Perhaps, for its just just hot air, "we have a supper dooper end all card in the form of GK110 but its not quite ready yet, when it is it will change the world... look at how good we are, now buy my T-Shirt"

A large serving of salt with that please....:)

Companies posture and talk themselves up all the time, making bold claims is one thing. putting it on the shelves is quite another
 
Last edited:
All I see on the shelves is a generally faster card that is cheaper by 10%, quieter, and less power hungry by 10%. I dont care what marketing said...who promised how many jiggawatts ler ounce...all we have IS whats on the shelves (see first sentence).
 
All I see on the shelves is a generally faster card that is cheaper by 10%, quieter, and less power hungry by 10%. I dont care what marketing said...eho promised how many jiggawatts ler ounce...all we have IS whats on the shelves (see first sentence).

Yes, the last thing i want is to get into an argument over this, what i see on the shelves is no killer, what i see is something that is a little better and a little cheaper, a little is not a lot to make up or beat.

Whats on the shelves has not lived upto its hype.

It is better no mater how you look at it (we agree on that) yet far from all that... its clocked much higher then its competition, its just possible all they have to do is release one at similar clocks and drop the price, job done.
 
Last edited:
I guess I just didnt see this hype, or if I did, buy in to it. Jeremy Is right though on the nvidia slides (compute/ gF is what I recall... But could be wrong)

Sounds like a misinterpretation of marketing may have caused inflated expectations for a lot of of folks.
 
I don't recall it coming from nVidia marketing. I recall the "2x performance per watt" coming from leaks and leaked internal slides. Please correct me if I'm wrong (with source links if possible). It's possible the 2x performance per watt thing was an internal goal. Internal goals are typically aggressive in order to drive innovation and encourage engineers to consider everything they can. They're not typically public because they're not always met.

If the information Frakk is basing his criticism on was never meant to be public, I think it's more than a bit unfair to judge based on that metric. Now, that Italian PR guy with the "It'll be untouchable" or something to that effect statement before launch, HE certainly deserves a good slapping.
 
Back