NVIDIA Launches GEFORCE GTX 680, aka Kepler

Good morning faithful readers! While we truly wish we had a review for you today (and indeed were promised we would have a card pre-launch at CES so we could bring you one), we are unfortunately left holding the short end of the stick today. That’s ok though, we’re working our hardest to get a partner card for you ASAP and have already been in talks about future, non-reference versions coming up this spring.

NVIDIA GTX 680 - Image Courtesy Tweaktown

NVIDIA GTX 680 - Image Courtesy Tweaktown

From an accidental posting of GTX 680′s at Newegg last night, it seems the price will range from $500 to $535, depending on brand. This undercuts some models of AMD’s HD 7970 by a decent margin (Newegg list) and a lot of indicators show it out-performing the 7970 and falling just shy of the dual-GPU HD 6990. This is a good thing for everybody; with any luck we’ll have a price war and in the end consumers will win out. Better GPUs for cheaper? Yes please!

We’ve been told we’ll have a card very soon and rest assured we’ll bring you a review as soon as we can. Until then, have a look at some reviews published this morning.

If you do make the plunge (which you can now do at Newegg as of ~9:20AM EST), you will of course need the latest drivers. NVIDIA has come through with driver version 301.10, which is WHQL-signed.

Feel free to share more links in the comments!

Jeremy Vaughan (hokiealumnus)

Tags: , , , ,

200 Comments:

David's Avatar
I'm personally very frustrated with nVidia over this. Forgive the venting post, but we were told at CES to work with their partners to secure cards. We were promised a card by a partner. Then, a matter of days before launch, we get that we're not getting a card. We'll get cards post launch, but we do our best to jump through the various hoops so we can be on top of new releases and do a proper Overclockers.com review of all the major hardware, and when we get knocked back like this it's very poor.
hokiealumnus's Avatar
Updated to 15 reviews and added Newegg link. ASUS cards are already out of stock. If you want one of these, you'd better jump now!

EDIT - David, you should check out the first page of TweakTown's review. We're not alone. Being in Taiwan they managed to get their hands on one, but they did a pretty good job at spitting on NVIDIA's NDA.
Super Nade's Avatar
It seems that there are a lot of areas other reviews have not had the time to look at, such as overclocking for one and the turbo-boost features of the card. I'm sure one of the benching team members will have one on hand and keep us posted. After all, we cannot control what companies do and being a bit late to the party is not that big of a deal. People still consider us a trustworthy source and I'm sure our guys will be adding useful data to the pool.
David's Avatar
I see your point, and we will get a good review up.

However, our hardware reviewers are all (very busy) volunteers, who had set time aside to give Kepler a proper going over. So not only are we sat with no Kepler review, we don't have content that could otherwise have been prepared during that time.

There is a stark contrast between dealing with AMD and dealing with nVidia. The former are forthcoming, open, clear and honest. If nVidia had just said 'no' it wouldn't be so bad.

I'm just working myself up into a mood over this :-\
hokiealumnus's Avatar
Got it, thank you. Anandtech's already up. Ones we already have are:
Brolloks's Avatar
As I recall Nvidia snubbed us when the GTX580 launched as well, not sure why they dont like us?
That said we have been very fortunate getting review samples so we should not complain too much
Hardin's Avatar
Now here is a card to get excited about. Especially after the recent disappointing benchmarks for the new amd cards.
Brolloks's Avatar
^ I dont see the 7970 as a appointment, maybe I'm missing something, AMD is still pretty close to the latest Nvidia offering
MattNo5ss's Avatar
Both HD7970 and GTX680 are beasts, definitely no disappointments from either side.
hokiealumnus's Avatar
You guys want overclocked results? How about from the king himself?


diaz's Avatar
Oh my god... 1848Mhz... And people were bragging about the 7970 hitting 1800. DOn't get me wrong, the 7970 is a fantastic card, but the 680 is just that notch above and if it can OC just as much (using Epower board), then that's INSANE.

My MSI 680 order is currently "processing" over @ NCIX.ca ... DAM YOU HURRY UP NCIX!!
hokiealumnus's Avatar
I doubt that was on the stock VRM. These are the two premier overclockers employed at EVGA.

Chances are it had an untouchable attached to the cards.
azuza001's Avatar
Is it just me or does this card look like its going to have severe memory bandwidth issues? I mean, just look at some of the reviews, at 1920 the card seems a lot faster than the 7970 but at 2560 and high they're about the same. When you go below 1920 the card just gets so much faster % wise.

Either way its a great looking card and will drive prices down which is what I want.
hokiealumnus's Avatar
Heh, yea; they had untouchables attached.


Brolloks's Avatar
That is just insane, thanks for sharing Hokie
EarthDog's Avatar
They just need to get that modded bios out so we all can reach those speeds. Without it, I believe the dynamic overclocking will step in and knock it back...
Ronbert's Avatar
If these things truly beat out dual 6990's I may be looking to sell my 6950's as a pair and make up the difference for one of these babies, I guess time will tell.
nightelph's Avatar
1.4GHz on air achieved without addon VRMs, freakin awesome.
burebista's Avatar
My friend Ramiro tried something similar.
hokiealumnus's Avatar
Nice burebista; interesting they used an untouchable on air and that it survived 1.5V without issue. Thanks for sharing!

As far as our end of things, I've been told a card from EVGA was shipped yesterday. EarthDog purchased one from Newegg (a Gigabyte model) so we've got two of these beasts coming for review so far!
Fireside85281's Avatar
Correct me if I'm wrong, but wasn't this card, the GTX 680, supposed to be the GTX 670 instead, with a matching price point? But since AMD's cards were so lackluster in comparison, Nvidia thought they'd shift the classification of their new cards to more closely compete with AMD, by renaming their 670 to the 680, effectively raising the price of the same card $100.

This doesn't make Nvidia sound like they care about the customers at all...
Brolloks's Avatar
Impressive...I see the DVI stack has an odd shape so none of the existing 3-fan aftermarket coolers will fit the 680
muddocktor's Avatar
I'll respecfully disagree with you there, Fireside. This card performs better than the 7970, yet Nvidia has brought it in at $50 less on the MSRP than the 7970. The cards are direct competitors in performance so I have no problem with it being called a 680GTX. But, it might give Nvidia naming trouble down the road if they release higher end Kepler cores by the end of the year
Brolloks's Avatar
I suspect 685 will be the next single GPU Kepler
EarthDog's Avatar
+1.

They are a business. They are in business to make profit. I have to agree whole heartedly with mudd on this one. Doesnt matter what it was *supposed* to be, it is what it is and for the price is a pretty solid deal. I mean fastest single GPU for less than the 2nd fastest, consumes less power, is quieter, and $50 cheaper.

@ Mudd - Why not 7 series... wasnt the 580 a 'full' version of the 480? So if the big daddy does come out, it fits right in, no? Maybe 685?

EDIT: Heh,, Brolloks.
hokiealumnus's Avatar
As I understand it (not from NVIDIA, just from reading around) GK110 isn't finished yet, so they couldn't release that. AMD was kicking their prior gen in the rear-end and they had something that could beat it at a better price point. Thus, GK104 was released at a lower price.

AMD has reportedly said they aren't sweating and can beat this (with one GPU), so it was actually smart business not to tip their hand with the full GK110 beast.

It might not be the best for the consumer, but these guys exist to make money. Why release something that puts the competition squarely in their rear-view and charge $600 when you can beat it by a little bit and charge $50 less than the competition at $500?
xsuperbgx's Avatar
I spent my IB money.... I should have an evga version here, in a few days too.

My first time ever buying anything on launch day!
JeremyCT's Avatar
The GK110 wasn't anywhere close to done. It just taped out in January/February if you believe the rumor mill. The GK104 wasn't released simply because it was competitive, it was released because it's what they had available. Regardless of what it was "supposed" to be or what chip it follows up on is irrelevant. Nobody here was in the nVidia board room during planning, so nobody knows. Just because the GPU number makes it coincide with the 560ti doesn't mean it was initially intended to be a 660ti. Fact is, it is what it is, and it's what nVidia brought to market. It's a compelling product with good thermals, moderate power consumption, and very good performance at a price point that's extremely competitive. Given its relatively small die size, the price should stay competitive for quite some time. Neither nVidia nor AMD have any obligation to sell their wares for any given percentage over their cost. All these companies have shareholders, and shareholders tend to demand profits, so the products are priced at a level that they think the market will bear. I wish this stuff were less expensive too, but reality is reality and if wishes were fishes we'd all cast nets, y'know?

I tend to think that the GK110 will be a 700 series card, but with the naming shenanigans of recent GPU generations, nobody can really say for sure. nVidia might go to the 7 series just to catch their numbering up to AMD. Time will tell.
nightelph's Avatar
Good for you! Exciting, eh? I just came across a pic of the last 68 series card I bought for $500 on launch day:

hokiealumnus's Avatar
Our first victim has arrived. Please excuse the cell photo.



Unfortunately we have visitors this weekend and I can't touch it until next week, but rest assured it will be tortured as much as I can well when the time comes!
davedree's Avatar
The thing is prices for the 680 over in the uk (dependant on model spec) are the same or higher than a 7970. average prices =
gtx680 = 424 gbp 7970 = 400 gbp

Secondly as seen on overlockers uk theres quite a stir up about how the 680gtx is compared to a 7970 in reviews. Comparing a standard 680 against a standard 7970 yes the 680 will win. But as we knows the clocks on the 7970 are artificially low. Yes there are reviews of overclocked vs overclocked and i'm not going to debate which card is the best, because quite honestly its sometimes one or the other depending on the game.

My issues are with this, people misquote the 680 as being faster than the 7970 like its faster in every game. This mostly true when the 7970 is standard, but some games stock for stock the 7970 wins. Secondly fair play to nvidia for making a card which is their midrange offering but can equal/ sometimes beat a 7970 depending on game.

The gtx680 really is impressive and I congratulate Nvidia on everything bar the price (uk price).

The 7970 is overpriced and so is the gtx680. Everyone in the uk knocked down the 7970 as underwhelming and expensive for the performance it offers. Yet the gtx 680 is so close to performance and price but yet it seems to be acceptable because its Nvidia?
I really can't understand this. Neither can I understand people saying that the 680's release should drive down the price of the 7970's?

On overclockers uk theres a guy called martini who is going to prepare a game bench test between an overclocked 7950 and a 680.
So when I get home from work I look forward to his findings.

Finally I have an unlocked 6950 and clocked. Now the games I've noticed that have struggled a bit have been Crysis 2 with the dx11 extra pack, Bf3 does struggle a little bit on multiplayer but I turn down the msaa, and the other game is metro in dx11 not too great at all.

A 680 or 7970 does offer almost double my 6970s performance, but in metro both cards still are way below 60fps.

It depends on how games are made in the future as to how well these cards architectures will perform in the future, but if games are going to be more taxxing like metro, then I can see both of these cars might struggle on future titles.
EarthDog's Avatar
But thats how you compare... Stock vs Stock. Sure you can overclock a 7970 to the moon, but is it enough to beat an overclocked 680 (which also clocks well (maybe not as far as 7970)? Is it worth the extra $50 (in the US), power consumption, noise to see? Thats up to the user.
davedree's Avatar
I agree thats how you compare stock for stock yes and this is amds fault for clocking them so low, Depends on what games as to where the cards battle it out vs their clock speeds.
For myself I wont be buying either I'm happy with what I've got, It's going to be interesting with the round 2 of nvidia/amd's offerings.

I cant see them offering a bios flash upgrade to up the clocks for the avrage user who doesnt know how to overclock.
So will be intersting how amd play their next card.
Speculation is a 7990/their usual dual gpu malarky, Or could they make a 7980 and clock it and work it a bit ?
EarthDog's Avatar
I dont think I have witnessed that sentiment (acceptable b/c its Nvidia). There are a couple of games the 7970 wins, but the rest the 680 takes it (Read: Anandtech review, Tom's, Techpowerup).

In reading the Anand review, it beats out the GTX 580 by an average of like 30-40%. Thats big. Prior to its release, the GTX580 was selling (in the US) from $400-$530 (non watercooled). Now its $360 - $500 (non watercooled).

I dont understand this... I would imagine it would catch up and possibly beat it. BUT, (like fractions) do to one side what you do to another. When you overclock the 680, I would imagine it still beat out the 7950.
David's Avatar
I think performance and overclockability are two separate issues, personally.

Stock performance is important - it represents what any non-faulty card is capable of.

Overclockability has two parts to it: how far? and how likely? This is something we'll not properly see until the cards have been out for longer. Is every card equally overclockable? Is it a case of just getting lucky? How many of each card are people like kingpin binning? Did he bench just those four or did he chuck out two, seven, thirty-eight cards before them?

Overclocked vs overclocked is an interesting facet, but I think stock performance is an important variable. Any card will run stock, but we don't know yet necessarily how good the cards are for overclocking generally.
manu2b's Avatar
Mmm... I am waiting for IB to be launched for my next rig, and am still wondering:7970 or 680?

As it's been said already, it seems that the 7970 overclocks somehow higher than the 680, and this is OCF, no?

IF the 7970 price drops by $/€50-100 , I think I will go AMD. If not, it will be 680 for me...
LZ_Xray's Avatar
Sounds like the good old days again. We all sit and /popcorn + /profit watching the two titans punch each other in the face. They make their zillions and we get cutting edge technology for relatively cheap. Capitalism the way it should be.
davedree's Avatar
You need to look at the overclockers uk forum then lol its proper swinging handbags !
Like I said though the gtx 680 reference card stock is 1066core mem 6000, core boost is around 1110-/= depending on tdp and temperature.
In comparison the 7970 reference stock in reviews is 925 core mem 5500. So quite a big difference. As i'm sure you're aware it'd be easier to do our own reviews on the hardware we own, but i cant afford a 7970 and 680 to compare.

Please don't think i'm trying to derial this thread by the way I'm still working out with reviews and findings to the performance difference betwen the two cards. The problem with different review sites is they all have
different hardware ie cpu, ram etc. and different ways to measure .
Like techpoweup sometimes dont use msaa 4x in some benches, where anandtech do.

For the 1536 mb 580's (aircooled) the prices havent really changed here in the uk at around 320- 340 gbp.

The only reason I brought up the 7950, is because for its price average = 330-350 gbp its around 80-100 gbp
cheaper than a 7970 and 680.
Martini currently owns a 7950 oc, and he has just bought a 680. He is going to put them back to back. Of course
the luck relies upon on the silicon overclocking, but hopefully he'll be able to work out the max oc of both
cards. The end result will be how much of a punch the 7950 offers for the money.

This is only a snippet of info I was able to attain so far, as im at work,
but on guru 3d it shows how much the 7970 scaled in 2 games from overclocking.
I'm not biasing this info it's just the only info I have found at the moment.


Summarised Crysis 2: 1920 x1200
DirectX 11
High Resolution Texture Pack
Ultra Quality settings
4x AA
Level - Times Square (2 minute custom time demo)

Standard 7970 61 fps
Asus duII oc 1000x5600 66 fps
Asus duII oc'd 1250x6000 76fps
std 680 1006 1058 6000 63fps
680 oc'd 1264 1264 6634 70fps

---------------------------------------------------------------------------------
Alien vs pred 1920x1200 4x aa 16af

Standard 7970 55 fps
Asus duII oc 1000x5600 61 fps
Asus duII oc'd 1250x6000 71 fps
std 680 1006 1058 6000 52 fps
680 oc'd 1264 1264 6634 58 fps
-------------------------------------------------------------------------------------

Yes this is only 2 games, I'm well aware of the potential of the 680 in bf3 and other games vs the 7970. This is

just to prove how much the 7970 can scale when overclocked and how benchmarks showing the std 7970 vs the 680

could show different results.

For the links from my summary here they are
http://www.guru3d.com/article/asus-r...u-ii-review/23
http://www.guru3d.com/article/geforce-gtx-680-review/25


Like I said if i owned both cars id bench them with a wide variation of games.
but its hard to cross reference different review sites with their fps because the results differ. Just like my pc would someone elses.
Frakk's Avatar

I'm not sure i believe anything AMD or nVidia say, AMD Re-Bulldozer / nVidia Re- Kepler 2x as fast as GTX 580.... its no where near that.

But the GTX 680 while definitely faster then the 7970 its only by about 10% overall, that's far from a killing....

The GTX 680 is running a much higher stock clock then the 7970,- 925Mhz vs 1060Mhz.
If you take the performance deficit of the 7970 and account it to the lower clocks it adds up in theory.

As for the load power draw its about the same.

The 7970 is the only AMD GPU of the 7### line with no Ghz version.
Add to that they have 3 months to refine there 28nm architecture, now look at the 7870, its just about as fast as a 7950, has less specifications and draws notably less power.

I would like to see these two titans fight it out clock for clock in the widest possible range of games and resolutions, i think the 7970 might just get close to the GTX 680 or catch it if clocked at 1060 with the memory also clocked to the same....

On this occasion i can easily believe AMD when they say they are not intimidated by the GTX 680, i would even believe them if they said they already have what it takes to at-least match it.

I think both these guys have more to come, this is just a warm up.

I'm not counting AMD out, we might even hear from them soon.
QuickFast's Avatar
@ davedree i here you, it's hard to cross reference fps from different web sites. I've been wanting to update my pc for a year now but looking at benchmark from different sites they get me going crazy because i shouldn't be beating the numbers they are throwing up or coming close but since i been on this site i heard about anAndtech web site it seems to be more realistic.The links you put up i cant believe there is no way i can beat a 580 let a lone a 680
davedree's Avatar
frakk the 680 doesnt run at 1060, thats its minimum base clock it boosts on average upto 1110 + more if theres room..

Ok stock for stock the 680 looks a winner.
But Its early days, I want to see a good oc 680 vs a oc'd 7970.
In a wide variation of games.
Frakk's Avatar
What are you comparing that to?
Frakk's Avatar
ah... okay, that's even more... i thought it was 1057 or something, don't ask me where it comes from, its to late in the day.....
QuickFast's Avatar
@ frakk i'am comparing it to davedree links
EarthDog's Avatar
Stock clock is 1006 with 9 total boost steps up to 1100Mhz. BUT it seems 1100Mhz isnt sustainable b/c of temps... read on:

http://www.anandtech.com/show/5699/n...x-680-review/4
Frakk's Avatar
ok thanks, i will look them over tomorow to catch the fullest of what your getting at.
davedree's Avatar
Its a game i've yet to play, your results look great, I think i'll have a play with this game and see how my card performs
Frakk's Avatar
I just looked, thats confusing, looking at that it looks like you beat a stock GTX 680 with a 6950?!?!?!

That's not right and i have noticed it myself,- in that apparently similar setups from different sites do not match at all sometimes, at times even similar benches from different times from the same reviewer are way off eachother....

One could argue its down to different hardware or even drivers but some look so far off it looks like numbers they have simply pulled out of there collective rear ends.

Strange old going on.
JeremyCT's Avatar
While it's true that it isn't close to 2x as fast as the 580, that's not what nVidia was shooting for. The infamous "GPU Roadmap" image from an old nVidia presentation showed a goal of 3x the DP GFLOPS per watt for Kepler compared to Fermi. It was a computational goal, not a gaming goal.

Their stated goal for gaming was 2x the performance per watt consumed. GK104 is about 44% better than the 580 in terms of fps per watt according to one review site (Tom's), but remember, Kepler isn't entirely rolled out yet. "Big Kepler", the GK110, might be even more efficient than GK104, but I don't know if they'll hit their goal of double the performance per watt.

The Kepler show isn't over quite yet, this was just the first act. Hopefully it keeps going well and the price drops a bit over time.
Frakk's Avatar
(With respect)


Perhaps, for its just just hot air, "we have a supper dooper end all card in the form of GK110 but its not quite ready yet, when it is it will change the world... look at how good we are, now buy my T-Shirt"

A large serving of salt with that please....

Companies posture and talk themselves up all the time, making bold claims is one thing. putting it on the shelves is quite another
EarthDog's Avatar
All I see on the shelves is a generally faster card that is cheaper by 10%, quieter, and less power hungry by 10%. I dont care what marketing said...who promised how many jiggawatts ler ounce...all we have IS whats on the shelves (see first sentence).
Frakk's Avatar
Yes, the last thing i want is to get into an argument over this, what i see on the shelves is no killer, what i see is something that is a little better and a little cheaper, a little is not a lot to make up or beat.

Whats on the shelves has not lived upto its hype.

It is better no mater how you look at it (we agree on that) yet far from all that... its clocked much higher then its competition, its just possible all they have to do is release one at similar clocks and drop the price, job done.
EarthDog's Avatar
I guess I just didnt see this hype, or if I did, buy in to it. Jeremy Is right though on the nvidia slides (compute/ gF is what I recall... But could be wrong)

Sounds like a misinterpretation of marketing may have caused inflated expectations for a lot of of folks.
JeremyCT's Avatar
I don't recall it coming from nVidia marketing. I recall the "2x performance per watt" coming from leaks and leaked internal slides. Please correct me if I'm wrong (with source links if possible). It's possible the 2x performance per watt thing was an internal goal. Internal goals are typically aggressive in order to drive innovation and encourage engineers to consider everything they can. They're not typically public because they're not always met.

If the information Frakk is basing his criticism on was never meant to be public, I think it's more than a bit unfair to judge based on that metric. Now, that Italian PR guy with the "It'll be untouchable" or something to that effect statement before launch, HE certainly deserves a good slapping.
JeremyCT's Avatar
GeForce GTX 680, Part 2: SLI, 5760x1080, And Overclocking

http://www.tomshardware.com/reviews/...ound,3162.html

Haven't read it yet, going to now!
diaz's Avatar
Not sure why people compare clocks 7970 vs 680... Both different cores, and 7970 has 25% more cores. Its really a moot argument.. It really comes down to any of these three:

1. Raw stock performance / price
2. Maximum overclockable performance with stock cooler / price
3. Performance / watt

Then there are further breakdowns:

1. Performance / feature (resolution / AA / Tesselation / DX11)
2. Performance / Game (specific games you play)
3. Performance / SLI or Crossfire / multi screen setup scaling


It can go any way, but it really comes down to what you pay for the stock performance it comes to. To get a factory OC 7970 you generally pay more.. At that point it starts matching the 680 for $100 over the 680..
manu2b's Avatar
+1

BUT: I believe that for us, overclockers, the clocks improvement margin matters as well.

And, marketing wise, it is maybe the first smart AMD move in ages. Sell the 7970 at a high price for 3 months until the 680 launch, then drop the price and launch an improved 7970 that equals the 680...

That's one quarter extra profit!
Badbonji's Avatar
Just been playing BF3 for a while this morning. Temps are reasonable, 70C (~25C room temp) seems to be the limit, as because it drops down one boost clock multiplier at this point which prevents it from going over 70C. The game runs very smoothly with everything on ultra, even on the larger maps. Around 1700MB Vram is in use, but varies map-map. Definite improvement over my HD5970 - the 1GB of memory wasn't sufficient to avoid FPS drops in firefights.
Frakk's Avatar
Thats fine for your average Michael, on Overclockers.com we are not your average user.

To try another way to explain what i'm getting at i will use a system of scenarios.

A:
I'm buying a new GPU today in this range and have never heard the word overclocking.
I will buy the GTX 680.
Why: cheaper / better performance / better drivers.

B:
I am an overclocker.
Clock for clock they perform similar right up to there highest clocks on stock cooling.
I will buy the GTX 680.
Why: cheaper / better drivers.

C:
Clock for clock they perform similar right up to there highest clocks on stock cooling, the 7970 comes down to the same price.
I will buy the GTX 680.
Why: better drivers

D: The 7970 out performs the 680 overclocked, the 7970 still costs more.
could go either way, depends on how much the 7970 out performs.

D1:
As above but the 7970 costs the same.
I will buy the 7970.
Why: better performance for my money

E: AMD release a Ghz version and price it the same as the GTX 680, it performs the same.
I will buy the GTX 680.
Why: better drivers.

E1
The 7970 Ghz version out performs the GTX 680.
I will buy the 7970 Ghz.

I would not base my decition on one reviewer, i would look at all of them to get the full picture.
especially if some reviews are incomplete.

My decision would be based on a wide range of games, the full power consumption tests, and the full range of resolutions on all games tested and the full range of overclocking, again on all those games.
freeagent's Avatar
^^

Is your avatar crooked or is it just me?
zitnik's Avatar
Lol I really want one of these, but I think I'll hold on to my SLI 570s.

They outperform a single 680, so no point in ridding them and upgrading to the 680.
JeremyCT's Avatar
lol, it seems to have rotated.

That's usually the way of it from one generation to the next. In order to beat my SLI 260s, I would've had to buy a GTX580. GTX570 would just match the performance. Granted, it would do it with less heat produced, more features (DX11), and with a single card instead of two, but once you go SLI and get used to the performance it offers, it's kinda tough to go back to a single GPU type of setup.
Frakk's Avatar
I photoshopped it that way, looks amusing to me
hokiealumnus's Avatar
I had like 30min, so I installed the card and ran a couple benchmarks. This thing is a beast. It also looks to be just under an inch shorter than an HD 6970, so it's big power in a small package. No photos yet. For one of the first times ever I chose to run it before taking photos. Time is precious.

These are run bone stock with a 2600K (2133 RAM) & GTX 680.



That graphics score beats the stock 7970 by 3724 marks.



This one beat the stock 7970 by 2430 marks.

Wow.
manu2b's Avatar
Lol, my OC'ed 5830's equal a 7970 or an OC'ed 580GTX (http://3dmark.com/3dm11/2729853), but my next rig will be either 7970 or 680 powered: an OC'ed IB and an OC'ed 7970/680 will run on a 550W PSU, I need at least 750W for my current PhII@4GHz an the 2 5830's...
Badbonji's Avatar
Why does it say PCI-e @ 1.1 on your GPU-z?
Bobnova's Avatar
Some modern GPUs downgrade the PCIe connection at idle to save power, then ramp it back up to 2.x or 3.0 under load.
Badbonji's Avatar
The GPU does this? In my gpu-z it says it is always running at 2.0x mode :s

Seems to overclock quite well, left mine at +150Mhz for now and has been running well in BF3 and on Cuda units (I cannot seem to get 2 instances to run yet):

bmwbaxter's Avatar
does your CPU support pcie 3.0? if not then that is why, so unless you are running SB-E 2.x is all you are gonna get. more of a problem for GPU compute stuff than it is for gaming.
Badbonji's Avatar
That makes sense, just got a bit confused because it sounded like the GPU was responsible. I am waiting for Ivy Bridge before finally upgrading my CPU, still on LGA 1366
QuickFast's Avatar
nice scores hokie, gtx 680 is a beast can we see some game benchmarks and what was your physics test score in 3dmark'11 on your i7
diaz's Avatar
That's a nice OC, seems pretty easy..
St Alban's Avatar
Hey guys I was thinking of getting a 680. I have a i7 2600k processor. I shouldn't have to worry about bottle-necking for this generation of processors right? Also Anyone know any news about the 670 release date and price point? I may go that route if they release before Guild Wars 2
Badbonji's Avatar
You won't havy anything to worry about with that processor! You have it overclocked? That will reduce any possible bottleneck by the CPU! I don't feel bottlenecked and I am using a Core i7 965 (sig).
dejo's Avatar
@bad
how are the benches on the gtx680 for 01,03,05+06
diaz's Avatar
That CPU wont bottleneck anything, other than maybe some specific CPU intensive games, but even higher end CPUs will still have difficulty in the same games. If anything, the GTX680 further removes CPU dependency by optimizing for DX11. The 680 is the king of the hill with DX11, it LOVES DX11.
St Alban's Avatar
Thanks for the help guys. I haven't OCed my CPU yet....Waiting on getting a GPU. I really probably wont mess with it for a few years when I get a new case and a second 680 to SLI Now if I could only get some money and they would be in stock!
Bobnova's Avatar
It all depends on what you mean by bottleneck. There isn't a single GPU benchmark that I am aware of that doesn't benefit from OCing a 2600K from 5GHz to 5.5GHz.
Does that mean it's bottlenecked? Hard to say IMO.

In any case you won't find a faster CPU till IB comes out anyway, so don't worry about it. (Technically SB-E might be faster, but if the SB is OC'd it will probably be OC'd further than the SB-E due to heat, which makes up for the slightly lower clock/clock performance)
St Alban's Avatar
I think I would need a different motherboard to OC my card I currently have a ASUSTeK CG8350 (LGA1155)

Whats the noise quotient on the 680s do they run cooler and quieter than the 580s?
Badbonji's Avatar
I haven't got them installed currently, also had to reset the CMOS on the motherboard to just get the pc to boot after switching cards, so I am back at stock CPU speeds for now I won't have much chance at scoring well on those compared to people with 2600K's
diaz's Avatar
Benches sure, but if you are talking real world, only certain games would benefit from anything more than a stock 2600k.. and if they do its not by that much.
thobel's Avatar
Any word on the evga gtcx 680 hydro classifieds?
Super Nade's Avatar
Are any of you guys having problems with the staggered power pins? Is it easy to disconnect the power cables?
hokiealumnus's Avatar
Not at all; it's surprisingly simple. They're staggered enough to make it easy.
Ivy's Avatar
Maximum CPU is totaly dependable on system. On system with low heat-capacity the SB-E with a slight stock volt-OC is more powerful than a volt raised OC-K series (cant handle that heat), even for gaming. Although, nothing can be build that cheap and provide so much power for the bucks such as a K-system. Although i still think that the future will be 6 core and as soon as a game is using it well, its very hard to catch up with 6 cores on a 4 core CPU, because every core will have to work 33% faster. Mostly it only works because most games arnt supporting 6 threads very well but that behaviour may change. So finally, systems with high heat capacity (high is the stuff which allows for non heat capped OC) can always push further using SB-E. Systems with no heat capacity will come further too. Best stuff for 4 core is mainstream systems with average heat capacity or systems which have to be cost-effective. On IB-E the heat issue will become lower, so the smaller systems will be at advantage and IB-E, might become volt-OC able, everywhere except laptops.

Until yet, there wasnt a single game able to push a Nehalem-E, SB-E or equivalent stuff to its very limit as long as only a single GPU used and at least 1080P. The GPU is the limiting factor at below 60 FPS. Everything above makes no sense to me.

I never heard about such a thing such as DX11 CPU dependency. The elements implemented in DX11 are pretty much a GPU only matter. As far as i can remember, the CPU was never of that low importance in whole history because pretty much any low cost CPU (I3 150$ CPU = very sufficient) nowadays is sufficient for gaming. That was not like that in the past and i paid much more for a C2D which was never truly sufficient but more sufficient than any single core (compared to price).
Seebs's Avatar
@ Hokie...

Now that you have the card in your hands; would you mind doing a bit of checking...

- What does nVidia control panels show it as? Seems the guys at TPU got one that shows up as 670Ti and it'd be nice to know if it was an isolated thing, or if nVidia really decided to rebrand the lower end tier card.

- Is there any physical evidence on the card itself? Seems an early sample card came out with the GTX670Ti engraved on the cooler shroud... It's just morbid curiousity, but I'd like to know if any production units slipped through the cracks with this.

Here's the link to their article: TPU Article



Seebs
hokiealumnus's Avatar
I'll check NVIDIA control panel as soon as I'm able. Frantically trying to finish up a motherboard review so I can properly test this thing. Unfortunately that includes re-installing windows for SSD cache testing.

The shroud definitely doesn't have GTX670Ti etched into it. Unless it's under the EVGA sticker and is so shallow it can't be felt through it.
Badbonji's Avatar
I checked and it says GTX 680 like normal for mine. Must of been on the early samples rather than any retail batches.
Bobnova's Avatar
You can OC A 2600K rather significantly before you hit the same heat output as a stock sb-e.
People have been saying "wait till stuff uses more threads, then you'll see!" for most of a decade now, you really have to look at how things are now, rather than how they might be "soon".
Ivy's Avatar
I just hope you may not make same mistaken such as those people who was telling me (in 2007) that it is a complete waste getting a 4 core CPU, and then i got me a C2D and nowadays, it turned out to be a huge failure. 4 core is so much more useful for gaming and even a very old quad core is able to perform well which cant be said for C2D anymore, so the era of C2D is over yet, completly over. Just try to catch up with reality: Most consoles already got 6 threads or cores, AMD CPUs got above 6 core or threads, even the cheapest ones. Only Intel is trying to break the rule using "low core" stuff, thats probably not enough to stand the test of time.
Janus67's Avatar
The xbox has 3 cores. I think the one in the PS3 (the Cell processor) is either 6 or 8, but is [as far as I know] only used in a PS3 and its computational power isn't close to what a current day quad can do.

2007 was 5 years ago, and at that time a dual core was pretty much all that was needed. I had my E8400 until 2008 when I built myself an i7 920 system (which I ran with HT off most of the time for less heat/more overclockability)
Bobnova's Avatar
Nothing you could buy in 2007 stands up very well at all in games now, regardless of the number of cores. Quads stand up better, but still don't do very well.
As Janus pointed out, modern consoles don't run very many cores either. The Cell processor is a fairly unique beast.
AMD's cheapest CPUs are a single thread and single core. Next up are doubles, then triples, then quads, then finally six cores, well above the bottom level.
EarthDog's Avatar
Its 2012. That information, 5 years ago, was correct.
Frakk's Avatar
The 360 has a 3 core IBM 3.2Ghz CPU and an ATI GPU.

It looks like the Xbox 720 will use an AMD Fusion Bulldozer variant APU with a 76## series on DIE GPU. performance wise that will bury the current 360, this is a more understandable hardware performance barometer, it gives you some idea of how Game Consoles compare to gamer PC's

Now if you take the Xbox 360 version of BF3 you will find that its running at about 1280 res, with no AA of any kind, no Vsynk and set at the minimum graphics settings.

And i can see it on my brother in laws 360, there is graphical tearing with every movement, fine detailing does not exist and the maps are all full of graphical tiling, he thinks it looks awesome.... i just agree as i haven't heart to tell him it looks horrible.

You can't compare Game Consoles to PC's... simply because they just don't.
GoldenTiger's Avatar
Back when Quads were new, I always recommended people get them since they would hold up better over time and offer more performance right away at the time. Same thing with when dual-core cpu's were new... each time, people didn't want to believe it. *shrug* Indeed 5 years ago that info was correct.
Frakk's Avatar
CPU's with many cores (anything over 4) only come into play when more then 4 cores are being used.

Today few things use more than 4, some encoding and rendering software do.

of-course that's not much help if the core for core performance is less than other CPU's with less cores.

AMD vs Intel is the classic example, clock for clock my Phenom II can beat an i5 but only if its using all of its 6 cores, highly threaded computational benchmarks show that, yet today in the real world its almost never using that power, unless your running a Linux OS

In Gaming the work is off loaded onto the GPU, it takes a lot of GPU for the (little by comparison) CPU work to fall behind, at-least for a higher end CPU from any brand.
Owenator's Avatar
After reading Guru3D's SLI testing I think I'll wait. I wanted to replace my twoGTX570's in SLI with a single GTX 680 but my GTX 570's in SLI are faster. So until I can afford two cards I'll pass. Also I would have liked to have seen more VRAM. 2GB is not really enough for high end games in 5760 x 1080 surround.

Maybe a "GTX 685" could be a single card replacement for my current SLI cards.
EarthDog's Avatar
While not faster than 2 570's, the 2GB of vram is seemingly not a limit at this time...

http://www.tomshardware.com/reviews/...nd,3162-5.html
Ivy's Avatar
AMDs six+ core is barely more expensive than a egg for lunch compared to Intel SB-E which costs a huge fortune. A high end user usualy get the six+ pack AMD as far as i know, everything different is foolish. Regarding Xbox, its 3 cores but 6 threads (resulting in almost same behaviour such as a 6 core). A 6 core SB-E is 6 core and 6 threads without HT. PS3 is 6 active cores, 1 of them is used as a backup.

A Quadcore at OC is still able to run ANY game nowadays. "Good" is a relative term because for a OCer good means 200 FPS and for a standart user it would be 40-60 FPS. But the impact is still not that high as it seems to be, quads still can hold up well in most games.

I always think 4-5 years in advance because im usualy not gonna replace a PC CPU for that time. In term of C2D, it got nasty the last 1-2 years of its lifetime (outdated at 2010+).

Besides, yes there is games using 6 cores:
-EVE Online (does support almost endless amount of cores as far as i know. At some tests, a E-type had a very balanced load at all 6 cores)
-WoW (high core AMD CPU is huge gain)
-Supreme Commander (close to endless core support)

Games which benefits from 6, isnt fully tuned for but still is at advantage (off load):
-Civ 5 and much more.

Possible to support *put amount inside* cores? Yes. Why not using it? We are lazy developer and only support Intel.

As long as only playing 1080P, it isnt. Higher than that, and eyefinity, i would not feel to confident at some games in near future. Futureproof (usualy means +5 years to me) means clearly 3GB+. RAM is fun stuff, as long as sufficient, they are not noticed. As soon as overused, they will be noticed more than any other part...

Remember: Xbox720, PS4, Wii U is soon to be released, PC demand will soon improve good margin as it usualy will react to it. Because many stuff is a improved port with much better graphic and finally exorbitant RAM use.
Janus67's Avatar
SB-E has hyperthreading so effectively runs 12 threads, where did you read/hear that it didn't? SB-E http://www.newegg.com/Product/Produc...82E16819116492 $600


Of course the C2D was outdated in 2010, we were a generation and a half forward by then in CPU power including the C2Qs and the 1156+1366 sockets.

The problem is expecting a gaming computer to still be powerful enough to play games at a high detail and good FPS in 5+ years. I personally upgrade pretty constantly [both for benchmarking for HWBot Boints and for having something new to play with]. I also don't see much reason to buy an SB-E system unless you are doing some encoding that will use 12 threads, it is better off with a 2500k or 2600k for the near future.
Ivy's Avatar
?!?! You can disable HT, i know of no gamer who is enabling HT. HT is for applications outside gaming who will actually benefit from. 4 core got HT too, they will have 8 threads but gamer usualy disabling it.

Core i7-3930K is not a SB-E, thats a K-type. Although, for gaming it would probably perform the same so i would say its best deal when a gamer feels the urge to go 6 cores.
Frakk's Avatar

Yes i agree with you, EVE Online,- because of its VAST computational matrix espesialy benifits from the 6 core, i have played it myself...
As a game when that reads what your system specs are and see's the 6 core Phenom its reaction is along the lines of "wooooo yummy....... "

But like Linux these things are RAW CPU power techy type stuff where the AMD shines are not about to hit the mainstream market, as you say its to much, to difficult and to time consuming for then main market to get into.

Having bought the AMD 6 core thinking games will become more and more complex with bigger and bigger AI matrix, computing more and more strings simultaneously.... its not happening.

Games are becoming small, more reactive - less proactive, less intelligent with more emphasis on graphical elements... thus the CPU becomes less important and the GPU takes over.
Janus67's Avatar
I don't see what your argument is in the post that I quoted. You say that SB-E is 6c/6t with HT turned off... which is true, but I don't see what the point is? The point is that it can do 12t if it is desired by the user, AMD CPUs cannot.

Lastly, these last 10 or so posts have pretty much nothing to do with the Kepler lineup.
I.M.O.G.'s Avatar
On what are you basing these statements? Most gamers likely leave CPU settings at default, which would mean HT is enabled. A well informed gamer may disable HT if it doesn't benefit their games, however it would be a safer assumption to assume that is rare.
Owenator's Avatar
I read that too and call me a snob but 50-60 FPS @ 5760 x 1080 4xAA in BF3 or Metro 2033 is just not enough to get me to run out and buy two.

I went through my Nvidia surround phase. BFBC2, Just Cause 2 and other games looked and ran great. But I pretty much gave up when BF3 came along. I could get two or three of the 3GB GTX 580's but that's very expensive and uses a lot of power.

These cards do look like a step in the right direction don't get me wrong. I just wish they came with 4GB of RAM for the high eye candy settings. I think that is what is holding them back. But drivers may need optimization and such so I hold out hope. I'll keep saving my $$ just in case!
Bobnova's Avatar
4GB cards ought to be coming, I would be fairly surprised if they didn't, really.


A 2600K with "only four cores" and the apparently useless hyperthreading stomps Thubans in everything.
Every. Thing.

If the game only uses four cores, the 2600K stomps it.
If the game uses six cores, the 2600K still wins, if the user didn't do something silly like turn off hyperthreading.
That's at stock clocks.
If you OC both on the same cooler the Intel will win even with HT turned off.
That's a $300 CPU, not a $600 CPU.

680 wise I am tempted, but the general execution seems to jive with the core name. The power delivery section is meh, the overclocking appears to be difficult at best. Most importantly for me personally there are very, very, very few submissions with them on HWBot. What submissions there are are all at fairly low clock speeds (the TiN special doesn't count, heh), while the 7970s are all doing 1200 or better on the stock cooler.
Frakk's Avatar
It seems to be holding its own at very high res so far, but i still don't think its wise to give such a high end card the minimum Vram needed to compete right now.

Games are not standing still. they are moving forward very fast and it looks like they will be becoming more and more Vram hungry as there will be more graphical elements they need to deal with.

Vram is starting to become important.

@ Bobnova, take cost into account, 2600K = 240, Phenom II x6 130. There are aeriers where the 130 x6 will beat a 170 2500k.
Its not as black and white as that.
Also, there are games where the 190 FX-8150 will match or beat the 240 2600K frame for frame, F-1 2011 is one of a few where it beats it, just as an example.
Its more dependant on the programmer, and whether or not they can be bothered.
EarthDog's Avatar
SOME games (albeit rare, and not sure which ones personally) have issues with HT enabled. I dont know of a gamer either that disables HT. Whats the point? Why not just save $100 and get a 2500K at that point.

Also, the "K" part of the 3930K is not what separates it from SB to SB-E. The 3930k is on socket 2011 (which makes it SB-E), while 2600k/2500k are SB and s1155.

This statement needs qualified. It will be a long while before 2GB on 1920x1200 or less is too little. I have to admit I'm getting a bit rattled at the repeated assertion (and subsequent proof showing otherwise) that 2GB on cards is too little for SINGLE MONITOR operation. Let that point go already.
I.M.O.G.'s Avatar
The amount of submissions on the bot, and the challenges for overclocking compared to what people are throwing out with 7970's is also a concern for me.

I'm a benchmarker though, not a gamer... So pure overclockability is more important to me. Also, if anything, the results by TiN/Kingpin give me more concern than they do confidence - they were using a modded BIOS (non-public?) to work around the default performance scaling behavior, and TiN has hardware modding skills that enable performance well beyond what is attainable by most.

I see the 7970 getting crazy overclocks and bench numbers by a great number of overclockers, and I'm watching to see if the same becomes true of the 680. So far its not looking good though its very early - but the 7970 jumped out of the gate with ferocity. The 680 came out of the gate with a thud.

I like its gaming performance, and I'm glad its competitive. But on pure overclockability, it doesn't seem to lend itself to the masses as well as the 7970 thus far.
Frakk's Avatar
Its not an assertion, its a concern given how much it costs... Its a valid point which is why it keeps coming up.

Banging on about how it can keep up today holds no water for the future, it may well be fine in a year.... two... three... from now, or not.

The concern is 2GB of Vram is not a lot, the concern is it may not be enougth at some point not far from now.

People will keep expressing that concern so get used to it
EarthDog's Avatar
Agree with this (IMOG).

I am a gamer, more so than a bencher these days () and at 2560x1440, so I need this horsepower.

Hopefully the bios' get out to the public so it can be a bit easier to do it like the 7970.
Suppressor1137's Avatar
So far, from what I've read in this thread is : Hold on, the big guns will be here shortly.

As such, I'm going to holster my urge to buy one of these cards(even though it would Claymore explode my 460*very nasty stuff that claymore* and basically run as fast as 3 of them in SLI)

As of Right now, My Gaming resolution is 1680x1050. At this res, My GTX 460 can handle BF3 on ultra with 30-45 FPS. certainly, the occasional dip in fps caused by heavy combat is there, but not enough to warrent buying a $500 vid card.(yet.)
nightelph's Avatar
^ Yeah, I don't see the point of buying this card if you're gaming at 1680x1050 on a 460. If you really want a new card maybe waiting until a '660ti' comes out. Or just SLI your 460 on the cheap.
Suppressor1137's Avatar
Also, considering the fact that once the 690(or 695) comes out(dual gpu) I may have the ability to buy a GTX 590 from a friend for killer cheap. Something like $200. I will jump on that all day anyday.
JeremyCT's Avatar
Depends on your definition of "shortly". GK110 rumors have launching in August if all goes well.

Valid points. I'm looking forward to seeing what happens when companies like EVGA release their non-reference cards with things like 5 phase power with 8+6 connectors. Hopefully that will improve things on the overclocking front.
hokiealumnus's Avatar
Copying this over from the other thread. It's pretty important.


tom10167's Avatar
This is easy. Buy a bigger monitor.
hokiealumnus's Avatar
Someone had asked about '06 on this card. It seems about the same as the 7970 tbh. This was a very quick overclock. 2600K @ 5300, GTX680 @ +130 core, +100 (actual) mem. I have no idea if that's anywhere near the max, it was set-and-run in about ten minutes.

diaz's Avatar
Good news hokie

I canceled my MSI 680 (backordered) and ordered an in-stock evga, should ve here friday or early next week!
Ivy's Avatar
1. Its still not a true SB-E. But the only true difference is some smaller cache and a bit lower stock clock. However, in as good as any condition it wont be noticed, and the clock can easely be pumped up to equal levels. So for now i say...indeed. K type is basically a alternate X-type... Intel you are funny! Not that i worry because i use a Nehalem and i will not get any SB, i will upgrade next at IB (because i need backup PC). Guess reason Intel like that, because they CAN... milk.

At the Nehalem X-type there was still a good reason to get X-type because the I7 970 didnt have a open multi (harder to OC). So the 990X was more ahead of the second one and it even got a special (better than standart) Intel cooler included (not the case for SB-E).

2. Its not to little for single monitor at current time, but i do not feel secure for +5 years. Although i usualy get new GPU every 2-3 years, its still nice to be more future proof. The thing i worry the most is.. when i sell that GPU after 2-3 years and then the buyer may say "nah... its 2 GB only... i rather get a 3-4 GB because many games already very close to the limit"... that means the selling price is not very satisfying. And im sure there is 4 GB standart out soon, so we simply get some "bad in between deal". And i mean, its a card of high price range. You could get some 6950 for less than half price (such as i got), with same RAM amount. When i buy highend i kinda expect to have more of everything, compared to much cheaper mid range, no matter its current use.

Anyway, 4 GB out soon... just have to wait and in that term im able to wait because i wait for a good IB... and may probably get a aftermarket 680+ type by end of year. But at current time, choice would be hard.
Ivy's Avatar
Look, Nvidia currently got the better non eyefinity gamer card but they like to milk. Something i do not appreciate, thats why i always told to wait for aftermarket and they surely will make way to many upgrades.
MattNo5ss's Avatar
From benches it looks like the GTX680 is equal to or better than the HD7970 at 5760x1080 or less. The GTX680 also uses less power, creates less heat, makes less noise, and even costs less. What is being milked?

By the time more than 2GB of vRAM is needed on a resolution less than triple screen, the GTX680 and HD7970 will be old as dirt (in PC years) and there will be much better cards available performance wise that will also have more vRAM...

What am I missing

The cost of the 4GB version over the 2GB will probably make it a waste of cash to me, but if prices were very close then I guess I'd consider it.
JeremyCT's Avatar
Nothing. FWIW, I think you have a pretty good grasp of the reality of things in this instance.
bmwbaxter's Avatar
YAY! I just received track for mine! look like this is gonna be a very good friday!
Ivy's Avatar
Those who think they do not miss something, will soon notice a huge miss. Nvidia can do better than that, same for Intel, but as long as it can give the feeling and result of a better deal, there is no need to be even better. Anyway, congrats everyone to theyr stuff and hopefully happiness may last longer than i expect.
Badbonji's Avatar
Apparently user of X79 are stuck at pci-e 2.0 for now:

http://www.techpowerup.com/162942/Ge...E-Systems.html
bmwbaxter's Avatar
Lulz, how come in that article they (nvidia) say that SB-E isn't native pcie 3.0?

I was almost 100% sure it was...
deed's Avatar
It's capable not I believe it's not certified.
Check on Intel x79 website
http://www.intel.com/content/www/us/...s-chipset.html

Kind of funny. Seems Intel has been having issues with chipsets lately. They fix it but I wish they would solve issues like this before hand.
EarthDog's Avatar
Godo thing it doesnt matter!
hokiealumnus's Avatar
Well...this GPU is (very painfully) heading to MattNo5ss for the rest of its review. I have, um, something else coming that's going to take up all my time and just couldn't do this nice piece of hardware justice. Alas, parting is such sweet sorrow.
bmwbaxter's Avatar
I am sure he will not confirm
or deny.
hokiealumnus's Avatar
Yep, what bmwbaxter said.
EarthDog's Avatar
bwaaaaaaaaaaaaaaaaaahahahaha! Nice Janus!
cw823's Avatar
G rated forum folks. Do we want the removed image to be someone's first experience on our forums?
EarthDog's Avatar
You should watch some PG movies..
Janus67's Avatar
Sorry about that, thought it fit the description pretty well. I honestly thought it was just a picture/screenshot from the show didn't realize it was a gif until I went back to it after seeing it removed
cw823's Avatar
lol I mean G-rated, like Disney's Tinkerbell.

It was funny, I'll give you that!
Bobnova's Avatar
I'm not entirely certain tinkerbell is more appropriate than the image was, really.
cw823's Avatar
I now own a copy of Tinkerbell, the movie.
Bobnova's Avatar
lol
Wait there was a tinkerbell specific movie? Crazy, that's news to me.
MattNo5ss's Avatar
For this thread Dawn > Tinkerbell...

Kepler "New Dawn" Tech Demo
Seebs's Avatar
Awww... I missed the funny pic...
nightelph's Avatar
Ditto. And when are we gonna be able to download the new tech demos for ourselves? I keep checking nvidia's site..
Badbonji's Avatar
HD7970 versus the GTX680 at the same clockspeed:
http://hexus.net/tech/reviews/graphi...d-7970-clocks/

The GTX680 is still faster it seems, plus the power draw is obviously increased further with the HD7970, although the difference is only around 5% on average between the two.
manu2b's Avatar
Yep, and this article is biaised: on every bench, the 680@stock is faster/better than the 7970 overclocked, and still, it says that, visually, they "could not make a difference"... Of course, between 55 and 65 FPS, the human eye (inless being bionic!) doesn't make any difference.

But in the coming games taking advantage of this 10/15% extra fps, then it wil be visible: 40 vs 45/48 DOES make a visual difference. And let's not even talk of multi displays setups.
Frakk's Avatar
Depends on where you go and what games are benchmarked... http://www.xbitlabs.com/articles/gra..._14.html#sect0

There half the time the 7970 is beaten and the other half the 680 is beaten... its the same right the way across the Internet, there is so little between them for me there is no 'clear' performance winner.
EarthDog's Avatar
Its pretty clear there is a winner when averaging things out (IMHO). In your link Frakk, in the stock vs stock graph (as clock for clock comparisons are frankly asinine to me - you are overclocking one card with a COMPLETELY different architecture to match another card's clockspeeds which makes no sense) it wins 12 of 18 benchmarks @ 1920x1080 w/ AA (the most common resolution). Some games just work better naturally with different implementations and some are TWIMTBP games and others sponsored by AMD. So you add up winning in most games/benchmarks, costing $50 less, using less power, and quieter in reference form, it seems pretty clear to me(most) who stands a bit taller between the two. Excluding the other stuff and focusing on performance for the majority (1920x1080 or less) its still heads above the rest in most cases. The only way I can see that stance holding water is if you only play the 2 games where there is actually a difference (AvP, Civ V). Will you notice some of the differences, like the hexus article said, you really dont, but that is when the other factors come in to play (price, power, noise).

Performance wise it wasnt a knockout punch (wins across the board), that I agree with, but it sure was a solid left to the chin (winning most, a lot handily 10%+), especially being a 'mid-range' card if you go by the core used. Then it took a few more body blows with the pricing/power side of the house... so then you have a softened fighter.

Boxing analogy FTL!
manu2b's Avatar
Hey Frakk:
“The best argument against democracy is a five minute conversation with the average voter.” -Winston Churchill

I LOVE it!
bmwbaxter's Avatar
+1
I couldn't say it better myself
manu2b's Avatar
Joe, you're a Star!
Frakk's Avatar
Truer words never spoken IMO

@ EarthDog just relax dude there is little in them, you talk up this card rather a lot, its not that special its just the latest high end card with a little step over the last high end single GPU card, no one disagrees its cheaper (right now) or a that its good allround card.

Soon the will be another one to take its place, that's how the cookie crumbles..

I think you can make better word choices than that, don't you Frakk? -hokie
diaz's Avatar
True arguments from both sides... but business is business, AMD priced its card at 550 and people were all over it. Does this mean we should boycott a card that is $50 cheapher and completely dominates the AMD card? No it just means because AMD didn't push the envelope in terms of performance, nVidia gets to win the lottery and make great margins with this card.

Could nvidia have sold this card at $300? Maybe.. but at $500 they are making investors happy, and its good for business to make investors happy. Why? Because they begin to have more faith in their investment. Happy investor = happy company = stimulus in production, design etc..

If the profit margin is larger, this only means that he product is worth that much at the time. Take the 680 as it is, the current GPU flagship and king of the hill. If you look passed the price, buy it and are a happy customer, then that is a good purchase in my book. If you think the $500 price is too high, then you are as right as the guy who thought the price was right.

Personally, I bought one because it just hit all the points I wanted to cover. High performance, low power consumption, cool new features, 3D and multi-display, great drivers and best performance in my favorite game etc… You can't go wrong with that much innovation and $100 cheaper than the equivalent performer is a no brainer.
bmwbaxter's Avatar
@Frakk

I don't really see how EarthDog was inflating the GTX 680 in his post. He was merely saying that you can't compare clock for clock especially when one of the cards has to overclock or underclock to make such a comparison. Making stock clock comparisons is fine and fair, and we all know how that ends...
Bobnova's Avatar
Stock vs stock, stock+10% vs stock+10%, max reasonable air OC vs max reasonable air OC, those are all useful comparisons.

Clock/clock? Not useful.
dtrunk's Avatar
any1 seen any of these around US e-tailers??

EVGA 02G-P4-2684-KR
manu2b's Avatar
Let's compare a Pentium 4@3GHz vs a 2500K (one core)@3GHz
It just makes no sense
EarthDog's Avatar
Nope.. but looks like EVGA has you covered.

You can also try typing in that number in Froogle and see what that turns up.
Frakk's Avatar
Its not at all difficult to understand actually, here is how simple this is, hands up who would run any of these cards at stock.

Comparing how they run at different clocks is absolutely useful.

@ diaz, your emphasizing points that no one has argued against, chill out people its just a GPU card
manu2b's Avatar
With different architectures, clocks mean nothing.

What would be a good indicator would be to benchmark@stock+x% for both cards, as Bobnova wisely highlight earlier.

EDIT: competitor of the 6970 is the 570, right? The 6970 runs@860/1375 and the 570 runs@732/950. Would you compare the 570 to the 6970@same clocks? The 570 would litterally kills the 6970... Does it mean the 570 is better? No! Because the 6970 overclocks much higher under normal cooling conditions (air/water) than the 570. Which means that at their respective highest OC (still under conventional cooling), they are still neck and neck.
Frakk's Avatar

Diffrent ideas perhaps on what matters, for me is how they perform overclocked and how far that overclock goes and in that how they stack up.
EarthDog's Avatar
I run at stock. There is no need for me to overclock any of my last few cards I owned. I have been blessed to be able to have nice cards. But yeah, I dont overclock 24/7. I only overclock to benchmark.

MY only point up there was that I didnt like, in that review you linked, that they matched clocks and ran them like that was worth it.

I certainly understand your point of wanting to see how they both perform overclocked. The problem with that is every card overclocks different. So you may wind up with a dud, or a stud, you never know. BUT you are getting a midrange card in the first place so......these cards and how they clock are of little consequence in the first place. I can tell you first hand that the architecture down the chain, at least the cards I have, dont exactly scale as high as the 7900 series either. Thats why I personally look at stock clocks for comparisons.
Frakk's Avatar
You should try it, you get better performance out if it for your money... give it a go
EarthDog's Avatar
Ehh, I'm well aware of the benefits. I do not remotely need to do so... 680 @ 2560x1440 = plenty good FPS for me in BF3 and everything. But thanks though!

(and I edited my post above. )
Frakk's Avatar
I would ask you for your gamer tag but i did that once already with another member here and we are never gaming at the same time, despite the fact that i work when i want from home and my internal clock is not rigidly set to UK time, actually its set to Newyork time...
pinky33's Avatar
Can you use nvidia surround with different resolution monitors like you can with the 7xxx series AMD cards?

I ask because I happen to use one 1920x1080 and two 1280x1024 for day to day use (none gaming) right now.
JeremyCT's Avatar
I'm not sure the surround gaming function will work (I'm pretty sure it won't but don't know for certain), but your desktop will span the multiple displays for sure. The one downside is that, same as the AMD cards, it probably won't be able to enter a low power state using that sort of configuration.
pinky33's Avatar
Damn, did not know both manufactures would not enter low power mode in multiple monitor configurations.
Owenator's Avatar
You can use different sizes for more desktop space but for surround gaming you need three monitors and thier resolutions have to match. I do beleive that you can use them in portrait mode instead of landscape ie
3240 x 1920 versus 5760 x 1080.
EarthDog's Avatar
Just in case the concern about not enough ram on the 680 is still kicking. Here is another review on 3 screens.

http://hexus.net/tech/reviews/graphi...adeon-hd-7970/

My tag is in the thread started by Janus.
nightelph's Avatar
Got that free EVGA bracket in the mail.

Also, I run my card at stock. Theres no game I play at 1920x1200 that needs it. When this thing ages and newer games start to slow then I'll OC it.
MattNo5ss's Avatar
By that time, there will be another BA card out from AMD or nVIdia that you won't be able to resist
Owenator's Avatar
Sorry friend but the frame rates at 5760x1080 are aweful. Obviously the GTX680 is not their top card. I'll just wait for the GTX685 or what not.
EarthDog's Avatar
I cant see the link, but know in the conclusion that they both, apparently, "suck" but the 680 beat out the 7970. My point was its not a ram issue like many have been concerned about in the past.
JeremyCT's Avatar
They both seem shader limited, and the 680 has more shader power, so it wins a contest where both are unusable. I do agree that it shows nVidia knew what it was doing when it engineered the bus width and frame buffer size for the card though.

Has anyone done triple monitor SLI/Crossfire testing yet? With the additional shader power available, memory bandwidth might become more dear.
dtrunk's Avatar
bah, drivers drivers drivers. When nvidia reaches level of driver maturity that AMD has, i "hope" the 680's will scale better for 2x/3x sli.
Bobnova's Avatar
I wouldn't say all of them, the 7970 won AvP decently.
Everything else it lost convincingly.

None of the games had what modern folks call a playable frame rate, though back when I was younger 20+FPS was great, and 30+ (7970 manages it in AvP) was fantastic. Hell one of my games capped at 31FPS!

Anyway, neither of them have the horsepower for that resolution single card.
The real test will be dual card and/or triple card. No more ram, lots more shader power.


EDIT:
To those who say drivers:
Every time one brand beats the other, the winning brand "has more mature drivers".
Any time a fan of one brand wants to justify their position, they blame the drivers of the other brand.
The hilarious part is that this works both ways and always happens.
And yet, only rarely do more "mature" drivers do much for a card on the whole. The last one I can remember where they made a meaningful difference was the 5830.
This speaks towards performance, not crashes/etc. of course.

AMD has had better multi-card scaling for something over a year now (more?), if that isn't long enough for nvidia drivers to mature I am frightened.
JeremyCT's Avatar
To be fair, you get better raw scaling with AMD, but worse micro-stutter problems as a general rule. Driver updates often result in measurable improvements over time with multi-GPU setups for both camps.
Bobnova's Avatar
I've yet to see microstutter at all from either camp personally, so I don't have much input on that front.
EarthDog's Avatar
No microstutter in benchmarks BobN?
Bobnova's Avatar
I played a game once! I am shocked and offended by your accusations.
Janus67's Avatar
This is one of the few times that I have read someone praise AMD/ATI for their drivers over nVidia.

Overall I would say I have had less issues (single and dual card) with nVidia [less crashes and microstuttering] than I have with ATI. With that said, ATI has had better multi-card scaling for some time.
vixro's Avatar
I'm considering going to green for the first time in 7 generations. Speaking from experience, I have rarely had issues with ATI/AMD drivers as a whole though I did notice issues with the 2000 series, but that card wasn't that great anyway. I also had a few issues with the 5x series, but those were fixed in about 4 driver releases as well. This is all based on single card/single GPU setups. I always see these posts about AMD having immature drivers but I have not seen it yet.

Now back to my question, where the HELL DO I BUY A 680??? When it's out of stock everywhere....
Suppressor1137's Avatar

Just wait until the production catches up, and check once every 5 hours. It'll be in stock at some point.
EarthDog's Avatar
Or have it auto notify.
dtrunk's Avatar
instock right NOW...

http://www.evga.com/products/moreInf...s%20Family&sw=

edit, and they're gone... lmfao.
vixro's Avatar
Had to pay an extra $25 bucks, but I noticed the Gigabyte was in stock today and I believe I snagged the last one.

http://www.newegg.com/Product/Produc...82E16814125421

nightelph's Avatar
Bracket comparison:




And OMG! No temp change. :P Perhaps if you had really poor caseflow and lived in a very hot environment you'd noticed the claimed 3C difference.
Bobnova's Avatar
With a static fan speed maybe. With a dynamic fan speed it'll probably be identical anyway :P
Leave a Comment