• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE NVIDIA Launches GEFORCE GTX 680, aka Kepler

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
The 360 has a 3 core IBM 3.2Ghz CPU and an ATI GPU.

It looks like the Xbox 720 will use an AMD Fusion Bulldozer variant APU with a 76## series on DIE GPU. performance wise that will bury the current 360, this is a more understandable hardware performance barometer, it gives you some idea of how Game Consoles compare to gamer PC's

Now if you take the Xbox 360 version of BF3 you will find that its running at about 1280 res, with no AA of any kind, no Vsynk and set at the minimum graphics settings.

And i can see it on my brother in laws 360, there is graphical tearing with every movement, fine detailing does not exist and the maps are all full of graphical tiling, he thinks it looks awesome.... i just agree as i haven't heart to tell him it looks horrible.

You can't compare Game Consoles to PC's... simply because they just don't.
 
Back when Quads were new, I always recommended people get them since they would hold up better over time and offer more performance right away at the time. Same thing with when dual-core cpu's were new... each time, people didn't want to believe it. *shrug* Indeed 5 years ago that info was correct.
 
CPU's with many cores (anything over 4) only come into play when more then 4 cores are being used.

Today few things use more than 4, some encoding and rendering software do.

of-course that's not much help if the core for core performance is less than other CPU's with less cores.

AMD vs Intel is the classic example, clock for clock my Phenom II can beat an i5 but only if its using all of its 6 cores, highly threaded computational benchmarks show that, yet today in the real world its almost never using that power, unless your running a Linux OS

In Gaming the work is off loaded onto the GPU, it takes a lot of GPU for the (little by comparison) CPU work to fall behind, at-least for a higher end CPU from any brand.
 
Last edited:
After reading Guru3D's SLI testing I think I'll wait. I wanted to replace my twoGTX570's in SLI with a single GTX 680 but my GTX 570's in SLI are faster. So until I can afford two cards I'll pass. Also I would have liked to have seen more VRAM. 2GB is not really enough for high end games in 5760 x 1080 surround.

Maybe a "GTX 685" could be a single card replacement for my current SLI cards.
 
Nothing you could buy in 2007 stands up very well at all in games now, regardless of the number of cores. Quads stand up better, but still don't do very well.
As Janus pointed out, modern consoles don't run very many cores either. The Cell processor is a fairly unique beast.
AMD's cheapest CPUs are a single thread and single core. Next up are doubles, then triples, then quads, then finally six cores, well above the bottom level.
AMDs six+ core is barely more expensive than a egg for lunch compared to Intel SB-E which costs a huge fortune. A high end user usualy get the six+ pack AMD as far as i know, everything different is foolish. Regarding Xbox, its 3 cores but 6 threads (resulting in almost same behaviour such as a 6 core). A 6 core SB-E is 6 core and 6 threads without HT. PS3 is 6 active cores, 1 of them is used as a backup.

A Quadcore at OC is still able to run ANY game nowadays. "Good" is a relative term because for a OCer good means 200 FPS and for a standart user it would be 40-60 FPS. ;) But the impact is still not that high as it seems to be, quads still can hold up well in most games.

CPU's with many cores (anything over 4) only come into play when more then 4 cores are being used.
Today few things use more than 4, some encoding and rendering software do.
I always think 4-5 years in advance because im usualy not gonna replace a PC CPU for that time. In term of C2D, it got nasty the last 1-2 years of its lifetime (outdated at 2010+).

Besides, yes there is games using 6 cores:
-EVE Online (does support almost endless amount of cores as far as i know. At some tests, a E-type had a very balanced load at all 6 cores)
-WoW (high core AMD CPU is huge gain)
-Supreme Commander (close to endless core support)

Games which benefits from 6, isnt fully tuned for but still is at advantage (off load):
-Civ 5 and much more.

Possible to support *put amount inside* cores? Yes. Why not using it? We are lazy developer and only support Intel.

While not faster than 2 570's, the 2GB of vram is seemingly not a limit at this time...

http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-5.html
As long as only playing 1080P, it isnt. Higher than that, and eyefinity, i would not feel to confident at some games in near future. Futureproof (usualy means +5 years to me) means clearly 3GB+. RAM is fun stuff, as long as sufficient, they are not noticed. As soon as overused, they will be noticed more than any other part...

Remember: Xbox720, PS4, Wii U is soon to be released, PC demand will soon improve good margin as it usualy will react to it. Because many stuff is a improved port with much better graphic and finally exorbitant RAM use.
 
Last edited:
AMDs six+ core is barely more expensive than a egg for lunch compared to Intel SB-E which costs a huge fortune. A high end user usualy get the six+ pack AMD as far as i know, everything different is foolish. Regarding Xbox, its 3 cores but 6 threads (resulting in almost same behaviour such as a 6 core). A 6 core SB-E is 6 core and 6 threads without HT. PS3 is 6 active cores, 1 of them is used as a backup.

A Quadcore at OC is still able to run ANY game nowadays. "Good" is a relative term because for a OCer good means 200 FPS and for a standart user it would be 40-60 FPS. ;) But the impact is still not that high as it seems to be, quads still can hold up well in most games.


I always think 4-5 years in advance because im usualy not gonna replace a PC CPU for that time. In term of C2D, it got nasty the last 1-2 years of its lifetime (outdated at 2010+).

Besides, yes there is games using 6 cores:
-EVE Online (does support almost endless amount of cores as far as i know. At some tests, a E-type had a very balanced load at all 6 cores)
-WoW (high core AMD CPU is huge gain)
-Supreme Commander (close to endless core support)

Games which benefits from 6, isnt fully tuned for but still is at advantage (off load):
-Civ 5 and much more.

Possible to support *put amount inside* cores? Yes. Why not using it? We are lazy developer and only support Intel.

SB-E has hyperthreading so effectively runs 12 threads, where did you read/hear that it didn't? SB-E http://www.newegg.com/Product/Product.aspx?Item=N82E16819116492 $600


Of course the C2D was outdated in 2010, we were a generation and a half forward by then in CPU power including the C2Qs and the 1156+1366 sockets.

Futureproof (usualy means +5 years to me) means clearly 3GB+

The problem is expecting a gaming computer to still be powerful enough to play games at a high detail and good FPS in 5+ years. I personally upgrade pretty constantly [both for benchmarking for HWBot Boints and for having something new to play with]. I also don't see much reason to buy an SB-E system unless you are doing some encoding that will use 12 threads, it is better off with a 2500k or 2600k for the near future.
 
?!?! You can disable HT, i know of no gamer who is enabling HT. HT is for applications outside gaming who will actually benefit from. 4 core got HT too, they will have 8 threads but gamer usualy disabling it.

Core i7-3930K is not a SB-E, thats a K-type. Although, for gaming it would probably perform the same so i would say its best deal when a gamer feels the urge to go 6 cores.
 
Last edited:
AMDs six+ core is barely more expensive than a egg for lunch compared to Intel SB-E which costs a huge fortune. A high end user usualy get the six+ pack AMD as far as i know, everything different is foolish. Regarding Xbox, its 3 cores but 6 threads (resulting in almost same behaviour such as a 6 core). A 6 core SB-E is 6 core and 6 threads without HT. PS3 is 6 active cores, 1 of them is used as a backup.

A Quadcore at OC is still able to run ANY game nowadays. "Good" is a relative term because for a OCer good means 200 FPS and for a standart user it would be 40-60 FPS. ;) But the impact is still not that high as it seems to be, quads still can hold up well in most games.


I always think 4-5 years in advance because im usualy not gonna replace a PC CPU for that time. In term of C2D, it got nasty the last 1-2 years of its lifetime.

Besides, yes there is games using 6 cores:
-EVE Online (does support almost endless amount of cores as far as i know. At some tests, a E-type had a very balanced load at all 6 cores)
-WoW (high core AMD CPU is huge gain)
-Supreme Commander (close to endless core support)

Games which benefits from 6, isnt fully tuned for but still is at advantage (off load):
-Civ 5 and much more.

Possible to support *put amount inside* cores? Yes. Why not using it? We are lazy developer and only support Intel.


Yes i agree with you, EVE Online,- because of its VAST computational matrix espesialy benifits from the 6 core, i have played it myself...
As a game when that reads what your system specs are and see's the 6 core Phenom its reaction is along the lines of "wooooo yummy....... :ty::ty::ty:"

But like Linux these things are RAW CPU power techy type stuff where the AMD shines are not about to hit the mainstream market, as you say its to much, to difficult and to time consuming for then main market to get into.

Having bought the AMD 6 core thinking games will become more and more complex with bigger and bigger AI matrix, computing more and more strings simultaneously.... its not happening.

Games are becoming small, more reactive - less proactive, less intelligent with more emphasis on graphical elements... thus the CPU becomes less important and the GPU takes over.
 
?!?! You can disable HT, i know of no gamer who is enabling HT. HT is for applications outside gaming who will actually benefit from. 4 core got HT too, they will have 8 threads but gamer usualy disabling it.

I don't see what your argument is in the post that I quoted. You say that SB-E is 6c/6t with HT turned off... which is true, but I don't see what the point is? The point is that it can do 12t if it is desired by the user, AMD CPUs cannot.

Lastly, these last 10 or so posts have pretty much nothing to do with the Kepler lineup.
 
?!?! You can disable HT, i know of no gamer who is enabling HT. HT is for applications outside gaming who will actually benefit from. 4 core got HT too, they will have 8 threads but gamer usualy disabling it.

On what are you basing these statements? Most gamers likely leave CPU settings at default, which would mean HT is enabled. A well informed gamer may disable HT if it doesn't benefit their games, however it would be a safer assumption to assume that is rare.
 
While not faster than 2 570's, the 2GB of vram is seemingly not a limit at this time...

http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-5.html
I read that too and call me a snob but 50-60 FPS @ 5760 x 1080 4xAA in BF3 or Metro 2033 is just not enough to get me to run out and buy two.:shrug:

I went through my Nvidia surround phase. BFBC2, Just Cause 2 and other games looked and ran great. But I pretty much gave up when BF3 came along. I could get two or three of the 3GB GTX 580's but that's very expensive and uses a lot of power.

These cards do look like a step in the right direction don't get me wrong. I just wish they came with 4GB of RAM for the high eye candy settings. I think that is what is holding them back. But drivers may need optimization and such so I hold out hope. I'll keep saving my $$ just in case! :thup:
 
4GB cards ought to be coming, I would be fairly surprised if they didn't, really.


A 2600K with "only four cores" and the apparently useless hyperthreading stomps Thubans in everything.
Every. Thing.

If the game only uses four cores, the 2600K stomps it.
If the game uses six cores, the 2600K still wins, if the user didn't do something silly like turn off hyperthreading.
That's at stock clocks.
If you OC both on the same cooler the Intel will win even with HT turned off.
That's a $300 CPU, not a $600 CPU.

680 wise I am tempted, but the general execution seems to jive with the core name. The power delivery section is meh, the overclocking appears to be difficult at best. Most importantly for me personally there are very, very, very few submissions with them on HWBot. What submissions there are are all at fairly low clock speeds (the TiN special doesn't count, heh), while the 7970s are all doing 1200 or better on the stock cooler.
 
It seems to be holding its own at very high res so far, but i still don't think its wise to give such a high end card the minimum Vram needed to compete right now.

Games are not standing still. they are moving forward very fast and it looks like they will be becoming more and more Vram hungry as there will be more graphical elements they need to deal with.

Vram is starting to become important.

@ Bobnova, take cost into account, 2600K = £240, Phenom II x6 £130. There are aeriers where the £130 x6 will beat a £170 2500k.
Its not as black and white as that.
Also, there are games where the £190 FX-8150 will match or beat the £240 2600K frame for frame, F-1 2011 is one of a few where it beats it, just as an example.
Its more dependant on the programmer, and whether or not they can be bothered.
 
Last edited:
SOME games (albeit rare, and not sure which ones personally) have issues with HT enabled. I dont know of a gamer either that disables HT. Whats the point? Why not just save $100 and get a 2500K at that point.

Also, the "K" part of the 3930K is not what separates it from SB to SB-E. The 3930k is on socket 2011 (which makes it SB-E), while 2600k/2500k are SB and s1155.

Vram is starting to become important.
This statement needs qualified. It will be a long while before 2GB on 1920x1200 or less is too little. I have to admit I'm getting a bit rattled at the repeated assertion (and subsequent proof showing otherwise) that 2GB on cards is too little for SINGLE MONITOR operation. Let that point go already. ;)
 
680 wise I am tempted, but the general execution seems to jive with the core name. The power delivery section is meh, the overclocking appears to be difficult at best. Most importantly for me personally there are very, very, very few submissions with them on HWBot. What submissions there are are all at fairly low clock speeds (the TiN special doesn't count, heh), while the 7970s are all doing 1200 or better on the stock cooler.

The amount of submissions on the bot, and the challenges for overclocking compared to what people are throwing out with 7970's is also a concern for me.

I'm a benchmarker though, not a gamer... So pure overclockability is more important to me. Also, if anything, the results by TiN/Kingpin give me more concern than they do confidence - they were using a modded BIOS (non-public?) to work around the default performance scaling behavior, and TiN has hardware modding skills that enable performance well beyond what is attainable by most.

I see the 7970 getting crazy overclocks and bench numbers by a great number of overclockers, and I'm watching to see if the same becomes true of the 680. So far its not looking good though its very early - but the 7970 jumped out of the gate with ferocity. The 680 came out of the gate with a thud.

I like its gaming performance, and I'm glad its competitive. But on pure overclockability, it doesn't seem to lend itself to the masses as well as the 7970 thus far.
 
SOME games (albeit rare, and not sure which ones personally) have issues with HT enabled. I dont know of a gamer either that disables HT. Whats the point? Why not just save $100 and get a 2500K at that point.

Also, the "K" part of the 3930K is not what separates it from SB to SB-E. The 3930k is on socket 2011 (which makes it SB-E), while 2600k/2500k are SB and s1155.

This statement needs qualified. It will be a long while before 2GB on 1920x1200 or less is too little. I have to admit I'm getting a bit rattled at the repeated assertion (and subsequent proof showing otherwise) that 2GB on cards is too little for SINGLE MONITOR operation. Let that point go already. ;)

Its not an assertion, its a concern given how much it costs... Its a valid point which is why it keeps coming up.

Banging on about how it can keep up today holds no water for the future, it may well be fine in a year.... two... three... from now, or not.

The concern is 2GB of Vram is not a lot, the concern is it may not be enougth at some point not far from now.

People will keep expressing that concern so get used to it ;)
 
Agree with this (IMOG).

I am a gamer, more so than a bencher these days :)() and at 2560x1440, so I need this horsepower.

Hopefully the bios' get out to the public so it can be a bit easier to do it like the 7970.
 
So far, from what I've read in this thread is : Hold on, the big guns will be here shortly.

As such, I'm going to holster my urge to buy one of these cards(even though it would Claymore explode my 460*very nasty stuff that claymore* and basically run as fast as 3 of them in SLI)

As of Right now, My Gaming resolution is 1680x1050. At this res, My GTX 460 can handle BF3 on ultra with 30-45 FPS. certainly, the occasional dip in fps caused by heavy combat is there, but not enough to warrent buying a $500 vid card.(yet.)
 
Back