• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

nVidia Kepler GTX700 (600?) series info here ->

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
not if it was in your nda/contract saying how you are allowed to reivew the product...

nvidia had the right to say how you can review the card and they are smart enough to know what tests to run to beat the competition. So it wouldn't be lying exactly... then if you don't play by said rules they are equally right to no longer give you cards to review.

But I do agree that good reviews should be unbiased and 100% truthful. Which is why if they don't send you one, you can buy one and review it on your terms.

EDIT: also some of the reason they have guidelines is to keep reviews consistent at launch. so some idiot doesn't run a benchmark with different settings and handicap or inflate the cards performance.

Being required to only run a certain/specific benchmark is bull for any piece of hardware. Many of the review sites have their own suite of benchmarks that they run for the hardware that has little to no variation. I hardly doubt Anandtech [for example] has requirements for only running Dirt 3 with XX resolution and XX settings to show how much it is better than Y-card. It is most important for sites to be as static as possible with the review process. Granted there are always going to be benchmarks that are better for certain hardware, it is just that way, but being limited/choosey toward the sites/reviewers because they possibly pointed out a faulty with your piece of hardware is pretty wrong in my opinion.

With that said, I still like nVidia's cards [although I don't have any issue with AMD/ATI, I've bought about 50/50 between the two of them]
 
All this noise to me is more about marketing from nVidia right at launch. Within days there usually are reviews from every other review site that had to buy the card. Its really more about picking and choosing who you trust as a review site, whether nVidia picked it or not.

Marketing is all about flashy anyways, if you're that gullible, maybe it doesn't matter what you get anyways :D

BTW; who from OCF is reviewing?
 
BTW; who from OCF is reviewing?

Oh, me me me! Pick me!! My OCZ Synapse review was FANTASTIC (IMHO of course), it's about time I spread my wings and did a video card review.

I'll happily run that puppy through whatever paces nVidia politely suggests and then some for a free one. Course, it's possible that it might be slightly hampered by my C2Q and DDR2 memory. :chair:
 
BTW; who from OCF is reviewing?

Well if hokie doesn't deny it or does deny it... I would guess him since he usually does most of the high end unreleased stuff. but I guess we will see. we wont even know for sure if we got one to review because hokie or anyone on the news team for that matter can keep a secret. this is gonna be like bulldozer for anticipation I think.

although hopefully not a big let down though.
 
Now I'm just curious to see availability and what the price will be when they actually start selling. If availability is low, then they might jack the prices $50 or so.

I have a feeling the 680 will be abundant, since it was originally meant to be a midrange card so it was probably manufatured in much larger volumes than a high end chip would. At the $550 price point, it should be unattractive to the mainstream thus leaving more for us OCD plagued enthusiasts.. :D
 
First Kepler benchmark!

While technically true, I'll spoil my own tease by telling you that it's a mobile, not desktop, part.

http://www.anandtech.com/show/5672/acer-aspire-timelineu-m3-life-on-the-kepler-verge

Some of our editors recently had the opportunity to take part in NVIDIA's Editor's Day in California's "sunny" San Francisco to be briefed on new products. While we can't go into any great detail on NVIDIA's new Kepler architecture (as that information is still under embargo), what we can provide you with is a review of Acer's new Aspire TimelineU M3 notebook
 
The shortage of the last few launches didn't have anything to do with how many they were trying to make, it was all about how many they could make. If the process has a 10% yield rate it doesn't matter overly much what level chip you're trying to make.
 
GTX 680 / GK104 /nVidia Kepler Transistor count around 3.5B... new leaked specs :D

http://www.techpowerup.com/162341/GK104-Transistor-Count-and-Exact-Die-Size-Revealed.html

Picture from a German site..
nvidia_gk104_specs_3dcenter.jpg
(Spielverbr = wattage in-game?)

If this is true, this means that it has only 17% more transistors than Fermi. This also means that if the cuda cores are now 3 times which of Fermi, then the transistor count does NOT compute to the same cuda cores. I think the new Cuda cores (Cuda 2.0?) will be similar to AMD's stream processors, since the core/transistor count would add up close to what the Tahiti core/transistor count. Ex: 4300/3500 = 1.23 .. 1.23 x 1536 = 1900. Tahiti has ~2000 cores, so the transistor count is relatively close, however nVidia seems to use a slight amount more transistors, since AMD has more cores (~30% more) but less than 30% more transistors (~23%).

If the benches are correct, this would mean that nVidia might have kept some of the original cuda cores (two operations per clock) clustered with their own new cuda cores (single operation per clock?).. That has been my suspicion for a while, and the transistor count would make sense, but then again I have access to the same info you all do.. I'm just trying to immagine how these numbers would all add up, and of course I could be completely wrong and nVidia just has a whole new approach to efficient framerates. :D
 
The shortage of the last few launches didn't have anything to do with how many they were trying to make, it was all about how many they could make. If the process has a 10% yield rate it doesn't matter overly much what level chip you're trying to make.

True.. I called one of my "providers", and the rep said that they were also awaiting for the 600 series. He said that they knew nothing more than we did, but that there are signs that something is coming soon so they suspect the 600 series would be coming any time soon. No details were given, and if they did have details he wouldn't have given them :rain:

On the other hand, he said he would send me an email immediately when he has a 600 series card in stock for me. SWEET. :comp:
 
http://www.guru3d.com/news/nvidia-geforce-gtx-680-up-to-4gb-gddr5/

GK104 comes with several dozen power planes, and will operate on varying clocks depending on the computational load, card temperature and the power consumption

low power mode is 300 MHz, standard is 705 MHz extendable to 950 MHz

Idle state on Fermi is 100 MHz, isn't it? I don't see why they'd need higher idle clocks. Perhaps "low power" in this case is actually the mildly ramped up state used for things like video decode. If this one's true, it makes more sense than some of the other rumors in terms of base clock and "hot clock" or "turbo clock", whatever you want to call it. 705 MHz was an early rumor, but we didn't know about the "hot clock", and that might be where the 900 MHz range rumors are coming from. The GTX680 might sport a 705 MHz "normal" clock which can ramp up to 950 MHz given thermal and wattage headroom. This might be nVidia one-upping AMD's PowerTune feature.

Also, if it has a lot more power modes than the currently used 3 (idle, video, and game), that would help explain the 5-phase power regulation seen on the PCB spy shots.
 
Last edited:
So just to be sure the 680 should be Nvidias single GPU flagship model correct? If this thing can really do the job it took 3 580's to do, then its packing some serious horsepower. I dont even want to think about the dual GPU version! Come on NVIDIA!! Im ready to spend some green on big green!! lol
 
http://www.guru3d.com/news/nvidia-geforce-gtx-680-up-to-4gb-gddr5/





Idle state on Fermi is 100 MHz, isn't it? I don't see why they'd need higher idle clocks. Perhaps "low power" in this case is actually the mildly ramped up state used for things like video decode. If this one's true, it makes more sense than some of the other rumors in terms of base clock and "hot clock" or "turbo clock", whatever you want to call it. 705 MHz was an early rumor, but we didn't know about the "hot clock", and that might be where the 900 MHz range rumors are coming from. The GTX680 might sport a 705 MHz "normal" clock which can ramp up to 950 MHz given thermal and wattage headroom. This might be nVidia one-upping AMD's PowerTune feature.

Also, if it has a lot more power modes than the currently used 3 (idle, video, and game), that would help explain the 5-phase power regulation seen on the PCB spy shots.
Other rumors above stated a 1006 core and 1058 'hot clock'. :shrug:
 
If this thing can really do the job it took 3 580's to do, then its packing some serious horsepower.

Short answer, no.

Long answer: I would be VERY surprised if the 680 would match a pair of SLI-580's, forget about TriSli-580's. That article was blown out and sensationalized and its very annoying to see that on my searches every day. The original article clearly explains that the game was optimised in several ways, such as a new AA and other DX11 optimisations. On top of that, they only say "acceptable framerate", which the TriSli-580 could have been in the 50fps for all we know, and the optimised version of the game could have ran at 30fps on the Kepler.. make it look like 3x580 = 1xGk104 is PURE marketing. No performance details were taken out of that, other than yes, the GK104 is at least capable of running a game with excellent visuals within acceptable framerate parameters.
 
Last edited:
Great... just great... another semi-hard launch for nVidia's GTX 680 / GK104 according to Fud..

http://www.fudzilla.com/home/item/26359-gtx-680-will-be-hard-to-get


Expect it to sell out like the new iPad


Many of our industry sources are telling us that Kepler based GTX 680 desktop cards will be hard to get.

Despite quite high prices, these cards will sell out very quickly simply as they will be a better choice than similarly priced AMD cards. However, at the same time, AMD can mess up Nvidia’s launch by dropping prices and making its cards more attractive, but this is something that might happen close to GTX 600 series Kepler 28nm launch.

Partners would not be partners if they would not complain about the quantity of Kepler cards that they will get at launch and weeks to come. They all think they deserve and should get more, but they are still convinced that the new cards will sell well.

If you are after GTX 680, our advice is to look for one at launch day, scheduled for March 23 and immediately buy one. Otherwise you will probably end up waiting. The 28nm process is not at great yield levels but it is still reportedly better than the transition to 40nm a few years ago. Nvidia is also having to use a lot of capacity to service OEM deals in both notebook and desktop segment. That will swallow many of the 28nm Kepler based cards.

Nvidia lovers have not much to worry about this as notebooks get lower TDP chips, but since the 680 is close to 200W TDP this remains desktop-only card that will only end up in some high-end desktop machines.
 
Good Grief... so much posturing marketing hype yet absolutely nothing to back it up......
its begging to look like Kepler is not going to be the supper GPU after-all, and there trying to rev people up and bamboozle them into thinking its something its not.

AMD Bulldozer Déjà vu....
 
Its nothing new... I still stand by my earlier comments... as fast as 7970 (trades punches dep on game/res/settings/etc) priced the same, less power. Thats a win to me. Sure Id love an AMD crusher, but they dont remotely need to do it. Thats just smart business (assuming my guess is correct).
 
Its nothing new... I still stand by my earlier comments... as fast as 7970 (trades punches dep on game/res/settings/etc) priced the same, less power. Thats a win to me. Sure Id love an AMD crusher, but they dont remotely need to do it. Thats just smart business (assuming my guess is correct).

Completely agreed, unfortunately.

There is no reason for a company to offer a product that decimates the other person's at the same price point. They likely have something better up their sleaves (680ti or some other dumb naming convention) that would be able to do it. As the guess was already that the 660ti became the 680 because of its performance and then they could charge $550 for it instead of $250.
 
Its nothing new... I still stand by my earlier comments... as fast as 7970 (trades punches dep on game/res/settings/etc) priced the same, less power. Thats a win to me.

Yes, as that would be an improvement and a win in my eye's too.

So lets see it.
 
Good Grief... so much posturing marketing hype yet absolutely nothing to back it up......
its begging to look like Kepler is not going to be the supper GPU after-all, and there trying to rev people up and bamboozle them into thinking its something its not.

AMD Bulldozer Déjà vu....

Well...

Its nothing new... I still stand by my earlier comments... as fast as 7970 (trades punches dep on game/res/settings/etc) priced the same, less power. Thats a win to me. Sure Id love an AMD crusher, but they dont remotely need to do it. Thats just smart business (assuming my guess is correct).

Agreed.

Don't get me wrong, GK104 is not the performance haymaker like the GK110 was meant to be, but the architecture itself is a step forward along with power consumption is fantastic. Definitely a win TDP-wise. The only issue to me is price as their margin is huge at $550, and of course availability.

I have a feeling the architecture will be similar to AMD's, but with better optimizations, or more efficient use of each core. This would allow game developers to make games easier that can run on both architectures witout having to optimize for each platform (as much). Of course, there is still the "nvidia hepled develop/optimize the game", so that their architecture might have a better optimization in the end - vice versa AMD/ATi has been doing the same since the beginning.
 
Back