NVIDIA GEFORCE GTX 780 Graphics Card Review

In the lead-up to Computex, NVIDIA is heading the pack, releasing their new GEFORCE GTX 780. This is their flagship for the GTX 700-series. While TITAN should remain at the top of NVIDIA’s single-GPU lineup, the GTX 780 – based off of the same GK110 GPU – shouldn’t be any slacker. Read on and we’ll show you just how close the GTX 780 can get!

Specifications & Features

Right off the bat, in their presentation, NVIDIA was out for blood. The GTX 780 has 50% more cores (up from 1,536) and 50% more memory (up from 2 GB) than the reference GTX 680 that came before it. The TITAN had 2,688 cores, so they didn’t cut the GK110 down too far for the GTX 780.

GTX 780 - 50% More

GTX 780 – 50% More

NVIDIA is claiming big strides in performance too, with a 70% improvement over the GTX 580 and over 40% improvement over the GTX 680.

GTX 780 vs. Previous Generations

GTX 780 vs. Previous Generations

The GTX 780 comes with GPU Boost 2.0, which you first learned about in our GTX TITAN introduction. If you missed that or need a refresher, go back and check it out.

All the Features of TITAN

All the Features of TITAN

Using the same cooler as the GTX TITAN, the GTX 780 can also claim very solid gains (or should I say losses) in the sound department. While this cooler is very quiet, it’s also tuned to allow the GPU to get up to a certain temperature (80 °C) and then hold it, which helps them obtain such quiet operation. That said, as we’ll see, you might want to make some adjustments to keep the GPU farther away from the 80 °C level because of how GPU Boost 2.0 works, but we’ll get into that later.

Quiet Cooling

Quiet Cooling

Speaking of cooling, NVIDIA has taken an additional step in the right direction. During our conference call, they pointed out that our ears pick up on changes in pitch and volume, even more so than slightly higher -but consistent- volume. NVIDIA has addressed their fan continuously ramping up and down by introducing their new Adaptive Temperature Controller. This will maintain the same temperatures as before, but do so while keeping the fan RPM ‘curve’ flat.

Adaptive Temperature Controller

Adaptive Temperature Controller

Here is where the rubber meets the road – NVIDIA’s claims of improvement over the competition’s HD 7970 GHz edition. Long in the tooth, the HD 7970 is still doing well for itself, but as you can see, in every metric NVIDIA has displayed here, you get a large increase over the HD 7970 GHz Edition’s performance.

GTX 780 vs HD 7970 GHz Edition

GTX 780 vs HD 7970 GHz Edition

They are also claiming solid SLI scaling. As measured with FCAT, greater than 175% SLI scaling appears pretty common. Speaking of SLI, like the TITAN before it, official support will extend only to tri-SLI. That’s ok because quad GPUs from either camp are known to scale very poorly, but so you know, there won’t be official support for quad-SLI.

GTX 780 SLI Scaling

GTX 780 SLI Scaling

Finally we have the specifications. This is a very conservative base and boost clock, as you’ll see in our overclocking results, but you have to start somewhere and NVIDIA starts the GTX 780 at 863 MHz base and 900 MHz boost, which is actually slightly higher than TITAN’s 837 MHz & 876 MHz, respectively.

As mentioned, the GTX 780 comes with a 3 GB frame buffer and it benefits from the GK110′s 384-bit memory bus. Clocking in at 6000 MHz GDDR5 (1,500 MHz quad-pumped), you’ll benefit from the same wide bus and fast speed as TITAN, just with half the buffer size.

GTX 780 Specifications

GTX 780 Specifications

GTX 780 Stock GPUz

GTX 780 Stock GPUz

What isn’t mentioned in their slides is price, so let’s get that out of the way. The reference GTX 780 will come with an MSRP of $649.


Launching at the same time as their GTX 780, NVIDIA’s GEFORCE EXPERIENCE is coming out of beta today.

Large Success in Beta

Large Success in Beta

NVIDIA actually framed the GEFORCE Experience in terms many people can understand. They always attempt to get “game ready drivers” out the day a new game releases. They asked an important question: “What’s the first thing you do when you get a new game?” Do you go check your GPU manufacturer’s site for a new driver, or do you install it and play? You install it and play of course!

GEFORCE Experience takes control of your drivers and should automatically let you know that there is a new driver available and download it for you on the day that your shiny new game arrives. They’re trying to make PC gaming more accessible to the layperson. While many of the people reading this review would indeed check for the latest driver, using this new software will take that worry off of your shoulders.

They are touting their game ready drivers in this slide, which compare launch-day drivers with the ones available previously, and with solid gains it’s worth it to have the newest driver when your next big game comes in.

Day-One Game Ready Drivers

Day-One Game Ready Drivers

Drivers aren’t the only thing NVIDIA is striving to have ready on day one. They also will make every effort to have optimization profiles pushed out on the day a game launches. These profiles will check your system for which GPU & CPU you have and automatically set your graphics quality to match your system. A lot of you might want to do that yourselves (I know I would), but if you want one click set it and forget it optimization, NVIDIA has the ability to do that. That they plan on doing so on release date is icing on the cake.

Optimized Gaming Profiles

Optimized Gaming Profiles

Last, but not least, NVIDIA isn’t yet done with GEFORCE Experience. Later this year (late summer ETA), they plan on introducing ShadowPlay, a new gameplay recording feature. This isn’t your typical FRAPS recording, where you set it before you play, hoping to get something cool worth saving. ShadowPlay can be set to auto-record everything you do in real time, with a buffer of up to 20 minutes.

For instance, say you pull off the kill of your life. I’ve seen a video similar to this – a guy in the middle of a dogfight in BF3 jumps out of a plane, snipes the person they’re dog fighting, gets back IN the plane, and goes on about their business. That person happened to be recording his video at the same time, which is great. However, they had to sift through every bit of video they had to find the right place.

ShadowPlay takes out the guess work. It will automatically and continuously record your gameplay, deleting from the back-end as it goes. If you pull off that insane kill, you hit a hotkey and it instantly saves the last X minutes of gameplay to your drive for you to go back and look at later.

In addition, ShadowPlay doesn’t have near the overhead that FRAPS would have. There is a good chunk of FPS lost when recording with FRAPS. ShadowPlay will work on any Kepler card and will use the built-in H.264 encoder to handle the heavy lifting. NVIDIA expects only a 5% or less overhead when using ShadowPlay, so it will barely be noticeable.

Coming Later This Year - Shadow Play

Coming Later This Year – Shadow Play

NVIDIA has done good things with GEFORCE Experience. If you have a GEFORCE card, it would be worth looking into this program. If you have (or get) a Kepler card, when they add ShadowPlay it’ll be worth the download just to check that out.

Meet the NVIDIA GTX 780

With that out of the way, let’s meet the GTX 780. In what is perhaps the most poorly kept secret of the year, the GTX 780 does indeed look just like its big brother TITAN.

NVIDIA GeForce GTX 780


NVIDIA GeForce GTX 780


Thanks to our handy-dandy gallery, you don’t have to slog through a bunch of thumbnails any more either.

This slideshow requires JavaScript.

Like the TITAN before it, the GTX 780 will require two (one 8-pin and one 6-pin) PCIe power connectors. The GTX 780 comes with a TDP of 250 W.

GTX 780 Power Connections

GTX 780 Power Connections

The standard reference NVIDIA video output configuration should look familiar to everyone by now. The GTX 780 is no exception and comes with four video outputs – two dual-link DVI (one DVI-D, one DVI-I), one HDMI and one DisplayPort.

Video Outputs

Video Outputs

Sporting the same cooler, the GTX 780′s “GEFORCE GTX” text also glows a pleasant green when powered up. No word yet on whether EVGA’s software can control this one, but it should be able to.

NVIDIA GeForce GTX 780


So far so good. We loved the look of the TITAN, so you shouldn’t be surprised we’re fans of the GTX 780 as well. There will be plenty of partner-designed cards on the market soon, complete with custom coolers; but if you want a blower-style cooler, the reference model is a solid offering.

Under the Hood

The GTX 780 cooler is the same as that on the TITAN. It’s a very premium looking (and performing) cooler.

GTX 780 / TITAN Cooler

GTX 780 / TITAN Cooler

The cooler uses a vapor chamber connected to a nice long aluminum fin stack.

Vapor Chamber Cooler

Vapor Chamber Cooler

The PCB should look familiar, as it’s the same as the TITAN. The only significant differences are the vRAM (there is none on the back of the PCB this time) and the GPU itself, which is reduced from 2,688 CUDA cores on the TITAN to 2,304 on the GTX 780.

GTX 780 PCB Front

GTX 780 PCB Front

GTX 780 Rear PCB

GTX 780 Rear PCB

All 3 GB of vRAM resides on the GPU-side of the PCB.

GK110 GPU and vRAM

GK110 GPU and vRAM

The power section appears to have minor changes, but still has the same six-phase power design. It’s plenty for every day operation and moderate overclocking, but don’t go volt-modding one of these on the stock power plane. They’re not designed for it and you will blow something up (probably a MOSFET).

GTX 780 Power Section

GTX 780 Power Section

Finally, we have the GK110 GPU itself.



One more gratuitous PCB photo and we’re done here (thanks to NVIDIA for supplying the card photos).

GTX 780 Glamour Shot

GTX 780 Glamour Shot

Test Setup

Like all our GPU reviews, our test bed consists of an Ivy Bridge based system with an i7 3770K and RAM running at a reasonable DDR3-1866.

CPU i7 3770K @ 4.0 GHz
MB ASUS Maximus V Extreme
RAM Kingston HyperX Predator DDR3-2666 @ 1866MHz 9-9-9-24
MSI GTX 680 Lightning
HIS HD 7970 X Turbo *
OS Windows 7 Professional x64

*Note the HD 7970 X Turbo was the only GPU tested with AMD’s new driver with our GPU test setup when I graphed these results. It is a HD 7970 with strong stock clocks, operating at 1180 MHz Boost and 1500 MHz on the memory. Thus, you’re seeing some of the best AMD can come up with vs. the GTX 780. The results may surprise you.

GTX 780 Installed

GTX 780 Installed

Installed, the GTX 780 looks just as nice as it did on its own.

NVIDIA GPU Boost 2.0

To reiterate, you should go back and check our our original description of GPU Boost 2.0 from our TITAN introduction. Once you’ve done that, come on back.

Back? Ok, good. Now, this graph bears a little bit of explanation. As you just saw, GPU Boost 2.0 bases its boost not only off of TDP constraints (which you see mostly when overclocking), but also temperature constraints. Once the GTX 780′s GPU reaches 80 °C – which is its default temperature target – it will ramp up the fan and reduce the GPU frequency to make the card stay at its temperature target.

This is where NVIDIA’s silence starts to work against them. As you can see below, the max stock boost clock is a strong 1006 MHz. Looking at the stock Heaven Xtreme run though, you can see precisely where the GPU hits 80 °C and the frequency tanks. It then goes through continuous fluctuations while the card wavers between 79 °C and 80 °C.

However, if you use PrecisionX (or the GPU software of your choice, depending on whose GTX 780 you buy) to set a more aggressive fan profile to sacrifice a little bit of that silence – while keeping the GPU away from 80 °C, you can see the frequency goes to 1006 MHz and sticks throughout the entire benchmark run.

Stock GTX 780 Boost Clock

Stock GTX 780 Boost Clock

Thus, you have a choice to make. You can let NVIDIA maintain its near complete silence (and it really is quiet!), or you can take control of your card’s fan profile and let it boost to the level it’s meant to boost. It doesn’t take a rocket scientist to figure out we prefer performance to silence at this site, but you have the choice.


This card really surprised me with its overclocking capability. Coming in with a 24/7 overclock of +180 MHz on the GPU and +350 MHz on the memory (in Precision X; actual frequency was +170 MHz per GPUz), the GTX 780 was very impressive, especially with the meager +38 mV NVIDIA allows us to toy with.

24/7 Overclock - Precision X - +180 MHz / +350 MHz

24/7 Overclock – Precision X – +180 MHz / +350 MHz

24/7 Overclock - GPUz - +180 MHz / +350 MHz

24/7 Overclock – GPUz – +180 MHz / +350 MHz

You haven’t seen any performance yet, but this solid overclock managed to score 13353 in 3DMark11 Performance, a very strong showing.

24/7 Overclock - 3DMark 11 - +180 MHz / +350 MHz

24/7 Overclock – 3DMark 11 – +180 MHz / +350 MHz

So, what does that overclock get us in actuality? Why, it looks to be a very nice 1201.9 MHz. With the aggressive fan profile set, the GPU is allowed to stay up there most of the run too.

24/7 Overclock - Boost Frequencies - +180 MHz / +350 MHz

24/7 Overclock – Boost Frequencies – +180 MHz / +350 MHz

Overclocking the GTX 780 was very impressive. If you could give it more voltage, it would be even more impressive, but with the less than strong enough power section, +38 mV is as much as you’re going to get.

What is very disappointing is that this +38 mV limit is going to stay with us, from this reference card all the way up to whatever card partners produce (think MSI Lightning, ASUS TOP or EVGA Classified models), even if they have a more robust power section, NVIDIA is only going to allow that same +38 mV limit.

They say this is for safety reasons – they want to give a little flexibility (38 mV) and will let users potentially impact the long-term well being of their cards, but they refuse to allow higher voltage increases that may jeopardize the short-term life of their GPUs.

While this is understandable, I’d love it if they would let the partners make that decision. With a stronger power section (such as EVGA’s ePower), we’ve seen great things come out of TITAN when you throw a bunch of voltage to it. Without an external power section, TITAN dies faster than you can say burnt MOSFET. So it doesn’t seem like the GPUs are the problem, but NVIDIA’s weak reference power designs.

We are hoping against hope that board partners figure out a workaround to this limitation. If nothing else, design your cards with very robust power sections and give us solder points to easily and quickly attach volt mods. That wouldn’t violate NVIDIA’s restriction and would give us the choice of doing what we please.

All of that said, the GTX 780 has very strong overclocking results even without a large amount of voltage control.

Temperature and Power Consumption

When viewing the temperature graph, keep in mind this is not a referendum on the cooler’s capability. The GTX 780, like the TITAN before it, is made specifically to warm up to 80 °C and stick, using a combination of increased fan speed and frequency throttling. Seeing these go up to 80 °C is normal and intentional.

The cooler is capable of much better temperatures, but you have to compromise on silence. It’s still very quiet – quieter than any other blower-style heatsink I’ve heard, but it does make noise. If you’re willing to make that compromise, as an example, in the Heaven Xtreme run above where you see it keeping a consistent boost frequency throughout, temperatures stayed in the low-to-mid 60 °C range, far from the 80 °C you see here.

GTX 780 Temperatures

GTX 780 Temperatures

Power consumption is right where it should be for a slightly cut-down TITAN, coming in between 40 and 50 W less, topping out at 353 W in 3DMark11.

GTX 780 Power Consumption

GTX 780 Power Consumption

Cooling is right where it’s designed to be and power consumption isn’t bad for a powerful card with a 250 W TDP. Let’s start testing this thing!

Performance Results

All video cards we test are tested per our Video Card Testing Procedure. Long story short – benchmarks are run at their default settings and games are tested at 1080p with all settings turned to max.

Synthetic Benchmarks

Good old 3DMark03 hasn’t ever been Kepler’s bag and we don’t see it getting better here. We continue to include it because it’s valuable to benchmarkers, but there is really no correlation to modern gaming, so don’t read too much into this result.



3DMark Vantage, on the other hand, is a good place to start meaningful results. The GTX 780 starts off strong, with an overclocked result better than the TITAN and beating the 7970 X Turbo.

3DMark Vantage

3DMark Vantage

3DMark11 shows much the same result. If the GTX 780 keeps this up, we might not only have an HD 7970 GHz-beater, but a strong alternative to TITAN.

3DMark 11

3DMark 11

Wow, I’m not sure they cut TITAN down enough, what do you think?

3DMark: Fire Strike

3DMark: Fire Strike

Another bench, another win. Things are looking good for this card!

HWBot Heaven

HWBot Heaven

Game Testing

Starting off with Aliens vs. Predator, the GTX 780 continues its trend of beating the HD 7970 and its overclocked result taking out the stock TITAN. This is turning out to be a very strong card.

Aliens vs. Predator DirectX 11 Benchmark

Aliens vs. Predator DirectX 11 Benchmark

More of the same with Batman: Arkham City.

Batman: Arkham City

Batman: Arkham City

Here the HD 7970 X Turbo actually squeezes out a win with the GTX 780 at stock, barely. When it comes to Battlefield 3, the GTX 780 has some competition.

Battlefield 3

Battlefield 3

The stock GTX 780 actually beats its big brother TITAN in Civilization V.

Civilization V

Civilization V

Dirt 3 shows the TITAN coming out on top of even the overclocked GTX 780, but the two are very close. The HD 7970 lags behind by a little bit, then a lot when you overclock the GTX 780.

Dirt 3

Dirt 3

Metro 2033 continues to be the number one toughest game in our repertoire. Only Crysis 3 is worse and I’ve started recording those results as well. Here the TITAN is the clear winner, with the GTX 780 an easy second and the HD 7970 trailing behind.

Metro 2033

Metro 2033

Well, it’s pretty clear – at 1080p, the GTX 780 is top of the heap. Only the TITAN comes out ahead – and even then the stock TITAN often loses to the overclocked GTX 780. I’m very glad they didn’t cut it down more than they did, because that’s great news for consumers who want a more affordable option.

NVIDIA Surround Testing

Surround testing takes all the games (except Civilization V, which isn’t cooperating) and runs them in a triple-monitor setup in NVIDIA Surround / AMD Eyefinity configuration. This is where horsepower meets the road. If your GPU isn’t strong enough, it will crash and burn here.

ASUS Matrix HD 7970 Platinum

The Surround results do a great job of mirroring all of the games above – the GTX 780 spends most of its time out-performing the HD 7970 and, when overclocked, beating out the stock TITAN. Nothing to complain about here.

NVIDIA Surround / AMD Eyefinity Testing

NVIDIA Surround / AMD Eyefinity Testing

When considering such high resolutions, this is as good of a time as any to go over the GTX 780′s frame buffer. As mentioned, the GTX 780 comes with a 384-bit wide, 3 GB frame buffer operating at 1502 MHz. There have been people since the beginning of time complaining that 2 GB this or 3 GB that just isn’t enough. In the case of the GTX 680′s 2 GB, they may well have been right; however, NVIDIA thinks they have hit the sweet spot at 3 GB.

To those that say 3 GB isn’t enough, NVIDIA says, in their testing, it is enough for every game on the market today and the ones they know about that are coming. If you are adamant that 3 GB just isn’t enough for you, we’ve got some bad news – there will be no GTX 780s on the market – aftermarket or reference – that will come with a frame buffer larger than 3 GB, period. If that is extremely important to you (which it really shouldn’t be, based on the results above), NVIDIA would point you to TITAN.

Pushing the Limits

There isn’t much to this section – because of a certain other review, the water loop was in use elsewhere. The cooler used was an Intel cooler, and you know how great those are for overclocking. Thus, I picked the most GPU dependent benchmark (3DMark Fire Strike) and went to see how far the GTX 780 would go…and go it did, up to +190 MHz CPU offset and a whopping +650 MHz (in PrecisionX, +325 MHz actual) on the vMEM, giving us a score of very close to 10,000 for 3DMark Fire Strike Performance.

3DMark: Fire Strike - +190 / +650 - 9962

3DMark: Fire Strike – +190 / +650 – 9962

I bet it would have topped 10,000 if I could have overclocked the CPU a bit. This is a very strong showing. We can expect good things from those that are willing to hard-mod their cards (either with external power or hard-modding partner cards that have beefed up power sections).

Final Thoughts & Conclusion

Based on the venerable TITAN’s GK110 GPU, the GTX 780 is a very solid offering to lead the way into the new GTX 700 series. The performance is just great, absolutely nothing to complain about and plenty to celebrate. The GTX 780 is an absolute beast.

Frankly, I also like the reference cooler. They had a winning design and stuck with it, no complaints there either. For better or worse though, we probably won’t see a great many of those on the market. Partners are rushing to get their custom designs out the door and the majority of them will have different cooling. At least one partner I’ve spoken with said they will make reference cards, but they will be a very limited run; the majority of the production effort will be put into their custom designs, which is understandable. They need to be able to separate themselves from the competition and everyone is going to be trying to do that. You can expect the first of these very shortly and more will roll out as the weeks go on.

On the software side, GEFORCE Experience is a neat tool and should help do what it professes – make PC gaming more accessible. Keeping people that may not be keen on watching for driver updates up to date with the latest drivers is no small thing. For NVIDIA, it will at least keep them happy with the performance of their cards and not complaining about poor performance on the day a new game releases. The profiles will be perfect for those same people too. What has me excited -and probably a good number of our audience- is the upcoming ShadowPlay feature. That will be a …wait for it… game changer. Har har.

Back to the GTX 780, let’s talk a little bit about the price – $649 MSRP. It’s a little steep for me, but that’s what you can do when you’ve got the best. AMD did it with the HD 7970 when it launched and it’s tough to blame NVIDIA for doing it now. Until AMD has an answer for GK110, I fear we’re stuck with a high premium for both this card and TITAN.

In the TITAN review, I said that was an $800 card with a $200 price premium for being the best GPU on the planet. The GTX 780 is similar, but on a lesser scale. This is a $600 card with a $50 premium for being very near the best. It’s regrettable but not unexpected; that’s the way the market works.

Taking TITAN’s current price into consideration, they’re putting the GTX 780 where it should be, offering significant savings ($350 is nothing to sneeze at) for a card that can overclock far enough to get to TITAN’s performance level. So, is $649 a fair price? It’s very close, if not quite where I’d like to see it. $599 would look better when looking at purchasing one of these.

With that, I think I’ll put us down for today. What has the GTX 780 got to offer? Stellar performance? Check. Quiet, effective cooling? Yep, it has that too. It’s even power-efficient for its performance level. Yes folks, NVIDIA has taken the top single GPU spot and now offers a more affordable option with very near the same performance. AMD has their work cut out for them.


Jeremy Vaughan (hokiealumnus)

Tags: , , , , , , , , , ,


Janus67's Avatar

reading through it now.

First of all, excellent review (as always), Jeremy.

Secondly, very impressive for a single card. The only issue I can think of is that it is the price of two 7970s (appoximately) and will be outperformed by those in most circumstances.
DarrenP's Avatar
Well, i was wrong... About the shopped comment... xD

After reading: **** son, that's one hell of a card...
briansun1's Avatar
great write up... well now it seems like I have to go over to the other side...
Nebulous's Avatar
Dayum that's wicked noice! Sweet writeup as usual Hokie!
Nicely done Hokie!

Very intrigued by ShadowPlay.

Also, that thing is a BEAST! Can't wait to see factory OC versions coming out.
hokiealumnus's Avatar
Thanks guys!

Oh, I forgot to mention in the article - NVIDIA expects plenty of availability to meet demand. To wit, there are nine cards listed on Newegg right now. They are determined to keep this from being another GTX 680 (which nobody could get).
EarthDog's Avatar
Well done per usual, and what a BEAST!
Soulcatcher668's Avatar
Very nice write up.

So you did have one burning up a bench........ you sneaky man you.
Xeon_KI's Avatar
excellent review.
I think the 780 will be very popular
JR23's Avatar
Great review, right now, I am excited!

I guess you'll find me by the window waiting for the post man.

KurtBlanken's Avatar
Shadowplay is looking like a really nice feature. I think Nvidia will be the go-to card of anyone who records themselves gaming. The reduced overhead will make Nvidia the go-to card for Lets players and the like.
Bobnova's Avatar
Why is Nvidia comparing it to a GTX580? That's like, ancient.
70% higher performance (in theory...) and 225% higher price

I was hoping to see some full retail board photos.
Leegit's Avatar
They should have added in SLI 670s/680s to those comparisons :/
Noshei's Avatar
Have you guys gotten any word from EVGA on time frame on hydro copper?
Leegit's Avatar
What's so great about Hydro Copper? I'm n00b.

Side Note...

EVGA ACX Cooler 03G-P4-2784-KR GeForce GTX 780 3GB 384-bit GDDR5 PCI Express 3.0 SLI Support Video Card

Source: http://www.newegg.ca/Product/Product...C-C8junctionCA

Sold out already... that was fast :P
hokiealumnus's Avatar
Negative Noshei, no word on Hydro Copper.

That said, TITAN blocks should still work on the GTX 780, it's the same PCB with a few minor changes that don't affect cooler mounting holes/GPU/vRAM positioning.
Brolloks's Avatar
Stellar review Jeremy What a beast of a card
Looking at the benchmarks I notice the 7970 still gives you the best performance for money, you can get one new for 60% of the price of the GTX780 (a used one for even less), hard to pass that if you dont mind not having the latest and greatest on the shelf.

Any idea when the GTX 770 will make it's appearance?
EarthDog's Avatar

That is all.
Noshei's Avatar
Yeah, I have plenty of time to wait though. Planning my new build to be a 4xxx series CPU so gotta wait on that to come out anyways.
Janus67's Avatar
So because of the limitations of voltage, I wonder what the point of the 'higher-end' units will be, minus the aftermarket coolers.
hokiealumnus's Avatar
Heh, so you can volt-mod them and run them how you want?
Janus67's Avatar
hopefully they can make a switch for 'voltage unlocked' that comes off by default. I don't feel quite confident enough to do it for a $600+ piece of HW.
Brolloks's Avatar
Looking forward to your review Joe
raiku's Avatar
Welp time to throw away my 7950 for this :3
Darknecron's Avatar
I had been planning on buying one of these since the fall of last year. I ended up breaking down and getting an HD 7970 for $400 (with $120 of free games) instead. No regrets whatsoever...spent the extra on a set of Cambridge S30s (great speakers)!

Looks like my card @ 1200/1550 would compete with the 780 @ stock (or get close to).

Great review!
Culbrelai's Avatar
LOL... you guys make me laugh
KurtBlanken's Avatar
I think having the HIS 7970 X Turbo in there is a little misleading, would have liked to see a reference 7970 GHz edition used instead.

Hydro copper is a card that comes with a full coverage water block installed on it instead of an air cooler. That way you can add the card to your water loop right away, and don't have to worry about voiding warranty. It reduces risk on your part.
hokiealumnus's Avatar
You and me both. We can thank AMD for that, we never got a 7970 GHz Edition (and, frankly, didn't push for one because it wasn't exactly ground breaking...). I certainly didn't try to hide anything or mislead anyone and put an italicized note in the test setup section.

Boulard83's Avatar
thansk for another good review !
neilpeart's Avatar
Great review. I wonder which card is better: the GTX 690 oir the GTX 780
hokiealumnus's Avatar
For FPS, the GTX 690 mostly. I would prefer the stability, lower power consumption and headache-free operation of a single GPU card though. With the performance of this card and $350 savings, I'd go with the GTX 780.
Janus67's Avatar
690 will be faster I would imagine in most circumstances, but for the price (albeit $650) the 780 is a better buy.
Boulard83's Avatar
Heres some thermal shot from hardware.fr.

EarthDog's Avatar
Agree. I have a 690 and will be moving to a 780. I think this is the perfect card for 1440p screen and single card. It has the memory capacity (3GB) needed for that res, as well as the bandwidth (384bit) and you can save ~$350 quid over a Titan or 690.
Xeon_KI's Avatar
+2, but I'm keeping my 7970s
EarthDog's Avatar
I'd like to add that, at least with BF3, a single 7970 or a 680 (latest drivers) I wasnt able to run full tilt without some frequent dips in firefights below 30FPS without changing settings (HBAO to SSAO, and HIGH, leaving 4x MSAA enabled) @ 1440p.
PolePosition's Avatar
Looks like the ASUS Aries II smokes all the competition.
EarthDog's Avatar
For $1500 it better...! Oh wait, the $900 Sapphire 7990 matches it, and so does 2 regular 7970's in CFx for $800 total. That ARES, LOL...its lucky there was only, what 1K of them made, or I would be laughing even harder at the price point.
txus.palacios's Avatar
As usual, Hokie: great review!

This new generation cards do look really nice. If the 770 / 760Ti isn't superexpensive, I might consider dropping my SLi for one of those.
doz's Avatar
When are the cards going to release at all retailers? Amazon is showing OOS for everything. Was hoping to grab one from Fry's locally but they arent even listing the card. Newegg has everything instock though??
hokiealumnus's Avatar
Amazon has had them off and on I think. Microcenter has them according to someone's photo of a purchased one on OCN. EVGA has (or had?) them in their store too. Tiger Direct also had several of them.
Culbrelai's Avatar

Didn't they say they'd be $800?

Still too much money for only 3GB of VRAM. I can't wait for the GTX 6xx series prices to go down, since 2x GTX 670 > Titan, it must be also true that 2x GTX 670 > 780.
EarthDog's Avatar
$649 is MSRP (in the review...).

2 670's also cost MORE than a 780...
Culbrelai's Avatar
Yes obviously if you're building a new system from scratch with nothing then the 780 is of course the better deal im just speaking of people who already have 6xx series.
Theocnoob's Avatar
Although I feel $649 is fair from the standpoint of the card's performance, I wish they'd put it somewhere closer to $499. That would make it more affordable to a wider range of people. I have concerns for the rest of the 7XX lineup in terms of pricing. It looks like Nvidia is going to jack up the going rate on the whole stack, based on what they're charging for the 780...
mokrunka's Avatar
Great review buddy. Really enjoyed reading through it
Xeon_KI's Avatar
you could also sell you 6xx card. Money is money, whether it's invest now, later or before.
I doubt the 6xx series goes down much either, but rather just goes away. IMO
ivanlabrie's Avatar
Well, it seems like most Titan owners will get pretty pissed off (at least the ones who aren't using the 6gb vram buffer extensively that is)

Msrp is relatively decent considering how greedy Nvidia has been lately...
I like it but can't justify something like this for benching, specially since I would have to hard mod the card extensively to make it go high and a 7970 pays for itself over time.

Nice review Hokie!
wingman99's Avatar
I don't think it's fair price, I think there just gouging because they can.
manu2b's Avatar
Xeon_KI's Avatar
that and;
-they don't want to risk alienating the Titan crowd
-they don't want to lower 6xx card prices while they EOL them

Until AMD brings something new and releases their frame pacing drivers, nVidia will continue to milk.

My fear for those that plunge into Titan and 780s is will they remain competitive against Volcanic Islands and Maxwell when the time comes.
manu2b's Avatar
Pfff... dont know f it's worth the double $ tag for a 20% increase in perf. I paid €300 fpr my 7970, and it plays anything maxed out@1080p if you give it a 20/30% OC.
EarthDog's Avatar
This card, imo, isn't made for 1080p. It just feels like a perfect fit at 1440 resolutions where a 680/7970 tends to struggle a bit in some titles.
manu2b's Avatar
Yep, agreed on that.
trueblack's Avatar
I would like to see AMD's answer for that. rumors says not until 2014Q1, I hope not...

would the GTX 670/680 be made cheaper?
if that is the case, that will be quite smart of nvidia to take down more market share..
ivanlabrie's Avatar
I'm anxious to see the new Radeon's but not a refresh...What about the volcanic islands cards that had arm cores inside??
SF101's Avatar
having a impossible time trying to justify this upgrade over my 24/7 1285 core / 1950 mem oced 7970.

i wouldn't be surprised if stock it was slower than this card runs now.

if i did buy it it would just be so i can upgrade my guest pc from a 480gtx which also runs ok.

thought the 780 would be better than this might just wait for 8xxx cards.
trueblack's Avatar

but by then, GTX 880 be out.
SF101's Avatar
maybe it will be worth the $ then black.
and i doubt it .. maybe 6 months to 1 year from the 8xxx release yes but not in 6 months from now.
M33Cat's Avatar
So it almost beats 7950 CF but it no doubt way smoother. Come on AMD get that driver fix for CF out to help me
doz's Avatar
Id almost bet money on the 8xxx appearing by the end of August if not sooner.
SF101's Avatar
should be a oct 31st release called the 8999 "hell razer"
Leegit's Avatar
+1. They will have to counter this offering from Nvidia or they will be out on the streets. For new customers coming into the market they most likely will not buy 2x 7970s. A lot more fun to come

I have a hard decision to make in July. To 700 series or to not
SF101's Avatar
i dunno if out on the streets is a realistic thing. nvidia launched like 6+ months after 7970 ect.
wingman99's Avatar
I think you are overreacting I have not seen anything go wrong with AMD or Nvidia not having there new cards out in the past or present.
trueblack's Avatar
i wager you otherwise.

meet AMD GPU roadmap

AMD is under the illusion that they are still ontop most of this year:

Which is funny, cause except for benchmarks which AMD relies on, my GTX670 with mod-bios gives 7970 fps if not better in some games.
not to mention GTX 680 or the Titan.
Leegit's Avatar
Notice there is no Q4 on that roadmap If they come out in Q4 then no problem. But any later than XMAX 2013 and I think there's gonna be some probs. They're pretty low on cash compared to Nvidia... which IMO is taking market share.
ivanlabrie's Avatar
AMD gpu sales are fine, lots of coin miners buy em, same as guys with lesser budgets.
Leegit's Avatar
I didn't literally mean out on the streets. Lol. I just meant they're going to be losing marketshare, which is never good when you are #2 and strapped for cash.

I'm a greenie fanboy
Culbrelai's Avatar
I really hope so, I've been itchin for a second 670. And with that I should be good on graphics for a while...
trueblack's Avatar
probably cause those that didn't buy the 680 is still on the 580..
so for 580 holders, comparing to 580 makes more sense.

and i think price just around 125% higher? (225% of original price, but 125% higher, 225% higher implies a 335% of original price.)
Soulcatcher668's Avatar
Huh? Let's not portray this as something it is not.

The only reason that Nvidia compared it to the 580 is because it makes the numbers look better.

I really don't understand this corporate loyalty thing. You owe these companies nothing. You will do better for yourself and others if you look at everything these companies say critically.

This goes for all companies.


This launch is very similar to the last series, just reversed. Last time AMD had no competition for months. I am interested to see what the next gen cards are like.

Historically Nvidia's top cards have not been there best value cards. I want to see how the 'new' 770 stacks up.
DaaG's Avatar
I had read somewhere that AMD volcanic islands were coming along better than expected and would probably be out Q4,
Ah, nice.

It really did, makes me wonder if they've already got them done and waiting to spring a release lol.

Hokie, you don't have an 8970 sitting beside your 780 do you?
DaaG's Avatar
Found THIS

With THIS as their source

I remember the interview with the corporate guy stating they were ahead of schedule on them.
That Rage3D link was a good read!
Xeon_KI's Avatar
Some? like 1?
DaaG's Avatar
I want to pull the trigger on one of these so bad, but I'm only at 1920x1200 res, single monitor.
EarthDog's Avatar
... How quickly we tend to forget that the 7970 overclocks like a banshee itself... did I mention it does so without taking the big risk of flashing your GPUs bios?

770 has your name on it.
bluezero5's Avatar
I think 770 will be based on gk104, not gk110 though.

i wonder how it will look.

i on on 680, so 780 not yet for me..
will wait for the next gen.
DarrenP's Avatar
The GeForce GTX 770 will be based on the GK104-425 architecture.
The GTX 760 will be based on the GK104-225 architecture
hokiealumnus's Avatar
No 8970 here. AMD has said, consistently, that the 7900 series will remain "stable" in the top end throughout 2013. To me, that means we won't see any new architecture until at least Q4, and if then it will probably be a paper launch while they spool up production at TSMC enough to get stock out on the market, which will be mid-Q1'ish. It will probably be just like the HD 7970, but two years later....late December NDA lift on reviews, showing off at CES and mid-January availability.

I had this slide but can't find it, so this one is courtesy Anandtech.

Hopefully Sea Islands' 8970, or whatever they call it, will be another HD 7970. When AMD came out with that, it trounced NVIDIA's offering. Only with competition will both companies continue to improve. Competition also drives prices down, which is great for consumers.

I don't care which company is the best in performance at any given time, I'm just here to report on which one that is. As long as they're competing (and competitive!) with each other, that's what's important.
SF101's Avatar
From the rumors i hear it should be 50% faster than a titan but they are just rumor spec's could be total bs.

That said if they already have the card in the works to the point of leaked spec's i hope we see one at least by the last quarter of the year. that would be a nice surprise.
wingman99's Avatar
I don't think nvidia got the news they have competition and there supposes to lower there prices. GTX 780 $650
Culbrelai's Avatar
Still waiting for lower prices on the 6xx...

Comon nVidia.
SF101's Avatar
hardly ever happens this early culbrelai
Culbrelai's Avatar
More like hardly ever happens at all period with nVidia, they're pulling an Intel.

990x's are still $1000 =P
bluezero5's Avatar
i actually think they will drop the GTX 680 price soon to sub 7970 level.

Like 400-425. Will be a brilliant business strategy.
wingman99's Avatar
Why would the stores do that when they can sell them at full price, till they sell out.
bluezero5's Avatar
for now. yes. a few months down the road, the story might be different
ivanlabrie's Avatar
Look at gtc 580 pricing...retail.
wingman99's Avatar
Where at are they in stock.
ivanlabrie's Avatar
EVGA had some, Newegg and Amazon too. Not long ago at least, and they were pretty expensive.
Bobnova's Avatar
They won't lower the prices until AMD forces them to. The GTX780 price tag is a pretty clear indication of that.
Darknecron's Avatar
Well, the HD 7970 outperforms the GTX 680...and the GTX's prices don't seem to be budging that much. :/
bluezero5's Avatar
depends how you define Outperforms.
on benchmarks and stock bios, maybe.

for gaming, it really depends on the game title.. very mixed.
Darknecron's Avatar
There was a set of benchmarks that was done earlier this year, using up-to-date drivers for both cards. It placed the HD 7970 at or above the performance of the GTX 680 in 80% of games.

If I find it again, I will post a link here. Until then, disregard this, because I may be completely mistaken. :P
EarthDog's Avatar
I think they are well aware of it and priced it according. I would imagine with the impending(?) release of the 770 I would imagine its price to be in the ballpark of the current 680's pricing (which hasnt changed since launch), perhaps then the 680's pricing will drop.
trueblack's Avatar


7670 Ghz and GTX 680 at stock, basically the same result.
like someone else mentioned, it depends which game you are playing.

GTX 680 with BIOS mod vs 7970Ghz OC also about the same.
btu that's on reported FPS.

General choice:
If you a Gamer, Nvidia better, faster driver updates, better with recent games. Much less Microstutter, and smoother game play.

If you are a Bencher, AMD better, the scores are higher.

Btw, 7970Ghz is painful to the ears, at 61.5 Decibels. same noise test on full load, GTX 680 around 51 dcb. maybe that's a worth comparison too.

back to the 780.. quieter, faster. but the price is high...
I think the 770 will be priced near $550 and performance above 7970 and 680 for sure.
wingman99's Avatar
I think they priced titan and GTX 780 according to how much they can milk nvidia buyers.
Xeon_KI's Avatar
The Anandtech Bench is useful due to it's sheer scope and range of items you can compare, but this is also it's weakness, as they can't possibly keep something so vast up to date.

A good rule of thumb is always use the most recent reviews/benchmarks to make comparisons... Which makes the latest GTX 780 reviews ideal and quite appropriate given the thread topic.

That being said, Toms, Anand, Guru, Canucks all have the 7970ghz on top. (i left off Overclockers due to the OC'ed 7970 used)
Not even mentioning the free games bundle, a cheaper price tag and 3GB vram.

I mean even if you don't want the free games and you only have a single 1080p monitor, it's a toss up at best.
I'm not sure how you can give such the clear nod to the 680 as the gamers choice.
Darknecron's Avatar

Those benchmarks were done WAAAAAAAAAAAY back in 2012, and no longer adequately depict the performance of AMD cards (since their drivers take so long to mature).

The 7970 is the quietest card I have ever owned (my old cards: 9800 GT, GTX 470). I can't hear it unless I turn the fan up past 75%, and it never gets there with the clocks in my sig. Temps never pass 70C.

trueblack's Avatar
cool, you guys are AMD fanboys, and that's cool.
Go ahead and select the reviews that put 7970 ahead, I won't stop you.

the only fact of your statement that is 100% true, is that 7970 is cheaper than 680. the Ghz version is more expensive though.

I am here to tell you, we have played with both cards in a variety of set ups, and between a GTX 680 using Kepler Gold Bios, the GTX 680 easily gets 7970 Ghz performance at cost that's lower.

You are free to believe what you want.
but my recommendation stays if you asking me about the $400-$500 budget range.

I dont even need to bring in game driver update speed or anything here for this arguement, which if I do, makes nvidia stuff even better.
Darknecron's Avatar
Just saying....my past GPUs:

1. ATI Rage Pro 16mb
2. Nvidia FX 5500
3. Nvidia 9800 GT
4. Nvidia GTX 470
5. AMD HD 7970

I don't think I'm a fanboy; just stating what I perceive.
trueblack's Avatar
of course, all in good spirit.
our different views is what makes this forum valueable to readers.

Since list yours, let me do mine: (and thanks for sharing)

1, GT 220
2, ATI HD 6570
3, GTX 460
4, ATI HD 6950
5, GTX 670

so I basically jump every generation myself too..
I will admit, on my first 3 GPUs, I had no idea what I was doing.

my experience with 6950 was very bad, and the struggle in catalyst 11 was just a terrible time for me. Not to mention I found the 6950 though faster, the color/smoothness felt worse than my previous.. since then I have been nvidia basised. when I switched to GTX 670, and now on SLI, that made me the happiest person alive..

anyway, I won't be going GTX 780, out of my price range.. my hopes.. will be like.. GTX 860 in 2 years time.
EarthDog's Avatar
lol, so quick to call people names...for shame!

The 680 and 7970 seem to trade blows. Pricing on the other hand lands squarely in amds corner. You can EASILY find ghz edition 7970s for $50+ less than a reference 680.

Drivers for single card are fine with regular updates and CAPS helping things along.

Being a reviewer here gives me the chance to play with a lot of catds, and*a lot of drivers from both camps. I can say unequivocally that I haven't had issues with either in my experiences. That doesnt mean either camp is perfect, especially amd with the CFx issues that plague some, but imo, both camps are incredibly solid.

Perhaps, black, go checkout Newegg and verify pricing.

Edit: I see prices have JUST recently come down a bit on the 680 and some of the cheapest are now $450 (refernce model) while you can get a ghz edition, non reference and 4 games for the same price. Honestly, at that point, I'd still go for a 7970 for the vram and bandwidth, not to mention games and a better cooler... Perhaps that is just me though.
Xeon_KI's Avatar
Black as admitted he favors nVidia in the past.
He also thinks a 670 can match a 7970.
The 680 is smoother! But I'm the fanboy, right
bluezero5's Avatar
yay~~~~ i was right.

my local store's gtx680 dropped by $30-$40 too.
though i feel this is probably just the store premium dropping.

and 7970 and 680 trade blows. there's no clear winner while each company claims their better. as they should ! lets just agree there are roughly the same?
as seen in this example: http://hexus.net/tech/reviews/graphi...d-acx/?page=11

on using kgb... yes both 680/670 gets a boost that makes them a fair bit better than stock.. 670 can perform near 680, and 680 gets the edge over 7970ghz cause of it. (talking direct 11 game titles only in this statement)
but on 3d benches, 7970 still reigns, so even then there's no clear winner. also on openCL and bitcoin mining operations 7970ghz slightly better.. so.. really can't call which is better in a generalized sense.


now, the GTX780 is a FREAKING monster.
and I have read that there will be a Titan on Steroids coming out later this year, stay tuned!
Xeon_KI's Avatar
I've been looking, but I can't find any benches with the 680 using KGB against an overclocked 7970... can you link?
bluezero5's Avatar
If you speaking on benching items, there's none, 7970 Ghz wins hands down. We of the benching team can tell you that much. Go to hwbot.org and see for yourself, this is what I meant by "7970Ghz wins in benching"

In games though, it is different, as the driver probably has a lot to do with it, and I find NVDA drivers more optimal. I cannot find an example with KGB mod, but here's a stock comparison of GTX680 with 7970Ghz. in this example the GTX680 beats 7970Ghz -on stock-.. so when OC-ed, and KGB applied.. you do the math. and I have seen in many forums people claiming GTX670 on KGB approaches 7970Ghz performance too. (I haven't tested it.. but I do not think it is impossible either.. driver optimization for games.. can make very large difference.)


This is what I mean by "In games, I find NVDA GPUs better", this might be related to GPU-CPU sync or just flat our driver optimization, I do not know. but it is quite clear that GTX680 can easily go head to head on 7970Ghz on most of the recent games.

Like I said, both cards have their perks, saying either is better than the other is not a very correct way of describing the results at all.

EDIT: I found a good comparison page for everyone: http://www.techpowerup.com/reviews/ASUS/ARES_II/27.html
This is done in January, includes all we want to compare in the above talks.

on OC: 7970Ghz can probably get up to 1360Mhz without crashing, GTX 680 can get to 1306Mhz (on KGB) both of which is around the same 10% boost from stock OC.
So if we use this comparison, and extrapolate the results, I will say GTX680 and 7970Ghz is really all about the same.
Xeon_KI's Avatar
This is one bench of one game at one resolution and a difference of one frame... really? this is the proof of the 680s superiority in games? 1 frame is well within the standard margin for error
Sleeping dogs, Dirt, Hitman,Tomb Raider all favor the 7970 by a large gap.
Battlefield 3 , Crysis3, Farcry 3, Metro, Bio Shock are basically a toss up as some benchs have the 7970 ahead and other have the 680 ahead and both are within a frame or 2.
Blizzard games, like Wow and Starcraft 2 favour nVidia hardware so the 680 is king in those games.
Here are the benches, go though them

When I say Bench, I don't mean synthetics, I mean all which includes games.
bluezero5's Avatar
yup yup, read all those, only selected one.

the performance link I posted was pretty good though.


as it takes everything into account for a general propose comparison.
Xeon_KI's Avatar
ok... then technically you should be using this version as it is the most updated
and @ 1200 they are both 83% of the 780 and the 7970 is slightly higher @ 1600.
So again, how is the 680 better for gaming?

I'm not the one saying the 7970 is the clear choice for gaming, I only asking the question as to why you guys believe the 680 is.
Darknecron's Avatar
I had originally intended on waiting for a GTX 780, but I got impatient. Also, when I saw that the HD 7970 came with $120 of games I was going to buy anyways, it was a no-brainer. Turns out I spent $370 less than I would have on a 780 (less $250 from GPU, less $120 from games)....and I spent the extra cash on some really nice bookshelf speakers.

EDIT: Someone needs to make a stickied thread that contains a list of all large graphics card shootouts and comparisons (along with the dates they were done and the driver versions used, if possible).
bluezero5's Avatar
hmmm.. I read the link you showed differently

overall:in comparison to GTX780
GTX680 84% 7970GE 83%
all resolution except for 2560x1600 gtx680 > 7970GE
(and I dont know anyone that use 2560 x 1600 res for gaming either.)

so your link is quite supportive of the fact that GTX680 is about the same as 7970GE and outperforms in fact in most gaming resolutions. But that's not even my original point...

I have found AMD to be much slower in updating their drivers for new releases, I had this talk with someone else a while back, and in the end, we went through the update frequencies in the official sites to prove that Nvidia has a lot more beta drivers for intrim games releases. (please dont make me count again. ) and NVDA is now releasing GEFORCE experience, which is an auto update of the latest game driver to even make it easier for the user.. these are the things that makes NVDA better 'for gaming' for me.

if we want to bring up SLI and CFX.. then we have another issue there too.. a sensitive word on this forum.. "microstutter". which is also an issue in gaming for some folks... so all this combined.. don't you think it is common sense that that NVDA GTX 680 is better for gaming than 7970GE? not to mention, GTX 680 is a tad bit cheaper than 7970GE now...

If you think otherwise, I would like to hear your logic as well, do share.
Darknecron's Avatar
Less than 1080p: GTX 680 > HD 7970 GHZ
1080p/1200p: GTX 680 = HD 7970 GHZ
Over 1200p: GTX 680 < HD 7970 GHZ
Drivers: Nvidia >>> AMD
Multi-card: SLI >>> Crossfire
Price: GTX 680 => HD 7970 GHZ
Xeon_KI's Avatar
Who games @ less than 1080p and of those, who owns a 680 or 7970?
Even still, 1% advantage from 1 site is not a win...

What does it matter if nVidia pushes out 10x the beta drivers? can you show me where these beta drivers are providing a huge boost in new releases over AMD not putting out beta drivers?

The only thing nVidia has over AMD at the moment is multicard support.
This conversation was about single cards, nowhere has anyone mentioned SLI or crossfire.

You are splitting hairs now if the crux of your argument is driver support and Geforce experience.
The 7970 is every bit as great for a gamer as the 680(at any resolution) without factoring in tangible things like 3gb of vram (future proofing and better at higher resolutions) and an awesome games bundle.

I'm done, sorry for taking this WAY off topic... we should be drooling over the 780!
bluezero5's Avatar
..... I am not here to argue, I hope someone else will take my spot here... but since you asked, your link just showed 7970 only better at 2560 resolutions in gaming. Not sure why you are discounting everything else that they are not as good at. However, I will just call them equal performance.

1% advantage is not a clear win for GTX680, I keep saying they are about the same in performance. No one is challenging you there.. so be cool. We are establishing the fact GTX680 <=> 7970GE here only. Both cards are GREAT for gaming, and personal preference can easily make the decision sway either way.

as for the drivers, just search in their sites, I think you know how. More importantly, is 'how fast' each company optimize their drivers to new releases. You can google that too, remember to include AMD CAPS and GeForce Beta drivers to ensure it is a fair assessment.

For gaming support, nvidia has their own quite developed 3Dvision, which AMD also is underdeveloped this moment, if you want to count more things.. Performance per Watt is also an nvda win, so is noise level, on top of SLI support, and cost.. if all these factors, listed out, you still prefer 7970GE for gaming experience, then there's still no problem, I own a 7970 myself and I love it. I just prefer GTX680 as I find everything smoother better to the eye.

for me, 7970 has one clear solid use that makes it > GTX680, and that is to get high scores in benching tests, which I do a lot. but should I not have to do that, I will be picking GTX680 anyday.

and if I can pick again, I pick TITAN !!
Darknecron's Avatar
Also, I don't think that AMD cards support Ambient Occlusion...at least I don't see an option for it in my Catalyst Control Panel...
Janus67's Avatar
AMD support SSAO, although I think they force it on unless it is selectable in the game?

Not positive.
trueblack's Avatar
back on track of the GTX 780... this one seems to have everything right..

despite the cost, I feel it will be very popular.
Janus67's Avatar
I agree. I still think $650 is a bit too premium but we will see how the rest of the line performs
Bobnova's Avatar
The cost is wrong, the performance is nice though.
trueblack's Avatar
I think it is priced that high cause there's no competition for it this moment...

this is when we all have fingers crossed for AMD to give them a beating.
Bobnova's Avatar
AMD gives the 680 a beating and costs less. Nvidia has enough Loyal Followers that are willing to pay too much for the performance as long as the label is green.
trueblack's Avatar

would like to know based on what you say that:


then onto pricing:
Newegg price GTX 680: $470
Newegg price 7970 GE: $460

based on the above info, I hardly consider the $10 difference anything.
Nor does the difference in performance can be considered "a beating".. if anything, the 7970GE is a desperate attempt from AMD to get the 7970 "on par" with the GTX 680. This is also just considering stock only, for 6XX series, there's a KGB BIOS flash you can use that brings it up but another notch.

then I just read another post above for more points:
Nvidia there's:
  • 3Dvision with active 3D
  • PhysX supports games
  • Less Power Consumption
  • Less Noise
  • Less microstutter in SLI

If anything, it is GTX680 giving 7970GE a huge beating at the same price. Not sure why so many people pro-AMD despite flat facts on the floor.
I am an Nvidia fan, but to be objective here, I will just say GTX 680 = 7970 GE. facts are above, read in your free time.

Some say AMD 7970 benches better. Bench schmench. I didn't buy a "G"PU to do benching exercises, I bought it for better graphics. Saying AMD can score higher in benching as a 'buying point', is like saying You bought a calculator to draw rectangles with..

I might be nvidia basised, but ONLY cause the facts shows so.
If AMD is better, I will jump camps anyday.

THAT being said, the Titan is a bit too expensive.. but like I said, that's cause AMD has no answer to it, 'at all'. not to mention any microstuttering problem still unresolved.
wingman99's Avatar
Titan and GTX 780 are way overpriced for what you get with new games performance compared to the GTX 680 you get 8 more FPS in crysis 3 and with the Titan only 12 more FPS.
12 More FPS in farcry3, that sucks.
trueblack's Avatar
yeah, in agreement, those with 680's is better off just going SLI.

I think that's why Nvidia uses GTX 580 as the 100% benchmark for the 170% performance boost of the GTX 780.. their target audience are those that didn't went for the 680.

So you are a farcry and crysis player then?
for you, AMD might be better.
If you are into Bioshock and Assassin Creed, then Nvidia blows AMD out.

but if you play almost every game like me, then Nvidia gives better overall score. (though the difference is hard to notice)
I will gladly call gtx680 same performance as 7970 GE. unlike some AMD fans that just refuse to admit. weird as hell.

What I don't like about AMD, and I have said this in multiple threads in the last few months: AMD has a terrible habit of being quite slow with drivers for new games though. This is personal experience, for new games, you will need to wait 1-2 mths before AMD will give you a driver that gives decent performance, all the along, nvidia usually release drivers ON THE DAY of the game's release. That I also take as a very 'visible' difference. For old games though, the difference is very small.
Xeon_KI's Avatar
I really question Techpowerup benches.
Guru, Anand, and Toms all have either a tie or AMD win in Bioshock.
But techpowerup has a 7 or 8 FPS advantage across all resolutions for the 680.
Hell even the 670 beats a 7970 ghz in their graphs.

Given how many numbers they push into those reviews, I wonder how often they rebench for new reviews
Bobnova's Avatar
Not often enough, I'd wager.
EarthDog's Avatar
Wizzard is in the middle of updating on all new drivers. I havent been to that site in a couple weeks so he may be done, I dont know. Its actually more often than one would think.
bluezero5's Avatar
well. nothing on the internet is worth trusting 80%+,
this goes to ALL sites, not just specific ones.

I run a lab in Japan, so I know that all too well. hahahaha.
Very often, and I mean VERY often, sites have what we can 'sponsoring biased'.
I guess it is self-explanatory. And you will never know 'for sure'.

I am a firm believer of 'test it yourself, and draw your own conclusions'.

and I only trust sites where I 'met' the people, and I know their results are justified enough.

For the rest: don't pick some on the internet as correct, and some as incorrect.
cause you do not know until YOU did the test, and picking one over the other, is at times, a 'Hypocritical' thing to do.
cause you are accusing others of doing exactly what you are doing. heh.

Things on the internet?

Just a reference point.
At $800! Impressive clocks though, 117MHz core over stock.

EVGA GeForce GTX 780 Hydro Copper
Part Number: 03G-P4-2789-KR

•980Mhz Base Clock
•1033Mhz Boost Clock
•188.16GT/s Texture Fill Rate
•3072MB GDDR5 Memory
•6008Mhz Memory Clock
•288.38GB/s Memory Bandwidth
bluezero5's Avatar
EVGA stuff are really getting quite good..
I had one of their 550Ti models before this Asus card, and I could OC the crap out of it!

But I had a buyer for my card and got a deal on the one I have now.
Darknecron's Avatar
A little redundant, but...

Also take note of the Game Benchmarks... :P
Xeon_KI's Avatar
I pointed out a single review that is inconsistent from a list of other popular review sites.
I'm not cherry picking 1 review or one game bench.
ED already cleared up the frequency at which they updated their numbers.
I don't see any harm in asking the question.

As far as testing it yourself.
I owned a 670 (1280 boost on core) fyi, I was very happy with it, and I'm very happy with my 7970s.

Agree to disagree
Darknecron's Avatar
Albeit that the HD 7970 used here is an uberified GHZ edition, the benchmarks display the performance gap between the AMD card and a GTX 680. It is likely that a superclocked GTX 680 would perform up to par with or better than the HD 7970, as displayed in this review. However, it looks like the AMD card has a definitive win in Battlefield 3.
wingman99's Avatar
I trust Techpowerup benches they are using the newest drivers.
Darknecron's Avatar
I trust our benches. Look at my post above.
MattNo5ss's Avatar
For the GTX780 SC ACX review, I re-benched my GTX 680 using the newest drivers available at the time, 314.22, and I used the data from EarthDog's most recent review for the HD7970. The following are the clocks the GPUs were running at during testing and the drivers used:

EVGA GTX 680 (Reference)
Driver: 314.22
Core: 1084 MHz
VRAM: 1502 MHz
Reference GTX680 cost: ~$450

Driver: 13.4
Core: 1050 MHz
VRAM: 1375 MHz
Price: $400
KurtBlanken's Avatar
Anyone have any guesses as to what's up with the Hydro Copper Classified?
EarthDog's Avatar
What do you mean Kurt??? Whats up with...........what?
It'll be a higher clocked version of the Hydro Copper
EarthDog's Avatar
With the Classified, and I guess this is what he was getting at(??)... its more than just increased clock speeds. With PAST Nvidia implmentations, they have voltage read points, MUCH better PCB/power bits, etc. Its, usually, a lot more than clockspeeds.
Well, I learned something today!
KurtBlanken's Avatar
That's what I meant. I didn't know about the previous classifieds. I'm also wondering if there is any speculation as to whether EVGA will try to circumvent Nvidia's locking down of the voltages.
EarthDog's Avatar
A vendor or two tried that, and got yelled at and threatened. So, I doubt it.
hokiealumnus's Avatar
They didn't just get yelled at. Unsubstantiated rumor had it that some partners didn't get to put out first-run Titans because of it.
MattNo5ss's Avatar
What if they made easy access solder pads where you would just solder a 0ohm resistor (or wire) to link two pads, and that would enable more volts? Like what ASUS did that on their GTX580 Matrix.
Janus67's Avatar
What if they just had a bios switch?
hokiealumnus's Avatar
That would probably still be 'soft' control in NVIDIA's eyes. It seems to me that they basically want you to have to use a soldering iron, making warranty claims easy to differentiate.
trueblack's Avatar
you pointed out the site that did multiple resolution test.
if anything, that was probably the most comprehensive of them all.
but since they didn't list 7970GE > 680 like the rest, you cherry picked it out as wrong.

yea, that's exactly what you did.

you see, this is what I dislike about people on the forum in general.
even at this point you claim you are not cherry picking results.
You and I, the difference is I accept other sites saying 7970 > 680, their test, their result. Good for them. I personally find them pretty equal. So when techspot says 680 > 7970, I told myself, "hm. consistent, avgs out fine"

but not only you, even some more season posters are quick to call them inaccurate, or 'not updating enough', based on 'absolutely nothing'.. if that is not AMD fanboyism, I don't know what is.

I am saying this with good intention, not trying to be condolscenting or anything. I am trying to point out, you have a flaw in logic, and you might already been subjected to marketing bias, that you should be aware of, hope you can read my post that way.

Nvidia has been doing GREAT things, look at GTX 780 !! amazing piece of work!!!
I think people should begin to recognize that the coming generation of GPU for now, crowns on Nvidia.

Who knows, let's hope AMD catches up and stop playing denial.
Darknecron's Avatar
I am surprised that the 680 didn't score higher on BF3 and Arkham City. O_o
trueblack's Avatar
me too. O_o

but results are results. let's avg it to the pool.
or maybe the latest catalyst drivers improved the other side too.
Xeon_KI's Avatar
What I actually did was read your post where you state nVidia kills AMD cards in Bioshock and I thought, hmmmmm... that is not what I remember seeing the day before.
So I decided to actually fact check it across multiple reviews and was faced with an anomaly.
I thought to myself, "I wonder why these result are so different than the others."
After a moment or so, remembering the Anand Bench is not often updated due to the amount of datapoints and came to a possible answer.

Also both you bluezero5 started out stating the 680 was the clear gamers choice under the premise it being better at game benches, while the 7970 is better at synthetic benches.
Using multiple benches, from multiple sites I proved that this is false.
Only than did you both change your story and talk about SLI, better 3D support, etc...

Maybe if you would just stick to facts or try not to pass opinion off as facts, people wouldn't call you out and you wont have to change your story.

This conversation bores me at this point.
bluezero5's Avatar
OK, I just read the whole darn thread cause you decided to call out on me.

I will leave you two to your quarrels, but since you called me out on one thing, I will explain.

My points are simply:

1, "i think" gtx 680 > 7970 in games, that's my opinion, and mine alone. I happen to have both in 1 system. really.
2, I believe nothing on the internet should be trusted, so I made a comment about 'testing it yourself'. I hope you came to your conclusion yourself.
3, The rest was just causal talk. (I brought up SLI/3D cause you asked me what else makes GTX680 a win in gaming! you asked !!! I didn't change any topic!!!! )

hmmmmmm. I strongly disagree here.. that's calling confirmation bias. You see a result that you didn't like, look for cross reference to find something that justifies you.
In the field of science, making such statement will cause you your job, cause that's what we call 'data manipulation'.
Finding something to FIT your theory.

for example, you can also easily find another site that supports gtx680 > 7970 ghz for your example of Bioshock too, here:
Both of these studies seems to be inline.. with the other site you showed me in the beginning.
and all I googled was gtx680 7970 bioshock, and used the first 2 results.
So at this particular point, I am slightly uncertain why you question the first site's credit, calling it 'wrong'....
as clearly some people tested otherwise, and it is more than 1 for sure...

So which site should we trust? NONE. trust only yourself. (that's ALL I AM SAYING!! )
I am not here to make any decisions at all, I don't even play games that much, most these titles I never even heard of.

I am also not agreeing with the other parties.. I am just saying, what you said right there, (which I believe is referring to my statement about the internet.) is a taboo in science.
I find it weird why Bobnova and Earthdog decides to 'thank you' for saying that... not very... eloquent of them.
they of all people.. should know better about trusting some sites and not trusting some.. weird.

I find this not constructive to the forum, and will make this my last comment on this thread.
If you have further inquires directed to me. Just PM me, let's not bother others.
Janus67's Avatar
I understand the confirmation bias part, but I think it is just as important to have multiple data points that can show a counter-argument to a statistic instead of trusting one source again going the opposite way. I don't believe Xeon did anything in a negative manner, I would have done that same to point that I found conflicting numbers on a different site. The problem is the sheer number of variables that can cause slight differences in results (one boost here versus not there, ambient temperature, different motherboard, processor, speeds, drivers, better silicon on one chip than the other, etc). IMO if two are within 1-2% of each then it is within margin of error.

Granted, the majority of this thread could be its own thread (maybe 'benchmarking/review bias/notes or something)
Culbrelai's Avatar
Indeed, no bias here.

Just can never let nVidia have a win =P
wingman99's Avatar
I think the time has passed were you can really tell a winner from benchmarks they are to close to call and much to close and difficult to be consistent with testing. What all this arguing shows me is that the GTX 780 is not worth the cost.

With all this variable performance crap it's impossible to tell who is ahead and by how much and since the reviews have been more difficult, this is a growing problem where people don't trust reviews anymore and to test it your self is a bunch of crap, Who has the money or setup to waste that much money.
Xeon_KI's Avatar
Bit tech review, a month old
Guru review, 2 months old.
This isn't rocket science here(no pun intended), a good researcher uses the best/most up to date info and from as many places as possible to form an opinion.
I'm not going to buy every card and bench them across multiple games on multiple platforms, running multiple OSes.
That is why I love to view and browse these types of sites... I have the curiosity of knowing, but I don't have the time or the means.

I use the best information I have access to, to form my opinions. I'm still waiting to see better data.
hokiealumnus's Avatar
In the sentence you just quoted, I said the GTX 780 out-performs the HD 7970 GHz in every metric. If you actually think that's bias, you truly need your head examined.

EDIT - From my conclusion:

Seriously dude, you need to check your fanboy attitude, it's really tiresome.
EarthDog's Avatar
I dont think its so much who has the money or setup as, if you are interested in buying a card, you have the money and the setup already, no?

The problem there is most, just dont know how to properly benchmark, test, and compare. Its not rocket science by any stretch, but knowing how to properly test and compare isnt something people just 'know'.
wingman99's Avatar
How do you properly test without all the video cards to compare?

There is to many variables to compare you data with someone else, like where you did you did your benchmark in the game and number of players and the same path you take benchmarking.
Most (if not all) games that this site benchmarks with have a canned benchmark built in.

So, yes, they are the same.
EarthDog's Avatar
Perhaps take a read about or testing methods as that is covered in that thread...

To summarize, I was on the same page as you as far 'run throughs' and the inevitable variability that it can cause. The only benchmark we have that is not canned is BF3. We tested BF3 and its variability in the section we use (very beginning of SP, the train scene, again SINGLE player not multiplayer as you are correct there) and the % difference was REMARKABLY similar. In that scene, there isn't a lot of 'world' to do different things in, you are inside of a train and the same guys shoot the same way at you for the most part. IIRC, the difference between runs are within a % point when we tested this. It blew my mind to be honest, but the facts are the facts. Unless you deviate completely from the brief run through, like sit looking at a chair for faster FPS or flames/smoke for lower FPS for an inordinate amount of time, its surprisingly accurate.

As far as comparing between the viewers. We all specifically buy setups that match. 3770K, Z77, ram speed, etc. Yes there are likely some minor differences there, however, we do not have the resources to set up a lab and test on a single benching station. It is what it is, however, the differences are A LOT less than you are imagining from both of your points.
MattNo5ss's Avatar
Manual, actual gameplay runs are useful since they present real gameplay numbers, but, because of possible inconsistencies mentioned above by wingman, they are not the best for comparing cards, especially cards that are close in performance to one another.

Canned benchmarks are best for straight card-to-card comparisons since everything is consistent, and actual gameplay numbers are good at giving a general idea of what performance one can expect to get while playing a game with a similar setup. In my opinion, they complement each other.

As EarthDog said, our BF3 manual run is the first scene in single player in a train. The train is a straight line, there's only one way to go and you have to kill the enemies in your path. Pretty much as consistent as a manual run can be.
trueblack's Avatar

some people just never admit when they are incorrect, here's a fine fine example.

Xeon_KI disagrees with GTX680 being gamer's choice. When I posted this:

he wrote:
so another forumer posted two links to answer him:

Xeon then suggests these links are too old, and he posted 4 links that he 'thinks' is more accurate.

which in fact, 2 out of 4 there also shows GTX 680 doing better in the most recent game titles. One of my favorite game, Bioshock gets isolated as example:

then Xeon_KI still thinks those data are inconclusive, post another link

This is my favorite link, as he posted a link that shows the performance of EVERYHTING averaging in, clearly showing GTX680 >= 7970 Ghz under 2560x1600 resolutions. This must be a mistake on Xeon_KI's part though, cause he later decided his link is not accurate. He keep asking:

So Bluezero5 decides to bring up while the performance are 'very similar', Nvidia has many features that makes it better for the gamer, these includes:
PhysX, 3dVision, SLI > CFX. etc.

Xeon_KI knowing he lost the arguement, start to say talking about other features is a 'diversion of topic'. Which in fact he asked to start with, after a very conclusive multiple site that HE posted, shows performance of GTX680 <=> 7970 in most areas. (except synthetic benches.)

Somewhere along the line, Bobnova decided to chip in:
also making claims that techpowerup site is not well managed, and their benching is wrong, when I just call him out and said his price is wrong, and his concept of gaming performance is wrong. (Just flat wrong, you will think a benching team Senior member will know better.) GTX680 cost same as 7970GE on newegg, and multiple sites clear shows the avg performance of GTX680 > 7970GE.

Then Xeon_KI returns with a vengeance, claiming Techpowerup is the ONLY site that says so. As other sites says otherwise, Techpowerup MUST Be wrong. (AMD fanboyism at best here I believe.)

Then I of course was upset, so I told him nicely that he shouldn't cherry pick reults, I even give disclaimer in my best intentions:
then Xeon_KI comes back with:
well, Bobnova and Earthdog loved that comment I tell you. cause he claims the site that claims GTX680 > 7970 is FALSE, and keep saying any talks about SLI > CFX or any issue on microstutter is no issue. cause AMD cannot lose to Nvidia right? no way. Gamer support like better drivers, and 3Dvision is now deemed as 'changing story' and not focusing on the topic. and it is Xeon_KI here that says other sites ALL claim 7970 > GTX680 on Bioshock.

This upsets bluezero5, so he googled two more showing that many sites supports the fact GTX680 >= 7970GE in most gaming situation, and the newest favorite game Bioshock gets in as example:

both Bit-tech and Guru agrees ALSO with TechpowerUp that GTX 680 > 7970 in most resolutions.

Then other than accepting defeat, Xeon_KI pulls the "I am bored with you card" and calls those '1 month, 2month ago, too old', when in fact the first TechPowerUp report that's 1 week old ALSO agrees GTX680 > 7970GE conclusively. So rather than 'realizing GTX680 > 7970GE is true since many months ago, he decides to think the rest of the world is not scientific.. claiming:
when the world gave him data, he spits it out, old or new. Newest being TechpowerUp that agrees with the old, but apparently that's not accurate cause it doesn't say 7970 > GTX680.

AMD can't even fix Microstutter yet and is making everyone with CFX suffer. (my roommate is a perfect example.) If that doesn't say why GTX680 is a better card for Gamers, what will? better synthetic benching? more microstuttering?

Now, there's GTX780, which is BEYOND awesome.
The price tag is too steep for me, but if you look for performance, it beats EVERYTHING, hands down.

Hopes this answer why Nvidia > AMD as of today. and may all those that claims otherwise find 'good data' to support them before they make a rebuttal.

I am here on NO WAY saying 7970 doesn't have its perks, which is in pure calculations like bitcoin mining, and folding at home. 7970 is also better for selected game titles, and excels at the resolution of 2560x1600. But on average performance across all games, I am saying that GTX680 is a clear Gamer's choice, cause the performance is better, cost comparable, and cause the company support is BETTER, 3Dvision is VERY good, so is PhysX for selected games. I am here to say cause of these supports, and GTX680 <=> 7970GE for about the same price, this makes Nvidia better. and i bought a GPU for gaming not folding, nor do I use a 2560x1600 res. If you do have such a high res, 7970 will be better. But if you are like the rest, on 1920+/- resolution, GTX680 has better avg gaming performance, proven by multiple sites that are relatively new: http://www.techpowerup.com/reviews/N...TX_780/26.html

This is getting good.
Darknecron's Avatar

I give the thread 3 days before it is locked.
Bobnova's Avatar
Seriously, to the popcorn.

Can you find/link/quote where I said anything along these lines?
Point me towards where I outline my concept of gaming performance. Or where I said TPU isn't well managed. Or where I said their benching is wrong.

680 price drops are brand new, and happened since the last time I looked at prices.

Don't forget the most recent Nvidia drivers borking BF3, or the yellow roads in BF3. Gamers don't like that, I don't think.

Not everybody likes 3dvision. It makes many seasick, and may or may not impair depth perception with extended use.
PhysX is lol.
AMD brings eyefinity.
As to SLI vs CFX, I give that one a meh. I've yet to have issues with either one.

As for the 780 being beyond awesome, I'm disappointed for the price.
The performance is nice, but the pricetag ruins it for me. Much like the 680 was ruined for me prior to the recent price drops.

Less than three days.
If it lasts the night I'll be mildly surprised. The personal accusations have started flying, that's certain doom.
This +1.

On a side note, dude messed up when he called out Bob, its on like white on rice now.
trueblack's Avatar
yeah sure. when talking about how often techpowerUp update their benches:

If You mean something else there, I would apologize and hear what you mean by that. But from a normal person reading, you are saying TechpowerUp's most recent bench is poorly managed, and can be ignored. You got mislead by Xeon there, cause in fact Guru and other sites shows GTX680 > 7970 in his example, bioshock. Look at bluezero5's links if you do not believe.

Yup, please keep up with the world before making a strong statement. as a senior member you should.

Wrong is wrong dude. If correcting senior member now a 'no no' on this forum, that we might as well be communists.

funny isn't it?
one thing nvidia done, and you completely Soak it. good things from nvidia completely ignored. All my teammates I know, use 3D glasses on BF3, it makes it soooooo much better. I will accept your personal dislike though, its not for everyone. but yes, the recent driver wasn't the best for bf3, that's why in techpowerup, nvidia no longer score as high as well. But EVEN THEN, the average performance is still better. I wonder what you have to say there. http://www.techpowerup.com/reviews/N...TX_780/26.html

AMD eyefinity use to be something, until the kepler series of GTX cards can do 4 screens WITHOUT the requirement of a passive->active tail adapter. So unless you want more than 4 screens. They are the same.

again SLI>CFX is ignored, cause AMD loses. totally normal. just people quitting gaming cause of it.

yup, bring the popcorn, what else you got to say?
Xeon_KI's Avatar
wow, I'm honoured you are this upset!
I also want to congradulate you for one of the largest fail posts I have ever read!

Bluezero brought up Techpowerup, all I did was point him to the most recent review from them... so you know, to get the most up to date numbers.
We talked about the overall numbers and I also agreed that the 680 over all resolutions was better accoring to Techpowerup, but was the same @ 1200 and worse @ 1600.
Thusly proving my point that the 680 was not better in game benches. pretty simple, I'm sure even you can follow that, ya?

The rest of your mess of a post is filled with exaggerations and half truths.

Also your true colours come out when you make this about AMD vs nVidia...
The 780 is faster than everything (except for Titan, lol derp) and that is why nVidia is better than AMD!!!!
Darknecron's Avatar

*hides in bunker*

Yeah, I actually meant to type 3 hours. lol
trueblack's Avatar
that goes to you, I am in a jolly good mood today. are you mad? cause you reading it wrong. I am just joking with my roommate when correcting you, but if you are mad, you might read it so.

yup, and your link said you are wrong. dude. . big fail. check again. http://www.techpowerup.com/reviews/N...TX_780/26.html
man, still in denial. so you want to say 7970 > 680 on average? despite almost no gamer game on 2560x1600? you know what? fine, live in denial. but the data says otherwise. The rest of the world is saying 7970 <=> 680 (that reads, they are the same), but nvidia support is better, and in games FPS on average GTX 680 is better, hence GTX680 a better gamer's choice, this is also 'your' question, people just answering. but I guess the truth hurts? You need 7970>680 don't you? -chuckles- for u everything else that is a tie breaker for a 'gamer's choice' will be a change of topic.

another failed post by you, the last paragraph doesn't even make sense. Guess u upset.

relax and enjoy man, life is too short to be upset.
bluezero5's Avatar
can we just group hug and agree 680 basically same performance/price as 7970ghz?

i suppose to each their own.

we should be rejoicing on how good the 780 is instead!!
Xeon_KI's Avatar
I guess you can't follow that.. lol
My link to the Techpowerup bench wasn't about being right or wrong Turbo. It was about being precise and using the most accurate numbers techpowerup had to offer.
It's like talking to a rock
trueblack's Avatar

contradicting yourself now dude..... you see, that's your problem.
you later even said you think:

so you think techpowerUp is correct now? how about Bobnova that supported you on the fact he thinks it is not updated enough now?
or did the other sites posting similar results finally convinced you? I am curious.
Culbrelai's Avatar
Yeah I noticed that around here too, funny really, but then there's an AmD blitz and everyone's just dandy bashing nVidia, like when the ARES II came out.

Lol I got microlags without CFX on a AMD card, the very same game, none on an nVidia card. And no, no topics ever get locked here, just the nVidia people get kicked out of the topic, and the AMD people continue to talk crap even though they're told not to =P
Xeon_KI's Avatar
how is this a contradiction? I have said this whole time you should use the most up to date data a site has to offer.. I also said you should use more than one source.

My hesitation about the Techpowerup data wasn't about it not being the most up to date, it was, of the recent 780 reviews from popular bench sites, the only one that showed a large gap in performance between the 2 cards in Bioshock.

Are you even reading these posts?
I understand I don't normally spell it out this thoroughly, but I expect that people who frequent tech forums to discuss these types of topics to have some level of reading comprehension.
My bad, i guess
trueblack's Avatar
I find it funny too.

in my little talk here, the scarey part is they make you feel like you are insane when you simply say GTX680 about the same as 7970GE. (which data shows). Some insist that 7970GE is giving GTX680 a beating, anything else, they deem crazy..

also is scarey is that even AMD admitted that CFX is broken currently, but AMD fans here keep saying "nah, that's a meh, not the case"... if that is not denial, I really don't know what is... my roommate is about to trade his pair of AMD cards for 670s this coming weekend cause of microstutter in Farcry3. (which is so bad, he cannot play, at all, radeonPro basically slows the graphics to make it better, defeating the purpose of 2xGPU) But apparently according to them, these are NOT FACTORS of making AMD the best choice for gamers though..
trueblack's Avatar
You then said it is wrong, cause other sites says so. you forgot eh?
You are now in so deep, you are ignoring your previous points that you said
Also, not only bioshock, you dont get "best avg" but wining on one game dude. look at other popular games like, Assassin creed, CoD etc. These are also in favor of Nvidia. Overall, 7970GE = 680. This is shown in the data, and you asked why that makes GTX680 a gamers choice, and we told you about other supported features, and some problems with AMD CFX. I admit here, I copied bluezero5's point there, cause I thought his points are decent.

if that doesn't become a tie breaker, what will? what you want dude????

anyway, I feel you should private message me unless you want to keep calling out on people. I will response when you do, but otherwise I will try not to agitate you as you look a little pissed off. to that, I apologize.
SF101's Avatar
Trueblack most of us are just rolling our eyes at you is why you see so little supporting arguments.

Clock for clock 680gtx cant compete with a 7970 and the 7970s on average overclock better than the 680gtx.

They"7970" also released 6 months earlier and id have to guess on average most can do 1200mhz core - 1650mhz mem 24/7 from the first day of release some actually oced better in the earlier versions.

So its quite simple the 680s just released with a more aggressive overclock to compensate for the lack of overall competitiveness otherwise.

Until titan released the 7970s were still beating down on the 680s in almost every 3d bench out there in the overclocking leagues air or otherwise.

Your trying to compare a stock for stock which to me and alot of others is just a feeble attempt at winning a argument some might call it fanboyism at the very least.

Also bobnova has benched and tested more than his fair share of hardware so taking his views or opinions into your consideration would be just good advice.

Then you have someone like myself that has a 24/7 1250+core -1850+mem card which should actually compete with any stock 780gtx or be close in performance it makes the 780gtx for a gamer seem like a meh type upgrade at most.
Culbrelai's Avatar

My 670 beats 680s, so where does that put 7970s?

Maybe they'll say its shopped!

Or I cheated!
SF101's Avatar
maybe but not clock for clock. ie performance efficiency.
Darknecron's Avatar
Nope, I'm pretty sure he just cheated.
trueblack's Avatar
You sound respectful, so I shall be the same. Cause I bet people that bench and compete would roll their eyes. but you people need to know, you are NOT the majority. People like ME, is closer to the majority, we have a nice computer, we hard earn some money to buy a decent GPU, and we play games, without trying to break it with overclocking. We might use Precision X to overclock it a little, use KGB to flash a bios, and to those end, GTX680 and 7970 gives roughly the same overclocking edge. Not everyone is on water, most of market is on STOCK cooling btw.

First off, MOST of the gamers do not overclock 'much' on their GPU, this is why sites like guru/techpowerUp compare on stock. Which is the most relevant data to us. However you are wrong when you say 7970GE better, maybe it was, but not anymore, now they are the same cross an avg of many games and resolution. In fact, if you wish to count out resolution people do not use, 680 > 7970GE a little, but the margin is so small it is pointless to argue that. 680 = 7970 on stock for the average gamer. (like me) Your recommendation, or Bobnova will be good advice for people trying to compete with benches, however, are you vain enough to say you two contributes better to say other sites are wrong? If the majority don't bench on crazy clocks, but your team does... now who's using niche data to justify a card is better than the other? unfortunately that's you. 90% of the buyers dont overclock. do you see?

For your read:http://www.techpowerup.com/reviews/N...TX_780/26.html
then other's said this site is wrong, says it is the only site that says Nvidia better than AMD on bioshock
this is also wrong, as many sites says 680 > 7970 on bioshock

So I am sorry, you are wrong. so is Bobnova.
For the average person that doesn't want to break their GPU with overclock, GTX 680 IS about the same as 7970GE.
I hope you are not in so deep that you want to call all these review sites wrong, and yourself and bobnova as right. cause that would be quite vain indeed.
You might be right to the 1% of benchers among the majority, to the rest of the Majority, you are sounding like a moron saying stuff like "Not when it is on LN2 or water with Volt mods". Cause we don't do much of that, and we just don't care if you can do it, we don't. This is why I when bobnova decided to say techpowerUp doesn't update enough, cause in fact, he's out out of the market in most cost and performance on stock, and he really should make a firm statement before checking.

This is IMPORTANT, get it in your head:
you can overclock it a bit, compare, cool with that.
but when you talk about anything out of safety zones, well, that's interesting to know only, hardly applies.

This is why these sites that compare at stocks are the more popular review sites out there, not the extreme reviews.

I have NO DOUBT, the 7970GE can bench like a madman on synthetic 3D benches like 3dmark2011 or what not, you are right, 7970GE will KILL GTX680 in those area, hands down. I do not even try to argue there. I am saying, I didn't buy a GPU to bench, I bought one to play games, like the rest of the majority. Like the remaining 99% of the market.

7970GE use to boost Performace/Cost

but this has changed since 780 came out, and you need to know, as GTX680 price dropped a little.
now the performance/cost of GTX680 and 7970GE basically at the same margin. That has been my 'entire point'.

now when they are the same, what gamers value? other add/on values for gaming.
like 3Dvision, like quick driver delivery.

and I was asked what makes the GTX680 a gamer's choice, so I brought up other features, which I will suppose won't be very useful to people benching either. So I am not surprised if you do not find those points relevant. I am here to say, you folks that bench, see things differently than the avg Joe who doesn't. If 7970GE is so good, why is it not giving more FPS than GTX680 in ALL titles? You have a reason there? Cause I have one, that is GeForce drivers improves faster, 7970GE might be a stronger card in the core, but it LACKS DRIVERS support for new games, this is the case since catalyst 11.0. This alone keeps the GTX680 better in gaming FPS despite the 7970 can overclock better on 3dBenches.

and this is where, you are wrong. to the majority, those sites are closer to reality. You and your team who can only speak to the 1% of the benching elites, in this case, is WRONG. I am sorry. We don't bench, we compare at stock and at the low end of overclocking. and as of today, 7970GE doesn't even have the performance/cost edge anymore. is it still 'marginally' cheaper than GTX680 on performance/cost? maybe, depends where you get the card and what for, but that margin to the majority, doesn't REMOTELY compare to the other services a nvidia card holder gets from the company.

Again, I am NOT saying GTX 680 > 7970GE on performance. DO NOT READ IT AS SUCH.
they are the about the same, is what i am saying, justified by many sites. I listed them out already.
and for 'similar performance, or too small a gap of difference' Nvidia has a LOT more to offer.

I am not even going to start another microstutter talk.
hopefully now you learn how to recommend this to the public, not sounding like a nerd in a tin can.
I am on overclock.net on the same topic, and they did a self compare on stock looking like this, for details, go over and find yourself, I will just post results:

again, this is just stock. the talk stemmed from what the majority will see if they just plug and see scores on 3dmark.
they are in agreement that 7970GE can overclock better on water as well, and that overclocking edge CAN top the GTX680, so there's arguement there.
But this comparing is relevant, cause the majority doesn't really OC that much. (within 5-10% of stock only)

ALSO, NOT SAYING this below is the absolute truth. Far from it, but just showing people are finding the GTX680 on par with 7970 on average.

SF101's Avatar
ok but even my nub non geek friends have figured out how to oc their 7970s to 1200mhz+ core 1600+mem and most of them paid 350 or so for theirs i still cant find a 680 for 350$

the ghz edition cards are just for people who cant figure out how to download afterburner or precision or the other 5 or so oc tools out there . even the amd ccc tool can do a decent oc.

i should add 1 thing. this is the Overclockers Forums

just saying cant come on here and preach stock.
Janus67's Avatar
@trueblack , I agree with most of your statements and I believe that it has become a circular argument of people making different cases to prove that they are correct. In this case both sides have valid arguments and sources (both objective and subjective) to backup said points.

One quick note 2560x1600 is becoming more popular, but more so in the 16:9 flavor of 2560x1440. I agree though that most enthusiast gamers are likely on 1920x1080 at this point.

The argument about drivers I've seen both ways. It seems that it just depends on the game for which company releases updated drivers for it. From both parties I have seen some pretty poor drivers at a game launch that if they are huge flaws get fixed quite quickly with a beta driver or hot fix.

I think the major issue with looking at all of the review sites is that I would bet most compared results are from previous reviews that would have used (by current standards) an outdated driver or differently patched windows or who knows what else. Again, I tend to look at +/-2% as margin of error which doesnt give bias one way or the other.

Overall I think this generation has had some excellent performers and I'm excited to see how the tech explodes in the next 12 months after the new consoles launch. I think we will be back to shorter upgrade cycles again - which are fun to follow, but hard to live through if you want to always be on the cutting edge (especially on your wallet!)!

As for the 780 it is one fantastic piece of hardware. Its price/performance is excellent.. when compared to the Titan, but not much else. I would have LOVED to see it drop in at $500 and everything else drop in price accordingly, but when you have the crown you can charge an arm for hardware and get away with it (Intel is also guilty of this nowadays 3930k/3960(70)x)
Culbrelai's Avatar
And especially hard on your wallet, says this teenager with no job =P
trueblack's Avatar

Ok, i should respect that, you are right on that aspect 100%.
I was only comparing cause someone asked about a 'gamer's choice' only.

Your view and Bobnova's view on the OC-ed end of it will of course be taken as the Elite advice for those who can get on better cooling solutions.. etc.

Hope you see where I am coming from.

In full agreement on performance. I see little to no bias.. some wins some, depends on personal favor more really.
I agree with that fully as well, and I must confess I share the excitement of volcanic island chips from AMD and what they will bring.
nvidia NEEDS AMD to be a competition, and when they compete, we the public wins.
Jpaul's Avatar
i prefer very slight overclock only too.
i find 680 and 7970. equal.
but the recent talk and trends in games make me prefer the 680.
any chance of more price drop on the 6XX series?
Supertrucker's Avatar
I wouldn't expect any dramatic price drops, the price of the 780 was set above even the launch price of the 680 placing it in its own bracket. I expect the 770 to fall in around 500-550$ where the 680 used to be, leaving the 680 and 670 in their current brackets.
Bobnova's Avatar
Not updated recently != "not well managed, and their benching is wrong". Don't put words in my mouth.

Not only did I mean something else, I said something else. So let's hear it.

Read the senior member charter, have you? You have no clue, no clue at all.

Not wrong at all, but correcting can be done in a variety of ways, you've chosen the second worst way.
Of course, when correcting someone it helps if you are, you know, right.
When it comes to "right", it helps to have a bit of a perspective. Game another decade and then look back on this.

And everybody (everybody) I know personally who plays BF3 does it in 2D. Guess the "people I know" card can go both ways, who'd have thought? Try again.

The recent driver was (is, unless it's been replaced/fixed) garbage. It wasn't just not the best, it breaks things.

How bout six?

Depends on the game, the drivers, the cards, and the people.
Like I said, I've never had any issues, microstuttering or otherwise, on either brand. The only issue I have ever had is in regards to GTX580s not being willing to SLI with modded drivers.

Plenty, this is only part one

As was pointed out, this is Overclockers.com, we aren't talking to, or for, people who don't overclock.
If you're looking to throw out overclocking results because "everybody else wasn't OCing", you're welcome to, but you might want to find a new forum.

Hey now you say it's about the same! Just recently you were saying the 680 is the clear gamers choice. Perhaps you should retrieve your "contradicting yourself" stone and contemplate your own sins before you cast it again, eh?
For reference, now that they're "about the same", the lowest price 7970 1GHz I can find is $400 flat, the lowest price GTX680 (excluding open boxes, those don't count, nor do used costs) is $450. 11% higher price for "about the same" performance. Clearly this is the right choice?

Who said LN2/water (lol, what a combination) and voltmods? I suspect you forget that unlike Nvidia, AMD actually allows voltage control. Not that all AIB partners do, but it's there on the reference cards.
I suspect you meant "shouldn't make a firm statement". Perhaps a break is in order, keyboard rage may be setting in over there.
On cost, I covered that last quote. The jump in price to get to the GTX680 isn't as bad as it used to be, but it's still a lot for "about the same performance".
On TPU, you clearly failed to note the lack of solidity. One doesn't wager on a solid, known, thing.
Beyond that, I still doubt they re-bench every single card every driver release, largely for time reasons.

Says the guy with "so.. you have an Intel "k" chip, but you think it is best not to OC it... seriously man, what are you?" in his signature. lol

Aha, popularity makes a person correct now?

Actually 3d11 is Nvidia's game if you don't turn off tessellation. You should have done some research on that one.

Uses what? Completing sentences helps a person be understood.

Has it? It started off that 7970s were a terrible choice, not that they were basically at the same level. (Margin means, essentially, border. Sometimes a dividing space.)

Quick driver delivery, that you have to search the partially hidden "beta drivers" section to find. Great.

Now 7970s are stronger?
But GTX680s are better?
The driver support thing continues to be hilarious to me, btw.
The 7970 can overclock better on everything, not just benches.

You contradict yourself here again, first it doesn't have an edge, then it does, but it doesn't matter because Nvidia services the card holders well.
You seem to be of the (highly mistaken) opinion that the benching team doesn't play games.
News flash: We almost all play games. Some of us play a LOT of games.
Bonus news flash: Most of the benching team doesn't overclock much, or at all, for 24/7 use.
Those two news flashes plop us squarely in what you define as "normal gamers". We just happen to be normal gamers that know a touch more about the hardware.

Go re-read your post here: http://www.overclockers.com/forums/s...&postcount=181
Then tell me you aren't saying that the GTX680 is better on performance. Let me give you a few lines from that post, for completeness sake:
"This is my favorite link, as he posted a link that shows the performance of EVERYHTING averaging in, clearly showing GTX680 >= 7970 Ghz"
"multiple sites clear shows the avg performance of GTX680 > 7970GE."
"So rather than 'realizing GTX680 > 7970GE is true since many months ago"
"I am saying that GTX680 is a clear Gamer's choice, cause the performance is better"
"But if you are like the rest, on 1920+/- resolution, GTX680 has better avg gaming performance"

I think you can see how we got the idea you were claiming the GTX680 has higher performance.

Me neither, never seen it, can't talk about it.

I'm going to make an off-topic suggestion real quick here: People take posts that contain proper grammar, capitalization and punctuation rather more seriously than posts that lack such things.

On the subject of performance:
It varies, hugely, by game.
It varies, hugely, by individual GPU as well. What is "stock".
Is a 7970GHz "stock" at 1.0GHz? What about Gigabyte's base windforce model, it's not a GHz Edition, but the clocks are 1GHz stock. Their GHz edition is 1.1GHz "stock".
I think we covered the overclocking thing enough, I have to say that I find it absolutely hilarious coming from you with your sig that bashes non-OCers and your CPU OC'd to the bitter edge of MOSFET flame out.

I say that now that the prices are within 12%, look at the games you want to play and pick based on that.
With the previous pricing, the 680 was lol. With the new pricing, that it really ought to have had for rather of a while (and would have, if Nvidia didn't enjoy servicing their card holders quite so much), it's a much more reasonable choice.

The GTX780 is an epic card with a lol pricetag. The titan is even more epic, with an even more lol pricetag.
None of them have decently beefy power sections. One of the main reasons the Titan is limited so much in voltage is that the MOSFETs go blammo somewhere between 1.25v and 1.35v when overclocking. That's regardless of cooling, too.
The 680 power bits are stronger, but still fall woefully short of the 7970 power bits.
The 780 uses the same stuff the Titan does, it can overvolt (with hardmods, being Nvidia) a bit further before blammo.
I don't think I've seen even a reference to a 7970 that chunked the MOSFETs, excluding improperly done OCP hardmods. Those are lethal.

So, to extrapolate a bit, I think we all know that heat kills silicon (not to be confused with silicone ). That isn't really up for argument.
MOSFETs are silicon devices.
The more stressed the MOSFETs are, the hotter they run.
The hotter they run the faster they degrade.
The more they degrade, the hotter they run.
It's a positive feedback loop.
It leads to another positive feedback loop, which in turn promptly leads to blammo.

Have I seen stock cooled 680s dying yet? No. Do I expect to see them next week? No. Do I expect to see them eventually? Yes. Think GTX570 and GTX590 for examples.

Now that I've spent some time writing all this up, I'm going to get on with me evening. Enjoy!
wingman99's Avatar
Are people blowing Titans with just overclcoking with multiplier and voltage consistently?
trueblack's Avatar
I refuse to comment much on your post that is aimed to insult people on grammar etc, least I don't respond to those with a vocab of a 12 year old. You can do that all day, you will get not much response from me. But you sure show your personality very well as a senior member, interesting. Glad to meet you, bobnova.

Now, you read the above out of context completely, but I will help you understand it, since you need the help.

in gaming. and it is true, it shows that. if you can't read graphs, I can teach.
In gaming. and it is true. numbers Do not lie.
In gaming, and it is true.
and again, it is true, according to many sites. I know you might want to disagree, you are welcome to.
yes, you should cause it is true. IN GAMING.
The whole context is "Why is GTX680 better than 7970 in gaming."

hope that point is now across, and you have no more questions.
albeit, the margin is SMALL, GTX680 is better by fractions which makes it -pointless- to discuss.

Now, I really don't know what friends you have, or why they do not try 3D, it's either they don't have the money to buy the 3D monitor, or maybe it is a generation gap thing. Afterall, it is a younger generation thing, I don't expect the older generation to see too much value there. But that's OK, the old generation will soon be replaced by newer, and just like CRT TVs will be replaced by LEDs, it WILL happen, just see. For now, I agree, I think less than 20% of people use the 3D feature, but those that do, LOVE it. I believe it has a future.

Your points on 7970GE being a good card is not bad, it is a good card what what you do to it. Voltage control, etc. That card can take lots of abuse, so I understand your loyalty to it. And I respect that. I just find it weird that you refuse to respect 'the majority' that doesn't do OC-ing above what CCC might offer. For us, 7970GE <=> GTX680 in most cases, depending what you doing. But lets not go into circular references again, I think we got that point across.

Your stance on CFX being fine is the only thing I feel you are being Ultra Stubborn about. But there's nothing I can do about it, your point is your point. There are people complaining everyday about it, I live next to one, but guess you won't believe it, despite AMD admit CFX requires a fix.

Now Bobnova, you are a senior member. I am NOT SURE what makes one, nor am I interested. But people 'look up' to you here as your name is Blue. You among anyone, SHOULDN'T make statements that will jeopordize your credit. You know what I mean, when you say things like:

This about TechPowerUp Update frequency:
you mean you are not calling them out for posting a poor comparison with not oftenly updating their drivers? and that is not a reflection of poor management? Do you simply mean they don't know what they are doing, and hence don't update enough? If you mean something else there, you sure lost me there. Yup. Don't know what else you can mean other than utter disrespect for a site that does review.


That is not only false, it sparks fanboyism wars. Cause for the rest of us, what you just did there, was 'ignore' all review sites that said otherwise, replace reality with your own, and call everyone out for buying things that's not worth it. (that's exactly what's implied in your speech.) If you think you can say that, and not expect a rebuttal, well, maybe you should post less, it is not objective, nor representing, and makes this forum looks like an AMD fan site. Not everyone wants to void their warranty by posting the EULA line in the AMD bios, most people I know will just use PrecisionX and CCC, and get that extra 10% til the air cooling beings to throttle out of heat. And on that front, both cards OC in air about the same. (you are free to comment.) And those that having gaming sessions more than 2 hours, we want to keep our GPU cool, last thing we want is a sudden FPS drop or crash, so I hardly OC it near any possible limits. I used KGB for my GTX670 however, I researched it, it is legal to use, doesn't void warranty, and the card operates quite well with it. a GTX670 with it, can operate like a GTX680 as stock, and that makes a LOT of Nvidia users happy. A GTX680 on it, with just simply precisionX, will match a 7970GE on most catagories except out of the boundaries where both cards might crash. If you do not agree, please make a comparison of them, I believe many people will like to read your OC review on these cards done objective... like you said, this is an OC forum.. best see something there.

another point I would like to make, is I find the 7970 buring up into much hotter temperature than the GTX 680 over time. So in longer gaming sessions, AMD cards actually 'cannot' be OC-ed too much, else it reaches temperatures that the average person might not be too comfortable with. This might have something to do with 7970 using more power consumption, not sure. This is the precise reason however, why I would overclock my GTX 670 at 1.212V, but my roommate who is still on 7970, will just slightly overclock the voltage on CCC, and not all the way. This area I won't pretend I know much, but this what we observed.

btw, I got my Mobo and CPU for free, from a sponser, so unlike my GPU, I will clock the shxt outta it. I normally won't even buy an Euthastist class chip, as it is a complete overkill for what I do. So you know that. (as you questioned.) I have my old 2600k to fall back on when this dies, and in a way, I am waiting for it too.. surprise it hadn't.

I am in a pretty good mood today, but I do have a paper to write.
if you have more to say, you can post, my next reply can take a while though.

ps: I don't think the GTX780 price tag is too bad. For something with no competition, they can price it anywhere they want. that's what corporations are SUPPOSE to do, SELL new technology, compete prices when matched. I don't think that is hard to understand.
Culbrelai's Avatar
Overall, why are GFX cards so limited with overclocking?

I mean CPUs are pushed crazy beyond limits, running 24/7 on specs 2.0GHz above just fine, etc, I pushed my Xeons 50% higher than they're rated, and their temps only rose 5C at idle, lmfao. And really GPUs would benefit (it seems) WAY more than CPUs...

So what limits GPU's? Their power sections are all woefully inadequate? What are CPU's "power sections" like? The whole 8+2 thing? How do you find out what your mobo has?

Perhaps they bielieve that the quality is worth it? I certainly do. 690s and Titans look like you could throw them at the wall and they'd be fine, AMD cards not so much.
Pierre3400's Avatar
Your cooling is a lot better then stock Intel cooling for one, and at idle, the temp should remain almost unchanged, as it should throttle back on, unless you have set it to be at constant peak performance.

The main reason we are getting limited on the GPU front, is due to RMA. We want cheaper cards, and that means cutting cost, but cutting cost, you cut on quality, by cutting quality, your cutting away at the headroom for over performance, and running a stock performance closer to the cliff. By doing that, and allowing OC, a card will die within the warranty period, and this means manufacturer has to replace it out of own pocket, which is a loss.

By limiting OC, they wont near as many returns.

Now, i know your going to be thinking and wanting to say, but if quality has dropped, and prices of production set down, why are the cards still ever so expensive. Simple, the people that have to work to make a new GPU every year need to get paid too.

Well this is what my logical sense is telling me, and what i have gathered from reading around.

Bielieve? You must mean believe, and not Justin Bieber crap!

Apart from that, i must disagree. Looks has nothing to do with it. I know as well as you know, that you have never tried to take apart an AMD card with reference cooler. The cooling structure is just fine, and just as solid as nvidia. The difference between AMD and nVidia, is not quality. Its price. nVidia just charge more. The over price then means they can pump more money into making even better GPU's. But it has nothing to do with quality, please understand this!
Culbrelai's Avatar
They are always at peak (3.5whatever GHZ) because turboboost and speedstep are disabled automatically as to not overclock the overclock, at least on this board.

But I get what you mean, they don't want people to bust their cards and be RMAing them, but if you simply make overclocking void your warranty... wouldn't that solve it? People that push them would merely be taking a risk with their own money.

Well, if you merely compare our two lovely forums here...

AMD - http://www.overclockers.com/forums/f...splay.php?f=85

and Nvidia - http://www.overclockers.com/forums/f...splay.php?f=86

I see two complainings about dead/dying AMD cards, (both 7950s) and none to note about nVidia...

And the irony here kills me - http://www.overclockers.com/forums/s...d.php?t=728850

Clearly, the quality is lacking.

You buy a cheap crappy $100 particle board Walmart desk (like the one my parents bought me that I'm using right now) and when you set 75lbs of computer on it it creaks and moans, and will likely only last a couple of years.

However, buy a much more expensive fully Mahogany desk...

It could become an antique, and be worth much more than the original price...
Pierre3400's Avatar
They could, but they will have a hard time proving that it dieed under OC.

If you took the time to read about the dead/ding 7950,s you would know that those cards are used for mining Bitcoin, which btw nvidia cards are USELESS at. Mining bitcoing puts the cards at 100% stress 24/7. If you did the same with nvidia cards, you would see a couple fail too.

The irony, is that my 7970s turned to be 100% fine, its a motherboard issue. So pipe down
Culbrelai's Avatar

That one was Bitcoining? How do you know? Do you have a crystal ball? Lol

I doubt bitcoining would stress quality cards enough for them to emit white nonsense =P Bah, internet get rich quick schemes.
Pierre3400's Avatar
I know you are by far the biggest nvidia fanboy OC.com has even seen.. Now i cant finish this sentence because i am on thin ice for arguing with you in the first place, but you hold a certain place in my heart.

You may be right, that AMD cards fail, but nVidia cards fail too.
Simple because im sick of getting warned from arguing with you, im going to drop it from here on. But just because your a teen with no job, it doesnt make you all knowing.

Oh and im happy i got that quoted before you changed it
trueblack's Avatar
AMD cards does things like bitcoin mining better than GTX. Cause GTX is designed that way to excel in other things. (or maybe even purposefully nerfed) However, if you use the same chip technology, and put in a Tesla card, then the AMD card cannot even COMPETE at that level.

the largest super computer in the world, the Titan, employs Tesla cards, hundreds and thousands of them to do calculation at incredible speed.

Imagining using THAT computer for bitcoin mining. stupid and waste of resources, yes, but imagine the success it gives. lol.

AMD GPUs are more generalized in that fashion, but being generalized is also another way of saying it is not specific enough, these are the exact worlds both companies calls the other. Nvidia says AMD not specific enough, and AMD says otherwise. AMD does have calculation only cards too, but as far as I know, nvidia is a Mile ahead in that tech.

I remember the 8800 was at some point able to get a firmware flash.. and work like a tesla, I am not sure does modern GTX cards still have this capacity locked within.
Supertrucker's Avatar
If nvidia are worse at bitcoin mining, then no matter how many nvidia cores you throw at it, a similar number of amd cores would do more.

Saying the worlds largest supercomputer uses nvidia so nvidia must be the best is like saying mcdonalds is the biggest food chain in the world so mcdonalds has the best food.

Edit: also, its laughable to say nvidia quality is greater than amd quality, there is a reason nvidia hard locked voltage on the 6 series cards, after how many 570s and 590s blew their vrms.
Pierre3400's Avatar
As far as a super computer goes, when it comes to huge orders like that, its not like some guy goes out and buys 500 tesla cards. They contact producers of chips, and ask them, for an offer on chips, and ask if they can make those chips work with the system they are running. You dont know the specifics on how those deals go down, only a select few people do, and when it comes that sorta stuff, its a hush hush business.

And as far as mining goes, AMDs firepro cards, suck at mining too, geforce cards suck at it, and tesla even more. This is a pointless and endless discussion.

This all boils down to a quality question, or should we say price range vs. quality vs performance.

We all know AMD are beating the life outta nVidia on Price Vs. Performance. As far as quality goes, only a select few people with the know how are able to tell us, the how the quality differs.

With that said, i would consider owning nVidia cards, if the price wasnt so high, and i have owned nvidia cards before. My last one was a GTX590, which had to be RMA'ed because it was broken, but we'll gloss over that to keep Culbrelai's pride intact.
trueblack's Avatar
terrible analogy you using.
and I do not think you understand Tesla structure nor computing power.

You got some reading to catch up on. Til that, there's little we can discuss about.

if you don't mind, I can select some for you. and Pierre is just blindly agreeing with you there, he doesn't understand about the difference in the Tesla card structure either from the few lines he's saying.
Pierre3400's Avatar
True i may not understand the full structure yes. But at the end of the day, that still has zero to do with the overall build quality of the finished product, and i have never claimed to know about the structure, if you read what i wrote, i talked about an order, where the buyer sets up the requests for the chip. nVidia were able to deliver, and thats why teslas sit in a super computer today, but it still serves no purpose in the debate what so ever.

If you want to go down this road, then why do all the new gen consoles have AMD/ATI and not nvidia?
trueblack's Avatar
firstly, I would first want to say one thing. ON THE TOPIC of bitcoin mining, 7970 is probably one of the best card to do so, I won't go into specifics, but even the GTX Titan cannot compare. When I was saying using Titan to mine, I was referring to the 'supercomputer titan', that has 18688 Tesla K20X cards, and the combine wrath of that can beat 'anything' today in calculations.

OK, now that being said.

back to topic:
Quality: I don't have data.. but I read about both nvidia and amd cards dying. I will read some, and come back, I dont want to just make stuff up.

now, why new gen consoles have AMD/ATI? from the talks on companies like Sony etc, they say it is cost related, so obviously AMD was able to deliver the chip they need at the lowest bid. Which AMD SHOULD be able to, afterall, that company makes CPUs too, the synergy in getting console chips SHOULD be higher. My understanding was that nvidia was actively pursueing that business too, but lost out in the end. I guess when it comes to mass production, AMD has a serious edge, and AMD knows it, that's why they prefer to do mass production business, and nvidia seems to focus on Crown jewels. At least, it seems like that to me.

AMD likes to Bottom-Up, work the business, mass produce, it we can jack it up to top nvidia, we do it.
Nvidia appears to do Top-Down, work the expensive chip, crazy expensive card, if we can make it cheaper for mass market, then we do it.

(the above I base on my impression only. I base my impression on product lines and stuff I read only, not factual)

However, why does the supercomputer uses Kepler Tesla K20X? they claim they spared no expense, and for 'what they do', Tesla cards gives better performance, this can be read from the long article of 'Construction of the Titan'. I read it around 3 months ago... was good read.
EarthDog's Avatar
forget it,not worth it...
Pierre3400's Avatar
You actually picked up on something very interesting here. If AMD produce a lot more than nvidia, and AMD tends to have more fails than nvidia, with a bigger production, we should be looking at failed cards vs produced amount, then the numbers might not be as bad as people would like to think they are?

Just to finish off my side.

It seems we agree at the end of the day, it just dragged out. Yet, all that had nothing to do with quality talks that were going on, and it seems to be an even score on that account from your point of view.
bluezero5's Avatar


this will tell u discrete gpu market share. AMD has roughly 1.2 times share of NVDA.
Quality wise, i got both.. i feel AMD is tougher.
AMD is like steel meant to be abused.
NVDA feels like a gemstone, shines by more fragile.

however cause of this AMD users tend to be rougher with their cards which can also be a reason of more failure.
Maybe cause NVDA cost more in general, users then to be more loving to their card.
bluezero5's Avatar
hey.. let's try not startle him.. he is a warrior..
u are just asking him to challenge u there.... ....

EarthDog's Avatar
not challenging anything, people have already rsponded in kind to this.... display of knowledge. I just bought the popcorn and watching the show now. Curious to see justhow much dirt one can shovel on their own head at this point.

@ Pierre, if you are not looking at broken card rates by % in the first place, you are not doing it right. If amd sold 1M cards with 2% return rate vs nvidia that sold 750k with a 2% return rate, its going to look like amd cards suck using a raw quantity like that.
bluezero5's Avatar
Ok. good luck. though I disagree with you doing this. scares me a little.

Bobnova's Avatar
Hey look at that, proper grammar! Well almost. As long as we're going to trade a few insults here (I think you were trying to say I have the vocabulary of a 12 year old, rather than saying you you have such a thing. Maybe I'm wrong), know that "oftenly" is not a word.
Proper grammar does help people understand things.

I mean that I doubt they have rebenched everything with the current drivers. Sort of like I said in my last post.
This has absolutely nothing to do with their management, it means nothing about my respect for their site, and I'd like my apology now.
NOBODY re-benches everything on the most current drivers every time a new driver comes out. We here at OCF try to keep up halfway decently, but only on a very limited number of cards.
With a large card database it would take more time than you have till the next driver update to rebench everything.

lol, like your posts?

You keep saying "like me", and now you try to make yourself different. Make up your mind.
Either you think anybody who doesn't OC a K bin CPU is worthy of being mocked or you don't. Given your sig, I'd say you do.
Given that, your stance of "People like me don't OC their GPU much" looks awfully silly.

Bonus: Defeating the swear filter is a Bad Thing.
Glad you're in a good mood, me too. Sunburnt, though.

The ability to price it where ever they want does not make the price good.
You're mistaking "good price" for "proper business practice".
Proper business practice is to ream out the customers wallets to the largest extent possible without getting (successfully) sued.
That doesn't make price-rape a good price from the perspective of "the normal gamer". If you're going to launch into all this nonsense from the perspective of the "normal gamer" then perhaps you should keep that perspective when it comes to price.

With regards to Titan, or "Supercomputer Titan" (lol), neither can mine bitcoins for beans.
Nothing Nvidia can.
Because it's the highest performance single core out there, bar none. Why would they use anything else?

I have to be a bit honest here and admit that I enjoy arguing, I don't actually care what misguided opinions you may or may not have, I just find it amusing for some reason to argue with people that have such passionate beliefs that they make up "logical" reasons for them. Watching grammar, punctuation and spelling break down is kind of fun too :P
Now that said, I should probably cut it out before I get yelled at, so for now know that I'm laughing at every post, and try to work on grammar/spelling/punctuation for that paper
DarrenP's Avatar
Guys, honestly. Get the thread back on topic instead of rambling on about each others grammar and what not. This is a thread for talking about the GTX 780. So let's talk about the 780. Shall we? Like for instance, what would you guys be looking for a non-reference design on the GTX 780? I think a Direct CUII would look good honestly!
Noshei's Avatar
Frankly I'm rather disappointed overall. I knew going in I would be, mainly because I would rather see more GK110 cards than the older GK104's. I get why they are doing it this way, but it still feels rather "slap-in-the-face"ish to me.
Bobnova's Avatar
Given the extremely limited overclocking and overvolting, there's no real need for a non-reference. Slap a better cooler on it (to prevent throttling) and call it a day.

GTX770 wise, it's better than I expected.
If this is a slap in the face from Nvidia, what was the GTS250?
DarrenP's Avatar
I didn't intend to write GTX 770, i mistook this with another thread. I think they should make a un-voltage locked 780. The slap a great cooler on it. Just for us Overlcockers!
MattNo5ss's Avatar
That would be nice if NVIDIA didn't prohibit partners from doing it...

My GTX 780 and GTX 770 can both get well over 1200MHz even with their limited voltage control.
Darknecron's Avatar
Damn. O_o

A roundhouse to the nether regions.

No matter, this has become a friendly NVIDIA vs AMD discussion thread anyways. :P
DarrenP's Avatar
Someone should slap Nvidia upside the head with a big wet fish. They could be making so much more money. All they'd really have to do is put in there warranties, we do not cover damage due to overclocking.
bluezero5's Avatar
I feel their strategy is using GTX 780 as flag ship

and they just hope GTX 770 will be a 7970 killer. (explains that price range)
but really should give a tint more power.... which brings me to potential voltage mod in bios. I will ask that in the 770 thread!

EDIT: looks like it is already defaulted at 1.2125V. ahwelll.. was hoping for a surprise
Noshei's Avatar
My guess is that is already in there (in far different terminology though). The problem isn't writing it down though, the problem is proving that you overclocked the card and that the overclocking resulted in a hardware failure.

This is why they take a harder stance here than all of us would like. If they gave us unlocked voltage and allowed manufactures to do more custom stuff they are opening the door for a lot more hardware failures. This can easily result in horrible press and bad looking stats, which is generally bad for business.

It is just one messy area to get into.
DarrenP's Avatar
Intel doesn't have many problems with stats on their unlocked CPU's. We all know a lot of us including myself have overclocked the snot out of their processors. I've surpassed 5.5Ghz on an H100I and mine runs like a gem. Given i didn't stress test. I digress however; I would be much less weary buying an Nvidia card if they weren't locked.
Noshei's Avatar
There is a huge difference that you are not taking into consideration. Intel doesn't have to worry about everything that is on the motherboard, if that breaks as a result of an overclock. Nvidia has everything on the card to worry about, which means a lot more potential for things to break.
DarrenP's Avatar
What i mean is, you could easily fry a chip. Just the same as fry an Nvidia card. I just think from a consumer perspective 1 card that you can OC the snot out of, and has a phenomenal cooler, with a good amount of VRAM. Wouldn't be to much to ask for. I should also say that the press regard this as the Overclocking card, and if you destroy it, at the end of the day to bad so sad.
bluezero5's Avatar
true on that front, I do wonder how much the warranty service cost each company. As part of their losses in revenue that be. Or does that cost gets exported to the distributors? like Gigabyte, MSI, etc. Logic tells me, the later case be more likely.
Noshei's Avatar
So from my understanding it is kinda split between both. The service cost (technical support, shipment, etc) is on the distributors. The hardware cost is on nvidia.
wingman99's Avatar
Nvidia only warranties the GPU, the partners take care of the rest.

Nvidia wont allow more voltage from the partners so it wont fry the GPU.
Culbrelai's Avatar
Because AMD is the budget over quality route, and it will always be that way, it's even true with their CPUs.

Hell, perhaps Micro$haft and $0nya want their consoles to fail early, so people have to have mulitple ones =P
Bobnova's Avatar
From what I've heard and read, the Kepler dies degrade pretty quickly starting at 1.3v.
That, and being able to save money on the power bits, is likely why they put such a hard lock on voltage.
EarthDog's Avatar
I dont see a quality issue with AMD. Does anyone else?
Darknecron's Avatar
My only complaint was the initial instability of their drivers . . . which has now been dealt with. Also, Bad Company 2 crashes on AMD cards when MSI Afterburner is running, but I made my own fix for that.

I'd say that efficiency (performance per watt) is what AMD skimps on to reduce their prices.
wingman99's Avatar
I have not seen a quality issue with AMD either.
Brolloks's Avatar
AMD cards - Built quality issues : No ...Driver issues with single card : maybe initially...Driver issue with CFX : Always
Pierre3400's Avatar
I have no issues with any of my Rigs with CFX

I got my 7970s and been testing 5870 with 5859 today. No driver issues.
bluezero5's Avatar
I happen to think AMD stuff can take a fair bit of abuse.

the problem here, is cause of that, the users tend to be a bit more Rough when it comes to handling AMD GPU, I was guilty of that too, I often go in with the mindset, "bleh, it can take it", and just over volt until it crashes. hahaha. But that being said, my 7970 is still alive, so I have not experience any quality issues yet.. I never tried AMD CPUs, all complains on AMD, I read on forums at this point.. so I prefer to stay neutral on this issue for now.
trueblack's Avatar
roll eyes. 4 words for you, floating point vs Integer. that's my point, but your attitude there shows me no reason to explain, you are just trying to insult, not debate. And to think I accepted your request to earlier by PM in another thread to help you keep some 'credit' by editting my post.. guess I shouldn't had done that, should just exposed you the way I did. well, no next time.

You two, the difference between you two and me, is despite being in a heated arguement which we clearly disagree, I was trying to explain my point of view, which I will confess I don't have the best attitude in the world, I tend to be blunt, some find it insulting, but I am delivering a point, backed with data. Data from objective sites, multiple of them. Views from majority forums, etc. You two just backed by your own feeble minds.

You two, are just picking a fight on the forum with those comments. Xeon_Ki thanked you for it too, grats. Bobnova has no more points, so he went down the grammar police road, which is very common for someone losing a debate, I understand. Denial is the most commonly predicted.

Just to stay on topic:
My previous point was: GTX 680 <=> 7970GE in the gaming arena, but added features makes it better, and the OC edge doesn't matter to the majority as much, that is what makes sites like TechPowerUp good reviews sites. They have already conclusively showed my point, I need not say anything further. Things like Microstutter makes AMD currently a TERRIBLE choice for gaming, can't even PLAY on CFX, CFX they say they will fix, and when they do and had done, I will change my point there. Until then 7970GE has no future proof dual-GPU capability that the GTX680 has. til then. The Bf3 issue cause a drop in FPS, but nvidia already explained why they choose it that way for this driver on their forums, but I won't explain anymore, you will just exchange my generousity with insults. I was trying to explain to you three 'why' that is the case, but I suppose there's a clear mental barrier for the 3 of you, you folks not even debating anymore, but of course, ED and BN has this 'I am holier than thou' attitude, why will they listen? disgusting.

I have no intention to 'continue a fight' at your own, low level. But I am obviously very disappointed at you so called 'Senior Posters', all I see there is a display of immaturity, and unwillingness to discuss, and unwilling to accept a view that's vastly accepted on other forums and the majority. Though in each forum, there are always people like you two, though it tends to be 'a wild noob appears', not 'a wild senior poster appears'. If you two had stuck to the discussion, use valid points and data to discuss, if the data is undeniable, guess what, I will accept, (if you have such data), but cause you have none, cannot prove me wrong either, so you just went fullretard instead. I back up almost ALL my posts with site data, views, reviews. I even include points that contradicts my view, to stay objective. Like saying things on the lines of, "Some sites says AMD better, I don't dispute that, it is the full average of the database that's meaningful, and the average of all data says they are about the same."

If you cannot see the points from my previous talks, and want to go down the insulting route, you can do that all day, and show everyone how immature you are. I have the maturity to stay ON TOPIC, despite poor attitudes on the internet. Now seeing our 'quarrel' doesn't really interest this forum at all, or it is good for other posters, I will stop right here.

oh. Live Long, and Prosper.
Bobnova's Avatar
He who stones the first throw shouldn't glass in a house live, or something.

I can't quite help myself, I just have to quote what makes TPU a good review site in your eyes:
TPU is a good site because they agree with you

I think you missed the post(s) where I explained my point of view. Either that or you read 'em and ignored them completely because they don't align with your point of view.
If you want your points taken seriously, take other people's points seriously.

If you think we don't know the difference between floating point and integer... lol.
trueblack's Avatar
not only them, many more.
and you keep missing my point. some sites agree with me, some disagree. which is important.
you need to look at the FULL database to see how it true is. Then it is that OC edge, which we disagree, and you and I both have a point.
BUT You explained 'Your view', with no data back up, therefore while I accept 'your view' (with respect),
I fail to see how 'your view' is more important than the database of plenty gaming reviews.

I am not like you, I won't waste time, looking through paragraphs, insulting people on grammar, etc.
that's just.. well.. immature dude. How else i can describe it? senior poster eh?

and yes, as for now, if you can't even understand this concept, I find it hard to believe you understand how integer vs floating point calculations can be.
I would love to debate about it.. but then you grow up a little I guess.
Bobnova's Avatar
Can you explain this sentence?
Can be what?
Different? Performed? Purple? Can exist at all? ("How can this be"?) Done at differing speeds on differing GPU architectures? Are you talking single precision or double precision? What size integer?

You clearly are not like me, lol. I think that's fairly obvious.
My grammar suggestion is helpful: The more mature your writing looks, the more people will take you seriously.
Culbrelai's Avatar
Yes, and therefore, more heat, and henceforth again, a less lengthy lifespan.

I would call a company trying to extend lifetime by making it more efficient, a better quality brand than one that skimps on efficiency to grab up more of the market at a lower price.

But that's just me, It's not my fault if people have low standards for products =P
Bobnova's Avatar
More heat != runs hotter.
Running hot shortens lifespans (we are talking hot here, though. Not 80c), producing heat does not.
trueblack's Avatar

not entirely correct again.

you trying to distinguish between running at a hot temp,
and producing a lot of heat, but dissipates into cooling solutions.

Both heat and producing heat can kill a chip, one more than the other.

What is Producing heat?
heat is produced by electric resistance when it meets Voltage and current.
Too high of a voltage can generate electron saturation that kills a chip outright regardless if you run it at 10K (that's 10 Kelvin.)
You are saying the producing heat doesn't kill a chip. really?
coz last I checked super conductivity not yet availible for modern chips yet. where you read it?

Similarly producing heat despite being able to cool it, will cause degradation, in time.

It is a function of both Voltage, Time and Temp.
Producing too much heat (despite being able to cool), and having a chip exposed to hot temp, will both cause degrading on chips.
Bobnova's Avatar
Lol you're reaching on that one. Producing heat can be a byproduct of running too high a voltage, but it does not in and of itself mean the voltage is high.

That's like saying that speeding will wear out your engine without mentioning how much throttle you're feeding it out what ram it's running.

Your theory is interesting, but you are leaving out the current half of the equation.

As a bonus, chips still die at 10K. Benchers run them there on LHe from time to time.

So, in short, producing heat that is dissipated correctly does not in and of itself cause harm.
Adding voltage to a given equation can cause it, and a side effect can be damage, but it's from the village, nor the heat production.
trueblack's Avatar

I think you mean voltage. but I don't pick on spelling. I understand you well.

voltage through resistance by definition will cause erosion. Especially with the shrinking die, parts get smaller = more vulnerable to this.
This is become more and more obvious and therefore when the die shrinks, the voltage must go down as well. You should know that.

So while I will continue to read up.

even running cold, high voltage (which produces heat, and heat is a function of voltage and resistance) does kill a chip. I feel this is common sense, no?

So I would like to say.
producing heat, 'passing voltage', WILL kill a chip by definition. Saying it won't is crazy talk. question is 'when it will'?

correct statement should be:

As long as a voltage is being passed, the lifespan of a chip will be shortened despite in a cold or hot environment.
Heat on top will accelerate that process as a catalyst, as heat will increase electric resistance, causing the same voltage to kill the die faster.
Therefore, producing heat is in fact, the process that kills the die (when voltage meets resistance), we just try to keep voltage low, temp low, to prolong the process.

only way that producing heat won't kill a die, is in super conductivity.
additional details or fleshing out of the content welcomed.

come backs? or refuse to admit?
Bobnova's Avatar
Yeah voltage, my phone really wants it to be village. It's a punk.
I absolutely agree that voltage kills. Adding high temperatures to that kills even faster. What I don't agree with its that heat production kills.
As an example, does a fx8150 die faster at a given voltage than a fx4100? It produces more heat, but runs the same voltage.

Along those same lines we're back to where we started, 7970 lifespan vs 680 lifespan. Feed them the same voltage and the 7970 eats more power and hence makes more heat. The current side of ohms law is sort of important.

Voltage and heat production seem to be the same thing in your mind, they are not in reality.
trueblack's Avatar
okay, so we just have a discrep on 'what is heat production'. You meant it on the mass scale sense, I meant it on small scale. Both are correct. I can live with that.

i meant by: voltage meet resistance = heat production and power.
u meant by: total thermal profile.

I happen to think AMD dies are a bit more heavy duty, so on 'same' voltage on the 22mm tech, they might actually last longer, which justify their higher TDP target without shortening life. though I have no info to back myself up here. Heavy duty is word I am using to describe its design on the silicon level, which makes it more able to with stand voltage.

BUT the heat production (which in my world = when voltage meet resistance = putting it to work) will still kill the die, given time. So in general, it is always a good exercise to try to keep lowest operating voltage for your system. So to answer the question of this.. keep it cool, -and- keep voltage at min of requirement. the chip will live longer. AMD also tries to lower TDP of the chip, they just don't think it is necessary this moment. It is true their chips are power hungry, (there's no dispute there), yet their chips are made that way to withstand that.

I too think it is more elegant to have chip with low TDP, more energy efficient. but that might not be good business..

Bad analogy time:

Think of Power Plants:
Of course nuclear is better, more efficient.
but coal burning is just a lot cheaper, who cares about green energy if it gets the job done.
wingman99's Avatar
Degradation or killing the die is all about Electromigration, it is the transport of material caused by the gradual movement of the ions in a conductor due to the momentum transfer between conducting.

Thermal effects http://en.wikipedia.org/wiki/Electromigration

In an ideal conductor, where atoms are arranged in a perfect lattice structure, the electrons moving through it would experience no collisions and electromigration would not occur. In real conductors, defects in the lattice structure and the random thermal vibration of the atoms about their positions causes electrons to collide with the atoms and scatter, which is the source of electrical resistance (at least in metals; see electrical conduction). Normally, the amount of momentum imparted by the relatively low-mass electrons is not enough to permanently displace the atoms. However, in high-power situations (such as with the increasing current draw and decreasing wire sizes in modern VLSI microprocessors), if many electrons bombard the atoms with enough force to become significant, this will accelerate the process of electromigration by causing the atoms of the conductor to vibrate further from their ideal lattice positions, increasing the amount of electron scattering. High current density increases the number of electrons scattering against the atoms of the conductor, and hence the speed at which those atoms are displaced.

In integrated circuits, electromigration does not occur in semiconductors directly, but in the metal interconnects deposited onto them (see semiconductor device fabrication).

Electromigration is exacerbated by high current densities and the Joule heating of the conductor (see electrical resistance), and can lead to eventual failure of electrical components. Localized increase of current density is known as current crowding.
Culbrelai's Avatar
AMD GPUs still run hotter than nVidia ones, even moreso overclocked, I presume.

So then the 7970 will die earlier. What's so hard about that?

AMD skimped on efficiency for price, and its customers pay the price.

Bobnova's Avatar
When I say "increased heat production", I mean "creating more heat". I have no idea what you mean by large scale and small scale. Making more heat is an exceedingly simple concept. Power consumption = heat production. It doesn't matter what scale it is on. Venture into the nano-world and it's still true.

Without a voltage change or a temperature change, creating more heat (reduce the resistance, voila!) will no degrade a chip faster.

Both the FX8150 and the FX4100 are the same process (same die even, I think), the FX4100 has two processing units (or whatever AMD is calling them, four cores) disabled. I think you missed this part.
The FX4100 is a 95w chip, the FX8150 is a 125w chip.
In "your world", does the 8150 die faster?

That's what heat production is, almost anyway. Heat production is a factor of voltage and current, the resistance is (generally, but not always) what sets the amount of current flow and hence the power consumed and the heat output for a given voltage.
Higher voltage degrades faster. Higher heat production with the same voltage (easy to get, just raise the clock speeds and speed the heatsink fan up to compensate for the extra heat) will not.

How hot a core runs depends on two primary things: Heat produced, and cooling solution.
A 35w laptop CPU can easily run along at 100c, while a 200w GPU can easily run along at 60c.
Darknecron's Avatar

My 7970 stays under 80 during a several-hour-long benchmark at the settings in my sig. At stock settings, it NEVER gets over 65C.
Culbrelai's Avatar
My 670 with a reference blower overclocked never gets above 55, lol

You can't argue with numbers, look at the link I posted in my last post...
wingman99's Avatar
That is not true it depends on die conductive material in the die and running temp.
Bobnova's Avatar
And some non-reference 7870s don't even get that hot, what's your point?
Have you, perhaps, noticed that coolers have a bit to do with the running temperatures of the cores?
trueblack's Avatar
will not degrade faster , now you are right.

your previous assessment however, says:
that is wrong. Producing heat WILL shorten lifespan, but given correct parameters might not shorten lifespan faster. That was what I was point at misleading previously.

depends on context. if use your multi-core chip example, you are right, cause the excess heat production is from 'other cores doing work', if the heat can be dissipated, doesn't add to the chip's overall heat, it will not kill it faster. (this is what I meant, when I say you mean producing heat on a mass scale)

however, in another example. 2 chips, both 1 core. Both same voltage, but one generates more heat cause it has higher electric resistance. despite both chips can dissipate the heat, the one more with resistance will likely observe more electron migration, and degrade faster. (this is what I mean at small scale)

the reduction of resistance to 0, can in theory, produce something with near infinite silicon life. this is one of the reasons people are interested in the area.
wingman99's Avatar
Well I have done some more reading on Electromigration and you dont have to have heat to move the atoms it also can be done with Diffusion mechanisms.


In a homogeneous crystalline structure, because of the uniform lattice structure of the metal ions, there is hardly any momentum transfer between the conduction electrons and the metal ions. However, this symmetry does not exist at the grain boundaries and material interfaces, and so here momentum is transferred much more vigorously. Since the metal ions in these regions are bonded more weakly than in a regular crystal lattice, once the electron wind has reached a certain strength, atoms become separated from the grain boundaries and are transported in the direction of the current. This direction is also influenced by the grain boundary itself, because atoms tend to move along grain boundaries.

Diffusion processes caused by electromigration can be divided into grain boundary diffusion, bulk diffusion and surface diffusion. In general, grain boundary diffusion is the major electromigration process in aluminum wires, whereas surface diffusion is dominant in copper interconnects.
Bobnova's Avatar
rofl, I award you with the Pedantic Award for Gratuitous Pedantry.
Way to find a context, no matter how spectacularly unreasonable, in which you can "prove" me wrong, I salute you.
I also award the award for "best intentional misinterpretation in an attempt to regain lost respect" for, oh, this year.
Holy hell man, I realize you have a deep seated need to prove me wrong now, but that's just silly.

Fortunately (for me, anyway), I excel at pedantry.
Here's a study of electromigration in superconductors: http://www.physics.ncsu.edu/optics/htsc/EM_YBCO_APL.pdf
They produce no heat, but electrimigration still happens. Maybe it is just voltage

Beyond that, I can easily argue that in a processor core producing heat is key to the processor functioning at all.
Keep in mind that CPUs and GPUs are silicon semi-conductors. Turn that "semi" into "super" and you have a dead short and a very short GPU lifespan indeed.
Producing heat means that it's still a semi-conductor, which means that it'll last a while.
Presto! Using the powers of Extended Pedantry, I have proven that producing heat lengthens the lifespan of a GPU!
trueblack's Avatar
sure. and let me aware you with most unwilling to admit he made a false statement and keeps trying to get around to make himself look less wrong award then.

you could have similar said, "Oh yes, I should had worded it better, the way I worded it can be misleading" but you just had to go down the road of insulting others to victory.
amazing human being you are. congrats.

but your link is good reading, I am enjoying that piece. that I thank you.
just btw, I speak 5 languages, english being my 4th.. so if my english is not good enough, I admit to that too.
if you speak as many tongues as I do, I would like to see how well you do too. cheers.
wingman99's Avatar
From the Quote http://en.wikipedia.org/wiki/Electromigration
Electromigration is exacerbated by high current densities and the Joule heating of the conductor (see electrical resistance), and can lead to eventual failure of electrical components. Localized increase of current density is known as current crowding.

Joule heating, also known as ohmic heating and resistive heating, is the process by which the passage of an electric current through a conductor releases heat. The amount of heat released is proportional to the square of the current such that
Bobnova's Avatar
"exacerbated", not "caused".
High heat conditions and high voltage conditions make it worse, is what that says.
Culbrelai's Avatar
Damn this is getting technical, but insulting someones grammar/etc merely shows your lack of respect, for that person, and likely everyone else with opposing views around you.

We can argue till' the end of time, but throwing your dollars and support behind nVidia is the strongest move you can make. Sig your rig proudly, be part of the green team. The whole AMD vs Intel/nVidia thing is a lot like politics, the losing team won't change untill something affects them directly, on their camp, and that is when they will change their tune.

Just wait a year, a couple of dead 7970s will teach em a thing er two =P

And if they continue after that, they it can be said that they are indeed fanboy(s)
wingman99's Avatar
That was my point I did not know if you were kiding when you said Maybe it is just voltage. Because exacerbated Electromigration with heat is a fact also.

I agree after all this reading the cause of Electromigration is the electrons collide with the atoms from a electron wind or ion wind.
GTXJackBauer's Avatar
Great Review Hokie!!!

I am proud to say I will be in the VERY near future, a proud owner of a GTX 780 Classified Hydro Copper owner or Hydro Copper. Just waiting for the release and the questions I've asked EVGA to be answered in regards to the EVport/EVBot. I have seen rendered images as it seems the dual fan version of the Classified has a EVBot port and not in the Classified Hydro Copper or Hydro Copper. They aren't giving us definite answers as of yet and told to "wait and see". I worry if I did go with a classified hydro copper with the EVBot port and not owning a actual EVBot, if theres a point in purchasing it without using that feature and just OCing it via Precision, assuming I will be get the same OC's as a EVBot user or are those cards specifically made for EVBot users only.
wingman99's Avatar
I would get GTX 780 however the cost is to high for me and i'm a Nvidia fan.

I don't think people need to worry about AMD cards if you get one with a 3 year warranty you will be safe, they make them to last.
Bobnova's Avatar
Holy hell, a 780 with a functional EVBot port would be glorious!
I'd be rather surprised if it went over 1.21v, but if it does... zomg!
Culbrelai's Avatar
Me too, but im getting another identical model of my 670 instead, because two 670s > Titan anyways. Hopefully the prices will go down soon...
Pierre3400's Avatar
What on earth is that kinda of BS?

How can you keep turning a blind eye to the information in front of you, and keep sticking to your fanboy points of view?

AMD vs. Intel/nvidia ? Wth? I understand YOU ARE Intel/nVidia fanboy to the core.

Personally, i prefer Intel CPU's over AMD, back in the days, i had both Intel and AMD, but now im all Intel on the CPU, and i dont even bother learning about current AMD cpus, in the same way i think, you dont even bother learning about AMD/ATI gpus. You keep throwing around all these at the end of the day pointless facts about cards.

You cannot sit and compare heat on card when you have nothing to compare to..

I have 2x 7970 6Gb cards, i currently Bitcoin mine with them. This means they are being stressed quite a lot. My main GPU sticks around 76 under full load, and my second one stick at 65. Under full load, thats great, but if you stick my cards into a rig that hasnt got the same airflow, then the temps would be different. Until you test cards under identical environments then you have no personal experience to base anything on.

Like i said earlier in the thread, i have owned a GTX590, it had to be RMA'ed, so stop claiming nvidia cards dont break. Its been set very clearly on in earlier posts that AMD mass produce, and nVidia dont. If one outta 500 Amd cards fail, and 1 outta ever 100 nvidia cards, then its nvidia with the short straw in your argument, but you dont know the production numbers, you dont know the RMA numbers, and not every single person on earth posts online when they have a hardware issue, in fact, most people just drag the pc to a store to get it fixed.

You are basing all your facts on the inner most belief that nVidia is god, you look at an nVidia GPU and you see a damn holy light around it.

Its OKAY to be a fanboy, but sometimes your are going to have agree to disagree, which BTW is something you seem unable to do.

We have always known AMD run hotter, this is nothing new, but what if, AMD are aware and they design the products to run with that amount of heat?
bluezero5's Avatar
that has SOOOOO much win in it.

but I think more dream list item than reality.
Culbrelai's Avatar
Yes, I know it is okay.

Some people around the forums, however, choose to hide it, and act like they're not biased when they really are.

As for the 2nd thing about the heat, yeah, they know, it is a smart business plan I admit, but it doesn't place the consumer first. More heat > faster death, less heat < slower death
Bobnova's Avatar
If the "faster" death is in 15 years, does it really matter?
EarthDog's Avatar
Agreed. This discussion, while fruitful in some respects, is really an excersise in theory crafting. Which, one side is really coming to a conclusion that isn't fitting from the facts at hand. So lets concede it does run warmer, like bobn said, to what end? Does it matter if the amd card dies in 10 years versus 15?
bluezero5's Avatar
I concur, there's no way to know.
base on Historical records, AMD cards have been rock solid in builds.
yes, there are some complainers, but true for both camps.

Only time will tell. theorycrafting is OK, fun to discuss, but til proven, really more like hypothesis only. Time will do justice, we need not judge now.
Knufire's Avatar
This is wrong.

Higher temperature = faster death.
Higher voltage = faster death.
More heat production (if temperature and voltage and all other variables are the same) does NOT equal faster death.

I don't get why there were like 10 posts just on trying to understand the above.

Also, anyone trying to argue that one company inherently has better products than another (regardless of what companies and what they make, doesn't even have to do with computers) are wrong. Also, I'd argue that fanboyism is inherently a bad thing, but that's just my opinion.

Also, random theorycrafting typically tends towards biased, terrible arguments more than anything productive. Theorycrafting in itself isn't bad, but the results of trying it usually are.
Culbrelai's Avatar
And AMD cards use more voltage than nVidia?
Knufire's Avatar
Sure, I'm not sure, I don't personally own/have overclocked anything remotely modern.

However, the statements I made above were all meant to be ceteris paribus, which means all other things considered equal. Since AMD and NVIDIA (obviously) have different core designs, among other things, they're not directly comparable in some aspects. Apples and oranges, if you will.

In other words, if you take two completely identical cards from either company, and run one at a higher voltage than the other, it'll probably die sooner (by some amount that might be significant or not, who knows). That's all my post was meant to say. You can't use voltage or power consumption or any measure like that to claim one GPU will last longer than a GPU of a completely different design, it's just illogical. If they were the same GPU, then yes, it makes sense.
Supertrucker's Avatar
They give you the option to. But higher voltage is relative to stock voltage within a design. I have a 7ish year old athlon xp that runs 1.75v stock on a small air cooler. Try doing that to an ivy bridge and see if it lasts 7 years. Or even 7 months.
Soulcatcher668's Avatar
Give up guys.... you're playing pigeon chess.
bluezero5's Avatar
I find the comparison there about which card dies first inconclusive at best.

Yes, higher voltage is not as good.
but look at the following example I will use to help illustrate why this is pointless.

In the US, they use 110v lightbulbs
in Japan, we also have 200v lightbulbs to use.
the estimated life of these light bulbs are usally the same hours.
why's that? cause the 200v lightbulbs comes with 'thicker' tungsten filaments, build for that voltage.

(now there's a trick we use, is we buy 200V lightbulbs in 110V countries and those lightbulbs almost NEVER burn out. )

So think of AMD as the 200V bulb and NVDA as the 110V

the voltage usage there is not meaningful,
cause AMD would have the higher voltage use in mind,
and make those chips with 'thicker' silicon transits, which can last longer 'potentially'.

*** I am trying to explain the theory using something not exactly comparable to the semi-conductor size of things, there are differences,
*** but I hope in this example you see why I say the voltage use in not really meaningful.

it will ONLY be meaningful, IF you find the following Info:
- AMD GPU and NVDA GPU (current gen), what electric tolerance are they build for? is it in fact the same?

Only if so.. we should be comparing voltages highs/lows, on the potential impact on the lifespan of the chip.
DarrenP's Avatar
Any word on the GTX 790?
Knufire's Avatar
Dual GK110?

I dunno if that'll happen, I feel like it'll just cannibalize their own sales of the Titan (unless it is significant more powerful and more expensive than the Titan).
DarrenP's Avatar
I would assume it to be a little more expensive, and i would pay that. It may cannibalize their titan, but consumers will want it. If they don't come out with one by the time i've amassed my money, i guess dual GTX 780's couldn't be that bad?
bluezero5's Avatar
my god, the thought of that...

dual-titan... that power... but at what cost man, at what cost... lol.
DarrenP's Avatar
Well i'm either getting a 7990, or a 790 given if the latter ever comes out... How long after the release of the 680 did the 690 come out? i wasn't in the loop of computers back then.
Janus67's Avatar
March 22nd 2012 (680) May 3rd 2012 (690)

IIRC the 580 and 590 were further apart.

-- edit they were - November 9th 2010 and March 24th 2011.
DarrenP's Avatar
Okay, so by then i'll know which i'm getting thanks again Janus
Knufire's Avatar
If I had to guess..it'd be in the $1400-1500 range, dual cut down GK110s (basically dual 780s). 8GB of vRAM and probably a triple slot card to manage the thermals.

Complete speculation though.
DarrenP's Avatar
I don't care if it's a 5 slot card (can you imagine xD) i just want a 790 more then a 7990 xD
DarrenP's Avatar
Oh, and how do i know if the OC wasn't stable?
Janus67's Avatar
artifacts, crashing, driver crashing, blue screen, etc.
Bobnova's Avatar
I pretty seriously doubt we'll see a 790, personally.
bluezero5's Avatar

I wager we will !!! but much later. (like Q42013 soonest)
It be odd if NVDA leave that slot in the series vacant.

Any physical reason you doubt so Bob? you might know something we don't.
GTXJackBauer's Avatar
I don't think they will make a 790 as well. The Titan took the price range for a future 790 unless they slash the prices. That's prolly after I spend 800+ on a Hydro Copper. I can only imagine.

And maybe because the next series is going to be so immaculate that it will make up for it. MAYBE lol
Janus67's Avatar
I agree with you on that, only way they would release one would be for probably $1500 or something crazy like that.
trueblack's Avatar
nvidia has been pushing the price ceiling higher and higher.

doubt that will be a limiting factor.

question is when only. I believe.
Boulard83's Avatar
Just ordered a Zotac GTX 780 !
Bobnova's Avatar
I take it back actually, I think they will make a GTX790. It'll have to GK104 cores (ala GTX770), think factory over-volted/coverclocked GTX690.

I don't see it happening for the GK110, because to get the same markup they get in Titan/780 they would have to sell it for something like $1600, and would have to severely underclock it to get it under the 300w cap to call it a PCIe GPU.
trueblack's Avatar
... possible.
given 770 and 670 are basically same GK104

only 780 is an odd one as GK110.

If this is the case though, I be much less excited about it.
but I can totally see this as a possible outcome.
EarthDog's Avatar
770 and 680 are the same GK104 essentially. 670 has less cuda cores than 770.

Wasn't it AMD that has a $1500 gpu, and wasn't it AMD who's 7970 originally came out at $550 (just speaking in terms of current gen)?
Bobnova's Avatar
Not really, it's Asus that has a 1500 buck 7990, AMDs are much more reasonably priced
EarthDog's Avatar
Yeah, ASUS... on AMD GPUs, correct. Point still remains. $1500 AMD based GPU(s) from ASUS.
EarthDog's Avatar
Whatever, not going to continue to split hairs guys. Its an ASUS, AMD does not OFFER it but its using AMD GPUs. Their closest is $900 or w/e. Is that better? Would that not make people come in and split hairs, correctly I should add, LOL!? Ugh........ZERO patience today kids...
txus.palacios's Avatar
Come on, ED. It is not splitting hairs. Actually, they're quite the same thing.

That $1500 AMD GPU you criticize is a limited edition GPU made for e-peen enlargement purposes. The MARS II is a $1500 limited edition GPU made for e-peen enlargement purposes. Thus, ASUS asks for $1500 for an e-peen enlargement device, whether it is powered by AMD or NVIDIA.
EarthDog's Avatar
I get it.. trust me. And you guys are correct. Its NOT an AMD card, except for under the hood.
trueblack's Avatar
I want an ASUS maded Dual Titan e-peen enlargement device.
txus.palacios's Avatar
Wait for it, and they shall deliver a MATRIX II with two 780 cores.
Bobnova's Avatar
Betcha it's a pair of GK104s overclocked
bluezero5's Avatar
I think Bobnova is likely to be right.

I think more likely to do a double 770 as 790, then double 780 as 790.
double 770 will be like a GTX690 on steroids. around $1000 bucks I think.

There's no reason for NVDA to skip this variation before launching the 890 next year with double GK110.
Leave a Comment