EVGA GTX 680 Graphics Card Review

EVGA has been selling NVIDIA GPUs for quite some time now, and they are regarded as one of the best manufacturers offering NVIDIA GPUs. They gained their status through superb warranty and customer support combined with the word-of-mouth among enthusiasts. Today, we have the most recent addition to EVGA’s GPU line-up, the GTX 680.

Specifications & Features

The only thing out of the ordinary from the GPU-Z screenshot below is PCI-E 2.0. The GTX 680 is actually PCI-E 3.0, but my EVGA P67 FTW doesn’t have PCI-E 3.0 slots.

Specifications
Specifications

Features

  • NVIDIA SMX EngineNext generation streaming multiprocessor built from the ground up for incredible performance and power efficiency.
  • NVIDIA GPU BoostDynamically maximizes clock speeds based on workload of the game to push performance to new levels and bring out the best in every game.
  • NVIDIA Adaptive Vertical SyncDynamically enables vertical sync based on your current frame rates for the smoothest gaming experience.
  • Supports Four Concurrent DisplaysTwo dual-link DVI connectors, HDMI, and DisplayPort 1.2
  • PCI Express 3.0 SupportDesigned for the new PCI Express 3.0 bus architecture offering the highest data transfer speeds for the most bandwidth-hungry games and 3D applications, while maintaining backwards compatibility with existing PCI Express motherboards for the broadest support.
  • NVIDIA FXAA TechnologyShader-based anti-aliasing technology available from the NVIDIA Control Panel that enables ultra-fast anti-aliasing in hundreds of PC games.
  • NVIDIA TXAA TechnologySupport for new temporal anti-aliasing technique that delivers the ultimate combination of image quality and performance.

Packaging & Accessories

EVGA’s box isn’t flashy and doesn’t have some random cartoon drawn all over the box. They just stick to the contents and information about the product, which I personally like. On the front of the box, the contents and features are listed around the edge with EVGA’s logo in the center. Then, on the back of the box, they go into more detail about the features of the GTX 680.

Box Front
Box Front

Box Back
Box Back

The GPU is wrapped in an anti-static bag and embedded in foam, so it’s well protected from possible shipping damage.

Foam Packaging
Foam Packaging

Cut-Out Removed
Cut-Out Removed

The accessories are pretty standard for graphics cards, except for the poster and “Enthusiast Built” stickers. There is a User Guide, Quick Start Guide, driver CD, case badge, DVI-to-VGA adapter, and a couple Molex-to-6pin power adapters.

Accessories
Accessories

Accessories
Accessories

The EVGA GTX 680

Here are a few glamor shots of the EVGA GTX 680 which I dressed up with the optional backplate. The card looks typical of most reference design cards from both NVIDIA and AMD, with the blower style fan and rear exhaust. If you take a close look, then you can see an indentation around the fan which should help air get to the GPU when used in SLI setups.

EVGA GTX 680
EVGA GTX 680

EVGA GTX 680
EVGA GTX 680

EVGA GTX 680
EVGA GTX 680

EVGA GTX 680
EVGA GTX 680

The video out connectors on the GTX 680 include a dual link DVI-D (digital only), dual link DVI-I (digital and analog), full-size DisplayPort, and full-size HDMI. With all of these options, the GPU should be able to connect to almost any display and without using adapters. The power connectors have a unique layout as you probably noticed; they are stacked on top of one another. With both stacked DVI and stacked PCI-E power connectors, the reference boards won’t be able to be single slot cards with waterblocks without some heavy modding like removing the DVI-D connector and moving one of the 6-pin connectors to another location on the card.

Video Out Connectors
Video Out Connectors

Power Connectors
Power Connectors

Below is an overall shot of the back of the PCB. The original is a much higher resolution of 4000×3000 in case anyone is interested in the details of a specific section. As for the length of the reference GTX 680, it’s 10 inches or 254 mm.

Back of PCB
Back of PCB

The rubber cover for the SLI connectors not only protects the connectors from accidental damage, but it looks good as well. Definitely a nice addition in my opinion.

SLI Connector Cover
SLI Connector Cover

SLI Connectors
SLI Connectors

A couple close-ups of the back of the PCB. The first shot shows six empty spots for capacitors which were originally planned for the GTX 680, but they didn’t make it into the final design. The next shot is the GPU PWM controller module.

qwerty
Empty Capacitor Banks

qwerty
GPU PWM Controller Module

The GTX 680 uses T6 Torx screws to attach the VRM/vRAM heatsink and fan to the PCB. I’m a big fan of Torx screws because are much harder to strip than Philips head screws. However, Torx screws are rare and not everyone has Torx screwdrivers, especially the very small T6 size.

T6 Torx Screws
T6 Torx Screws

Now it’s time to strip this thing down and see what’s under the hood. The shroud comes off easily enough by removing some Philips head screws. The actual heatsink is attached by with the four spring-loaded Philips screws on the back of the PCB. I was surprised to see a fairly good application of TIM from the factory, definitely a plus. The vRAM/VRM heatsink and fan are held in place by all those T6 Torx screws mentioned earlier. The thermal pads were obviously making good contact based on the impressions left in the pads after removing the heatsink.

Shroud Removed
Shroud Removed

Core Heatsink Removed
Core Heatsink Removed
RAM/VRM Heatsink Removed
RAM/VRM Heatsink Removed

Since we have a bare PCB now, we can take a closer look at a few of important sections. The first picture shows the VRM section of the card, and there’s room for five phases, but only four are active on the reference boards. Also, at the top right of the first picture, you can see a spot for another 6-pin PCIe power connector beside the other power connectors. The GDDR5 vRAM chips are made by Hynix and are model H5GQ2H24MFR-R0C, which are rated for 3 GHz at a rather low 1.5 V.

GPU VRM
GPU VRM & Unused Spot for 6-pin PCIe Power

Hynix H5GQ2H24MFR-R0C GDDR5 Chips
Hynix H5GQ2H24MFR-R0C GDDR5 Chips

Finally, to the heart of the GTX 680, the Kepler GK104 core.

GK104 Core
GK104 Core

Test Setup & Methodology

Test Setup
CPUIntel i7 2700K @ 3.4 GHz (Mimic i7 2600K)
MotherboardEVGA P67 FTW
RAM2×2 GB Corsair Dominator GT DDR3-1600 6-6-6-20
Graphics CardEVGA GTX 680
EVGA GTX 580 Classified
Solid State Drive1 TB Samsung Spinpoint F3
Power SupplySeaSonic SS-1000XP
Operating SystemWindows 7 x64 SP1 (Fresh Install)
Graphics DriversnVidia 301.10 WHQL Drivers (GTX 680)
nVidia 285.62 WHQL Drivers (GTX 580 Classified)
Equipment
Tenma Sound Level Meter
Fluke 52 II Dual Input Thermometer
Kill-a-Watt Meter

Test Settings

  • Synthetic Tests – Performance for 3DMark, Xtreme for Heaven, PhysX off when applicable
  • Hawx2 DX10 – 1920×1080, settings maxed, tessellation off
  • Hawx2 DX11 – 1920×1080, settings maxed, tessellation on
  • Alien vs Predator Default – 1920×1080, settings at default
  • Alien vs Predator High – 1920×1080, settings maxed
  • STALKER: Call of Pripyat – 1920×1080, 4x MSAA, maxed settings, tessellation on, Sunshafts test
  • Dirt2 & Dirt3 – 1920×1080, 8x MSAA, settings maxed
  • Metro 2033 – 1920×1080, settings maxed, PhysX off, DoF On, Frontline
  • Battlefield 3 – 1920×1080, settings “Ultra”, manual runs of first mission in single player

Cooling Performance

Cooling performance is measured by running 3DMark11 and recording temperatures shown in GPU-Z. Ambient temperature is measured with a Fluke 52 II thermometer by placing a K-type probe 1″ from the intake fan, and turned out to be 25 °C. The fan profiles used are Auto, 30% (min), 57% (middle), and 85% (max).

Sound Level (Noise)

Sound level is measured by placing a Tenma meter 10 cm away from the intake fan and recording the dBA reading with the fan set to 30%, 40%, 50%, 60%, 70%, and 85%. The GPU will be on an open bench table with all possible external sources of sound turned off (doors closed, A/C off, ceiling fan off, TV off, and CPU fan turned low as possible).

System Power Consumption

Peak system power consumption is measured and recorded during each of the tests using a Kill-a-Watt meter.

Performance Results

Synthetic Tests

The synthetic tests, unlike games, are ones that really show minute differences between GPUs because they are so consistent between runs with their results. The GTX 680 beats out both its predecessor and the HD7970 in 3DMark Vantage and 3DMark11, the GTX 680 and HD 7970 are pretty much neck and neck in 3DMark06 and Unigine Heaven with the GTX 680 barely coming out on top, and is 3DMark03 both the GTX 580 and HD 7970 come out ahead of the GTX 680. Going strictly by the numbers, the GTX 680 wins in 4 out of the 5 synthetic tests.

Game Tests

The GTX 680 is ahead in all but three of the game tests available to me. In Alien vs Predator High, STALKER, and Metro 2033 the GTX 680 loses to the HD 7970 by 6-10%, which may seem like a lot, but it’s only 3-4 FPS when considering the raw scores. The remaining tests show a varying degree of relative performance between the GTX 680 and HD 7970 with the GTX 680 winning out by a range from less than 2% all the way to 27%. What I make out of this is that the GTX 680 is roughly equal to or better than the HD 7970 in the majority of situations.

Synthetic & Game Test Raw Scores

For those that are interested in the hard numbers recorded during testing…

NVIDIA Adaptive Vertical Sync

Adaptive Vertical Sync does exactly as it sounds, it turns VSync on or off depending on current frames per second. If the FPS is 60+ then VSync is turned on, otherwise VSync is turned off. This helps minimize the downsides of having VSync either on or off only. When VSync is on, microstutter can be introduced when the FPS need to drop below 60 since the FPS will jump straight from 60 to 30 and back without a smooth transition. When VSync is off, screen tearing could happen when the FPS exceed the monitor’s refresh rate. Adaptive VSync turns on VSync when the FPS is high to eliminate tearing, and it turns VSync off when frames are below 60 to keep the FPS high as possible without jumping to 60-30-60. So, in essence, Adaptive VSync narrows the gap between max and min FPS resulting in a smoother gaming experience.

From looking at the data below, it looks like Adaptive Vertical Sync is doing exactly what it was designed to do. It prevents tearing by limiting max FPS to ~60, while also allowing FPS to be as high as possible when below the 60 mark to eliminate microstutter.

Vertical Sync Comparison (Click to Enlarge)
Click Graph to Enlarge

Metro 2033 benchmark settings in this test were reduced to make sure I had a good amount of frames both above and below 60 FPS.

You may also notice that, with Adaptive VSync enabled, the FPS exceed 60 by quite a bit in some areas. My theory to explain this is that these spikes happen after Adaptive VSync disables VSync and before it has time to re-enable VSync. First, notice how with Vsync On (not Adaptive), there is a small amount of fluctuation around 60 FPS with no huge spikes. Adaptive VSync has the same small amount of fluctuation about 60 FPS, but there are big spikes at times. Second, notice that the spikes in Adaptive mode only happen during the parts of the bench where FPS are high in the VSync disabled results. Since these spikes do not happen with VSync enabled, they must happen when VSync is off (whether by turning it completely off or using Adaptive VSync). I think the small amount of fluctuation may be to blame for spikes in Adaptive mode because the small fluctuation downward is enough to trigger the disabling of VSync by Adaptive mode. This means that if the bench is in a section where FPS would normally be high with no VSync, the triggering caused by the fluctuation results in the next few frames being really high to match VSync off results until Adaptive mode turns VSync back on. Looking through the raw data points, it looks like this might be the case. The high spikes seem to always happen for a few hundredths of a second after a measured FPS below 60, then the FPS is back to around 60.

MSAA vs FXAA vs TXAA

MSAA is Multi-Sample Anti-Aliasing, which means this method renders a scene at higher resolutions and then “shrinks” the scene to the desired resolution to reduce the appearance of aliasing (jaggies, pixellation). The GPU power required to render at higher resolutions is the downside of this method, and it can cause a hefty performance hit in games.

FXAA or Fast Approximate Anti-Aliasing recognizes edges in-game by contrast comparison, and then “smooths” those surrounding pixels by forming a gradient between contrasting colors. FXAA does its thing quickly and without consuming many resources because it’s a post process pixel shader. Something that can be considered both a pro and con of FXAA is that it effects all pixels on the screen and smooths things are are not effected by MSAA; the con part being that it can effect small text on the screen causing it to be blurry.

Temporal Approximate Anti-Aliasing, or TXAA, is like a combination of  lower MSAA and filters to give looks comparable to higher MSAA without as much of a performance hit. TXAA is meant to be integrated into the game engines, so there’s not an On/Off switch in the NVIDIA control panel like FXAA (yet). Since TXAA is so new and needs to be in the game engines, there isn’t a game that I have to show comparisons between MSAA, FXAA, and TXAA.

The following comparison shows visual differences between anti-aliasing settings. The Dirt3 “Options” menu was chosen because of the easily distinguishable edges. Based on this comparison, FXAA looks to fall somewhere between 2x MSAA and 4x MSAA. In a couple areas, FXAA even looks a lot like No AA, such as the N’s edge in CONTROLS and the inside edge of the U in AUDIO. I believe this is because there are two noticeable edges very close to one another in those areas, so FXAA smooths both of them and it turns out bad (this is just a guess). In all the other areas, FXAA looks better than 2x MSAA and close the 4x MSAA.

Click the Image for Full Size
Dirt3 Options Screen (Click Image for Full Size)

Now to check the performance hit of each anti-aliasing setting. In the following results, FXAA falls between No AA and 2x MSAA. However, there’s barely a difference in 2x MSAA, 4x MSAA, and FXAA. So, with 4x MSAA looking better than FXAA and only performing worse by ~4 FPS, I’d use the 4x MSAA setting over FXAA in Dirt3. I also take away from this graph that the GTX 680 can handle up to x4 MSAA really well at 1080p, and doesn’t start to show a significant performance hit until reaching 8x MSAA.

Overclocking

NVIDIA GPU Boost

GPU Boost is a parallel to Turbo Boost from the CPU side. What the GPU wants to do is get the most performance it can while remaining around its Power Target. GPU Boost does this by increasing core frequency when the GPU isn’t being fully utilized, this brings it closer to the Power Target. GPU Boost decides on where to set frequencies based on the power consumption and temperature of the GPU, so the overclocking will be limited to the power the card can pull and how cool the GPU can be kept. The same limitations as usual, but the clocks are dynamic and determined by offsets.

The “Boost Clock” is the average clock frequency the GPU will run under load in many typical non-TDP apps that require less GPU power consumption. On average, the typical Boost Clock provided by GPU Boost in GeForce GTX 680 is 1058MHz, an improvement of just over 5%. The Boost Clock is a typical clock level achieved running a typical game in a typical environment.

– NVIDIA

It seems the Boost Clock will vary between GPUs since the 1058 MHz listed is considered the average Boost Clock among average GTX 680 samples. So, depending on how good or bad the core of your specific GPU is, the average Boost Clock could be higher or lower than the listed 1058 MHz. This card actually boosts up to 1084.4 MHz during 3DMark11, so 1058 MHz definitely isn’t a static number that all GTX 680 GPUs will max out at while using GPU Boost. Some of the steps of this specific card are below, and it looks like the frequency steps are in 13 MHz increments and the voltage steps are in 0.0125 V increments.

Core MHzvRAM MHzVDDC
3244050.987
549.61502.30.987
1006.01502.31.112
1032.81502.31.137
1045.21502.31.150
1058.21502.31.150
1071.31502.31.162
1084.41502.31.175

Software

With the release of the GTX 680, EVGA also released a new version of their Precision and OC Scanner software, Precision X and OC Scanner X. Precision X is where all the GPU settings can be changed, such as power target, clock offsets, voltages, fan speed, etc., and it also monitors all the settings of the GPU. OC Scanner X is a FurMark-ish GPU stressing program that helps with determining stable overclocks by scanning for artifacts during the testing and it also gives you a quick idea of loaded temperatures.

EVGA Precision X
EVGA Precision X
EVGA OC Scanner X
EVGA OC Scanner X

Stock Air

The first thing to do when overclocking the GTX 680 is to increase the Power Target to the max of 132%. Once that is set the GTX 680 is allowed to pull more power, which allows GPU Boost to reach higher frequencies or voltages. As far as core voltage is concerned, it’s basically set to 1.175 V. Yes, it actually varies with GPU Boost frequencies, but when GPU Boost is maxed out at stock or is being overclocked the voltage remains at 1.175 V. I even tried manually turning it down to see if the GPU could run at higher frequencies with less than 1.175 V, but GPU Boost automatically increases it to 1.175 V regardless of what is manually set in Precision X. So, no need to worry about setting voltage. I also like setting my fan manually to a speed that doesn’t annoy me while I’m playing games, which turned out to be 65% when considering the music and sound effects of the game and any other ambient sounds like ceiling fan or TV.

When I’m overclocking I tend to start with big frequency jumps so I reach instability faster and then back down on clocks. I also start with core frequency since it has more of an effect on scores and FPS than vRAM frequency. I tested the overclock increments by running 10 minutes of EVGA’s OC Scanner X, so it took me around an hour and a half to find my highest stable OC. The core was able to reach 1215 MHz or 20.7% over the stock 1006 MHz, and the vRAM got to 3305 MHz or a 10.2% increase in frequency. Being able to OC by 20% and 10% on the core and vRAM without voltage control is a pretty good accomplishment in my book, all the GTX 680 needs is higher voltage options. With the fan at 65% the core only reached 61-62 °C during the OC Scanner X test, which really heats up GPUs, so I believe there’s plenty of thermal headroom for higher clocks if we could only get by the voltage limit.

GTX 680 Overclocking
CorevRAMPowerStress
+100+0132%Pass
+200+0132%Fail
+150+0132%Fail
+125+0132%Pass
+137+0132%Fail
+131+0132%Pass
+131+100132%Pass
+131+200132%Pass
+131+300132%Pass
+131+400132%Fail
+131+350132%Fail
+131+325132%Fail
+131+313132%Fail
+131+307132%Fail
EVGA GTX 680 @ 1215/3305 MHz
EVGA GTX 680 @ 1215/3305 MHz

Cooling Performance & Noise

Stock Cooling

With the addition of GPU Boost, cooling not only effects the GPU temperatures, it could also directly effect GPU performance. Theoretically, the cooler the GPU can be kept, the higher the GPU can boost its frequency, which could cause a cooler GPU to score better in tests. Based on that, I would think NVIDIA would use really good stock reference coolers on the GTX 680. The results show the stock cooler to perform like reference coolers on other GPUs. Definitely, not top-end cooling performance by any means, but good enough to keep the GPU well under 100 °C or so, where the throttling temperature is usually set. Even with the fan set to its lowest speed of 30%, the GPU just got into the low 90 °C range. If you like normalized temperatures based on ambient temperature, then just subtract 25 °C from the results below.

Sound Level

The dBA numbers graphed below come from the the raw measurement at 10 cm and two estimated dBA level calculated for different distances (1 m and 2 ft). Why estimate instead of measure at further distances? It’s because the meter I’m using gets more accurate as the dB increase, so I wanted to measure really close to the source to get the most accurate measurements. The following equation is what was used for estimation of sound level at different distances.

L2 = L1 – 20 * log10(r2/r1)

  • L1 = Sound level at reference distance
  • L2 = Sound level at desired distance
  • r1 = Reference distance
  • r2 = Desired distance

Sound level at 1 m is easy to calculate when measured at 10 cm since log10(1/0.1) = log10(10) = 1, so all that needs to be done is subtract 20 dBA from the measured numbers. That’s why I chose the 10 cm measuring distance 🙂

Subjectively, when sitting at my desk with the PC on an open bench on the floor beside my desk, the noise was tolerable for me when the fan speed was set around 60% or lower. So, ~35 dBA or lower is my preference when sitting at my desk.

System Power Consumption

The peak power consumption numbers below are comparing my EVGA GTX 580 Classified and EVGA GTX 680. There’s quite a difference here with the GTX 680 using anywhere between 90-100 W less than the GTX 580 Classified and ~94 W less on average. However, as seen in the performance results, the additional power consumption does not translate into the GTX 580 Classified performing better, the GTX 680 is just much more efficient than the previous generation of NVIDIA cards.

Performance per Dollar

In the chart below, I graphed the performance of the cards divided by the current price of the cards to show performance per dollar. I also left in the price of the HD 7970 at GTX 680 release as well for comparison. Before the recent price drop on the HD 7970, the GTX 680 easily took the performance per dollar crown among high-end GPUs. However, once the price dropped to $480 for the HD 7970, it bought the two powerhouses neck-and-neck in performance per dollar, with them being less than a percentage point apart on average.

Conclusion

Performance-wise, the GTX 680 wins out over its predecessor, the GTX 580, and the HD 7970 in the majority of tests. However, it doesn’t consistently “trump” the HD 7970. The performance difference between the two flagships seems to vary widely (between 1-27%) depending on the test used. Overall, the GTX 680 is on par with or better than the HD 7970 in performance. When first released the GTX 680 also had ~23% better performance per dollar than the HD 7970, but since then a price drop by AMD brought the HD 7970 on par with the GTX 680 in that category.

The reference cooler does an adequate job at keeping the core temperature in check at 75 °C using the Auto fan profile or mid-to-upper 50s if manually setting the fan to 50%. Even when the fan is set to minimum, the temperature doesn’t reach the thermal throttling threshold. The noise of the reference cooler is barely noticeable up to ~50% fan speed, but jet engine loud when cranked to the max of 85%. So be prepared to wear headphones if you want to set the fan speed toward the top end of the spectrum.

The GTX 680 greatly improves system power consumption over then previous generation by pulling 25% less power than the GTX 580 Classified at the wall.

It also has quite a few new features that really add to its value. It finally brings support for up to 4 monitors on a single card to the NVIDIA side in the form of a 3 monitor Surround setup plus a fourth auxiliary display for web, music, chat, etc. Adaptive Vertical Sync is a great way to help prevent both screen tearing and microstutter by allowing the GPU to dynamically enable and disable vertical sync. A couple new anti-aliasing techniques in FXAA and TXAA that smooth edges without a huge performance hit on FPS in games. Last, but not least, the GPU Boost feature that can increase performance is less GPU-intensive tasks by increasing the core clock frequencies as long as the card stays within its power target overall.

Overclocking the GTX 680 is different than what we’re used to with clocks and voltage being dynamic and having to make use of a Power Target and frequency offsets. However, with a little messing around with this new way of OCing, it becomes natural fairly quickly. The card doesn’t have voltage control, so we’re stuck with 1.175 V on the core and whatever vRAM voltage is set. That didn’t stop the card from being able to run at 1215 MHz on the core and 3305 MHz (GDDR5-6610) on the vRAM; a great overclock considering no voltage increases.

Overall, I can’t think of a single negative about the GTX 680, really. It’s a physically small card with tons of features that packs a punch, performance-wise, while using less power and generating less heat than the competition. Overclockers Approved without a doubt!

Click the Approved stamp for an explanation of what it means.

– Matt T. Green (MattNo5ss)

Loading new replies...

Avatar of bmwbaxter
bmwbaxter

Member

4,135 messages 7 likes

Very nice review!

The first thing I thought of when I saw the picture of the spot for an additional VRM was "looks like a job for TJ" :p

Reply Like

Avatar of EarthDog
EarthDog

Gulper Nozzle Co-Owner

77,677 messages 4,334 likes

Thats one of Tin's mods IIRC is adding those.

I wonder if by itself it would help. I think the evga card with the waterblock adds that 5th vrm too?

PS - Great review Mattno! :)

Reply Like

Avatar of bmwbaxter
bmwbaxter

Member

4,135 messages 7 likes

Thats one of Tin's mods IIRC is adding those.

I wonder if by itself it would help. I think the evga card with the waterblock adds that 5th vrm too?

PS - Great review Mattno! :)

I thought Tin's mods just completely bypassed those all together?

As for the EVGA hydro copper cards having a fifth VRM I would gladly accept one being sent my way to see :p

Reply Like

Avatar of Janus67
Janus67

Benching Team Leader

17,410 messages 874 likes

/awaits the msi lightning version that hopefully allows voltage modification w/o TiNing the card.

Reply Like

Avatar of EarthDog
EarthDog

Gulper Nozzle Co-Owner

77,677 messages 4,334 likes

I thought Tin's mods just completely bypassed those all together?

As for the EVGA hydro copper cards having a fifth VRM I would gladly accept one being sent my way to see :p

Nope. There is a picture of them added on the card in the article Hokie put up I believe.

Reply Like

Avatar of dtrunk
dtrunk

Member

892 messages 0 likes

very nice. wish i could simply mail you my copper for testing, but it is sort of.... installed. :-/

Reply Like