NVIDIA Introduces The GTX TITAN

In Greek mythology, the Titans were a race of “…immortal, huge beings of incredible strength and stamina…” It seems graphics card manufacturers are all about naming their high end offerings after gods now-a-days. The Titan is also the name of one of the fastest supercomputers in the world at Oak Ridge National Laboratories.

TITAN
TITAN

The NVIDIA GeForce GTX TITAN’s name is certainly derived from some powerful pedigree.

Specifications & Features

NVIDIA starts off their press deck with a strong claim – the world’s most powerful GPU. It’s certainly built to qualify, with 2,688 CUDA cores, a massive 7.1 billion transistors and the capability of 4.5 teraflops worth of computing power.

World's Most Powerful GPU
World’s Most Powerful GPU

Obviously a GPU isn’t nearly as versatile as an x86 CPU, but this comparison is eye-popping none-the-less. If you’re running calculations that can run on CUDA cores, it will outpace an i7 3960X by orders of magnitude.

Comparison vs. Intel's Strongest CPU
Comparison vs. Intel’s Strongest CPU

This is what a lot of people have been waiting quite a while for. The rumor mill has been churning for some time now. The rumor thread on our forum is over eight pages long now. Well folks, the wait is over. As widely rumored, the GTX TITAN’s GPU is the GK110 core. You’ve already seen the number of CUDA cores (2,688). What’s not listed is that it has 224 texture units and 48 ROPs. By comparison, the GTX 680’s GK104 GPU has 1536 CUDA cores, 128 texture units and 32 ROPs. Yes, TITAN is a beast.

TITAN clocks in with an 837 MHz base clock and an 876 MHz boost clock. Fear not, it normally runs faster than that. Exactly how much faster will have to wait a couple more days, but I’ll say that 876 MHz is a tad conservative for stock boost clock.

At that speed with so many cores, as mentioned, you can get 4.5 Teraflops of computing power. Note the number underneath that – 1.3 Teraflops of Double Precision computing power. Yes folks, for the programmers among you that enjoy tinkering with CUDA coding in your spare time, there is now a powerful consumer GPU that doesn’t cost multiple-thousands-of-dollars (i.e. Tesla) that allows you to compute with double precision. It’s important to note that selecting double precision processing is an all-or-nothing selection. If you want to dedicate your Titan GPU to double precision calculations, you’ll need to manually select that and get to work. When you’re done, if you want to go back to gaming, you’ll need to manually turn it off, or it will hurt your performance.

There’s more, but I’ll let you actually look at the specs first…

GTX TITAN Specifications
GTX TITAN Specifications

…and we’re back. With such a power packed graphics processor, NVIDIA saw fit to give people plenty of frame buffer, a massive six gigabytes to play with. People complained that the GTX 680 didn’t have enough memory, coming with a mere 2 GB to AMD’s 3 GB on the HD 7970. NVIDIA listened and tripled the amount. Not only that, it’s operating on a 384-bit bus at 6,000 MHz speeds (that’s quad-pumped GDDR5, running at 1,500 MHz). This is the largest frame buffer on any NVIDIA GPU, ever.

One thing that lots of people will be surprised about is the power consumption. The card has two power connectors, a single 8-pin and a single 6-pin. It just plain doesn’t need any more, with a meager 250 W TDP. If it lives up to its billing as the fastest GPU in the world, that’s a coup considering how little power it draws. AMD’s HD 7970 GHz Edition is rated at a 250 W TDP as well, but if TITAN out performs it by a tremendous amount, while keeping the same TDP, that would be impressive.

First Glimpse of TITAN
First Glimpse of TITAN

The slide above is a rendering, but it’s the first glimpse at TITAN. Below, you’ll see a focus of NVIDIA for TITAN. They wanted a very strong GPU, but they also wanted one that can fit the needs of small form factor builders (and boutiques). I mean, really, who wouldn’t want the most powerful GPU in the world inside a teeny tiny mITX system?

Small Form Factor Focus
Small Form Factor Focus

So you’ve heard the specs, let’s talk about using TITAN.

NVIDIA GPU Boost 2.0

NVIDIA has had some time to consider and refine GPU Boost since it debuted with the GTX 680. Introducing GPU Boost 2.0 (also referred to as GPUB 2.0 since it’s easier to type).

GPU Boost 2.0 - "Built for Enthusiasts"
GPU Boost 2.0 – “Built for Enthusiasts”

GPU Boost 2.0 has everything GPU Boost 1.0 had but adds more voltage and higher clocks. Sounds good to an overclocker so far.

Tech Details
Tech Details

There were a couple graphs leading up to this one in the press deck, but you can see how they’ve compared 1.0 to 2.0 relative to temperature in the grouped graph. Temperature is the biggest takeaway on this graph.

Faster Clocks
Faster Clocks

With GPU Boost 1.0, much to the chagrin of overclockers everywhere, clocks were capped by available voltage, which itself was locked down. It was good for preventing silicon damage, but bad for overclocking. Some people (water cooling enthusiasts, for instance) could cool their cards to quite low temperatures, but have zero additional headroom.

GPU Boost 1.0 Voltage Caps
GPU Boost 1.0 Voltage Caps

GPU Boost 2.0 pulls the rug out from under hard voltage control. Now the GPU controls its boost clock by temperature. In the simplest terms, the GPU will use the highest voltage available (now higher than the GTX 680) in conjunction with the highest stable clocks until the GPU reaches the default set temperature (80 °C). Only then will it reduce the boost clock.

GPU 2.0 Voltage Stock
GPU Boost 2.0 Voltage Stock

In addition to that, NVIDIA is also allowing some overvoltage of the TITAN GPU. It’s not a lot (you’ll get the actual number in a couple days), but it’s more than zero. You have to accept that you may impact the long term reliability of your GPU, but once you’re through that mandatory dialog, you get access to some overvoltage control. This is optional for board partners to include; they don’t have to include it. However, based on the fact that EVGA and ASUS are the two partners bringing TITAN to us in North America, I’d say it’s a safe bet they’ll give us that control.

GPU Boost 2.0 Overvoltage
GPU Boost 2.0 Overvoltage

With temperature in control instead of an arbitrarily selected ‘safe’ voltage, there is a not insignificant increase in frequency capability. What damages silicon is voltage combined with temperature. Previously NVIDIA only allowed voltage that was safe for long term reliability, period; no matter what temperature conditions existed. Now they’ve lifted that hindrance and let temperatures control the boost.

Performance Increase With GPUB 2.0
Performance Increase With GPU Boost 2.0

Of course, the curve shifts a hair when a bit more voltage is introduced.

GPU Boost 2.0 Performance Increase  With Overvoltage
GPU Boost 2.0 Performance Increase With Overvoltage

Only under these conditions can you adjust the max GPU voltage.

Max GPU Voltage Optional
Max GPU Voltage Optional

Overvoltage isn’t the only thing that you need to adjust with TITAN. Since temperature is the new metric by which your clocks are controlled, you need to use temperature targets to help decide where your GPU will end up.

Using Temperature Targets

Not only can you add a bit more voltage, you can also help your clocks along if you’re willing to tolerate higher temperatures. Overclockers have been doing this manually for years. They know the max temperature they can accept on their particular hardware and they adjust the cooling, voltage, and overclocks to reach that happy medium. It’s been this way for a long time, but this is the first time the temperature directly affects your ability to get higher clocks.

Pick Your Target Temperature
Pick Your Target Temperature

If you move your temperature target higher, it may move your clock curve even higher.

The Difference Temperature Can Make
The Difference Temperature Can Make

There are a few items in advanced controls that we haven’t gone over. You know about raising max voltage and adjusting temp target. There is also a Prioritization control. You can tell your GPU to prioritize either frequency or temperature. If you tell it to prioritize frequency, it will act like you expect based on everything above, sticking to the base clock no matter what and clocking up with boost as headroom is available.

However, say you want your GPU to run at a cooler temperature, frequency be damned. You can set your temperature target to, say, 65° C and then set the GPU to prioritize temperature. With GPU Boost 1.0, once the GPU banged up against its limits, it would throttle down, potentially to the base clock. With GPU Boost 2.0, if your GPU hits its limit (temperature now, which you’ve selected to prioritize), the GPU will throttle itself until it cools down to the temperature you’ve set, potentially even below the base frequency. So now you’ve got more control over your GPU, up and down.

Advanced Controls
Advanced Controls

The last item to talk about is display overclocking. This is a hit-or-miss operation. Some monitors will do it and others won’t. But with TITAN, NVIDIA has introduced the ability to overclock your display a bit. Most monitors operate at 60 Hz refresh rates (new 120 MHz monitors notwithstanding). Even if you’ve got a very powerful GPU pushing much higher framerates, the video output of the GPU is still only putting out 60 Hz (which is basically 60 frames per second).

Like any overclocking, display overclocking may or may not work and I have no idea whether it will have any effect on your monitor, positive or negative.

Meet the NVIDIA GTX TITAN

Now we get to the main event, TITAN itself, in the flesh. 10.5″ of GPU goodness. Its heat sink is similar to that on the GTX 690. It is clad in aluminum and there’s even a nice looking window into the fins on the heatsink.

NVIDIA GeForce GTX TITAN
NVIDIA GeForce GTX TITAN
NVIDIA GeForce GTX TITAN
NVIDIA GeForce GTX TITAN

It’s definitely a good looking GPU. As an added bonus, the GEFORCE GTX logo glows a pleasant green when the GPU is in use. Depending on the board partner’s software, you can even control it; turn it on, off or have its glow correspond with the load on the GPU.

TITAN
TITAN

TITAN
TITAN

TITAN
TITAN

TITAN
TITAN
TITAN
TITAN

Here are the aforementioned 6-pin & 8-pin power connectors.

Power Connections
Power Connections

The video outputs are standard fare for current generation NVIDIA GPUs, two double link DVI, an HDMI, and a DisplayPort 1.2 output.

Video Connections
Video Connections

The exterior certainly looks like a winner. Let’s look at the PCB.

Under the Hood

Here’s the back of the PCB, on which you can see half of the video RAM.

Rear of PCB
Rear of PCB

This photo is here for one very important reason: To dispel any rumors that this is the GTX 780, or in any way part of the GTX 700 series. The actual name of this card is the GTX TITAN and its SKU number begins with 699. Put those rumors to rest folks. No GTX 7xx here.

Nope, Not GTX 780
Nope, Not GTX 780

Here’s a closer view of the back of the socket area.

Rear-Mounted GDDR5
Rear-Mounted GDDR5

The chosen RAM provider for the TITAN’s frame buffer is Samsung.

Samsung GDDR5 Memory
Samsung GDDR5 Memory

Regrettably, I did not have a torx in the requisite size to disassemble this card and this thing hasn’t been in my hands long enough to acquire one (since Saturday morning to be precise). So the only disassembled PCB photos come courtesy NVIDIA.

TITAN PCB
TITAN PCB (Image Courtesy NVIDIA)
TITAN PCB
TITAN PCB (Image Courtesy NVIDIA)

The power section isn’t massive on TITAN. The GPU is powered by a six phase power section, with two additional phases dedicated to the 6 GB frame buffer, for a total 6+2 power section. While it can be over-volted, it might not be too wise to go volt modding these with the stock power section.

The cooler does a good job of zeroing in on the temperature target (in conjunction with the GPU itself of course). When you adjust the fan profile to be a bit more aggressive, it does a solid job of keeping temps down.

Vapor Chamber Cooler
Vapor Chamber Cooler
Vapor Chamber Cooler
Vapor Chamber Cooler

Since the photos from the press documentation started leaking, I’ve seen some mention the fins on the rear of the card behind the fan, wondering what they cool. The answer? Nothing at all. I’m not quite sure why they didn’t close that off. It does allow air to leak from the fan back into the case.

High Performance Thermals
High Performance Thermals

NVIDIA is quite proud of this card’s acoustics and for good reason. Its default fan profile is very quiet. Impressively so, even when temperatures cause the fan to spin up. Even after adjusting for a more aggressive fan curve (I prefer lower temperatures; they lead to higher clocks with this card, see GPU Boost 2.0 above), it’s quite quiet.

Acoustics Comparison
Acoustics Comparison

One last parting shot of the GPU itself, just for the heck of it.

Parting Shot
Parting Shot

There are two more photos and we’ll be done for the day. TITAN installed. This is what it looks like on a Maximus V Extreme (with a water cooled 3770K) and G.Skill DDR3-2600 RAM. While we can’t show you any performance numbers, you can guess that the system is going to perform pretty well. It even looks the part.

GTX TITAN Installed
GTX TITAN Installed

The last photo shows the nice green glow from the GEFORCE GTX logo. It’s really a good looking card and that’s icing on the cake.

GTX TITAN Installed
GTX TITAN Installed

Last, I leave you with a redux of the specs and a couple side notes regarding some things I’ve seen talked about.

  • TITAN’s MSRP is $999.
  • Yes, that’s the same price as the GTX 690.
  • This is not replacing the GTX 690. The GTX 690 will continue to be produced.
  • The TITAN is an alternative, not a replacement.
  • This is not a limited run part. In the conference call with reporters, senior product manager Justin Walker said there will be “More than enough to satisfy demand for these high end gaming PCs.” The rumor mill some how got a limited production run of 10,000 units in its head.
  • EVGA and ASUS are the two board partners bringing TITAN out in North America. There may be other partners in other countries, but not in this market.
  • Availability is expected to start next week (the week of February 25th).
GTX TITAN Specifications Redux
GTX TITAN Specifications Redux

Last, I leave you with a video that should be live by the time we hit the big blue Publish button this morning, courtesy NVIDIA.

That’s all for today folks, thanks for reading. We’ll be back in a couple of days with performance numbers for you. If you have any questions at all, don’t hesitate to ask in the comments!

– Jeremy Vaughan (hokiealumnus)

About Jeremy Vaughan 197 Articles
I'm an editor and writer here at Overclockers.com as well as a moderator at our beloved forums. I've been around the overclocking community for several years and just love to sink my teeth into any hardware I can get my paws on!

Loading new replies...

Avatar of wagex
wagex

Chapstick Eating Premium Member

6,422 messages 58 likes

hokie no benches??!??!? :p :rain:

cant wait to see some :)

p.s. good write up :thup:

Reply Like

Avatar of briansun1
briansun1

Member

1,587 messages 0 likes

:shock::shock:ooooooooooooh shiny!!!... must have :drool::drool::drool:

Reply Like

Avatar of txus.palacios
txus.palacios

Member

3,934 messages 0 likes

Too expensive for me to justify getting one, but delicious nevertheless.

I knew you had one Jeremy, you always seem to have sweet nVidia-related stuff. :rofl:

Thanks for the preview, sir. :salute:

Reply Like

Avatar of mxthunder
mxthunder

Member

1,771 messages 0 likes

Looking forward to see how this beauty peforms. Awesome stuff, glad to see awesome build quality and compute power back to the table.

Reply Like

Avatar of Pierre3400
Pierre3400

annnnnnd it's gone

4,816 messages 0 likes

Built for Enthusiasts - I was hoping it was made for mother so she could play soltiar!

Dammit, if your even looking at this card, you know more or less what your looking at! Price is insane, and limited to 3way, thats gonna take the fun away from at least 1 or 2 people with a bigger bank account then sense.

I have seen a guy on a another forum claim he wants 4 of them!

Reply Like

Avatar of bluezero5
bluezero5

Winner, Rig-o'-the-Quarter, Fourth Quarter 2012

1,416 messages 0 likes

O.M.G.
I want to SLI the hell outta that !!!

AMD, your move now.

Reply Like

Avatar of Janus67
Janus67

Benching Team Leader

17,215 messages 531 likes

c01BIA1.jpg

That is all.

I can only imagine what the single-GPU records are going to look like a couple weeks from now on the bot. It's almost like cheating!

Now the question is - does it scale with cold!?

Reply Like

Avatar of bluezero5
bluezero5

Winner, Rig-o'-the-Quarter, Fourth Quarter 2012

1,416 messages 0 likes

That is all.

I can only imagine what the single-GPU records are going to look like a couple weeks from now on the bot. It's almost like cheating!

Now the question is - does it scale with cold!?

and I need water blocks made for this, NOW. LOL.

Reply Like

Avatar of ivanlabrie
ivanlabrie

Member

6,458 messages 0 likes

I'm guessing it will clock like a hardmodded 680...reference one. Unless EVGA comes out with a classified model :p
Insane msrp, I can't justify it. 8800 Ultra all over again, I can't believe it :bang head

Reply Like

Avatar of madhatter256
madhatter256

Special Member

2,256 messages 0 likes

FOLD it!

I'm surprised there is no heat spreader on the back of the card.

Reply Like