NVIDIA Introduces The GTX TITAN

In Greek mythology, the Titans were a race of “…immortal, huge beings of incredible strength and stamina…” It seems graphics card manufacturers are all about naming their high end offerings after gods now-a-days. The Titan is also the name of one of the fastest supercomputers in the world at Oak Ridge National Laboratories.

TITAN

TITAN

The NVIDIA GeForce GTX TITAN’s name is certainly derived from some powerful pedigree.

Specifications & Features

NVIDIA starts off their press deck with a strong claim – the world’s most powerful GPU. It’s certainly built to qualify, with 2,688 CUDA cores, a massive 7.1 billion transistors and the capability of 4.5 teraflops worth of computing power.

World's Most Powerful GPU

World’s Most Powerful GPU

Obviously a GPU isn’t nearly as versatile as an x86 CPU, but this comparison is eye-popping none-the-less. If you’re running calculations that can run on CUDA cores, it will outpace an i7 3960X by orders of magnitude.

Comparison vs. Intel's Strongest CPU

Comparison vs. Intel’s Strongest CPU

This is what a lot of people have been waiting quite a while for. The rumor mill has been churning for some time now. The rumor thread on our forum is over eight pages long now. Well folks, the wait is over. As widely rumored, the GTX TITAN’s GPU is the GK110 core. You’ve already seen the number of CUDA cores (2,688). What’s not listed is that it has 224 texture units and 48 ROPs. By comparison, the GTX 680′s GK104 GPU has 1536 CUDA cores, 128 texture units and 32 ROPs. Yes, TITAN is a beast.

TITAN clocks in with an 837 MHz base clock and an 876 MHz boost clock. Fear not, it normally runs faster than that. Exactly how much faster will have to wait a couple more days, but I’ll say that 876 MHz is a tad conservative for stock boost clock.

At that speed with so many cores, as mentioned, you can get 4.5 Teraflops of computing power. Note the number underneath that – 1.3 Teraflops of Double Precision computing power. Yes folks, for the programmers among you that enjoy tinkering with CUDA coding in your spare time, there is now a powerful consumer GPU that doesn’t cost multiple-thousands-of-dollars (i.e. Tesla) that allows you to compute with double precision. It’s important to note that selecting double precision processing is an all-or-nothing selection. If you want to dedicate your Titan GPU to double precision calculations, you’ll need to manually select that and get to work. When you’re done, if you want to go back to gaming, you’ll need to manually turn it off, or it will hurt your performance.

There’s more, but I’ll let you actually look at the specs first…

GTX TITAN Specifications

GTX TITAN Specifications

…and we’re back. With such a power packed graphics processor, NVIDIA saw fit to give people plenty of frame buffer, a massive six gigabytes to play with. People complained that the GTX 680 didn’t have enough memory, coming with a mere 2 GB to AMD’s 3 GB on the HD 7970. NVIDIA listened and tripled the amount. Not only that, it’s operating on a 384-bit bus at 6,000 MHz speeds (that’s quad-pumped GDDR5, running at 1,500 MHz). This is the largest frame buffer on any NVIDIA GPU, ever.

One thing that lots of people will be surprised about is the power consumption. The card has two power connectors, a single 8-pin and a single 6-pin. It just plain doesn’t need any more, with a meager 250 W TDP. If it lives up to its billing as the fastest GPU in the world, that’s a coup considering how little power it draws. AMD’s HD 7970 GHz Edition is rated at a 250 W TDP as well, but if TITAN out performs it by a tremendous amount, while keeping the same TDP, that would be impressive.

First Glimpse of TITAN

First Glimpse of TITAN

The slide above is a rendering, but it’s the first glimpse at TITAN. Below, you’ll see a focus of NVIDIA for TITAN. They wanted a very strong GPU, but they also wanted one that can fit the needs of small form factor builders (and boutiques). I mean, really, who wouldn’t want the most powerful GPU in the world inside a teeny tiny mITX system?

Small Form Factor Focus

Small Form Factor Focus

So you’ve heard the specs, let’s talk about using TITAN.

NVIDIA GPU Boost 2.0

NVIDIA has had some time to consider and refine GPU Boost since it debuted with the GTX 680. Introducing GPU Boost 2.0 (also referred to as GPUB 2.0 since it’s easier to type).

GPU Boost 2.0 - "Built for Enthusiasts"

GPU Boost 2.0 – “Built for Enthusiasts”

GPU Boost 2.0 has everything GPU Boost 1.0 had but adds more voltage and higher clocks. Sounds good to an overclocker so far.

Tech Details

Tech Details

There were a couple graphs leading up to this one in the press deck, but you can see how they’ve compared 1.0 to 2.0 relative to temperature in the grouped graph. Temperature is the biggest takeaway on this graph.

Faster Clocks

Faster Clocks

With GPU Boost 1.0, much to the chagrin of overclockers everywhere, clocks were capped by available voltage, which itself was locked down. It was good for preventing silicon damage, but bad for overclocking. Some people (water cooling enthusiasts, for instance) could cool their cards to quite low temperatures, but have zero additional headroom.

GPU Boost 1.0 Voltage Caps

GPU Boost 1.0 Voltage Caps

GPU Boost 2.0 pulls the rug out from under hard voltage control. Now the GPU controls its boost clock by temperature. In the simplest terms, the GPU will use the highest voltage available (now higher than the GTX 680) in conjunction with the highest stable clocks until the GPU reaches the default set temperature (80 °C). Only then will it reduce the boost clock.

GPU 2.0 Voltage Stock

GPU Boost 2.0 Voltage Stock

In addition to that, NVIDIA is also allowing some overvoltage of the TITAN GPU. It’s not a lot (you’ll get the actual number in a couple days), but it’s more than zero. You have to accept that you may impact the long term reliability of your GPU, but once you’re through that mandatory dialog, you get access to some overvoltage control. This is optional for board partners to include; they don’t have to include it. However, based on the fact that EVGA and ASUS are the two partners bringing TITAN to us in North America, I’d say it’s a safe bet they’ll give us that control.

GPU Boost 2.0 Overvoltage

GPU Boost 2.0 Overvoltage

With temperature in control instead of an arbitrarily selected ‘safe’ voltage, there is a not insignificant increase in frequency capability. What damages silicon is voltage combined with temperature. Previously NVIDIA only allowed voltage that was safe for long term reliability, period; no matter what temperature conditions existed. Now they’ve lifted that hindrance and let temperatures control the boost.

Performance Increase With GPUB 2.0

Performance Increase With GPU Boost 2.0

Of course, the curve shifts a hair when a bit more voltage is introduced.

GPU Boost 2.0 Performance Increase  With Overvoltage

GPU Boost 2.0 Performance Increase With Overvoltage

Only under these conditions can you adjust the max GPU voltage.

Max GPU Voltage Optional

Max GPU Voltage Optional

Overvoltage isn’t the only thing that you need to adjust with TITAN. Since temperature is the new metric by which your clocks are controlled, you need to use temperature targets to help decide where your GPU will end up.

Using Temperature Targets

Not only can you add a bit more voltage, you can also help your clocks along if you’re willing to tolerate higher temperatures. Overclockers have been doing this manually for years. They know the max temperature they can accept on their particular hardware and they adjust the cooling, voltage, and overclocks to reach that happy medium. It’s been this way for a long time, but this is the first time the temperature directly affects your ability to get higher clocks.

Pick Your Target Temperature

Pick Your Target Temperature

If you move your temperature target higher, it may move your clock curve even higher.

The Difference Temperature Can Make

The Difference Temperature Can Make

There are a few items in advanced controls that we haven’t gone over. You know about raising max voltage and adjusting temp target. There is also a Prioritization control. You can tell your GPU to prioritize either frequency or temperature. If you tell it to prioritize frequency, it will act like you expect based on everything above, sticking to the base clock no matter what and clocking up with boost as headroom is available.

However, say you want your GPU to run at a cooler temperature, frequency be damned. You can set your temperature target to, say, 65° C and then set the GPU to prioritize temperature. With GPU Boost 1.0, once the GPU banged up against its limits, it would throttle down, potentially to the base clock. With GPU Boost 2.0, if your GPU hits its limit (temperature now, which you’ve selected to prioritize), the GPU will throttle itself until it cools down to the temperature you’ve set, potentially even below the base frequency. So now you’ve got more control over your GPU, up and down.

Advanced Controls

Advanced Controls

The last item to talk about is display overclocking. This is a hit-or-miss operation. Some monitors will do it and others won’t. But with TITAN, NVIDIA has introduced the ability to overclock your display a bit. Most monitors operate at 60 Hz refresh rates (new 120 MHz monitors notwithstanding). Even if you’ve got a very powerful GPU pushing much higher framerates, the video output of the GPU is still only putting out 60 Hz (which is basically 60 frames per second).

Like any overclocking, display overclocking may or may not work and I have no idea whether it will have any effect on your monitor, positive or negative.

Meet the NVIDIA GTX TITAN

Now we get to the main event, TITAN itself, in the flesh. 10.5″ of GPU goodness. Its heat sink is similar to that on the GTX 690. It is clad in aluminum and there’s even a nice looking window into the fins on the heatsink.

NVIDIA GeForce GTX TITAN

NVIDIA GeForce GTX TITAN

NVIDIA GeForce GTX TITAN

NVIDIA GeForce GTX TITAN

It’s definitely a good looking GPU. As an added bonus, the GEFORCE GTX logo glows a pleasant green when the GPU is in use. Depending on the board partner’s software, you can even control it; turn it on, off or have its glow correspond with the load on the GPU.

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

Here are the aforementioned 6-pin & 8-pin power connectors.

Power Connections

Power Connections

The video outputs are standard fare for current generation NVIDIA GPUs, two double link DVI, an HDMI, and a DisplayPort 1.2 output.

Video Connections

Video Connections

The exterior certainly looks like a winner. Let’s look at the PCB.

Under the Hood

Here’s the back of the PCB, on which you can see half of the video RAM.

Rear of PCB

Rear of PCB

This photo is here for one very important reason: To dispel any rumors that this is the GTX 780, or in any way part of the GTX 700 series. The actual name of this card is the GTX TITAN and its SKU number begins with 699. Put those rumors to rest folks. No GTX 7xx here.

Nope, Not GTX 780

Nope, Not GTX 780

Here’s a closer view of the back of the socket area.

Rear-Mounted GDDR5

Rear-Mounted GDDR5

The chosen RAM provider for the TITAN’s frame buffer is Samsung.

Samsung GDDR5 Memory

Samsung GDDR5 Memory

Regrettably, I did not have a torx in the requisite size to disassemble this card and this thing hasn’t been in my hands long enough to acquire one (since Saturday morning to be precise). So the only disassembled PCB photos come courtesy NVIDIA.

TITAN PCB

TITAN PCB (Image Courtesy NVIDIA)

TITAN PCB

TITAN PCB (Image Courtesy NVIDIA)

The power section isn’t massive on TITAN. The GPU is powered by a six phase power section, with two additional phases dedicated to the 6 GB frame buffer, for a total 6+2 power section. While it can be over-volted, it might not be too wise to go volt modding these with the stock power section.

The cooler does a good job of zeroing in on the temperature target (in conjunction with the GPU itself of course). When you adjust the fan profile to be a bit more aggressive, it does a solid job of keeping temps down.

Vapor Chamber Cooler

Vapor Chamber Cooler

Vapor Chamber Cooler

Vapor Chamber Cooler

Since the photos from the press documentation started leaking, I’ve seen some mention the fins on the rear of the card behind the fan, wondering what they cool. The answer? Nothing at all. I’m not quite sure why they didn’t close that off. It does allow air to leak from the fan back into the case.

High Performance Thermals

High Performance Thermals

NVIDIA is quite proud of this card’s acoustics and for good reason. Its default fan profile is very quiet. Impressively so, even when temperatures cause the fan to spin up. Even after adjusting for a more aggressive fan curve (I prefer lower temperatures; they lead to higher clocks with this card, see GPU Boost 2.0 above), it’s quite quiet.

Acoustics Comparison

Acoustics Comparison

One last parting shot of the GPU itself, just for the heck of it.

Parting Shot

Parting Shot

There are two more photos and we’ll be done for the day. TITAN installed. This is what it looks like on a Maximus V Extreme (with a water cooled 3770K) and G.Skill DDR3-2600 RAM. While we can’t show you any performance numbers, you can guess that the system is going to perform pretty well. It even looks the part.

GTX TITAN Installed

GTX TITAN Installed

The last photo shows the nice green glow from the GEFORCE GTX logo. It’s really a good looking card and that’s icing on the cake.

GTX TITAN Installed

GTX TITAN Installed

Last, I leave you with a redux of the specs and a couple side notes regarding some things I’ve seen talked about.

  • TITAN’s MSRP is $999.
  • Yes, that’s the same price as the GTX 690.
  • This is not replacing the GTX 690. The GTX 690 will continue to be produced.
  • The TITAN is an alternative, not a replacement.
  • This is not a limited run part. In the conference call with reporters, senior product manager Justin Walker said there will be “More than enough to satisfy demand for these high end gaming PCs.” The rumor mill some how got a limited production run of 10,000 units in its head.
  • EVGA and ASUS are the two board partners bringing TITAN out in North America. There may be other partners in other countries, but not in this market.
  • Availability is expected to start next week (the week of February 25th).
GTX TITAN Specifications Redux

GTX TITAN Specifications Redux

Last, I leave you with a video that should be live by the time we hit the big blue Publish button this morning, courtesy NVIDIA.

That’s all for today folks, thanks for reading. We’ll be back in a couple of days with performance numbers for you. If you have any questions at all, don’t hesitate to ask in the comments!

Jeremy Vaughan (hokiealumnus)

Tags: , , , , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Discussion
  1. Bobnova
    How many individuals actually have Tesla cards though? Seems like the vast, vast majority are in workstations in work environments. Gaming isn't exactly a priority there.

    True. My point is folks who have the cash and have multi display setups will buy this card as well as folks who do computational work and can't afford a 3k+. I'am sure video editing, rendering and folding will be great with this card.


    EarthDog
    Let me ask again.. what was said. You havent mentioned that... :)

    Sorry for the mix up as I had edited my post before and had posted all the info.

    Here's the Link to the Video @ 1:29:00 would be a good starting point. :popcorn: -Click Here for Video -

    If you notice after Ryan Shrout from PC Perspective says "From GTXJackBauer" he almost looks like he's going to laugh his ass off or something. I guess my name might have got too him and was also a 24 fanatic. :rofl:

    As for the question being answered, Tom Petersen from Nvidia said it should work very well and that he doesn't have any data yet on it. Ryan Shrout responded as well by saying to Tom that the folding client isn't usually ready for new released GPUs.

    SUCCESS!!! :bump:
    How many individuals actually have Tesla cards though? Seems like the vast, vast majority are in workstations in work environments. Gaming isn't exactly a priority there.
    Its going to be alot to most but what people don't get is its a hybrid card pretty much. A gamer who does computational work would be ideal or as to a Tesla user who also likes to game as well. Instead of blowing 3k you're better off getting this card. I guess you can say its a bargain for folks who do alot of computational work instead of a gamer when they can spend 500 instead of a 1000 which will never use the other features to it unless this thing which I believe will be a powerhouse for folding. Nonetheless if you're just gaming this card really isn't for you. You can buy something if you're just a gamer or wait for the 700 series.

    From my understanding ATI isn't coming out with anything untill the end of the year or next year so unfortunetly we might be seeing higher prices than usual thanks to the invisible ATI.

    I can see why only folding was talked about at the end and no computational questions were responded too. Why would Nvidia say this card can also be used like a Tesla. They'd be shooting their own foot.
    Nice, but 1000 $ is a lot in a GPU market where a even stronger Titan could just be around the corner. No one truly knows what kind of stuff AMD may have inside theyr bag of surprises. Such prices usualy are valid in term the stuff is able to be cutting edge for at least 1-2 years but thats certainly uncertain. AMD may have comparable stuff by end of year, so it may not even last a full year.

    So, lets say i have a true need for (which i currently dont, my current cards are fine for 1080P) then i still would not be very certain to spend 1000 $. Sure, a good CPU may cost the same, but its usualy outdated slower than that and a CPU is still more versatile for general use.

    Whats certain, Nvidia didnt fail to impress this time. At the launch of the rather average 600 series, i wasnt to impressed and Nvidia was only benefiting from AMDs inferior drivers. So thats why Nvidia at the first months after launch had an edge over AMD. However, the hardware was not superior at all, AMD simply had to fix theyr incomplete drivers first.

    Now at the soon to be released Titan, i got a much better feeling that it surely will be a real match against future AMD products. However, we still dont know what kind of stuff AMD is gonna release soon, its still far to much guessing. So it will be a interesing time this year and as usual, the GPU hardware challenge is still the most interesting and most uncertain hardware challenge by far.
    Bobnova
    Vince may know something that nobody else does, or may have cracked open drivers. I asked about it on the KPC forum, we'll see if I get a reply.

    Either that or Nvidia made special drivers just for him to use for these events. No idea why they are only doing 3-way SLI to consumers but my only guess is for quantity control.
    Vince may know something that nobody else does, or may have cracked open drivers. I asked about it on the KPC forum, we'll see if I get a reply.
    Lawl........:clap:

    Q: How does the Titan perform compared to the other cards in Folding? (For a Good Cause)"

    My question luckily got picked in the Nvidia Live Stream and was the last question.

    Unfortunetly too much gaming was talked about aside from the Crysis 3 crashes but not alot of the computation side of things. I'd love to see a Raytracing Design Garage Demo with this thing.

    Edit: Here's the Link to the Video @ 1:29:00 would be a good starting point. :popcorn: - Click Here for Video -

    If you notice after Ryan Shrout from PC Perspective says "From GTXJackBauer" he almost looks like he's going to laugh his ass off or something. I guess my name might have got too him and was also a 24 fanatic. :rofl:

    As for the question being answered, Tom Petersen from Nvidia said it should work very well and that he doesn't have any data yet on it. Ryan Shrout responded as well by saying to Tom that the client isn't usually ready for new released GPUs.

    SUCCESS!!! :bump:
    EarthDog
    Press releases said three though I thought.


    I think some revews site are wrong, or Nvidia is ? ;)

    NVIDIA SLI Technology

    Used by the most demanding gamers worldwide, SLI technology lets you link up to three GeForce GTX Titans together for astounding performance. And with NVIDIA’s track record for fast and frequent software updates, you’ll not only get the best performance in existing games, but future games too.


    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan/features
    bmwbaxter
    I think this is marketed towards gamers with multi-monitor setups who don't want to deal with microstutter or lack of scaling in some games but be able to run with ultra/maxed settings.

    Exactly and I believe so as well and have heard. NO POINT in buying this just for a sinlge monitor unless you're using DP.


    bmwbaxter
    Regardless of the fact that the vram isn't the limiting factor in most cases.

    me too, but maybe with the bar set higher we will see more games using the available vram? :shrug:

    I believe the limiting factors from the gaming industry is the existance of consoles. Someone needs to drop an A-BOMB on it all so they cease to exist.

    Sorry to all the console gamers for my "profanity". :rofl: