NVIDIA Introduces The GTX TITAN

In Greek mythology, the Titans were a race of “…immortal, huge beings of incredible strength and stamina…” It seems graphics card manufacturers are all about naming their high end offerings after gods now-a-days. The Titan is also the name of one of the fastest supercomputers in the world at Oak Ridge National Laboratories.

TITAN

TITAN

The NVIDIA GeForce GTX TITAN’s name is certainly derived from some powerful pedigree.

Specifications & Features

NVIDIA starts off their press deck with a strong claim – the world’s most powerful GPU. It’s certainly built to qualify, with 2,688 CUDA cores, a massive 7.1 billion transistors and the capability of 4.5 teraflops worth of computing power.

World's Most Powerful GPU

World’s Most Powerful GPU

Obviously a GPU isn’t nearly as versatile as an x86 CPU, but this comparison is eye-popping none-the-less. If you’re running calculations that can run on CUDA cores, it will outpace an i7 3960X by orders of magnitude.

Comparison vs. Intel's Strongest CPU

Comparison vs. Intel’s Strongest CPU

This is what a lot of people have been waiting quite a while for. The rumor mill has been churning for some time now. The rumor thread on our forum is over eight pages long now. Well folks, the wait is over. As widely rumored, the GTX TITAN’s GPU is the GK110 core. You’ve already seen the number of CUDA cores (2,688). What’s not listed is that it has 224 texture units and 48 ROPs. By comparison, the GTX 680′s GK104 GPU has 1536 CUDA cores, 128 texture units and 32 ROPs. Yes, TITAN is a beast.

TITAN clocks in with an 837 MHz base clock and an 876 MHz boost clock. Fear not, it normally runs faster than that. Exactly how much faster will have to wait a couple more days, but I’ll say that 876 MHz is a tad conservative for stock boost clock.

At that speed with so many cores, as mentioned, you can get 4.5 Teraflops of computing power. Note the number underneath that – 1.3 Teraflops of Double Precision computing power. Yes folks, for the programmers among you that enjoy tinkering with CUDA coding in your spare time, there is now a powerful consumer GPU that doesn’t cost multiple-thousands-of-dollars (i.e. Tesla) that allows you to compute with double precision. It’s important to note that selecting double precision processing is an all-or-nothing selection. If you want to dedicate your Titan GPU to double precision calculations, you’ll need to manually select that and get to work. When you’re done, if you want to go back to gaming, you’ll need to manually turn it off, or it will hurt your performance.

There’s more, but I’ll let you actually look at the specs first…

GTX TITAN Specifications

GTX TITAN Specifications

…and we’re back. With such a power packed graphics processor, NVIDIA saw fit to give people plenty of frame buffer, a massive six gigabytes to play with. People complained that the GTX 680 didn’t have enough memory, coming with a mere 2 GB to AMD’s 3 GB on the HD 7970. NVIDIA listened and tripled the amount. Not only that, it’s operating on a 384-bit bus at 6,000 MHz speeds (that’s quad-pumped GDDR5, running at 1,500 MHz). This is the largest frame buffer on any NVIDIA GPU, ever.

One thing that lots of people will be surprised about is the power consumption. The card has two power connectors, a single 8-pin and a single 6-pin. It just plain doesn’t need any more, with a meager 250 W TDP. If it lives up to its billing as the fastest GPU in the world, that’s a coup considering how little power it draws. AMD’s HD 7970 GHz Edition is rated at a 250 W TDP as well, but if TITAN out performs it by a tremendous amount, while keeping the same TDP, that would be impressive.

First Glimpse of TITAN

First Glimpse of TITAN

The slide above is a rendering, but it’s the first glimpse at TITAN. Below, you’ll see a focus of NVIDIA for TITAN. They wanted a very strong GPU, but they also wanted one that can fit the needs of small form factor builders (and boutiques). I mean, really, who wouldn’t want the most powerful GPU in the world inside a teeny tiny mITX system?

Small Form Factor Focus

Small Form Factor Focus

So you’ve heard the specs, let’s talk about using TITAN.

NVIDIA GPU Boost 2.0

NVIDIA has had some time to consider and refine GPU Boost since it debuted with the GTX 680. Introducing GPU Boost 2.0 (also referred to as GPUB 2.0 since it’s easier to type).

GPU Boost 2.0 - "Built for Enthusiasts"

GPU Boost 2.0 – “Built for Enthusiasts”

GPU Boost 2.0 has everything GPU Boost 1.0 had but adds more voltage and higher clocks. Sounds good to an overclocker so far.

Tech Details

Tech Details

There were a couple graphs leading up to this one in the press deck, but you can see how they’ve compared 1.0 to 2.0 relative to temperature in the grouped graph. Temperature is the biggest takeaway on this graph.

Faster Clocks

Faster Clocks

With GPU Boost 1.0, much to the chagrin of overclockers everywhere, clocks were capped by available voltage, which itself was locked down. It was good for preventing silicon damage, but bad for overclocking. Some people (water cooling enthusiasts, for instance) could cool their cards to quite low temperatures, but have zero additional headroom.

GPU Boost 1.0 Voltage Caps

GPU Boost 1.0 Voltage Caps

GPU Boost 2.0 pulls the rug out from under hard voltage control. Now the GPU controls its boost clock by temperature. In the simplest terms, the GPU will use the highest voltage available (now higher than the GTX 680) in conjunction with the highest stable clocks until the GPU reaches the default set temperature (80 °C). Only then will it reduce the boost clock.

GPU 2.0 Voltage Stock

GPU Boost 2.0 Voltage Stock

In addition to that, NVIDIA is also allowing some overvoltage of the TITAN GPU. It’s not a lot (you’ll get the actual number in a couple days), but it’s more than zero. You have to accept that you may impact the long term reliability of your GPU, but once you’re through that mandatory dialog, you get access to some overvoltage control. This is optional for board partners to include; they don’t have to include it. However, based on the fact that EVGA and ASUS are the two partners bringing TITAN to us in North America, I’d say it’s a safe bet they’ll give us that control.

GPU Boost 2.0 Overvoltage

GPU Boost 2.0 Overvoltage

With temperature in control instead of an arbitrarily selected ‘safe’ voltage, there is a not insignificant increase in frequency capability. What damages silicon is voltage combined with temperature. Previously NVIDIA only allowed voltage that was safe for long term reliability, period; no matter what temperature conditions existed. Now they’ve lifted that hindrance and let temperatures control the boost.

Performance Increase With GPUB 2.0

Performance Increase With GPU Boost 2.0

Of course, the curve shifts a hair when a bit more voltage is introduced.

GPU Boost 2.0 Performance Increase  With Overvoltage

GPU Boost 2.0 Performance Increase With Overvoltage

Only under these conditions can you adjust the max GPU voltage.

Max GPU Voltage Optional

Max GPU Voltage Optional

Overvoltage isn’t the only thing that you need to adjust with TITAN. Since temperature is the new metric by which your clocks are controlled, you need to use temperature targets to help decide where your GPU will end up.

Using Temperature Targets

Not only can you add a bit more voltage, you can also help your clocks along if you’re willing to tolerate higher temperatures. Overclockers have been doing this manually for years. They know the max temperature they can accept on their particular hardware and they adjust the cooling, voltage, and overclocks to reach that happy medium. It’s been this way for a long time, but this is the first time the temperature directly affects your ability to get higher clocks.

Pick Your Target Temperature

Pick Your Target Temperature

If you move your temperature target higher, it may move your clock curve even higher.

The Difference Temperature Can Make

The Difference Temperature Can Make

There are a few items in advanced controls that we haven’t gone over. You know about raising max voltage and adjusting temp target. There is also a Prioritization control. You can tell your GPU to prioritize either frequency or temperature. If you tell it to prioritize frequency, it will act like you expect based on everything above, sticking to the base clock no matter what and clocking up with boost as headroom is available.

However, say you want your GPU to run at a cooler temperature, frequency be damned. You can set your temperature target to, say, 65° C and then set the GPU to prioritize temperature. With GPU Boost 1.0, once the GPU banged up against its limits, it would throttle down, potentially to the base clock. With GPU Boost 2.0, if your GPU hits its limit (temperature now, which you’ve selected to prioritize), the GPU will throttle itself until it cools down to the temperature you’ve set, potentially even below the base frequency. So now you’ve got more control over your GPU, up and down.

Advanced Controls

Advanced Controls

The last item to talk about is display overclocking. This is a hit-or-miss operation. Some monitors will do it and others won’t. But with TITAN, NVIDIA has introduced the ability to overclock your display a bit. Most monitors operate at 60 Hz refresh rates (new 120 MHz monitors notwithstanding). Even if you’ve got a very powerful GPU pushing much higher framerates, the video output of the GPU is still only putting out 60 Hz (which is basically 60 frames per second).

Like any overclocking, display overclocking may or may not work and I have no idea whether it will have any effect on your monitor, positive or negative.

Meet the NVIDIA GTX TITAN

Now we get to the main event, TITAN itself, in the flesh. 10.5″ of GPU goodness. Its heat sink is similar to that on the GTX 690. It is clad in aluminum and there’s even a nice looking window into the fins on the heatsink.

NVIDIA GeForce GTX TITAN

NVIDIA GeForce GTX TITAN

NVIDIA GeForce GTX TITAN

NVIDIA GeForce GTX TITAN

It’s definitely a good looking GPU. As an added bonus, the GEFORCE GTX logo glows a pleasant green when the GPU is in use. Depending on the board partner’s software, you can even control it; turn it on, off or have its glow correspond with the load on the GPU.

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

TITAN

Here are the aforementioned 6-pin & 8-pin power connectors.

Power Connections

Power Connections

The video outputs are standard fare for current generation NVIDIA GPUs, two double link DVI, an HDMI, and a DisplayPort 1.2 output.

Video Connections

Video Connections

The exterior certainly looks like a winner. Let’s look at the PCB.

Under the Hood

Here’s the back of the PCB, on which you can see half of the video RAM.

Rear of PCB

Rear of PCB

This photo is here for one very important reason: To dispel any rumors that this is the GTX 780, or in any way part of the GTX 700 series. The actual name of this card is the GTX TITAN and its SKU number begins with 699. Put those rumors to rest folks. No GTX 7xx here.

Nope, Not GTX 780

Nope, Not GTX 780

Here’s a closer view of the back of the socket area.

Rear-Mounted GDDR5

Rear-Mounted GDDR5

The chosen RAM provider for the TITAN’s frame buffer is Samsung.

Samsung GDDR5 Memory

Samsung GDDR5 Memory

Regrettably, I did not have a torx in the requisite size to disassemble this card and this thing hasn’t been in my hands long enough to acquire one (since Saturday morning to be precise). So the only disassembled PCB photos come courtesy NVIDIA.

TITAN PCB

TITAN PCB (Image Courtesy NVIDIA)

TITAN PCB

TITAN PCB (Image Courtesy NVIDIA)

The power section isn’t massive on TITAN. The GPU is powered by a six phase power section, with two additional phases dedicated to the 6 GB frame buffer, for a total 6+2 power section. While it can be over-volted, it might not be too wise to go volt modding these with the stock power section.

The cooler does a good job of zeroing in on the temperature target (in conjunction with the GPU itself of course). When you adjust the fan profile to be a bit more aggressive, it does a solid job of keeping temps down.

Vapor Chamber Cooler

Vapor Chamber Cooler

Vapor Chamber Cooler

Vapor Chamber Cooler

Since the photos from the press documentation started leaking, I’ve seen some mention the fins on the rear of the card behind the fan, wondering what they cool. The answer? Nothing at all. I’m not quite sure why they didn’t close that off. It does allow air to leak from the fan back into the case.

High Performance Thermals

High Performance Thermals

NVIDIA is quite proud of this card’s acoustics and for good reason. Its default fan profile is very quiet. Impressively so, even when temperatures cause the fan to spin up. Even after adjusting for a more aggressive fan curve (I prefer lower temperatures; they lead to higher clocks with this card, see GPU Boost 2.0 above), it’s quite quiet.

Acoustics Comparison

Acoustics Comparison

One last parting shot of the GPU itself, just for the heck of it.

Parting Shot

Parting Shot

There are two more photos and we’ll be done for the day. TITAN installed. This is what it looks like on a Maximus V Extreme (with a water cooled 3770K) and G.Skill DDR3-2600 RAM. While we can’t show you any performance numbers, you can guess that the system is going to perform pretty well. It even looks the part.

GTX TITAN Installed

GTX TITAN Installed

The last photo shows the nice green glow from the GEFORCE GTX logo. It’s really a good looking card and that’s icing on the cake.

GTX TITAN Installed

GTX TITAN Installed

Last, I leave you with a redux of the specs and a couple side notes regarding some things I’ve seen talked about.

  • TITAN’s MSRP is $999.
  • Yes, that’s the same price as the GTX 690.
  • This is not replacing the GTX 690. The GTX 690 will continue to be produced.
  • The TITAN is an alternative, not a replacement.
  • This is not a limited run part. In the conference call with reporters, senior product manager Justin Walker said there will be “More than enough to satisfy demand for these high end gaming PCs.” The rumor mill some how got a limited production run of 10,000 units in its head.
  • EVGA and ASUS are the two board partners bringing TITAN out in North America. There may be other partners in other countries, but not in this market.
  • Availability is expected to start next week (the week of February 25th).
GTX TITAN Specifications Redux

GTX TITAN Specifications Redux

Last, I leave you with a video that should be live by the time we hit the big blue Publish button this morning, courtesy NVIDIA.

That’s all for today folks, thanks for reading. We’ll be back in a couple of days with performance numbers for you. If you have any questions at all, don’t hesitate to ask in the comments!

Jeremy Vaughan (hokiealumnus)

Tags: , , , , , , , , , , , ,

136 Comments:

wagex's Avatar
hokie no benches??!??!?

cant wait to see some

p.s. good write up
briansun1's Avatar
ooooooooooooh shiny!!!... must have
txus.palacios's Avatar
Too expensive for me to justify getting one, but delicious nevertheless.

I knew you had one Jeremy, you always seem to have sweet nVidia-related stuff.

Thanks for the preview, sir.
mxthunder's Avatar
Looking forward to see how this beauty peforms. Awesome stuff, glad to see awesome build quality and compute power back to the table.
Pierre3400's Avatar
Built for Enthusiasts - I was hoping it was made for mother so she could play soltiar!

Dammit, if your even looking at this card, you know more or less what your looking at! Price is insane, and limited to 3way, thats gonna take the fun away from at least 1 or 2 people with a bigger bank account then sense.

I have seen a guy on a another forum claim he wants 4 of them!
bluezero5's Avatar
O.M.G.
I want to SLI the hell outta that !!!


AMD, your move now.
Janus67's Avatar


That is all.


I can only imagine what the single-GPU records are going to look like a couple weeks from now on the bot. It's almost like cheating!

Now the question is - does it scale with cold!?
bluezero5's Avatar
and I need water blocks made for this, NOW. LOL.
ivanlabrie's Avatar
I'm guessing it will clock like a hardmodded 680...reference one. Unless EVGA comes out with a classified model
Insane msrp, I can't justify it. 8800 Ultra all over again, I can't believe it
madhatter256's Avatar
FOLD it!

I'm surprised there is no heat spreader on the back of the card.
hokiealumnus's Avatar
I can't find them for sale yet, but EK apparently already has them, because OriginPC announced their new liquid-cooled tri-SLI monster.

123489

123490

You and me both. EVGA & ASUS may make the decision to include one. It does get quite warm. Nothing that some normal case airflow won't cure, but on a test bench that didn't have a fan moving air in that vicinity (which was quickly remedied), it got nice and toasty.

I don't even know how to fold with GPUs any more. Should look into that.
Bobnova's Avatar
I'll be interested to see what the new improved hard voltage cap is.
Janus67's Avatar
that is one crazy computer.


For folding it's never been easier - install FAH v7 and be done with it.
bluezero5's Avatar
or better yet, I hope it is fully unlocked.
Bobnova's Avatar
From reading the article, it sounds like it's still locked, just a bit higher.
hokiealumnus's Avatar
Bingo, and it ain't much. I've just been told there's a new version of the software on the site. Maybe there will be more. We'll see.
EarthDog's Avatar
Im not sure if the new client supports it yet for F@H.. doubtful.
bmwbaxter's Avatar
Hopefully they are faster with this than they were for the 680.
EarthDog's Avatar
Not sure... they havent yet optimized the client for Kepler, none the less this thing... 680 barely beats a 580... bleh.
bmwbaxter's Avatar
I am glad to hear this isn't a limited production item. Although with such a high price I don't think many people would buy one.
EarthDog's Avatar
$1000 for this thing is a bit much to me. It had better match a 690 or spank 2 7970's at that price! If not, I wouldnt touch it and just go multi card route... oh and have a 2560x1440+ monitor or multiple monitors to snag one of these presumably bad boys. I wonder how m uch, if any CPU limited it is at 1080p?
dtrunk's Avatar
kingpin hasn't put up some bench numbers one of these yet?
Bobnova's Avatar
He can't, there's an NDA.
That's the downside to getting free hardware, you have to abide by the NDAs.
Janus67's Avatar
Just more time for him to tweak his 3d01 scores!
Bobnova's Avatar
Betcha a decent stack of cash Titan sucks at 3d01
Culbrelai's Avatar
AMD has no shot at beatin this, although how badly would this beat a 690? Certainly not by much if theyre the same price...
Janus67's Avatar
They probably don't, but again this is out of their price barrier by a significant margin. I look at AMD not putting out a competitor to the 3960X/3970X processors from Intel as a corrolary to this. The total amount sold is so relatively small that they are better off spending the R&D to sell 100:1 cards at $300 rather than $1000. (granted that ratio is made up, but I wouldn't be surprised if it were close to that)
Culbrelai's Avatar
What about that new 1500$ limited edition ASUS monstrosity?
Be interesting to see that vs this
Pierre3400's Avatar
Ah all is right with the world now that oc.com's biggest nvidia fanboy has spoken.

It wont happen this year, it might not happen next year, but guess what, one day this card will history, beaten and slow.

Now, that a side, AMD are not putting any new cards out this year, so, its pretty obvious to even a girl that only cares about her looks, that AMD wont be beatin it (anytime soon) By the looks of things, nvidia could also be wanting cut down on devlopment costs for a few year by making one finaly over the top product.
Culbrelai's Avatar
Lol im not the biggest NVidia fanboy I have one GTX 670 go find someone with two 690s =P

Thanks for the compliment though, one day I hope to work for nVidia, perhaps programming drivers, help them keep beating AMD in that way too lol.

Of course, tech changes so fast anythin will be outdated soon.

The 8xxx series isnt coming out this year? I thought it was. Guess this will be the Year of nVidia
madhatter256's Avatar
From what I read, AMD doesn't have anything in this market for this fiscal year...

Aren't they accusing of nVidia from stealing company secrets?
trekky's Avatar
ya soo annoying free $1000 GPU and cant even bench
hey for 1k gpu i wouldnt mind not benching for a few weeks

i still think AMD not releasing 8xxx is a wrong rumor i hoping they will release something cause i love AMD cards
(go BTC!)

no the people with two 690s have too much money...... oh btw AMD FTW
Bobnova's Avatar
AMD could do it exactly the same way Nvidia made the Titan. Picture a 280w TDP 3900 core AMD Zeus. Bye bye Titans.

This will likely lose to a 690, just count the cores. 2688 vs 3072.
What it won't have is microstutter or SLI compatibility issues.
EarthDog's Avatar
Hey hey!!! SOMEONE is using their thinking cap!! ^^
Pierre3400's Avatar
Why do i always get the sense that your 14? You have proved before that you have little if any knowladge on how the world econmics work, not that this would stop you from getting a job at nvidia, i honestly from the bottom of my heart hope they hire you.

Oh and btw, just cos you only have 670, and not 2x690, has nothing to do with being a fanboy.
hokiealumnus's Avatar
Pierre, you're not only driving this off topic, you're getting quite close to personal attacks. I'd highly recommend you tone it down.
thebski's Avatar
I'm really interested to see the benches, and mainly the overclocked benches on this card. It sounds like it could be a pretty good overclocker with the voltage tweaking. I don't remember which site, but one got theirs to 1176 MHz on air and Tom's has hinted at running 3-way benches with GPU clock rates in excess of 1100 MHz. That's a 35% OC over the stock 876 MHz boost. That might actually put this thing ahead of or close to the 690.

I also wonder how well it will OC on water, or if it will be like the 600's where temps were far from the limiting factor on an overclock in most cases, rendering an expensive water setup relatively useless.

I would love to SLI a couple of these bad boys and attempt to run a 5760x1080 120hz 2d Lightboost setup, but I play BF3 a lot and as we know that game is so CPU bound. If I were to pull something like that off it would probably take a 3770K running at 5.2+ Ghz, and that would be ridiculous and probably impossible to find. It would still probably be CPU limited.

Someone tell me that Haswell's will be able to hit like 5.5 GHz to help drive this insane GPU power.....
hokiealumnus's Avatar
I hinted at it in the article, but 876 MHz is a conservative number for stock boost. It runs well north of that. 1100 MHz isn't as large of a jump as it may seem at first.
bmwbaxter's Avatar
The thing is, I don't think AMD has anything close to this size. This is based off of existing HW (tesla) I don't think AMD has anything to draw from and in a year where they might not be releasing a standard line up it is unlikely they will make one.

But I could be wrong.
thebski's Avatar
Yes you did sir. I am fidgeting in my chair thinking about putting this boss under water and the real performance that could be squeezed out of it, lol.

I know you probably won't/can't say much in this regard if you even have any information on it, but how likely would say it is that we will see variants of the Titan (aka FTW, Classified from EVGA, etc.), or can we pretty much expect a Hydro Copper version at most akin to the 690?
hokiealumnus's Avatar
Truth is, I don't know the answer to that question. For right now, I'm relatively certain we can expect reference only models. There's no telling whether EVGA or ASUS will do anything extra with it down the line. With the (very) small market for a card at this price point, I'd expect the willingness to focus much extra engineering on it (as opposed to the GTX 680 / HD 7970, whose price can take a little increase due to customization) to be rather low.

NVIDIA can confidently say they'll have plenty to meet demand. Probably because there aren't all that many people that'll drop a grand on a GPU.

Also, temps will make a difference, but not a massive amount. You're still voltage limited. The voltage increase available isn't large by any stretch of the imagination. With a custom fan profile, temps stayed well under the temp target I set. The card crashes from too-high overclocks long before it reaches temps worth worrying about.
Pierre3400's Avatar
Im sorry, will be done.

Never the less, it was said that this yearly dealine of having to push out new and better GPU's was putting a strain on both nvidia and AMD.

This card is non the less insane for a single GPU, i have even thought about selling my 2x 7970 and buying the Titan.

But the price of this card are just so high... So high that one could assume they have left it room to drop in price over the next 2 years?
Bobnova's Avatar
From what I've heard from various (non-NDA) sources Nvidia has said "No non-reference!".
It makes sense when you consider that the silicon alone is probably $200.
thebski's Avatar
That's kind of what I figured. In my experience with the 600's, water was good for another 20-30 MHz over air. I could make it through benches with my 3 670 FTW's at 1345/1345/1320 on water where I could only make it at 1320/1320/1306 on air. Just a drop from 65C (air) to 40C (water) increased the max "stable" clocks.

It looks like it will be similar on this Titan, except they do give you the option to increase the voltage slightly on your own. Basically, they're giving people the option to do the 1.2125V bios flash that was available on the 600's without voiding their warranty.

The only reason I asked about variants of the card is because it sounded like nVidia was leaving it up to the manufacturers on what the voltage limitations are. The Anand's article made it sound like EVGA, Asus, etc. could set the volt max wherever they wanted. I didn't know if a non reference version may come with a higher vmax than the reference card, but I suppose time is the only one with the answer.

If I can expect to put one under water and get at least near 690 performance with the beefed up memory sub-system then I can see spending $1K on it. We'll see what the OC'ed benches look like
Bobnova's Avatar
I seriously doubt that. I expect that Nvidia has left them the option of turning off voltage control, or leaving it on.
Note the presence of the same daughterboard for the VRM as the 680 has. Betcha it has VID6/7 tied to ground just like the 680, locking it at the same 1.2175 max VID.
hokiealumnus's Avatar
Anandtech let the cat out of the bag on available voltage. I erred on the side of caution about giving an actual number. Their max available voltage was the same as my max available voltage. The software "update" I received today changed the slider you see on Anandtech from an xxxx mV to 1200 mV slider to one that reads +0 mV to +xx mV (still erring on the side of caution ).
thebski's Avatar
Just curious hokie, what do they do to you if you violate NDA? I mean, someone from nVidia themselves violated their own NDA by posting the graph with performance numbers on their website.

I'm sure it varies on degree of violation, but I've always just wondered because it seems the NDA's do a very good job of keeping a whole lot of people very quiet so the penalties must be somewhat severe.
Boulard83's Avatar
If you have a product before eveyone, lets say for reviews, and you publish your review before the NDA, the companie that sent your the product might just stop sending you any new product in the future.... so you'll have to wait for the real product release date to make your review, days after those who still have product before release date to prepare the review.
Seebs's Avatar
Titan... Since these companies seem to have a "liking" for mythological names...

Didn't the Titans get obliterated by the Olympians? And who are these Olympians you speak off? Well; since you asked. Zeus (Greek mythology) or Jupiter (Roman mythology) and their friends took down the Titans and then their offspring, among them Ares (Greek) aka Mars (Roman) went on to rule the world.

So this Titan thing won't beat the Ares II or the MARS III in anything. Perhaps only in being the "most expensive" card on a per-core basis.
hokiealumnus's Avatar
Well, I suppose the worst thing that can happen really is that they stop sending you hardware, so you lose the ability to have zero-day reviews.

However, the main reason I'll never break an NDA (nor will anyone on this site) is that we gave our word. That still means something to some people and our writers and editors are proud to be in that category. We have never broken an NDA and don't plan on ever doing so.
ivanlabrie's Avatar
No more free hardware?
Bobnova's Avatar
That really depends on the NDA.
You sign an NDA, physically. It's not just a checkbox. It's a legal contract.
Most of them if you violate them you expose yourself to nice juicy lawsuits by the company you contracted to NDA with.

If I recall the last one I signed, they can go after me for many thousands, hundreds of thousands, of dollars. Plus of course I'd never get a sample from them again obviously, and likely wouldn't get any NDA samples from any other company either, as I'd have proven myself to not be trustworthy.
thebski's Avatar
Well I can certainly appreciate honest people and commend you guys on being good for your word, something that is all too lost in today's society.

Bob, I figured there had to be some legal binding too it. Even though you guys are trustworthy, there is a mass of people that sign these NDA's and not all of them can be that trustworthy. I was just curious.

Can't wait for it to lift so we can get the true story and end all the speculation!
txus.palacios's Avatar
Alas, I doubt AMD will go olympian and blow NV out of the water. I'd love to, though. I no longer have anything against AMD, as they're improving driver-wise on Linux.
hokiealumnus's Avatar
Truth be told, I've never physically signed an NDA for any hardware. I've agreed to adhere to an NDA date & time via email and most times not even that.

There are a limited number of samples to go around. There are a LOT of places that would like to get their hands on pre-release, NDA hardware. If a site is big enough to qualify to get on the 'list', as it were, for NDA hardware and they finally do get on it, they aren't likely to violate the NDA for a few thousand extra page hits you get for publishing early, when the penalty is getting dropped from the 'list', likely permanently. It's just not worth it.

These places you see breaking NDA for the most part aren't ever under an NDA (think WCCF Tech and their ilk). They get hardware from people they know that work in / around / know billy jo bob that works at the factory in Taiwan.
Seebs's Avatar
Thing is AMD already has its "Olympian" champion... The Ares II. Sure; they're apples and oranges since nVidia's is a single core and "supposedly" mass market product whereas the Ares II is a dual core, limited edition card made by a vendor and not AMD itself. Still; when spending the equivalent of a good computer on a GPU alone; it had better be that much better than the next one down... And I just don't see the Titan being three times as fast as a HD7970, twice as fast as a GTX680 or even 1.5 times as fast as a Mars II.

Basically; this thing is just lik the Ares and Mars lines of cards, but without the "panache" of having a # of 1000 marking on it. So you're paying "Limited Edition" prices, but not getting "LE" status or even "LE" performance.

To me... It's a "bleh" type of release.
Boulard83's Avatar
Tweaktown had hard time with Nvidia in the past and if you vist tweaktown today, they dont have a Titan preview ... they lost alot in this "little war".
Robert17's Avatar
Pretty nice card. Not having $1k to throw down on it leaves me just sayin' "OOOO, AHHH". But from another perspective, maybe it will shave a couple of bucks off the prices of so called lower tier cards, silver linings and all.

Just the same, if you don't have to ship it back, maybe it could behave like the Stanley Cup, everyone gets to play with it for a month????
hokiealumnus's Avatar
They complained about it on Facebook last week actually. For the past three plus NVIDIA releases, they have obtained just such cards in just such a manner from their friends in Taiwan and then published the results before every other site's NDA time and date, spitting in NVIDIA's face for not including them in the sampling. It seems hardly a winning strategy, but that's what they did. NVIDIA just did a good job keeping them locked down in TW this time, so TT couldn't get their hands on one. I don't know what the initial beef between NV & TT was, but TT made it immeasurably worse by taking the path they did.
Boulard83's Avatar
^^ totally agree.
Culbrelai's Avatar
So wait, this is only 3 SLI? Why not four? Why an odd number? Weird.

EDIT: and what Telsa is this based off of?

The K20?
Janus67's Avatar
The "x" is for Titan!
ivanlabrie's Avatar
lol
All this leaks messed up my mind...
Bobnova's Avatar
It's a K20X Tesla card with (horrendously) gimped double precision floating point performance and a higher clock speed.
EarthDog's Avatar
I believe you can also enable/disable that DP feature by enabling/disabling it in the NVCP. Something like 1/24th DP without that option enabled...

EDIT: link -> http://www.anandtech.com/show/6760/n...titan-part-1/4
Bobnova's Avatar
Very interesting! Nice to see, really.
Also interesting on the ECC part, as GDDR5 has some ECC by design. I assume that they're talking about extra ECC stuff, maybe including the caches inside the core and maybe the data while it sits in RAM (not sure if that's covered by normal GDDR5 ECC or whether it's just transfers).
ivanlabrie's Avatar
Max voltage without hardmods would be...? 1.2v?
Or it will depend on the vendor?
So many questions for a card I can't even dream of buying till next year lol
Bobnova's Avatar
Based on what I can see on the PCB, there is almost certainly an absolute hard cap (without a custom i2c device, anyway) at 1.2175v.
It could be gotten around via hardmods, and a manufacturer could build a software controlled hardmod into the board, but that'd probably be the last Nvidia card they made.
Boulard83's Avatar
^^

I doubt Nvidia will allow EVGA and Asus to build costum card with this "jewel" but i would really like to see what Asus and EVGA could build with such a GPU base.

Imagine a Asus Titan Matrix with 16 or 20 phases phases power desing and a higher Vgpu allowed !!!!! Or even a Dual GPU card based on titan ....
ivanlabrie's Avatar
Yeah, I thought so too Ed...Me thinks I'll grab some of the used 680 lightnings people will sell for peanuts later on xD
wingman99's Avatar
Good introduction, looks like it's a little pricy like the rumors said, cant wait to see the benchmarks.
trueblack's Avatar
this is very very pleasing to the eye.

so... is this the rumored GTX 780?

or they calling it a different line entirely?
hokiealumnus's Avatar
It's just GTX TITAN. No number. It is not GTX 7xx anything.
Bobnova's Avatar
From what I've heard the 780 will be a much smaller refresh of the 680. Think 480 to 580, if we're lucky.
GTXJackBauer's Avatar
Thats exactly whats happening.
SupaMonkey's Avatar
Again, they insist in putting a vapour chamber on it.

If 80degC is an important number i would like to see how one of these goes with an Arctic Accelero Extreme on it. They are almost silent at full speed and will keep my card below 65 even overvolted and overclocked.

This is a 5 year GPU, surely?
Bobnova's Avatar
Those cost a lot more than a vapor chamber. Then you'd be looking at a $1050 card
Probably not dual slot, either.
SupaMonkey's Avatar
Haha, that $50 would really hurt.
nightelph's Avatar
It's SKU begins w/ 699.. But yeah, the core is from their Tesla line, so I don't really get why it says GTX on it at all. Did I mention I want one?
GTXJackBauer's Avatar
I've been reading around with people complaining about the price tag. This card is a hybrid GTX and TESLA AIO. Its freakin amazing for cuda programmers etc who need tesla power and those who want to disable DP and start gaming. You can't take a typical GTX card and do the fast computations/work as a tesla does and you can't take a tesla as the GTX will outperform it in games. You basically got a Titan "Supercomputer" single GPU with the best of both worlds. I read everywhere with people freaking out about the price. This is a special card and not a top tier typical card from Nvidia in the $500/$600 range.

I think Nvidia released this to show that they still have it. meh
trueblack's Avatar
TBH $999 for a single card which tops everything...

not that expensive at all.
I am thinking this will be on par with GTX 690
bmwbaxter's Avatar
I really like how the GTX Titan Logo lights up on the card. I always thought that was missing on the 6xx series. It would be nice to have it change color with load like the Matrix cards do.
Pierre3400's Avatar
Im pretty sure the 690 does? My old 590 did..
EarthDog's Avatar
Tops everything but a card that cost the same and $200 less? I mean, we dont know anything outside of the rumors, so assuming that is true, it has been said that it cant beat a 690 or 7990 or 2 7970's or 2 680's.
bmwbaxter's Avatar
but I would bet it can easily beat a 680 or 7970 and come within 10% of a 690 or 7990 but with WAY more vram per core which is a complaint some people have with the current Dual GPU cards.

Also a TDP of 250W is lower than any other GPU combo with this much punch.

The performance per watt is going to be far better than anything else out there ATM. So SFF people who want big performance this is still going to be a great card. (Still a costly card )
EarthDog's Avatar
It better spank a 680/7970 for that price! From the looks(rumors) of it, that card, is a great single GPU solution especially for those with 2560x1440+ or a multimonitor setup due to the vram and its horsepower. It was practically made for the Korean monitors with its refresh rate increase too (so says anandtech too).

The only people that should be complaining about vram on today's cards are those that have 2560x1400 or MMonitor setups otherwise, 2GB/3GB is fine for 1080p.

I personally do not care about performance /watt, but know some people do. Think about it though... You are paying $1000+ for this card, when you can have 2 7970's or a 7990 for $800. It will take one a while to make up that $200+ with your electric bill savings. Even if you play games 4 hours /night for 365 days you wont make that cost up for a couple years (depending on your power rates of course @ my rate, 10cents /KW /hour, that will take over 2 years to make up the cost if you game that much,which is a TON, to me).

While I think this will be a monster, its price is just so off-putting to me. Note this is coming from someone that really dislikes CFx/SLI setups outside of 2560x1600+ or MMonitor setups.
bmwbaxter's Avatar
I also agree that the performance per watt isn't that big a deal, and that the VRAM isn't an issue at 1080p. But I don't think this is marketed for that. Multi monitor/ high res all the way.

Just figured some of those things needed to be brought up. Especially since some forum members go nutty about performance/watt.
Bobnova's Avatar
The comparison to two 7970s is going to be crucial, as those have plenty of RAM for even multiple displays and cost $200 less.
I'd be interested to see it vs two 7950s for that matter.
bluezero5's Avatar
I agree with this.

the comparison to 2 x 7970 Ghz will be very crucial here.
If it can 'come close' being a single card, than that will be pretty big, as you can imagaine what a Titan-SLI can do then.
Boulard83's Avatar
IM want to see this kinda comparison in surround, mostly at this price point.

Titan vs 7970 CF vs 7950 CF vs 680 SLI (4gb ?)

IMO, 7950 CF is still the best bang you can get for your buck if you run surround/eyefinity. You get the same Vram as the 7970 with a very little slower core. At ~600$ for 2 7950 ... its damsn hard to beat. You could even build a 3fire 7950 for the same price as a Titan but you sometimes start to fall into some scalin issue at more than 2 GPU with some title.

GTX 680/70 dont really have what it takes to run really high resolution 3 monitor setup. 7970/50 have more Vram power and the price is very competitive but CF/eyefinity dont seems to have more trouble VS SLI. I have friends that run CF 7950 and eyefinity and the sometimes have issue with the setup and some games. I never had with my past SLI and surround. And my single 670 never give me any sort of trouble.... Ill swap my GPU soon .... i want to see excellent result !
ivanlabrie's Avatar
You're forgetting the 7870 LE cards...Tahiti core, albeit castrated, but performs similarly to a 7970 oced and can be had for 230usd atm. Two of those would cost 460usd
Boulard83's Avatar
But it dont have the same memory interface.
EarthDog's Avatar
How quickly one forgets that you can overclock.........the 7950, the 7970, and the Titan. It really irks me to hear something like that I have to admit (you can overclock X, yet not mention/forget that you can overclock Y and Z). So essentially, the song remains the same it just depends on how high you can overclock those specific cards.
Vsalvis's Avatar
Def considering one of these for my next build later this year, but i will wait for reviews, benchmarks, etc.
Seebs's Avatar
The more I look at this thing, the less I can understand who the target market for it is.

Is this a Tesla core with GTX capabilities? Or a GTX core with DP computation capabilities.

Either way; I don't see much overlap in the two markets for these... Can't really imagine a bunch of biochemists disabling DP on a few cards at the research lab for a few hours so they can play some Crysis 3.

And I really can't figure out what on earth would Johnny "I have way too much money" Smith ever do to put the DP capabilities of the card to use.

So we have a $1000 card that does two things, but it probably won't do either thing that much better than the "dedicated" counterparts to warrant the price hike.

If I were a gamer (I am) and had $1000 to spend (I don't); there would be lots of options for me to choose that would spank this card in its behind. Some would even cost me less than $1K.

If I were someone interested in the use of its DP computational prowess (F@H don't count because as far as I know; they haven't even optimized clients for Kepler so the wait would be long for that); I'd probably be working at a facility that has no need for the GTX part of the card so they'd go with full Tesla cards instead of gimped Tesla/GTX hybrids.

Again... This is one of those products that are more of a "status symbol" than a practical upgrade, but that too has problems because apparently it is not a "Limited Edition"... So out the door goes the whole status aspect of it. Again; paying "LE" prices, but getting mass market product and one that won't beat the "LE" products out there today... Argue all you want, but I find it difficult to see this thing beating the Mars II or Ares III. It will beat the "normal" GTX680s and probably even the normal HD7970s, but those are not the competition for this thing.
Janus67's Avatar
Great post seebs. Completely agreed with you. To me it (at rumored power specs) seems to be more of a bragging right for nvidia to say they have the most powerful card, but I can't see them moving very many units.

As an nvidia stock holder I honestly don't think this will do anything to increase my stock's value
bmwbaxter's Avatar
I can see this being an asset to anyone wanting the max vram per core available. there are some 6gb 7970's out there, but they retail for +$600 and will get wrecked by a titan (my guess) and a CFX of them will cost more and only be maybe 10% faster(my guess). I think this is marketed towards gamers with multi-monitor setups who don't want to deal with microstutter or lack of scaling in some games but be able to run with ultra/maxed settings.
EarthDog's Avatar
Serious question, who would that be, and why would they want that? Is this thing strong enough to push single GPU core on 5760x1080p?

I mean we have seen funsoul's results for vram at that res, and not many games now break the 3GB mark. 6GB is obnoxious! Id rather see 4GB on a 512 bus or something than 6GB on 384.
Seebs's Avatar
But will it really spank a HD7970 GHz with 6GB of VRAM ($579 at newegg right now)? I just don't see it doing that. It may beat said card, but will it be $400 better than that card? I see this selling to people that just want to say I have a Titan. I know a few guys that bought the Mars II and the Ares I when they came out and the only thing they used them for was "regular" desktop use... No heavy gaming or anything that even pushed them cards... They got them because they wanted to post pictures of their systems "sporting the LE cards" on the boards... So it was an e-peen sort of purchase. At least those cards were true LE and had a 10 of 1000 marking on them. With the Titan; it's LE pricing without LE badges and performance that we don't know about.

If I'm spending 40% more money for something; I had better be getting 40% more out of what I bought and I truly think that said 40% cost hike on the Titan goes in the "Tesla" part of it so a "gamer" would be spending the extra $400 on something he/she may never use.... Dumb move in my opinion. And someone interested in using the card for its DP side would be much better served by going full Tesla instead of getting a hybrid card like this one.

Even if I had more money than sense; I'd have enough sense to see this one for the "gimmick" that it is and move on to better options.
bmwbaxter's Avatar
Good question, I am not sure who that would be. I remember when the GTX 680 came out there were a good number people talking about the superiority of 7970 just due to the extra vram. Regardless of the fact that the vram isn't the limiting factor in most cases.
me too, but maybe with the bar set higher we will see more games using the available vram?
Bobnova's Avatar
Seriously doubt it. It'd be a stupid game manufacturer that limited itself to kilobuck GPUs.
trueblack's Avatar
oh this card is clearly a bragging card from nvidia no doubt.

market value is one thing.
they are just showing they are on the top of the food chain.
Pierre3400's Avatar
I happen to have 2x HD7970 Ghz 6Gb. I have hit 16000p in 3dmark11 with then overclocked on air. They are out scored by the Titan, and yes for the price of 2 of them i could have one Titan, and money left over. But meh, im happy. My cards will still be great in combo in 3-4years.

I cant really speak for the Titan, as i have yet to see 3dmark11 scores on performance level. But i would hope that i beats the 7970 6Gb, otherwise its even more overpriced.

I can max out anygame, with 2 7970s, and anyone with 2 of them, even with 3Gb have no trouble maxing out games.

Personally i DO get above the 3Gb mark in a certain game.

The one true difference i really see that is going to handicap me when it comes to OC, is locked voltage, and ontop of that, im also locked down to universal WD, as all block makers point blank refuse to make waterblocks for this card, even thou, there are many looking for them, mainly to have the 7970 and the 6Gb vram under water. But as far i can find the only non REF cards gettings full blocks are the MSI lightning I have even bitched at EK, cos i want 4 of these badboys.

Edit:

I see that the perforamnce figures are up, and my 2x7970's score better then one Titan, both at stock clockings. I presume a regular 7970's would score around the same.
madhatter256's Avatar
In this economy, a $1,000 gpu for <30% overall improvement (aggregate from all performance ratings) seems a bit far-fetched, but then I wouldn't be surprised if the video card drops to $500 by January 2014... and will still be the top of the line card.
EarthDog's Avatar
IMO, no way will this card see close to $500 in 11 months. Think about it... AMD isnt releasing anything else this year according to them and it really doesnt matter what Nvidia releases to be honst...which leaves this card to sit on top. I would imagine it can come down to $800 or so by then to match 7990's...?

No clue.. all speculation.
madhatter256's Avatar
True. And considering TSMC reported price hikes on fabrication, which make the chips for nVidia and AMD, prices will hardly drop. That $500 was a speculation if the card fails to sell well over the year.
EarthDog's Avatar
Caveats or not, (IMO) no chance will be it close to $500 in 11 months.
EarthDog's Avatar
And I desperately hope that, with a miracle of miracles, comes true.
GTXJackBauer's Avatar
This video just came in not too long ago with Kingpin smashing records yes with the TITAN!!! Beating 4 GTX 680s on LN2 with 4 Titans ON AIR ONLY!!!!!! lawlzzz




Enjoy!

Edit: I WANT ONE! LOL!!!!


One can only wonder how much this one will go for. I'll start by saying $1200+

EarthDog's Avatar
I thought you can only run 3 in SLI... I saw 4 there though, LOL!
Janus67's Avatar
So apparently you aren't limited to 3 Titans...
Boulard83's Avatar
^^ Seems to. techpowerup said quite a few times that you can pair 4 titan together trough the review.
EarthDog's Avatar
Press releases said three though I thought.
Boulard83's Avatar
I think some revews site are wrong, or Nvidia is ?

http://www.geforce.com/hardware/desk...titan/features
GTXJackBauer's Avatar
Lawl........

"<GTXJackBauer>: Q: How does the Titan perform compared to the other cards in Folding? (For a Good Cause)"

My question luckily got picked in the Nvidia Live Stream and was the last question.

Unfortunetly too much gaming was talked about aside from the Crysis 3 crashes but not alot of the computation side of things. I'd love to see a Raytracing Design Garage Demo with this thing.



Edit: Here's the Link to the Video @ 1:29:00 would be a good starting point. - Click Here for Video -

If you notice after Ryan Shrout from PC Perspective says "From GTXJackBauer" he almost looks like he's going to laugh his ass off or something. I guess my name might have got too him and was also a 24 fanatic.

As for the question being answered, Tom Petersen from Nvidia said it should work very well and that he doesn't have any data yet on it. Ryan Shrout responded as well by saying to Tom that the client isn't usually ready for new released GPUs.

SUCCESS!!!
EarthDog's Avatar
?? ^^

So it was picked and what did they say?
Pierre3400's Avatar
Watching the video, i was like, wtf? 4 Titans? They said its limit is 3, but we now know, this was a lie!
Bobnova's Avatar
Vince may know something that nobody else does, or may have cracked open drivers. I asked about it on the KPC forum, we'll see if I get a reply.
hokiealumnus's Avatar
Heh, I PM'ed him about a BIOS and/or PrecisionX version that lets it get to the 117% power target. Even that will increase my scores.
Ivy's Avatar
Nice, but 1000 $ is a lot in a GPU market where a even stronger Titan could just be around the corner. No one truly knows what kind of stuff AMD may have inside theyr bag of surprises. Such prices usualy are valid in term the stuff is able to be cutting edge for at least 1-2 years but thats certainly uncertain. AMD may have comparable stuff by end of year, so it may not even last a full year.

So, lets say i have a true need for (which i currently dont, my current cards are fine for 1080P) then i still would not be very certain to spend 1000 $. Sure, a good CPU may cost the same, but its usualy outdated slower than that and a CPU is still more versatile for general use.

Whats certain, Nvidia didnt fail to impress this time. At the launch of the rather average 600 series, i wasnt to impressed and Nvidia was only benefiting from AMDs inferior drivers. So thats why Nvidia at the first months after launch had an edge over AMD. However, the hardware was not superior at all, AMD simply had to fix theyr incomplete drivers first.


Now at the soon to be released Titan, i got a much better feeling that it surely will be a real match against future AMD products. However, we still dont know what kind of stuff AMD is gonna release soon, its still far to much guessing. So it will be a interesing time this year and as usual, the GPU hardware challenge is still the most interesting and most uncertain hardware challenge by far.
GTXJackBauer's Avatar
Its going to be alot to most but what people don't get is its a hybrid card pretty much. A gamer who does computational work would be ideal or as to a Tesla user who also likes to game as well. Instead of blowing 3k you're better off getting this card. I guess you can say its a bargain for folks who do alot of computational work instead of a gamer when they can spend 500 instead of a 1000 which will never use the other features to it unless this thing which I believe will be a powerhouse for folding. Nonetheless if you're just gaming this card really isn't for you. You can buy something if you're just a gamer or wait for the 700 series.

From my understanding ATI isn't coming out with anything untill the end of the year or next year so unfortunetly we might be seeing higher prices than usual thanks to the invisible ATI.

I can see why only folding was talked about at the end and no computational questions were responded too. Why would Nvidia say this card can also be used like a Tesla. They'd be shooting their own foot.
Bobnova's Avatar
How many individuals actually have Tesla cards though? Seems like the vast, vast majority are in workstations in work environments. Gaming isn't exactly a priority there.
EarthDog's Avatar
Let me ask again.. what was said. You havent mentioned that...
Leave a Comment