• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

AMD 390X reference card will ship with AIO cooler

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Agree,

In a typical gaming rig this card may prove problematic.

I don't think the intention of the thread is aimed at sub ambient users, as you state, who cares if the results are there I may have one :)
 
I am not sure I agree anyone cares about efficiency that much.
What it is all about is what you are spending your watt's on.
No matter what until retail hits the street's we will have nothing definitive.
The speculation is the performance for the watt's spent is not going to be there.
This card will be an epic fail if it can only deliver 10% over a GT980.
The higher that percent actually ends up will be tell tail to if this card will succeed or not.

Just like with the GTX480, was it worth the heat and power consumption for the performance or wasn't it ?
The alternative the more efficient, cooler running 6970 at about a 15% deficit in performance.

They sold a lot of 6970's despite the gap in performance.
Point being this is not a new argument, simply the same one with the rolls reversed.
I am sure many will jump on the 390x if it is 10% faster or better.
Many will opt to use the GTX980 though

Back then if you remember NVidia was scrambling for a solution and the GTX580 was released shortly after the GTX480 in an effort to regain market share.

The GTX 480 was a great performing card. slightly better than a 570 and overclocked better than a 580GTX because the 580's and 570's had terrible power delivery issues(they would often just die).
Yes it was hotter , yes it used more power but it had just as good or better potential imho than a 580gtx.

That said the 6950 sold tons and the 6970 sold well .. Why?
Well for one most 6950's could unlock to a 6970 I unlocked 3 in my time with them.
They were also under 200$ i paid 150$ for the last 2 i bought and that was prior to the 7xxx series cards so they weren't last gen yet.
That and all the 6950's i unlocked to 70's overclocked a good +25%.

The GTX570 at that time was nearly 2.5-3x that much and the GTX 580 was 500$+ only 8 months after the GTX 480 released.

I owned both @ the time i still have the GTX480 in a guest pc. they all worked fine with similar options but the AMD cards were much much cheaper.

If a friend had not wanted to buy my 6950(70 converted) id have prob been using that in my guest pc now.


Either way --- Yes Price Vs Performance has much to do with purchasing.
I could care little about actual wattage use tbh .. I never will either, so long as it performs well and is manageable heat wise. I never leave my gaming rig on past the time i use it anyhow so power use is totally not a big deal to me.

I doubt it is to most other people either.

If AMD made a 500 Watt GPU that did 6x the performance of a 980GTX and ran off a All in one Liquid cooling unit that took 2x120mm Rad people would buy that thing as long as it was afordable .. heck even if it was double the price of a 980 they would still buy it for the performance.

Problem is AIR coolers have hit their limits which means 1 thing.. either increase efficiency or they cant continue to use air coolers.
 
Last edited:
"manageable heat wise"

That is truly hitting the nail on the head.
The other side of this is it is not going to be 6x the performance of a 980,
Only time will tell what type of heat penalty and performance this card will bring.

This will be the determining factor that will either make or break this card
 
I'm just waiting for someone to come out with a single slot card with a full cover block that has a rad/pump attached.
 
The cooler that is on the 290x lightning is said to handle 500w. There is also an air cooled 295x2 which is 500w.

So, not sure a 250/300w card hit air cooled limits. ;)
 
I think even if it was hot etc but released half year ago as it supposed to looking at the first leaks then all would be fine. If they release it now at recently leaked specs and performance then it will be kinda fail. Looking at the leaked results we can expect 10-15% higher performance than the GTX980 but 50-100% higher wattage. If it's true then no way that Nvidia will reduce price for GTX980. They will live for 2-4 months as 2nd in performance but much higher efficiency and release new Titan and 980Ti.

This is what I'm thinking as well. I doubt they'll lower pricing at all, even with the 390's out performing it, as they'll be doing it with with a hotter and more power hungry card. With that said, too many people are excited about the lower power consumption cards at the enthusiast level. A year ago no one cared at all how efficient their graphics card was. I could care less as well, as long as the damn things don't overheat, I have plenty of power to run several of them, and most people can run 1-2 easily as well.
 
The cooler that is on the 290x lightning is said to handle 500w. There is also an air cooled 295x2 which is 500w.

So, not sure a 250/300w card hit air cooled limits. ;)
Some may even handle 600W (i guess the strongest from Gigabyte or Inno3D is able to cool down 600W) but the noise is almost as high as a jet engine... and/or the cooler is simply so huge that it can only fit inside cases that got support for giant-type cards. There is still open potential by using genuine high grade copper* coolers, because like 99% of all coolers are made with Alu for the sake of "lower cost", so the current "limit" is only practical and economical but in theory the true limit is still way ahead of the current capability. *Silver is even better but i wouldnt tell it to anyone else the PC may be stolen all of a sudden.

It always depends on the size of the cooler too. Most manufacturers are not anymore using the official 2 slot spec. Most high end coolers are surpassing the limit at one or even many spots (even my 970 MSI cooler is bigger than a usual 2 slot). As long as no SLI/CF used it may work even to build 4 slot coolers. For the sake of SLI/CF most manufacturers simply expand height (even 30 cm length is not that uncommon anymore) by large margin and are calling it "2 slot", somehow dishonest but seems valid.

Finally its simply terribly inaccurate making some "hard claims" of what a cooler is capable of and what not... it always depends on like endless factors and how many coin someone is actually able or willing to spend.
 
Last edited:
Doing some of the rather hilarious speculations now because simply fun, take it with a grain of salt:

I think the current "leaks" regarding single precision performance could be pretty realistic, i guess there is more truth involved than just a "grain of salt":
In that term the double GPU could be +45% compared to 295X2, the strongest single GPU may become a special cooler and up to +95% vs. 290X while the second strongest single GPU could be at the +45% vs. 290 range. I guess the strongest single type and double type could be using a special cooler and TDP handled according to the coolers limitations, in that term the single type should have the highest margin. The second strongest single type is not using a special cooler, so the margin will probably only be +45% and a "budget GPU*", kinda comparable to the 970.

*Budget is a relative term, its budget in term of "true gaming GPU", other GPUs are pretty frequently with many sacrifice dependable on game at even better budget.

In term the specs is barely valid then the performance compared to Maxwell would end up at:
Second strongest single type: +35% vs. 970 counterpart at a budget price (maybe comparable to 970). TDP unknown but guess is that it could be around 200W, more than 970 but as well more performance. Efficiency could be more or less same.
Strongest single type: +85% vs. 980 counterpart at a matching or slightly higher price (time of release) and with special cooler attached. TDP could be around 300W. Efficiency could be more or less same. A SLI 980 in that term could be having matching performance in many games but multi GPU will always be inferior compared to a single GPU as long as barely better performance and even higher price.
Double type: Cant be compared to Maxwell because no double Maxwell released so far.

What will Nvidia do? Well... they may release a 980 TI with almost twice the ALU in order to attack the new 300 series with strong cooler but may not entirely match the performance, although less heat and maybe improved efficiency. Titan will be attacking the double GPU and maybe it will end up a bit less performance but as well less heat and improved efficiency. AMD in that term may stay "volcano spec" but probably with matching or even beating performance.
 
Last edited:
There was a lot newer news posted saying otherwise, like, the link in the first post. ;)

Only time will tell...
 
I dont think there will be a shrink involved because apparently still not mature yet in large scale and affordable price (low wastage). Intel is a special thing, they can charge so much for theyr CPUs because simply no good competition, so they can afford to have a challenging nm progress and even high wastage, thus totally beating anyone. However, i think it is realistic to see a 20 nm shrink probably next year, i guess Nvidia could even take the lead (maybe already end of year) and AMD will follow short time after. The competition is already very high, to big steps could be bone breaking to both Nvidia and AMD, although Nvidia is still with bigger elbows.

Anyway, as already stated, only future will tell.
 
I read that link, but I didn't see anything about node or TDP. Just that it would have a new type of cooler. a 197 TDP card with water cooler uses maybe 25-30 watts for water pumping? That leaves 167-172 watts for the GPU, which seems possible for an x/xx90 series liek the 7950 which used around 150 watts stock. sigh... lol

And reading b/t the lines, just because it's not "mature" doesn't mean a limited run can't be used to manufacture a number of cards in the r9 390/x line. With the 7000 and r9/r7 series, they used 28nm since it was ready, but since only the r9 390 and 390x have been confirmed to be the only new GPUS with a new chipset, it frees AMD from the supply of an entire line of smaller shrink. Tens of thousands of GPUs might not be ready by H2 2015, but the demand might trickle up by the time it is ready. Timing a release so the limited quantities are marked up in price makes the demand low until the supply is ready. I think they'd release it as soon as supply is ready to pick up steam
 
Last edited:
There is no need for water cooling on a 200W TDP, a economical nonsense. A 300W would make sense, but in that term just the flagship (single and double).
 
Well, reading between the lines (not really) there is only one reason you need/put on an AIO cooler on a card. Because it runs hot/TDP is high. You don't need it on a 200/250W card. ;)

There is no need for water cooling on a 200W TDP, a economical nonsense. A 300W would make sense, but in that term just the flagship (single and double).
I.............. I agree....
 
GPUs must be a lot different from CPUs then. CPUS are watercooled when they are overclocked. I'm assuming 125watt to upwards of 200 watts. But a GPU that uses 200 watts at stock doesn't need it. I think it was designed for all the board manufacturers that like to overclock it.
 
Not really, no. It has nothing to do with overclocking it, well that doesn't seem to be its primary use. If it was a midrange 200W card 'for overclocking' then that would make no room in the market for the high end card. If this is true, I expect it to be 250W-300W stock. There are plenty of 250W cards out (7970/r9 290/290x, 780ti, etc) that don't require water cooling.

These would be reference cards. These are, from the latest rumors (first post of this thread, not a 3 month old rumor from the same website), high end card that would seem to require water cooling.

... but, its all rumors now. Some just make more sense, with a little logical thinking behind it, than others.
 
Last edited:
GPUs must be a lot different from CPUs then. CPUS are watercooled when they are overclocked. I'm assuming 125watt to upwards of 200 watts. But a GPU that uses 200 watts at stock doesn't need it. I think it was designed for all the board manufacturers that like to overclock it.

Cooling CPUs and cooling GPUs are two different animals. It doesn't compare in the slightest.
 
I think my point is easier to explain making the analogy of when the 40nm GPUs from the HD5000 series are compared to the HD 7900 series. the HD 5870 was considered "top of the line" for single GPUs. It was the first GPU series by AMD in the 40nm shrink from 55nm. It had a TDP of 228 watts. It did not exploit the full, mature 40nm new shrink process. It used more transistors in a smaller space with some comfort room left over. Which means it can improve performance at a lower TDP, but each generation tends to increase in TDP to allow more transistors. Otherwise we would be left with the same number of shader cores at a lower TDP. The next generation, the 6970 was released. This one was also a 40nm process, but it had a TDP of 250watts. It basically maximized the TDP on the shrink before the TDP became "too much" for a solution without water cooling. Only then did it seem more economical by that time to release the 28nm 7970, which used, again 250 watts. But the difference between the 7970 on the new node and the 5870 was that there was a higher TDP starting out on the 7970 with less "left over" that wasn't used on the 40nm. Then the R9 290x was released, and it had a TDP of 290 watts. the 28nm is starting to be maximized in performance/watt with less optimization available. So, what is very much possible, is if this pattern holds true for two generations, we'll see a "top of the line" R9 390x around 200-225 watts or possibly 250 watts (possibly on a new nm), and then we'll see the year after an r9 490x with a TDP of 275watts or 290 watts again. It's not that the GPU can't handle more TDP, it's just historically imitating the tick-tock strategy of Intel where it uses a smaller die, then "maturing" the generation after, with more shaders, etc even if it doesn't change much else on the same node. This is all the while GPUs have increased in TDP over the much longer term of 10 years. In 2006, the X1950XTX had a TDP of 125W, but in 2007, the HD 2900 XT had a TDP of 225 watt at 80nm. Here is the link where I got this info: http://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Comparison_table:_Desktop_GPUs
 
Last edited:
Back