• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

False Specs on GTX 970?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
if its an upgrade that is offered, it needs to be at a cost lever rather than retail level. I would rather a refund than to pay retail difference to an 980. and althought the card does have 4gb of ram it is still in a cloud of shennanigans over that and the memory bus
 
What does the type of plug matter? It CAN pull that much, but doesn't. It's a 165w card ivy... That doesn't change.

EDIT: so.... What's wrong with the memory bus now? Are we going on nameplate values and 256bit is less than 512 so the bigger number wins? We know better... :)

In the end the r9 series does have greater bandwidth, but what does it really matter until 4k? Again performance numbers are what they are with its specs!!
 
CAN? How are you going to pull 300W on the 980? LN2 or something? Probably impossible, the power target will screw it i guess, so it is just a fake? How comes there is such overpowered plugs if no need at all? Guess 2x6 for 970 and 1x6/1x8 for 980 is perfectly suitable. Heck the 290X is a true 250 W card and is using a 1x6/1x8 plug.

Well it is probably an accident... becoming very common lately.

Btw: The performance is not all that great, the stuff truly impressive is the efficiency, if AMD would have same efficiency a serious threat. But surely not very promising in term AMD is telling "here is our very efficient card and here is the 2x8 PIN power connectors because its consuming so few power... barely worth it to mention".
 
Last edited:
Well 980 is overkill anyway, even if i had the cash its just insane going for it. A 2x8 PIN card, i wonder how people can sustain their claim for much higher efficiency when a card is having a connection that is basically good to go for 300 W, not any less than a 290X. But i could go on waiting forever... i need new card today not in countless months. I compare 290X with 970 because its almost same priced... it would be unfair comparing it with a much more pricy 980.

Checking efficiency matters, the Nvidia cards use connections that are able to pump twice the power than they actually use (980: 300W on a 175W card*, 970: 225 W on a 150W card), its just weird. Although temperature threshold wont allow for a GPU twice that strong, maybe +30% and Nvidia is surely hiding it in order to attack AMD. Guess its to much fake really... Nvidia was probably not releasing the true high end and is just waiting for it to be released in some months. The overpowered connectors are here so a consumer got in mind that they actually own a flagship... oh well, i guess i know the game.

*290X is a 250W or so card, roughly ~40% less effective, probably ~30% vs. 970. Its much less efficiency but nothing that cant be beaten for a new generation. GTX 960 is clearly a junk piece with lowest efficiency of entire 900 series so far (probably just ~10% more effective than 290X), dunno why such a bad design for lower midrange users that are in need of efficiency.

GTX970/980 power draw is limited in BIOS and driver so it's never more than ~225-250W ( depends from version ). GTX980 has 6+8 pin config , not 8+8. Only some brands use 8+8 in their top OC series. TDP is also unreal. In tests on some websites like Tom's Hardware results were about 30-40W higher than declared by Nvidia. That's peak draw.
Reference GTX970 or ASUS has 225W max but in some reviews max wattage was reaching ~190W. I don't know if this is way too much. Actually most GTX970 without additional rail mods are throttling after OC because of power limits. GTX980 has 300W config but what you expect from a card which after OC make 220W ? 2x6 pin would be stupid.
290X is 250W on the paper. In real it's 300W card that's why reference series OC really bad. Also most popular for 290X are 8+8 pin connectors = 375W max.

On GTX970/980 you can unlock power target up to 1kW and unlock PCIe rail to make good single rail PSU support additional 75W. That's why you can see ASUS GTX970 reaching 1600MHz even though without mods it's throttling at 1400MHz+.

From other news:
http://www.techpowerup.com/209369/nvidia-to-tune-gtx-970-resource-allocation-with-driver-update.html

"NVIDIA plans to release a fix for the GeForce GTX 970 memory allocation issue. In an informal statement to users of the GeForce Forums, an NVIDIA employee said that the company is working on a driver update that "will tune what's allocated where in memory to further improve performance." The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point, and if current owners are not satisfied with their purchase, they should return it for a refund or exchange."
 
Last edited:
Again performance numbers are what they are with its specs!!

The numbers are what they are until we have a game which uses 3.5GB+ of vram, then they're very different.

According to everything we as gamers and hardware nerds have always understood about graphics cards, I think it's reasonable to suspect that if a video card has 4GB of vram, then performance will be the same in a given game regardless of the amount of vram the game requires UP TO 4GB... not up to 3.5.

Just because I don't currently play a game which makes use of 3.5+, doesn't make my disappointment any less justified.

I understand what Earthdog and Woomack are saying; it is silly for anyone to overreact about this as it's a very small percentage of affected users and the performance and characteristics (power draw, especially) up to 3.5GB are still incredible. However, I don't think we should say things which only serve to belittle and alienate those marginalized few who have concerns.

P.S. Ivy, please stop derailing this with nonsensical talk about which manufacturer lies more and whether power draw is insufficient on one card or another. You aren't helping the cause ;)
 
LOL, its not fake. You can modify the bios and raise the voltage. With water I can imagine it.

Also, reference specs are two 6 pins. So its the AIB's that are adding the 8 pins. ;)

However, I don't think we should say things which only serve to belittle and alienate those marginalized few who have concerns.
Who is? We just don't feel as strongly as you do about this problem is all. Its real. It effects 1% of people that rock multiple monitors and 4K or do heavy heavy mods.
 
Last edited:
From other news:
http://www.techpowerup.com/209369/nvidia-to-tune-gtx-970-resource-allocation-with-driver-update.html

"NVIDIA plans to release a fix for the GeForce GTX 970 memory allocation issue. In an informal statement to users of the GeForce Forums, an NVIDIA employee said that the company is working on a driver update that "will tune what's allocated where in memory to further improve performance." The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point, and if current owners are not satisfied with their purchase, they should return it for a refund or exchange."

I'd be happy with this. Everything they said about price-performance is on point.
 
LOL, its not fake. You can modify the bios and raise the voltage. With water I can imagine it.

Also, reference specs are two 6 pins. So its the AIB's that are adding the 8 pins. wink.gif

Who is? We just don't feel as strongly as you do about this problem.
So the TDP is a fake? It can actually make use of the full rail Watt and the rail are required for sufficient power? It is playing a huge role in efficiency matters, because thats the spec most users throw at me when it comes to Nvidia but it turns out to be lot of unrealistic TDP numbers, yet you was telling me several times it is barely using more than 150W or whatever.
 
"NVIDIA plans to release a fix for the GeForce GTX 970 memory allocation issue. In an informal statement to users of the GeForce Forums, an NVIDIA employee said that the company is working on a driver update that "will tune what's allocated where in memory to further improve performance." The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point, and if current owners are not satisfied with their purchase, they should return it for a refund or exchange."
The question is if it will become true. Technically i think a 970 is equal to a 980 and the driver may be able to do the trick. I still dunno why Nvidia made such a cut without properly informing the users from the very beginning. Apparently an accident... but i could care less in term the 970 will be running at original advertised 4 GB RAM speed with full allocation. In that term matter would be solved... i would say if they truly gonna fix it, they made their part of the deal and apology is accepted.

But not done yet... gonna watch it. Certainly i will need full 4 GB at some point, thats clear, because i mod games a lot. So i am able to truly suffer from the possible issue.

Btw: The refund thing wont be free, thats the issue. When i have to send the card to the shop i have to pay a fee... even if i use a car it will cost gasoline and valuable time. So it is not the best solution to play around with RMA matters just because someone made a failure.
 
Last edited:
So the TDP is a fake? It can actually make use of the full rail Watt and the rail are required for sufficient power? It is playing a huge role in efficiency matters, because thats the spec most users throw at me when it comes to Nvidia but it turns out to be lot of unrealistic TDP numbers, yet you was telling me several times it is barely using more than 150W or whatever.
Never said it was fake... there is just a lack of understanding on your part. Read Woomack's explanation... though again, reference was 6+6 pin, not 6+8 like he mentions.

I'll get back to you on what I meant when I'm not on my phone
Don't worry about it. I really don't want to debate people's stance on the subject. The facts were laid out in the Anand article so for me, debating its importance is a waste of time. I run 1440p on a 970 as well and I could care less because I do not mod anything and outside of cranking things additional to what in game can provide, I can't trip 3.5GB. Others will... just not many people. Does that make it right? Naaa, but it is what it is. We have said what we have to say, and the rubber on my wheels is just about done from spinning in circles already. :)
 
Last edited:
im still planning to buy a 970 in the near future :) i been with amd(ati) all my pc years bought a 750TI last year for my itx machine, i have had way less issues than i have had with my 280x. on a more subjective note it feels alot smoother even at a bit lower frames than the 280x. ontop of that the obvious price / performance ratio, the damn thing trades blows with the titan, and if memory serves me correctly was what like $600+? yeah sounds like u guys got screwed.
 
Last edited:
Don't worry about it. I really don't want to debate people's stance on the subject. We have said what we have to say, and the rubber on my wheels is just about done from spinning in circles already. :)

Fair enough. :)
 
Never said it was fake... there is just a lack of understanding on your part. Read Woomack's explanation... though again, reference was 6+6 pin, not 6+8 like he mentions.

I had to check that again and you are right. On nvidia site GTX970 and GTX980 have 6+6 pin connectors. I was sure GTX980 had more, maybe because only reference card has 6+6 and all other have 6+8 or 8+8.
Wattage of 970 and 980 is almost the same what is quite weird considering better specs of 980.
 
how will a driver fix the 224bit memory interface?
I am just tired of nvidia not being truthful in what they spec for devices. If nvidia is offering a full refund, I think I will send mine in. dont think I would take it back to microcenter for refund though, as it wasnt them that lied to me.
the one thing that many arent considering is the fact that I bought the card to have for some time to come, and there will be more and more things that will want to use more than 3.5gb of ram and the card is overly crippled when that attempt is to be made. Just because most stuff wont use more than said 3.5gb for now doesnt mean I didnt pay for a card with that capability in the near future.
I think that nvidia should do just as intel did with the p67 motherboards and offer an unconditional return to them for a full refund and eat every bit of the costs to do so. That is the only way that nvidia will learn the real lesson from hiding the little facts they want to cover up
 
Hard to say if drivers can fix it. Drivers are locking many things and limit OC, max power etc. I wouldn't be surprised if they could unlock something that they are hidding just to show bigger difference between 970 and 980.
Earlier cards had also locked ROP, bus etc but then at least it was declared as slower card and all were happy they can unlock additional performance.
 
how will a driver fix the 224bit memory interface?
256 bit. But that isn't the/a problem unless you get to 4K res. You can see even at 1440, the gap between the 290/290x and the 970 is the same. Its not until one gets to 4K/3x1080 that gap starts to shrink which then shows the 256 bit bandwidth difference.

Don't get hung up on nameplate values, but look at the actual performance. ;)
 
hey, if you guys dont like it that much shoot me a pm and we can hash out details on a trade that 384 bit memory bus ;) lolol
 
256 bit. But that isn't the/a problem unless you get to 4K res. You can see even at 1440, the gap between the 290/290x and the 970 is the same. Its not until one gets to 4K/3x1080 that gap starts to shrink which then shows the 256 bit bandwidth difference.

Don't get hung up on nameplate values, but look at the actual performance. ;)

I can confirm this. I have both of these cards at the moment and performance is almost identical at 1440P.
 
Back