• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

False Specs on GTX 970?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Yeppir, the omega drivers are great AND the Sapphire runs at 1000 core, 1300 memory. The 970 IS faster but the 290 is no slouch.
 
Why are they wasting resources on those? They aren't humorous or entertaining. Just keep knocking it out of the park in price-to-performance and delivering on specs ;) and leave the comedy to non-engineers hehe
 
Why are they wasting resources on those? They aren't humorous or entertaining. Just keep knocking it out of the park in price-to-performance and delivering on specs ;) and leave the comedy to non-engineers hehe

I thought it was both humorous AND entertaining xD
 
so the r9 3xx has no chance to compete with the 9xx series from nvidia?

They directly compete. It's just a different way of doing things. Right now you can buy a card in the 970 that beats 290x in the many cases while using significantly less energy and generating a ton less heat. But at most you are talking handfuls of FPS in either direction. So you either want less heat and power usage or you don't, for whatever reason.

That used to be the 'space race' between these two companies. Things were getting smaller and they were both trying to squeeze the most they could while keeping power and heat down. AMD kind of took its eye off the ball when they shifted to consoles and simply went brute force with the last generation of cards they released.

There is nothing subtle about the 290/x. Big ram with big 512bit memory bus. Nvidia's 256bit answer was far more refined and actually advances things with new compression techniques and other goodies. Some are even speculating that the 970's offloading off less intensive tasks to that 512mb of slower Vram is the future. It isn't the first time Nvidia has done it and while they cannot do hardware side to 'fix' the 970, they can conceivably do some things with drivers to better manage what gets offloaded to the slower Vram.

That was in their first official statement. Followed by a retraction to cover their butts if they can't deliver.


The GPU market is as much a two-party system as is the US' political system. We need to be able to have good reasons to buy AMD as much as an old school conservative like myself needs a good reason to vote Republican again. I haven't seen any in either case for awhile now.

The 970 thing is still not a great reason. If money was not an option you'd simply toss more money Nvidia's way and grab a 980. That's not a great position for AMD, or the consumer.
 
I thought it was both humorous AND entertaining xD

I disagree. Honestly, it was a cheap and crappy attempt at what ever it was. I could have come up with a better plot if given the time.

My main reason for steering away from 290x, heat. No one burned thier hands in the video...
 
Have been testing a lot now, but its hard to reproduce the issue, means it may have a issue at 3.5 GB+ but if so it is acting unstable and i guess it could be fixed driver side. The drivers are probably still on the beta side and not fully mature. Especially with some memory OC and improved drivers i guess it may work properly in most games even at above 3.5 GB, but there can be exceptions who knows.

Doesnt change the fact that Nvidia simply didnt spell out the truth for many months in a row but honestly... i guess in term they was fully open minded and was providing the correct specs from day 1 and telling everyone how it works and whatelse, i guess just a few people might be having issues. It is truly the dirty dishonest approach that made many users upset and i do fully understand it.

Performance wise, both partys got good stuff. AMD going the "volcano-way" and they are perfectly honest because it is even in the name "volcanic islands". While Nvidia the cold way... unfortunately even cold "by heart" and thats not how it gonna be work any good in marketing terms. They have to take customers serious, tell them the full stuff, simply dont make an fool out of them and all may be fine.

Considering the crazy "power usage" of many AMD cards: I cant make a standart statement because in my mind the high end super performance hardware nowadays is generally way more thirsty and in need of way more cooling than the stuff 10+ years ago. So that matter has "improved" at absolutly any spot or company. Its true that AMD is currently "worst case" but anyone who is truly serious about a "green approach" and is having a lot of high end hardware with high power consumption that cant be avoided: High performance = high power, simply the rule. Well to anyone that is still taking "green specs" serious they may be able to find certain solutions and i was able to find solutions: All my high end hardware is used right below the roof of a house, that means i can use the heat in the winter and generally when its cold very effective as a "room heater" and in the summer there is use in order to make the room more dry so i can dry my clothes way more effective and very efficient without the use of energy-hogs such as "tumblers" or a "dehumidifier"; i have no need for such devices by simply using my "hardware room". So i have many energy that isnt truly wasted at all... and can be used for many other stuff, not only "TFLOP-performance". The energy does affect the room in a way how i want it to be, so i have a gain in many terms and a reduced waste. To me it is critical that a room is warm and especially with low humidity, because i have tons of tea stored in that room and in term i get mold on the tea... it is able to destroy a super-load of very expensive tea. The hardware is a very effective device that are providing the correct "climate" in order to produce a safe environment to those sensitive goods. So, nope, Ninjacore, im not having anything in the basement except my bathroom... and i hope i was now perfectly accurate why it isnt the case.
 
Last edited:
3.5 GB+ but if so it is acting unstable and i guess it could be fixed driver side.
That isn't what the issue is, instability over 3.5GB. Its hitching because its writing out to the slower part of the ram.

That said, how many times are we going to repeat the same things (the rest of your LONG post)? Can we see test results?

I thought it was both humorous AND entertaining xD
I did giggle. That said, I wish they would spend more time on their GPUs and less time on marketing. Nobody likes a person that talks trash and can't back it up... Trueaudio... Mantle... We've been waiting for adoption and it hasn't been fast. :(
 
Last edited:
Sitting here waiting on all those returned GTX 970's in Amazon Warehouse and newegg openbox.
Looking for a pair of EVGA GTX 970 SSC's....

*twiddles thumbs*

:D
 
Have been testing a lot now, but its hard to reproduce the issue, means it may have a issue at 3.5 GB+ but if so it is acting unstable and i guess it could be fixed driver side. The drivers are probably still on the beta side and not fully mature. Especially with some memory OC and improved drivers i guess it may work properly in most games even at above 3.5 GB, but there can be exceptions who knows.


Doesnt change the fact that Nvidia simply didnt spell out the truth for many months in a row but honestly... i guess in term they was fully open minded and was providing the correct specs from day 1 and telling everyone how it works and whatelse, i guess just a few people might be having issues. It is truly the dirty dishonest approach that made many users upset and i do fully understand it.

Performance wise, both partys got good stuff. AMD going the "volcano-way" and they are perfectly honest because it is even in the name "volcanic islands". While Nvidia the cold way... unfortunately even cold "by heart" and thats not how it gonna be work any good in marketing terms. They have to take customers serious, tell them the full stuff, simply dont make an fool out of them and all may be fine.

Considering the crazy "power usage" of many AMD cards: I cant make a standart statement because in my mind the high end super performance hardware nowadays is generally way more thirsty and in need of way more cooling than the stuff 10+ years ago. So that matter has "improved" at absolutly any spot or company. Its true that AMD is currently "worst case" but anyone who is truly serious about a "green approach" and is having a lot of high end hardware with high power consumption that cant be avoided: High performance = high power, simply the rule. Well to anyone that is still taking "green specs" serious they may be able to find certain solutions and i was able to find solutions: All my high end hardware is used right below the roof of a house, that means i can use the heat in the winter and generally when its cold very effective as a "room heater" and in the summer there is use in order to make the room more dry so i can dry my clothes way more effective and very efficient without the use of energy-hogs such as "tumblers" or a "dehumidifier"; i have no need for such devices by simply using my "hardware room". So i have many energy that isnt truly wasted at all... and can be used for many other stuff, not only "TFLOP-performance". The energy does affect the room in a way how i want it to be, so i have a gain in many terms and a reduced waste. To me it is critical that a room is warm and especially with low humidity, because i have tons of tea stored in that room and in term i get mold on the tea... it is able to destroy a super-load of very expensive tea. The hardware is a very effective device that are providing the correct "climate" in order to produce a safe environment to those sensitive goods. So, nope, Ninjacore, im not having anything in the basement except my bathroom... and i hope i was now perfectly accurate why it isnt the case.

Bolded the only part of your response which matters.

What "issue" are you seeing?

Why is the "issue" hard to reproduce?

How are you trying to reproduce it?

Can you post screenshots of your results? (GPU-z output or afterburner graphs, fraps, ...)


Sitting here waiting on all those returned GTX 970's in Amazon Warehouse and newegg openbox.
Looking for a pair of EVGA GTX 970 SSC's....

*twiddles thumbs*

:D

Saw a G1 sell on [H] yesterday for $275 shipped. :shock:
 
Last edited:
They directly compete. It's just a different way of doing things. Right now you can buy a card in the 970 that beats 290x in the many cases while using significantly less energy and generating a ton less heat. But at most you are talking handfuls of FPS in either direction. So you either want less heat and power usage or you don't, for whatever reason.

That used to be the 'space race' between these two companies. Things were getting smaller and they were both trying to squeeze the most they could while keeping power and heat down. AMD kind of took its eye off the ball when they shifted to consoles and simply went brute force with the last generation of cards they released.

There is nothing subtle about the 290/x. Big ram with big 512bit memory bus. Nvidia's 256bit answer was far more refined and actually advances things with new compression techniques and other goodies. Some are even speculating that the 970's offloading off less intensive tasks to that 512mb of slower Vram is the future. It isn't the first time Nvidia has done it and while they cannot do hardware side to 'fix' the 970, they can conceivably do some things with drivers to better manage what gets offloaded to the slower Vram.

That was in their first official statement. Followed by a retraction to cover their butts if they can't deliver.


The GPU market is as much a two-party system as is the US' political system. We need to be able to have good reasons to buy AMD as much as an old school conservative like myself needs a good reason to vote Republican again. I haven't seen any in either case for awhile now.

The 970 thing is still not a great reason. If money was not an option you'd simply toss more money Nvidia's way and grab a 980. That's not a great position for AMD, or the consumer.

I would still venture to guess that the r9 3xx will compete on at least equal footing with the gtx980.
And if frametimes sag till tomorrow over the "advanced memory refinements" then Im not sure that is something I feel I would need. I have a gtx970 and dont have an r9 290/x but am thinking I may swap. There is no real "killing performance" by nvidia. Sure it beats the r9 at some things, but it also is getting beat in some things. Many wont consider that nvidia has paid many developers to optimize for their cards only, and that is really sad and once again underhanded on nvidia's part.
I am truly weary of what the gaming world will be if nvidia gets its way on how it thinks things should be. I wouldnt be able to afford any of their goods at that point
 
R9 3xx better BEAT a 980... it has had several months to tweak clocks and get yields up to a reasonable level. Then the power thing... I am hoping that they will be 200W or less... If it can beat a 980, use less than 200W, and cost the same or less than a 980, AMD has a winner and the people again have viable choices from both companies. I have a feeling it will come in less than the 980, and NVIDIA will then drop prices... just like it always works.

AMD also pays devs too Dejo...so make sure you share those underhanded feelings with AMD as well.

I wouldn't worry too much about the gaming world and expensive cards honestly. I really believe that on that front, you are making more out of it that it really will be.
 
In available leaks we can see about 10-15% higher performance on 390X than GTX980 but much higher wattage ( ~100W more is expected ) so even if these cards are faster then I don't think anyone will overclock them as high on average air/water cooling. R9 290X has similar power/thermal specs ( if leaks are correct ) and it's barely overclocking on more standard cooling.
 
Playing Far Cry 4 last night, I saw max vram usage around 3.5GB. I set everything to the max settings, while still keeping it "playable". 1440P, Ultra (except: shadows->High, water->High, Anti-Aliasing->TXAAx2, Godrays->Volumetric Fog).

4690K @ stock, 8GB @ 2133Mhz, 970 G1 overclocked (as seen in the data)

Playing at these settings, I had relatively smooth gameplay ("smooth" compared to the same game with everything maxed, less smooth compared to how I usually play BF4). I also didn't see the consistent, on-the-minute "jump" that dejo mentioned earlier.

This is a great guide for Far Cry 4 Video Settings, by the way :)

Relevant snippet of my GPU-z output (all of the rest of the data points were near 100% GPU Load and less vram usage):

fc4.png

Though, I don't remember noticing it while playing, there is an instance or two in the data where GPU load drops off considerably. Perhaps coincidentally, perhaps not, Memory Controller Load % also seems to "reset" at those points...

While the first of these "drops" occurs at 3418MB vram usage, the second occurs when the vram usage was at its maximum for the session (3514 and 3516 a few seconds before, then 3512 at the drop).

I find it a bit curious that there is almost a "wall" at 3500MB. I would think there would have been at least some time where more than that was required. When I had everything maxed out graphically, I saw the same. If I had to guess, I would say the game itself is "polling" the available GPU for the amount of vram it can access simultaneously. It then chooses not to exceed that amount. This results in no stuttering seen in other games which possibly don't set a limit like that? Again, just making a guess.

I'm going to try AC: Unity next.
 
Last edited:
when I tested with FC4 I think we had AA on Msa8. I have a couple screenshots of afterburner. FarCry4GTX970Ultra.jpg FarCry4GTX970UltraMSAA2X.jpg

I dont have screens of the titan on the same game. but fps was similar and the framtimes were nowhere near as spiked and didnt have the hitches

also should note that this was on a z97 board with 4790k running at 4.4ghz with 8gb of ram at 2400@10-12-12-31
 
Last edited by a moderator:
Thanks for the testing Dejo... looks like you didn't hit 3.5GB... but certainly close enough. You may want to raise the max value on the ram in MSI AB graph so you can actually see if the frametime spikes correlate with memory bumps close to the limit as well. With your limit set to where it is, you can't see that as its only set to 3072MB.

Also, you mention 8xMSAA... is that an in-game setting? Also, if it is an in-game settings, is that value on top of the default 'ultra' setting? It also appears you are running a triple monitor setup which when couple with the high MSAA, it wouldn't surprise me you are seeing 3.5GB+ use.

I don't think anyone contends at that res that this card could be a problem. I think we are trying to pin down 1080p and perhaps 1440p resolutions that in most cases will not hit the 3.5GB mark. 4K and 5760x1080... we expect to see issues, especially with copious amounts of AA. What was your goal with showing that testing at that res?

EDIT: In the other screen shot, you are not hitting 3.5GB (3.1GB). So you shouldn't be experiencing any issues. I have no idea why you are seeing these spikes. I wonder if it is system specific?
 
Last edited:
also should note that this was on a z97 board with 4790k running at 4.4ghz with 8gb of ram at 2400@10-12-12-31

I'll add other relevant system specs to my post above.

I definitely didn't see the spikes you guys saw. Maybe it's the non-Nvidia-proprietary anti-aliasing?

I'll make sure I grab the Afterburner graphs when I do the Unity testing.
 
Rather good explanation:
I really think because 4 GB is a important marketing factor they surely tried to hide the 3.5 GB issue. They sold over 2 million units so the 970 was big success, and it would clearly be lower success when full spec revealed.
 
Rather good explanation:
I really think because 4 GB is a important marketing factor they surely tried to hide the 3.5 GB issue. They sold over 2 million units so the 970 was big success, and it would clearly be lower success when full spec revealed.

Do you have results from your testing to contribute, Ivy? I think we're well beyond the speculation portion of this discussion...
 
my whole point is that there has to still be something going on to have the frametime spikes like that. in those screens you can see where the fps went up, that is where there was no movement at all from the charachter in the game and we still got the huge frametime spike. the titan didnt exhibit this behavior with same settings. The notepad to the right is the actually times where the hitches occured so you can see what the timestamp is for each of them.
I would wager that there is something going on at the driver level that is causing this.
I basically do no gaming at all. My friend that does alot of gaming said that he would find the gtx970 as a nuisance compared to the titan at the same settings, even though they were basically at the same framerate.
 
Back