• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

False Specs on GTX 970?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Thanks again.

Why are you testing at a res we know will have problems, particularly with those settings. The thing here, at least in my mind, is 1080p/1440p testing and what will hit the 3.5GB limit...

Its a real issue, there isn't a doubt in mind at 4K and 5760x1080. But those 1% of people (actually less than that according to Steam info), arent the issue. Its the 1080p/1440p who 'want to know' how this affects them, right?

Also, is 8xMSAA a settiing in game? Is that on top of default ultra settings?
 
Thanks again.

Why are you testing at a res we know will have problems, particularly with those settings. The thing here, at least in my mind, is 1080p/1440p testing and what will hit the 3.5GB limit...

Its a real issue, there isn't a doubt in mind at 4K and 5760x1080. But those 1% of people (actually less than that according to Steam info), arent the issue. Its the 1080p/1440p who 'want to know' how this affects them, right?

Also, is 8xMSAA a settiing in game? Is that on top of default ultra settings?

I can't remember what AA defaulted to when I picked Ultra, but 8xMSAA was in-game.

I think it's still worthwhile to have the results at the higher resolutions as well, if only to verify the occurrence. I think the difficult part at those resolutions is that the game is probably close to (or already) unplayable because of the demands on the card horsepower-wise.

This "3.5GB wall" is odd and I'm curious to see if it shows up in other games as well (Ivy, got those Skyrims results?)
 
I can't remember what AA defaulted to when I picked Ultra, but 8xMSAA was in-game.

I think it's still worthwhile to have the results at the higher resolutions as well, if only to verify the occurrence. I think the difficult part at those resolutions is that the game is probably close to (or already) unplayable because of the demands on the card horsepower-wise.
Thanks for the info.

Verification of what we were told... sounds good. But let's see some testing at the borderline res of 1440p (as you are doing) and at the 'chances are its not a problem unless you mod the hell out of games or use supplemental AA from the NVCP' resolution of 1080. :)
 
Marks titan at the same settings was playable at the same settings and had similar fps. we kept going up in resolution to see if we could indeed get more than 3.5gb of ram in use. his card at the same resolution was going up to 5.9gb
 
his card at the same resolution was going up to 5.9gb

!!!! That's an excellent piece of info :) Any chance you guys could test another demanding game and see if something similar occurs.

I'm wondering if Nvidia's driver or the games themselves aren't limiting how much vram they make use of based on what's "available" (only the fast portion).

I'll try out AC: Unity tonight. BF4 seems to show the "imposed" 3.5GB limit (though I don't remember seeing more than that used with my 290X or 290 either).
 
will have to check, he has a full college load and sometimes its hard to make time
 
will have to check, he has a full college load and sometimes its hard to make time

I will try and make time to do something similar with my 290X, comparing it to the 970. Will be a smaller difference margin, though, just seeing if the 290X makes use of the full 4GB.
 
I can tell you with my 295x2, I have seen at most around 7.2GB worth (its how MSI AB shows it, divide by two of course). And that is at 1440p, default Ultra (4xMSAA), AND with the resolution scale at 140%. If the res scale is left alone, I hit around 3GB.
 
LOL, I saw that a couple of days ago... God bless the US and their overly litigious people... :screwy:
 
Good.

Teach them a nice lesson.

Luckily hobbyists have a lot of money to throw around lol



0.5GB of the ram is much, much, slower than the rest of the RAM and it was not advertised anywhere. People want what they pay for.

I am confused?

Good, teach them a lesson?

What happend? You are/were the biggest nvidia fanboy ever to grace these forums... Your dream is to work for nvidia, and now its good, teach them a lesson?
 
Its amazing when you shower him with facts, he doesn't budge, but then the speculation of how this happened and he's immediately jumping ship. You just can't reason with that. :p

#tinfoilhat
 
What happend? You are/were the biggest nvidia fanboy ever to grace these forums... Your dream is to work for nvidia, and now its good, teach them a lesson?

It's called empathy. I have the same tier of cards (generations older, but still of the GTX x70 mid-high end) and I would be absolutely pssed off if this happened to me, and I am part of the so called "1%" it would have affected, I am currently in the market for new video card(s) as well... and i dislike any shady practices, and nVidia is no angel. Hardware is where I draw the line, sure they have (or had) a better system than AMD (CCC vs Nvidia Control Panel) but what is the use of better software if your card is basically a lemon?

Also, www.bursor.com <- Law firm that may be suing nvidia. Look at their cases, toothpaste, olive oil, freezer manufactuers...


I should probably change my sig.
 
It's called empathy. I have the same tier of cards (generations older, but still of the GTX x70 mid-high end) and I would be absolutely pssed off if this happened to me, and I am part of the so called "1%" it would have affected, I am currently in the market for new video card(s) as well... and i dislike any shady practices, and nVidia is no angel. Hardware is where I draw the line, sure they have (or had) a better system than AMD (CCC vs Nvidia Control Panel) but what is the use of better software if your card is basically a lemon?

Also, www.bursor.com <- Law firm that may be suing nvidia. Look at their cases, toothpaste, olive oil, freezer manufactuers...


I should probably change my sig.

As seen in the benchmarks from dejo and EarthDog, 4GB isn't even enough at 5760x1080 and 4K. They're using 5GB+ when it's available. 970 is still a great card, whether it has 3.5GB or 4GB of vram.
 
970 is still a great card, whether it has 3.5GB or 4GB of vram.

Yeah, for 1080p it is, for most games, but it would certainly grind my gears if I wanted to play a AAA title in the future and didn't have enough VRAM or it was slow and degraded performance. Honestly 4k/multimonitor gamers should have gotten 980s they are more suited to the task, a 980 is what I was looking at to replace my 670s with. It's sort of two way. nVidia should have told about the slower 0.5GB of vRAM but people who actually need to actively use every bit of RAM should have gotten a 980 or a Titan X or Titan Ti or whatever they're gonna call the new one/AMD equivalents.
 
Yeah, for 1080p it is, for most games, but it would certainly grind my gears if I wanted to play a AAA title in the future and didn't have enough VRAM or it was slow and degraded performance. Honestly 4k/multimonitor gamers should have gotten 980s they are more suited to the task, a 980 is what I was looking at to replace my 670s with. It's sort of two way. nVidia should have told about the slower 0.5GB of vRAM but people who actually need to actively use every bit of RAM should have gotten a 980 or a Titan X or Titan Ti or whatever they're gonna call the new one/AMD equivalents.

As noted above, I'm at 1440P and, so far, haven't experienced any "stuttering" or any other form of degraded performance which can be attributed to a lack of vram.

They definitely should have made mention of it, but I don't know that it really makes a difference in any scenario except SLI @ 4K and in that scenario, as you mentioned, 970s shouldn't have been the choice anyway.
 
Back