• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

[Rumor] Nvidia GTX 980 Ti or Titan X (GM200)

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
How quickly I forget (I linked that to Mags several days ago)! :rofl:

After re-reading it, I do not see where it implied anything larger than 256bit was unnecessary, just that its compression features allow for greater apparent bandwidth due to their compression.
 
Well, E_D, thing is that with 2K posts/day, you're allowed to forget sometimes!

I guess you're overheating due to forums browsing overclock!
 
How quickly I forget (I linked that to Mags several days ago)! :rofl:

LOL, yea I thought that was you ;)

That still doesn't say a wider bus than 256-bit is "unnecessary", it just explains how the new compression works ;)

After re-reading it, I do not see where it implied anything larger than 256bit was unnecessary, just that its compression features allow for greater apparent bandwidth due to their compression.

I agree it isn't explicitly stated (and why would it be when they know they're gonna try and sell us 384-bit 4 months later? :D).

white paper said:
This means that from the
perspective of the GPU core, a Kepler-style memory system running at 9.3Gbps would provide effective
bandwidth similar to the bandwidth that Maxwell’s enhanced memory system provides.

It's pretty much implied in there, though. Kepler at 384-bit provides same effective bandwidth to Maxwell's "enhanced" 256-bit.
 
I agree it isn't explicitly stated (and why would it be when they know they're gonna try and sell us 384-bit 4 months later? :D).

I said this somewhere else, but the 970/980 are geared toward 1440p, while the flagship is aimed at 4K :)
It's pretty darn obvious by the rumored specs.
 
I just took it as a comparison...I didn't go down the rabbit hole any further than that. But if you want me to take that trip, it makes sense to me BECAUSE of those words that the 384bit bus would compete with AMD's 512bit bus (that is made for 4K).

I agree that, at least it FEELS like, the 970/980 handle 2560x1440/1600, while their inevitable flagship will be marketed towards 4K.
 
I wonder why there were no benchmarks done on the GTX 970/980 series at high MSAA levels? Maybe 4xMSAA or 8xMSAA? How about some 4xSSAA or 8xSSAA transparency AA? Instead I saw benchmarks using low levels of FXAA or no AA at all.
I've seen my GTX 780 choke on 16xQ CSAA on Doom3 and Quake 4 (due entirely to the memory controller load), I wonder how the GTX 970/980 series would do at those levels of AA?
As for Nvidia's GTX 980 marketing blurbs, they mention "4K" throughout their press release.
 
As for Nvidia's GTX 980 marketing blurbs, they mention "4K" throughout their press release.

Nobody said the 980 WOULDN'T run 4K, just that the flagship is TARGETING 4K while the 980 targets 1440p.
 
I wonder why there were no benchmarks done on the GTX 970/980 series at high MSAA levels? Maybe 4xMSAA or 8xMSAA? How about some 4xSSAA or 8xSSAA transparency AA? Instead I saw benchmarks using low levels of FXAA or no AA at all.
I've seen my GTX 780 choke on 16xQ CSAA on Doom3 and Quake 4 (due entirely to the memory controller load), I wonder how the GTX 970/980 series would do at those levels of AA?
As for Nvidia's GTX 980 marketing blurbs, they mention "4K" throughout their press release.
Because they are aware of the trivial differences on 1080p or above for such settings would be my guess. Also, a lot of games do not give you that option to go that high in the first place. For example, BF4. 4xMSAA is it. You would have to force more AA from the NVCP which can mess things up in some titles.

If you can notice the different AA's (above 4x) while you are moving about, color me impressed, and skeptical. :)

We are not playing on 800x600 anymore so copious amounts of AA aren't really needed.
 
Last edited:
Who knows what is in the minds of the AIB's... where is our Galaxy rep??? :)

For the most part though, a single 780ti or 980 won't do terribly well at 4k resolutions. You will likely need AA disabled, which is ok due to the high pixel density so less jaggies.

More memory and more bandwidth, regardless of the improvements NVIDIA made to its compression will still help at 4K reso's.
 
Because they are aware of the trivial differences on 1080p or above for such settings would be my guess. Also, a lot of games do not give you that option to go that high in the first place. For example, BF4. 4xMSAA is it. You would have to force more AA from the NVCP which can mess things up in some titles.

If you can notice the different AA's (above 4x) while you are moving about, color me impressed, and skeptical. :)

We are not playing on 800x600 anymore so copious amounts of AA aren't really needed.

Who said anything about 800x600? "copious amounts of AA aren't really needed", is nothing more
than an opinion and a convenient excuse. Why do they bother allowing 16xQ CSAA if it's not really
needed? Or 8xSSAA transparency AA? After all video games aren't really needed either are they?
 
Who said anything about 800x600? "copious amounts of AA aren't really needed", is nothing more
than an opinion and a convenient excuse. Why do they bother allowing 16xQ CSAA if it's not really
needed? Or 8xSSAA transparency AA? After all video games aren't really needed either are they?
RE: 800x600... I was being a bit dramatic in trying to get a point across that the lower the resolution, the higher AA that is needed to cover the jaggies. Sorry I didn't make that 'read between the lines' statement more obvious. :)

Of course its subjective Magellan. Hang your hat on the fact that most only use in game AA settings which in a lot of titles do not go up very high (there are plenty of exceptions). And that if you override the settings of the game via NVCP, that can cause visual problems or significnant performance hits above and beyond what it would do if those options were in game. That is why it is always said to use the in game settings and not 'force' anything on the GPU unless you need to.

SSAA has been out for quite some time now... at least 2001 (where 800x600/1280x1024 was about popular, note).

Personally, with the games I play (quick action FPS - BF4), I cannot notice a thing when running around as far as AA goes. I leave at 4xMSAA (max in game). I can notice it lower. When I force higher through the control panel, I don't notice a thing and I get lower frames. I can see a greater need for say RPG's or RTS type games though where it is slower and more static allowing one to see the jaggies easier. Those games typically have higher in game AA settings though.

There are a lot of things in life that are not needed...but I am not sure what the point is there except to be snarky...
 
Last edited:
Is 1 980 not enough for 4k? Damn, I really want to upgrade my 670 4GBs, but don't have $1000 to spend on them.
 
Back