Alaric
New Member
- Joined
- Dec 4, 2011
- Location
- Satan's Colon, US
Then why aren't the 8 GB AMD cards selling better? Well, besides the drvers.
Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
With Crysis 3 on my 1080p I really need a GTX 1070, my FPS dips into the 40FPS.
Now why did I not think of that, I think I will wait till Pascel and see what it brings.You're on a 970... get a 980 or 980Ti and you won't have those dips...
Then why aren't the 8 GB AMD cards selling better? Well, besides the drvers.
Agree. Sold my 970 and stuck with my R9 290 when I first went to 1440 resolution.Depends on the resolution really. At 4K now? 6GB+ is where I would be at. 1400? At least 4GB (plenty of titles can eclipse that on ULtra settings). 1080p? 4GB is fine.
Thing is vRAM use, for all intents and purposes, seems to go up year by year. So some forward thinking needs to be involved.
You're on a 970... get a 980 or 980Ti and you won't have those dips...
And those numbers actually make some sense. Think about it - like everything, the average gamer isn't going to be thinking about 4k or 1440p. Maybe not even what panel a screen uses or refresh rate. They're going to be thinking "Can this monitor play the latest MOAB game without looking too bad?" Most of my gamer friends, especially in college, would do one of the following -
A) Hook their PC to their TV, so 1080P or 720P there
or
B) Pick up a cheaper monitor, but not complete bargain basement. If you look around, for the average 20-22" low to lower mid range monitor, that's still your basic 1080P 60Hz TN based screen.
anyway since all games are D3D now im sure nv has made changes to get more performance from those types of engines. with driver tweaking to lower quality in certain areas of the game we wont notice offers more performance that wouldn't be there if they just focused on quality all the way.
other then ID i dont know any other gaming company that makes engines using OGL, do you? everything else fits into MS's Dx world, so it would be Direct-X, or D3D. D3D in some games i played along time ago is what they labled as MS's DirectX, dont know why. i still use it from time to time..All games are D3D now? What world are you living in?
other then ID i dont know any other gaming company that makes engines using OGL, do you? everything else fits into MS's Dx world, so it would be Direct-X, or D3D. D3D in some games i played along time ago is what they labled as MS's DirectX, dont know why. i still use it from time to time..
well i asked a rather simple question but the latter part of your post makes it sound like i said something else other then what i said. i never said they sold HW just for DX, just from watching dev's. i noticed that all AAA titles were DX not OGL unless it was from ID. as i said before NV had better hw for OGL games then AMD, im hoping that is still the case. since when i looked at benches for the newer NV/AMD cards, it was never noted which game was which IE DX or OGL. even when i looked into some games using the same engine as some benchmarks, it just seemed clear that all were DX based. if the engines you mentioned above are OGL, then when did they switch UE's to OGL. cause the last UE i played was unreal and then UT both of those as far as i recall were DX not OGL. im not saying they didnt switch but the lack of transparency to which enige is which is something that has been lost in reviews. back in the day they would point out which games where OGL and D3D, thus showing a clear advantage for NV hw in OGL. where in DX based games both NV and AMD were rather close and sometimes AMD ahead.
AMD driver quality and performance have been improving for a good while. They're definitely the GPU of choice for open source purists.If there are any that are hoping that AMD dies, you either like paying a fortune for a GPU, or you REALLY don't understand how they help to keep NVIDIAs prices competitive.... Personally, I hope AMD hits a homerun, not only with their new GPUs, but also with Zen, when it finally releases. And if you were smart.....so would you.
/boggle
What remains to be seen is whether or not they can deliver a good driver on the first day of release like Nvidia does. Historically, they weren't very good at it.