• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FEATURED NVIDIA Readies Three GP104 "Pascal" Based SKUs for June 2016

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Then why aren't the 8 GB AMD cards selling better? Well, besides the drvers. :)

because VR isnt really here yet, like all over the place. there are many people who game playing flight/racing/mech sim games that those are the ones that might use say 6gb or over. some go all out with 3 monitor setups for games there might be more but i dont or have read about any for FPS games. higher resolution you go the more Vram you use, if current games and cards are good enough more isnt really needed. as fast as games and video cards come out going for the most ram isnt going to help for gaming. high(er) vram cards are more likely to bought up by people using them for 3D rendering or other apps that are non-gaming that will use that much or close to that much vram.
 
Depends on the resolution really. At 4K now? 6GB+ is where I would be at. 1400? At least 4GB (plenty of titles can eclipse that on ULtra settings). 1080p? 4GB is fine.

Thing is vRAM use, for all intents and purposes, seems to go up year by year. So some forward thinking needs to be involved.
Agree. Sold my 970 and stuck with my R9 290 when I first went to 1440 resolution.
 
And those numbers actually make some sense. Think about it - like everything, the average gamer isn't going to be thinking about 4k or 1440p. Maybe not even what panel a screen uses or refresh rate. They're going to be thinking "Can this monitor play the latest MOAB game without looking too bad?" Most of my gamer friends, especially in college, would do one of the following -

A) Hook their PC to their TV, so 1080P or 720P there

or

B) Pick up a cheaper monitor, but not complete bargain basement. If you look around, for the average 20-22" low to lower mid range monitor, that's still your basic 1080P 60Hz TN based screen.

That is true most people on these forums for gaming are 1% that care about gaming at 1440p me included.
 
I won't be upgrading screens until somebody starts making 10-bit 4k >=90Hz Adaptive-Sync screens (on 2x 2560x1440 right now, but only game on one, other is for web browser). My Vive will be here day after tomorrow, though, so I may be in for yet another GPU upgrade far too soon after getting this Fury X depending on how it does.

anyway since all games are D3D now im sure nv has made changes to get more performance from those types of engines. with driver tweaking to lower quality in certain areas of the game we wont notice offers more performance that wouldn't be there if they just focused on quality all the way.

All games are D3D now? What world are you living in?
 
Last edited:
All games are D3D now? What world are you living in?
other then ID i dont know any other gaming company that makes engines using OGL, do you? everything else fits into MS's Dx world, so it would be Direct-X, or D3D. D3D in some games i played along time ago is what they labled as MS's DirectX, dont know why. i still use it from time to time..
 
other then ID i dont know any other gaming company that makes engines using OGL, do you? everything else fits into MS's Dx world, so it would be Direct-X, or D3D. D3D in some games i played along time ago is what they labled as MS's DirectX, dont know why. i still use it from time to time..

Unity and Unreal Engine 4, everything cross-platform in Steam and everything on Android. The current nightly builds of Starbound are OpenGL-only, even on Windows. nVidia doesn't just sell hardware for DirectX. They sell Tegra for Android and they sell Tesla for HPC.
 
well i asked a rather simple question but the latter part of your post makes it sound like i said something else other then what i said. i never said they sold HW just for DX, just from watching dev's. i noticed that all AAA titles were DX not OGL unless it was from ID. as i said before NV had better hw for OGL games then AMD, im hoping that is still the case. since when i looked at benches for the newer NV/AMD cards, it was never noted which game was which IE DX or OGL. even when i looked into some games using the same engine as some benchmarks, it just seemed clear that all were DX based. if the engines you mentioned above are OGL, then when did they switch UE's to OGL. cause the last UE i played was unreal and then UT both of those as far as i recall were DX not OGL. im not saying they didnt switch but the lack of transparency to which enige is which is something that has been lost in reviews. back in the day they would point out which games where OGL and D3D, thus showing a clear advantage for NV hw in OGL. where in DX based games both NV and AMD were rather close and sometimes AMD ahead.
 
well i asked a rather simple question but the latter part of your post makes it sound like i said something else other then what i said. i never said they sold HW just for DX, just from watching dev's. i noticed that all AAA titles were DX not OGL unless it was from ID. as i said before NV had better hw for OGL games then AMD, im hoping that is still the case. since when i looked at benches for the newer NV/AMD cards, it was never noted which game was which IE DX or OGL. even when i looked into some games using the same engine as some benchmarks, it just seemed clear that all were DX based. if the engines you mentioned above are OGL, then when did they switch UE's to OGL. cause the last UE i played was unreal and then UT both of those as far as i recall were DX not OGL. im not saying they didnt switch but the lack of transparency to which enige is which is something that has been lost in reviews. back in the day they would point out which games where OGL and D3D, thus showing a clear advantage for NV hw in OGL. where in DX based games both NV and AMD were rather close and sometimes AMD ahead.

They didn't switch. They support both, because they're cross-platform engines.

Original Unreal (and Tournament) defaulted to DirectX (v6 at the time I think) becuse the OpenGL driver at release time was buggy. Some third party released updated drivers for OpenGL and DX9.

I'd bet that given the number of "casual gamers" with Android-based devices, there are probably orders of magnitude more OpenGL systems in use than DirectX systems, and as long as nVidia is selling Shields, I don't think they're going to focus exclusively on DX performance, which is what I thought you implied earlier.
 
Last edited:
If there are any that are hoping that AMD dies, you either like paying a fortune for a GPU, or you REALLY don't understand how they help to keep NVIDIAs prices competitive.... Personally, I hope AMD hits a homerun, not only with their new GPUs, but also with Zen, when it finally releases. And if you were smart.....so would you.

/boggle
AMD driver quality and performance have been improving for a good while. They're definitely the GPU of choice for open source purists.
http://www.phoronix.com/scan.php?page=article&item=end-april-amdnv&num=2
What remains to be seen is whether or not they can deliver a good driver on the first day of release like Nvidia does. Historically, they weren't very good at it. There's also the news where they talked a lot and contributed a lot to Vulkan, but Nvidia, Intel, Qualcomm, and even IMG beat them to it at delivering working drivers.
 
What remains to be seen is whether or not they can deliver a good driver on the first day of release like Nvidia does. Historically, they weren't very good at it.

I'm still waiting for a good driver for a R7 260X. :bang head
 
Back