• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Kinda BS performance of a modern system in DX9 — anybody else experience this?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

NewbieOneKenobi

Member
Joined
Nov 14, 2006
Location
Warsaw/Poland
It's not the strongest modern system you could build, but it's stronger than pretty much anything that existed in 2009 in the mainstream market: i5-6600, DDR4/3000/14, R9 280X. If you believe the benchmarks, the strongest due/quad/extreme existing in 2008/2009 with a 4890X2 can't offer merely the same power as an i5 with a 280X… or does it. Modern cards often end up scoring poor benchmarks in DX9 games, but GFX isn't all there is to it. In short, I'm often getting 30-ish fps and sometimes <20 fps (when more's going on) in Dragon Age 1(!) when I try forcing SSAA 16, anizo 16 etc. @ 1920x1080, enhancing app settings through Crimson. With VSR @1440p it's 5 to 10 less.

I could probably give up on supersampling and generally go down to app settings and get better fps with this, I guess, even @1440p, but this still has me thinking. The settings are pretty basic. DX9 is old. 1080p is old. And besides, I remember playing DA1 at max application settings (just not much enhancing via driver) @1440p on a fricken Core 2 Duo with the same graphics card, several weeks ago, and it's not like it was much slower with a HD4850 before I replaced the graphics.

I've gotta say this is quite underwhelming. I realize DA1/DX9 isn't going to put all physical cores to use, possibly just 1, but even in single-core Skylake i5 is supposed to be much stronger than a Core2Duo from 2008. I try to take solace in the hope that DX 10–12 is probably going to show improvement and get me some decent fps, but come on, here I was thinking that even an i3 with a 750ti would be dominating older games @ 1080p. Argh.

So what's your experience comparing modern systems to those from a decade ago in games, especially older games but not only?
 
Were you "forcing SSAA 16, anizo 16 etc. @ 1920x1080, enhancing app settings through Crimson" back in 08/09 ? Dragon Age Origins right ?
 
Forcing/using 16x SSAA is killing you. The 16x AF won't make much of a difference at all performance wise (EVERYONE should have this cranked). You may even be breaking the 3GB barrier on your card depending on your other settings and if you have any mods on the game...

EDIT: He said Dragon Age 1, which I am assuming is not Origins??


I don't play older games like that really. About the oldest game I have installed is BF4.
 
EDIT: He said Dragon Age 1, which I am assuming is not Origins??

No idea, that was the 1st one i ever heard about.

I still play a heavily modded (4096 textures instead of the basic 512/1024) Mass Effect 3 w/SGSSAA 4x for zero jaggies and i get rock steady 60fps on account of my 6700k as my old 8370 4.7ghz failed to keep up.
 
Were you "forcing SSAA 16, anizo 16 etc. @ 1920x1080, enhancing app settings through Crimson" back in 08/09 ? Dragon Age Origins right ?

No, I wasn't doing that back in 08/09. Couldn't remotely afford the kind of PC for that sort of thing back then. However, I eventually ended up having a lapped e8600 OC'd to 4 GHz and beyond on that mobo and going from an HD4850 (already manageable) tor R9 280X. The game ran fine. I wasn't exactly able to get fluent framerates with SSAA enabled, of course, but adaptive AA at 8 and VSR @ 1440p was doable.

EDIT: He said Dragon Age 1, which I am assuming is not Origins??

Sorry, I forgot only DA2 had a number of the entire trilogy. :)

Forcing/using 16x SSAA is killing you. The 16x AF won't make much of a difference at all performance wise (EVERYONE should have this cranked). You may even be breaking the 3GB barrier on your card depending on your other settings and if you have any mods on the game...

I'll probably try FSAA 16 & VSR 1440p to see the fps. Would be a fairer comparison to the retired C2D rig perhaps. Still, as I recall SSAA @1080p was more or less as doable with it as it is with the current rig (same GPU, different CPU) and SSAA @ 1440p equally not doable. I expected a noticeable gain from Wolfdale to Skylake, even in single-core/dual.
 
Last edited:
Curious, do you have your GPU running at max Core/Mem Freq at all times or is it varying each second?

You can view your Frequency over time using GPUz or MSI Afterburner.
 
Those functions are handled by the gpu, not cpu.

There isn't a need to run so much AA at 2560x1440. I can't notice much improvement past 4x MSAA...
 
No, I wasn't doing that back in 08/09. Couldn't remotely afford the kind of PC for that sort of thing back then. However, I eventually ended up having a lapped e8600 OC'd to 4 GHz and beyond on that mobo and going from an HD4850 (already manageable) tor R9 280X. The game ran fine. I wasn't exactly able to get fluent framerates with SSAA enabled, of course, but adaptive AA at 8 and VSR @ 1440p was doable.

As far as i know (please correct me if i'm wrong) all the Anti-Aliasing is done by the GPU, so if its weak you got no chance regardless of CPU. With a 280X i don't think i would use anything above MSAA 4x or maybe SGSSAA 2x @1080p to get steady 30fps/60fps (using link as comparison against mine) but i have never had one so YMMV :( http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-280X/3439vs2192

I downloaded Origins and will try it out to see what can be tweaked. Have you tried a FXAA injector ? used it for great effect in Kingdoms of Amalur Reckoning, allowed me to tweak lighting, colors and a few other options for DX9.
 
As far as i know (please correct me if i'm wrong) all the Anti-Aliasing is done by the GPU, so if its weak you got no chance regardless of CPU. With a 280X i don't think i would use anything above MSAA 4x or maybe SGSSAA 2x @1080p to get steady 30fps/60fps (using link as comparison against mine) but i have never had one so YMMV :( http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-280X/3439vs2192

I downloaded Origins and will try it out to see what can be tweaked. Have you tried a FXAA injector ? used it for great effect in Kingdoms of Amalur Reckoning, allowed me to tweak lighting, colors and a few other options for DX9.

would that really be a very good baseline, being your gpu is like 3 gens newer and like 3x as powerful?
 
would that really be a very good baseline, being your gpu is like 3 gens newer and like 3x as powerful?

Moot atm. Just tested the game and... well... 250fps+ in game, 350fps+ cutscenes, 4000fps+ on videos according to fraps @1080p max settings MSAA 8x ?

Clipboard02.jpg

Clipboard01.jpg

140fps+ in game, 250fps+ cutscenes, 1000fps+ on videos according to fraps @4K max settings MSAA 8x

Clipboard05.jpg

Clipboard04.jpg

Doesn't look like the type of game that would be hampered even by a 280X ?
 
Last edited:
Curious, do you have your GPU running at max Core/Mem Freq at all times or is it varying each second?

You can view your Frequency over time using GPUz or MSI Afterburner.

I'd need continuous on-screen display to be sure, but HWMonitor shows the full GPU clock after alt-tabbing out of the game @ 870, before it drops to 501. Mem always stays at 1500.

As far as i know (please correct me if i'm wrong) all the Anti-Aliasing is done by the GPU, so if its weak you got no chance regardless of CPU. With a 280X i don't think i would use anything above MSAA 4x or maybe SGSSAA 2x @1080p to get steady 30fps/60fps (using link as comparison against mine) but i have never had one so YMMV :( http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-280X/3439vs2192

Looks like I overestimated the power of 280X, even vs games that were 2 years old when it was released in the top range of the middle range. Crap. Still underwhelmed by the CPU apparently not making much difference. Might as well have skipped Skylake.

I downloaded Origins and will try it out to see what can be tweaked

Thanks.

Have you tried a FXAA injector ? used it for great effect in Kingdoms of Amalur Reckoning, allowed me to tweak lighting, colors and a few other options for DX9.

Nope. Sounds interesting.
 
Last edited:
You do not need an OSD...use the graph that MSI Afterburner has... start it up before the game, play the game for sevreal minutes and look at the graph for clockspeeds.

But regardless it's good news when you alt+tab out she is running at its rated speeds.
 
Is that the right game? :rofl:

I guess it works in modern systems! Wonder if it's W10?

Seems to be, otherwise OP would've said something by now (i Wiki'ed just in case). Works perfectly well in Win8.1 but seemingly needs the LAA patch to run in Win10 for some reason or add -dx9 to your launcher ?

Looks like I overestimated the power of 280X, even vs games that were 2 years old when it was released in the top range of the middle range. Crap. Still underwhelmed by the CPU apparently not making much difference. Might as well have skipped Skylake.

It's not all gloom and doom, all the settings i used are from the game's control panel. You could try to stop using the AMD CP for testing and use only your normal resolution+MSAA 8x from the game to see how it looks for you (i didn't see any jaggies @1080p). As Wagex said i have near enough 3x your horsepower BUT even then you should be pulling ~80fps with the settings i used, so yay i suppose. There also seems to be a HD texture pack at NexusMods, but that would be up your alley to find out if you can get a stable frame rate 1st.

http://www.nexusmods.com/dragonage/...age/ajax/moddescription/?id=15&preview=&pUp=1 - HD texture mod
http://www.nexusmods.com/dragonage/...e/ajax/moddescription/?id=3869&preview=&pUp=1 - Texture mod

http://www.nexusmods.com/dragonage/mods/3653/? - lighting mod

This game was apparently designed to use 2 cores but seems to use 4 or more to some extent if their available :thup:
 
Last edited:
Yeah, if one has AA enabled in game then pouring 16x of another AA on top of it... makes more sense.
 
I assumed OP went through the game's CP/menu and turned it off before tinkering ? regardless, the game's own MSAA looks good enough if you look at screenies, the only thing that should be forced in AMD/nVidia CP is Anisotropic Filtering 16x because the game itself doesn't seem to have any option for it.
 
Seems to be, otherwise OP would've said something by now (i Wiki'ed just in case). Works perfectly well in Win8.1 but seemingly needs the LAA patch to run in Win10 for some reason or add -dx9 to your launcher ?

Ugh, sorry about that. Yes, Win 10 it is.

It's not all gloom and doom, all the settings i used are from the game's control panel. You could try to stop using the AMD CP for testing and use only your normal resolution+MSAA 8x from the game to see how it looks for you (i didn't see any jaggies @1080p). As Wagex said i have near enough 3x your horsepower BUT even then you should be pulling ~80fps with the settings i used, so yay i suppose.

I can get the game down to 20 fps and occasionally even less if I max out the driver settings, such as 8EQ SSAA with edge-detect and everything else I can think about. Leaving AA in application control leaves me with 60–80 fps — with anizo 16, driver-side high-quality texture filtering and morphological filtering enabled. Replacing application settings with adaptive multisampling bumps the framerate above 100 without particularly noticeable degradation. Also, regardless which settings I use, it seems 'enhance application settings' is slower than 'replace'.

I'm still playing with SSAA @ 1080p and often 30-ish fps pretty much because I don't really mind 30, I'm more into max settings and 30 than lower settings and 60, personal preference. It just gives me some comfort that it's just that the supersampling is so taxing on the system and not that the PC is slow in DX9 overall in the end.

This game was apparently designed to use 2 cores but seems to use 4 or more to some extent if their available :thup:

Probably explains why the C2D was quite capable. :)
 
The 16x AF won't make much of a difference at all performance wise (EVERYONE should have this cranked).

QFT, even a 2002 and 2003 mid-end Radeon does well with 16xAF, LOL... AF hardly has a hit, if at all! I knew that for over 10 years!
 
I can get the game down to 20 fps and occasionally even less if I max out the driver settings, such as 8EQ SSAA with edge-detect and everything else I can think about. Leaving AA in application control leaves me with 60–80 fps — with anizo 16, driver-side high-quality texture filtering and morphological filtering enabled. Replacing application settings with adaptive multisampling bumps the framerate above 100 without particularly noticeable degradation. Also, regardless which settings I use, it seems 'enhance application settings' is slower than 'replace'.

I'm still playing with SSAA @ 1080p and often 30-ish fps pretty much because I don't really mind 30, I'm more into max settings and 30 than lower settings and 60, personal preference. It just gives me some comfort that it's just that the supersampling is so taxing on the system and not that the PC is slow in DX9 overall in the end.

Ok i'm very curious now, if you get zero or near zero jaggies like i showed you with only 8x MSAA (possibly less haven't tried, and it will play well on yours) why are you trying out other settings like SGSSAA ? As you seen for yourself the performance hit is massive for near enough the same quality ?
 
Back