• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Ashes of the Singularity - benchmark

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Kenrou

Member
Joined
Aug 14, 2014
Running GoG.com version 2.0.1.5 (still BETA), W8.1 dx11, interesting to see CPU not even warming up and GPU running full tilt to keep up. Installing W10 th2 tomorrow to see the differences with dx12.

Clipboard01.jpg

Clipboard06.jpg

Clipboard07.jpg

Clipboard08.jpg

GPU mem.jpg
 
Ok so, W10 PRO x64 1151 (10586), dx12. More eye candy, but even though BETA is BETA, was expecting a tad more then this. Roughly same CPU and GPU levels, tad more heavy batch fps and the one thing that stood out was "GPU D3D Memory Dynamic" that shot up.

Clipboard01.jpg

Clipboard03.jpg

Clipboard04.jpg

Clipboard05.jpg
 
Interesting results. I should test it on mine as post here if you don't mind. Hope to do it tonight.

I know DX12 is supposed to really shine when there is LOTS of units(entities) on the screen at once. Maybe the benchmark doesn't make that really shown as much.
Maybe in your case the CPU (AMD) is limiting the GPU from shinning more.
 
Alaric, the link you mention above explains my experience to a T with the DX11 vs DX12 dealing with an AMD card.


Alright got some results....

First settings... Same for all the tests.
Ashes-Game Settings.jpg

Second GPU Speeds, and Temp Results (3C higher with DX12... but it was run pretty close back to back)
Ashes-GPUTemps.jpg

Next up the GPU Sensor Results... as mentioned it utilizes Direct Memory with DX12.. quiet a bit of it.
Otherwise GPU utilization was identical... 100% across the board for both.
Ashes-DX11-GPU.jpg
Ashes-DX12-GPU.jpg

Now the CPU results, now I do have just a quad core i5 @ 4.5Ghz. Here is what I found interesting and inline with what DX12 is supposed to do. It utilizes the CPU cores more efficiently. Just shy of 6% more CPU utilization using DX12.

DX11
Ashes-DX11-CPU.jpg

DX12
Ashes-DX12-CPU.jpg

And now the real results... Lets just say, DX12 whoops DX11 and then some....
Ashes-DX11-Bench.jpg
Ashes-DX12-Bench.jpg
 
Little extra fun.... Benched 4k same settings with just DX12

My GPU is the limit

CPU, utilization went down 93.4% overall, so right in the middle of DX11 and DX12 running at 1080p
Ashes-DX12-CPU-4k.jpg

GPU, well it got hotter, went up to 76C this time... memory utilization went up slightly buy Dynamic was a little down.
Ashes-DX12-GPU-4k.jpg

Now the bench, well it took a hit vs 1080, about half the FPS.... but it looked ALOT nicer on the screen.
Ashes-DX12-Bench-4k.jpg
 
Nice! And this , added to various tests and reviews , and the article I linked , is why my next rig is 99% likely to be Team Blue. And I may jump ship to Green , too. Zen may close the gap by a lot (dumping the failed BD architecture) , but it isn't likely to be in time to counter the above results.
And AMD's bragging about their "advantage" with DX 12 just blew up in the CPU division's face. Way to shoot yourselves in the foot , guys. Geez. Looks like my hexacore , 4400 MHz , FX has a fine future in a HTPC. LOL
 
And my poor CPU is 5 years old this month.... I should build a new rig :)
 
I saw deathman20's results and immediately thought of my poor results. If your sig is up to date you have effectively the same GPU as I do.
My system runs that benchmark at the same game settings around 28fps. So I went to bios to reset everything and ran the benchmark again with the same result.
Then I OCed GPU to 1100/1300 and ended up with 30fps.
Back to bios and OCed CPU to 4.8GHz and still 30fps.

There is something funny with that test and the FX processor.

DX11 1080p
1080p DX11 new patch.jpg


DX12 1080p
1080p DX12 new patch.jpg


DX12 4.8GHz - 1100/1300MHz 1080p
1080p DX12 GPU+CPU OC.jpg


DX12 1440p
1440p DX12 new patch.jpg
 
The FX is really that far behind. If you follow the link in the fourth post there is a good read that covers the situation pretty well. DX 12 is no friend to the FX. Zen should be an improvement. And I should be rich and good lookin' , and that hasn't worked out very well either. :)
 
Is it DX12 or is it (CPU) just being used more with DX12 so it is showing its inherent weaknesses more?

Wait, that is like the, 'if a tree falls in the forest and nobody is there, does it make a sound' conundrum... :rofl:
 
To be fair my CPU sit around 30-40% utilization during the benchmark. There must be something else wrong.
 
Nevermind, figured it out.


here's my results =O
TYFmifF.jpg
1080p

my cpu was also only 30-40% the whole benchmark though
all cores being used


this was the cpu version:
CLOTh22.jpg
cpu usage generally stayed around 50% though, idk whats up with that
 
Last edited:
Is it DX12 or is it (CPU) just being used more with DX12 so it is showing its inherent weaknesses more?

Wait, that is like the, 'if a tree falls in the forest and nobody is there, does it make a sound' conundrum... :rofl:

What he said. :) Even if it's caused by an evil leprechaun that is shipped with every FX , the end result is the same. DX 12 is better with Intel. And by a pretty noticeable margin. So far. One beta doesn't make the entire case but the testing seemed to be pretty comprehensive and the disparity wide enough to make me very pessimistic regarding Team Red's ability to take full advantage of the new tech. Isn't DX 12 supposed to be close to Mantle in application? And AMD developed Mantle. Red stepped on their willy hard with this one. If Zen takes another six months , will they be able to overcome the perception that Intel=DX 12 ? Even if the new chip can compete they face a very uphill climb. IMO
 
Last edited:
FX might be sluggish but it isn't that slow. With better multicore support it should fare quite well.
Looking at the linked Techspot article there isn't that much of a difference in performance with high quality settings for both AMD GPUs or CPUs compared to NVIDIA or Intel.
Intel CPUs really don't pull ahead unless you lower the game quality settings to medium.
 
I would venture a guess that medium settings will be the most common among casual gamers , people who won't spend $500+ on a discrete graphics solution. We get a lot of people at OC that are looking for a mid priced card ($200-$250) that will "Play current titles" at medium settings. DX 12 was supposed to make that a lot more accessible. For AMD fans that may turn out to be a big Oops!.

Again though , we're talking one title and one review/test. Another DX 12 game may run better on FX chips. Not really enough info to make a declaration , but if it's all we have when I upgrade , I'm lookin' at you Blue. :)

ED must be loving this. Lately I've been defending W 10 and Intel. But facts is facts.
 
I would venture a guess that medium settings will be the most common among casual gamers , people who won't spend $500+ on a discrete graphics solution. We get a lot of people at OC that are looking for a mid priced card ($200-$250) that will "Play current titles" at medium settings. DX 12 was supposed to make that a lot more accessible. For AMD fans that may turn out to be a big Oops!.

You will still reach playable framerates with medium settings. If you would take a look at the article and those results, the FX CPU manages 55-60fps which is enough for a strategy title.
If your GPU is the limiting factor your CPU choice will matter less. I think you haven't really thought out what you write.
 
I didn't say the FX octacores were unplayable. They are , in every case shown , behind the i3 in that one article. By 40 fps in one instance. That's a LOT. And at 1080p , medium quality , which is probably a pretty fair representation of how the game will be played by a lot of folks. I wasn't trying to say DX 12 is here , sell your FX chips. However , building/buying a new rig at this time I wouldn't touch an FX. Game development and implementation of DX 12 isn't likely to favor the FX any more , either. The more the CPU is utilized , the bigger the disadvantage.

I like my FX , but spending money on one right now for gaming wouldn't be a decision I'd make. YMMV.
 
I didn't say the FX octacores were unplayable. They are , in every case shown , behind the i3 in that one article. By 40 fps in one instance. That's a LOT. And at 1080p , medium quality , which is probably a pretty fair representation of how the game will be played by a lot of folks.

I get that but, if you have the GPU power to run this game at around 60fps max quality, why would you play it at medium quality?
If you have GPU that pushes 60 fps with medium settings it doesn't really matter if your CPU could do 90fps with 980ti since your gpu is not up for it.
 
Back