• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Fallout 4: memory speed making significant differences in benchmarks

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
Perhaps the amount of system ram used plays a role, but, not always. There are other titles that use a ton of system ram but don't respond well to speed increases. That said, if you have filled the ram to capacity and are paging out, perhaps then faster memory would help.. I wonder how much system ram was used in these benchmarks. EDIT: I see 16GB... so I doubt it was maxing that out... hmmmmm

Is this the only game to ever see significant improvements in FPS by increasing RAM speed?
The 1st article really leads you to this answer...


I'd also like to see some testing at a higher 'gpu bound' resolution instead of the 'cpu bound' 1080p res and see if it still holds true.
 
OP
M

magellan

Member
Joined
Jul 20, 2002
The 1st article really leads you to this answer...

I'd also like to see some testing at a higher 'gpu bound' resolution instead of the 'cpu bound' 1080p res and see if it still holds true.

Good point. This article suggests it still does:
http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-fallout-4-4023

From the article (which tested Fallout 4 all the way up to 4K resolutions):
"Secondly, faster RAM will improve your lowest frame-rates - but remember that only boards using the top-end chipsets support overclocked memory."
 
OP
M

magellan

Member
Joined
Jul 20, 2002
ED to increase graphics load in Fallout 4 you don't necessarily have to increase the resolution, you can increase the draw distance through config file manipulation. Fallout 4 being an open-world game increasing draw distance can kill graphics performance quickly.
 

GenericSauron

Registered
Joined
Feb 27, 2016
It's really not that surprising that DDR3 2400MHz with its high speed and low latency can make such a difference. To me, DDR3 2400MHz is a pretty premium memory. It would probably take DDR4 2866 to match it, but that's just a guess.
 

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
It's really not that surprising that DDR3 2400MHz with its high speed and low latency can make such a difference. To me, DDR3 2400MHz is a pretty premium memory. It would probably take DDR4 2866 to match it, but that's just a guess.
Depends on the timings of the sticks. As a side note, I don't think 2866 exists, but get the point of the comment. :thup:



@ Mags - You can do that sure, but a lot of people won't mess with config files. The techspot article stated it took a 4.5Ghz 6700K before they felt it was GPU limited (1080p).

I will check out the other link at home as it is blocked from my office. But that comment seems awfully generic/out of context as I do not see what resolution they are talking about from what you quoted here.
 
OP
M

magellan

Member
Joined
Jul 20, 2002
From what I've seen of Fallout 4 it doesn't seem to use hyperthreading. But it does use all 6 cores of my 5820. Will Fallout 4 use all 8 cores of the Haswell-E?
 

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
You have 12 threads on that 5820 unless you disabled HT....also, how do you know it is using real cores and not HT?

You can pretty easily test this and report back...cut back your Hex to a quad and run without HT and then run it with. Or, if you, for some odd reason do have HT disabled, enable it and see if the load spreads out. ;)

Let us know (screenshots plz!)............. :)


EDIT: ANd you STILL have no updated your signature... Jebus man....... update that thing already! :rofl:
 
OP
M

magellan

Member
Joined
Jul 20, 2002
You have 12 threads on that 5820 unless you disabled HT....also, how do you know it is using real cores and not HT?

You can pretty easily test this and report back...cut back your Hex to a quad and run without HT and then run it with. Or, if you, for some odd reason do have HT disabled, enable it and see if the load spreads out. ;)

Let us know (screenshots plz!)............. :)


EDIT: ANd you STILL have no updated your signature... Jebus man....... update that thing already! :rofl:

ED, when I check task manager, it looks like only 6 cores out of 12 are being used by Fallout 4. For example CPU 0, CPU 2, CPU 4, CPU 6, CPU 8, CPU 10. Doesn't this imply the hyperthreading cores aren't being used?
 

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
No idea which cores are what.... though that is a logical leap.

Looking forward to your results! :)
 
OP
M

magellan

Member
Joined
Jul 20, 2002
Memory speed tests on gaming FPS


They didn't include the Fallout 4 tests, in which gains of nearly 50% in FPS were seen from merely increasing the memory speed, but I guess Falllout 4 is an outlier at this point.

I've read elsewhere that memory speed affects physics simulations in games the most, why I don't know.
 

illusi

Registered
Joined
Aug 22, 2016
I think the reason for that is, as you move around Fallout is constantly loading and unloading different cells to and fro the memory. Since ram doesn't have such a high impact on Skyrim, it's more than likely an optimization issue on Bethesda's part. Hmm I wonder if Minecraft or No Man's Sky would be affected by ram speeds.
 

Kenrou

Member
Joined
Aug 14, 2014
I'm confused, why Minecraft ? the kind of graphics this game has i would expect you hitting 500fps+ without any issues unless its extremely badly coded ?
 

illusi

Registered
Joined
Aug 22, 2016
I'm confused, why Minecraft ? the kind of graphics this game has i would expect you hitting 500fps+ without any issues unless its extremely badly coded ?

Because it's another game that is constantly loading and unloading cells? Also considering in minecrafts case, cells are procedurally generated, which would definitely need fast memory. 500FPS is not the point, what I'm curious about is if I'm getting 500 fps with and average RAM would I get 600 with a high end one? Or would the increase be 3-5 FPS?
 

[email protected]

Member
Joined
Apr 7, 2016
Location
Israel
Depends on the timings of the sticks. As a side note, I don't think 2866 exists, but get the point of the comment. :thup:



@ Mags - You can do that sure, but a lot of people won't mess with config files. The techspot article stated it took a 4.5Ghz 6700K before they felt it was GPU limited (1080p).

I will check out the other link at home as it is blocked from my office. But that comment seems awfully generic/out of context as I do not see what resolution they are talking about from what you quoted here.

I've learned something while messing with the internet at school, If you use TOR browser nothing will be blocked ;).


About Minecraft benchmarks, I can try to do it, But the benchmarks will never be consistent cause its different world everytime.
 

illusi

Registered
Joined
Aug 22, 2016
I've learned something while messing with the internet at school, If you use TOR browser nothing will be blocked ;).


About Minecraft benchmarks, I can try to do it, But the benchmarks will never be consistent cause its different world everytime.

Any kind of VPN will be able to penetrate the block. The fact that you are allowed to install TOR says that your school's IT is not the best :)

You can just use same save game for each configuration. And set a route you'd run each time. It might not be 100% accurate, but you'd still get a general idea.
 

[email protected]

Member
Joined
Apr 7, 2016
Location
Israel
Any kind of VPN will be able to penetrate the block. The fact that you are allowed to install TOR says that your school's IT is not the best :)

You can just use same save game for each configuration. And set a route you'd run each time. It might not be 100% accurate, but you'd still get a general idea.

I did it but its still isnt consistent enough for good benchmarking, I noticed timings dont effect much performance if at all.
8GB(2x4GB) dual channel results:
2400MHz: 105Minimun FPS (Avarage of 2 tests)
2800MHz: 126Minimum FPS (Avarage of 2 tests)


Timings of 2400MHz :
2400MHz timings.png

Timings of 2800MHz :
2800MHZ timings.png


The tests were very inconsistent, Needs proper test to see if the difference is really that big/small.
 
Last edited: