• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What is the cheapest high-end memory terms of latency?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
there are like 3 memory IC on the market ... all of them run at the same timings below 3000 and only Samsung B is in kits above 3600 ... there is not much to think of

list of what is probably Samsung B ... ~70%

https://www.morele.net/pamiec-g-skill-ripjaws-v-ddr4-2x8gb-3200mhz-cl15-f4-3200c15d-16gvr-868803/
https://www.morele.net/pamiec-g-skill-ripjaws-v-ddr4-2x8gb-3200mhz-cl15-f4-3200c15d-16gvk-852811/
https://www.morele.net/pamiec-g-skill-ripjaws-v-ddr4-2x8gb-3600mhz-cl17-f4-3600c17d-16gvk-831834/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3600mhz-cl17-f4-3600c17d-16gtz-828910/
I think that 3600 all are still Samsung B regardless if CL is 16 or 17 but I don't know what exactly is in new batches.

list of what is probably Samsung B ... ~95%

https://www.morele.net/pamiec-g-skill-ripjaws-v-ddr4-2x8gb-3200mhz-cl14-f4-3200c14d-16gvk-858852/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3200mhz-cl15-f4-3200c15d-16gtzkw-980792/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3200mhz-cl15-f4-3200c15d-16gtzsw-972624/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3200mhz-cl15-f4-3200c15d-16gtzky-1000360/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3200mhz-cl15-f4-3200c15d-16gtzsk-972622/
https://www.morele.net/pamiec-g-skill-ripjaws-v-ddr4-2x8gb-3200mhz-cl14-f4-3200c14d-16gvr-860229/
https://www.morele.net/pamiec-g-ski...x8gb-3200mhz-cl14-f4-3200c14d-16gtzr-1118488/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3600mhz-cl16-f4-3600c16d-16gtzsw-1022526/


these kits I have and are Samsung B
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3200mhz-cl14-f4-3200c14d-16gtzkw-980793/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3600mhz-cl16-f4-3600c16d-16gtz-836224/
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3000mhz-cl14-f4-3000c14d-16gtz-860226/
also Supertalent 3600 and Ballistix 3466 which are not available in Polish stores

This is listed as Samsung B on mobo manufacturer's QVL lists:
https://www.morele.net/pamiec-kingston-predator-ddr4-2x8gb-3600mhz-cl17-hx436c17pb3k2-16-1378283/

there are many more kits, I just gave examples

regardless what kit you get, to make it work faster you have to play with timings ... if you don't want to or don't know how then higher frequency usually gives better results and optimal is 3600 CL16
 
Thanx. That techreport test is pretty good. The differences between 3000 and 3800 are surprisingly accurate (I mean they are big), which suggests that it might be worth it to hold of for a few months to be able to afford really expensive 2x8GB kit (just buy cheap 2x8GB for now)
Of course the kit they used is unavailable in Poland, but I can assume 3600 15-15-15 will behave similar to the ones they tested.

About the google phrase - of course I tried the same, but as you can see - a ton of Ryzen tests pops up, since it caught big attention recently (for average Joe, memory matters for Ryzen gaming PCs, but does not for Intels).

BTW. I wish someone pushed PC gaming into stacked memory. Capacity doesn't matter so much nowadays, and we can only wonder how much boost would be possible if we moved from 2006 in terms of latency.
Yeah. DDR4 3600 15-15-15, and DDR2 1200 5-5-5 available in 2006/7. We are basically stuck, just bandwidth increases, which is not all that important, especially for high framerate (100+) gaming. I think PC would get bigger boost in gaming from something like Micron's HMC, than it will get from next 5 years of CPU advancements. And I'd buy a 30% faster PC even if factory equipped 12-16GB of RAM was unupgradable.
Sadly for big corporations, PC gaming doesn't matter. :(

I would not call 3 FPS a big difference from 3000 to 3800.
 
Its also one title that is known to respond well to memory increases. The other tests i have seen they barely budge. Memory isnt holding intel muck back in most cases. It scales on ryzen to a point due to its interconnect/infinity fabric.

Id like to see the op test his theory.. :)

Sorry i cant help with better links.
 
Last edited:
Woomack

Big thanx for the effort you put in to help me :)
So, the cheapest 100% certain, would be those:
https://www.morele.net/pamiec-g-skill-trident-z-ddr4-2x8gb-3000mhz-cl14-f4-3000c14d-16gtz-860226/

Just the question remains, I see your thread with the OC tests, is from 1.5 year ago. I wonder what's under radiators in new versions/batches. I would hate to get a nasty surprise.
Gotta research some more, maybe I can get the info.








EarthDog

This again... OK then. Let's go.

1. I don't care about games/tests when there's no difference, since obviously those are not cases where low latency and high single-threaded performance matters. But if 2 games per 10 show any change, then I'll focus on those two, for many reasons.
2. It's for high refreshrate gaming, and with low persistence. In such scenarios, 100fps at 100Hz vs. 99fps at 100Hz is A HUGE difference, no matter how hard it is to believe that. Also, average fps is useless. Minimum is what matters. If it drops below refresh rate once per 2 minutes compared to once per 10 seconds, it's a HUGE difference, which makes adding more money for RAM purchase worth it.
3. I've been laughed at and proven right so many times, I got bored now.
Here are some of cases, where whole forum laughed at me, cause all the benchmarks showed like 3% difference, and then it turned out, a few years later, that there are games where the difference is big.
This taught me to always look at worst case scenarios, to stay on the safe side.


Duron 800 vs. Athlon 800. 5-10% difference. Comanche 4 (or some other number) showed few hundred percent increase.

Athlon XP and 64 - I was going for max FSB, I kept insisting that not only MHz matters, that memory subsystem matters a lot. All the benchmarks showed close to zero difference, but eventually the GPUs got faster and it turned out I was right.
With Far Cry, "experts" said it has to be the fault of the Engine that in some areas, you got 40fps and swapping 7600gt for 7800gtx changed nothing, while overclocking some CPU (one architecture in test! BAD IDEA!) also didn't help.
"It's like it is. It won't ever go above 40, even in 20 years from now" they said.
then C2D e6600 was used in same locacion and suddenly... 70fps. I was NOT surprised when I saw that.

C2D e6300 and e6600 at same clocks. Benchmarks = tiny difference. I insisted that it's not some kind of magic, that Fear tested with no shadows allows 4MB CPU to outpace 2MB (same arch.) by more than 50%. People said "that's just one game, dude! You are an idiot!". Fast forward a few years - and suddenly 30% difference became the norm.

From C2D to Sandy Bridge - people laughed, cause no benchmarks showed bigger difference. "It's 5%!" they said. And then faster GPUs came out, and someone made tests in GTA IV at appropriate settings (lower res, no AA, max geometry). Suddenly the difference rose up to 25%. People were surprised. I was not.

And now the same thing. People test it WRONG. I don't care if someone thinks testing at 720p with 1080ti is a waste of time. I have my knowledge, and I know it is the way to test if you want to minimize the GPU influence, and see exactly what change the faster RAM can make. People with OCed i7-6700 cannot maintain 100fps in Dying Light at max geometry (draw distance etc.) settings. Same with the Witcher 3, if you are vulnerable to flicker, you will want 120fps locked. It's beyond what current PCs can do.
In exactly this type of calculations, we are seeing like 2-4% progress per year. Might be that 4 years from now it'll be just 110% of what is currently possible. Maybe 140% if we are very lucky.
So. Changing the whole platform for +14% faster CPU performance is wise, but buying expensive memory, which costs less than half of the cost of whole platform (mainboard+CPU), which gives +7% is stupid.
That's what all the Nelsons on all the forums are like (Nelson = the guy who goes Ha-Ha! in The Simpsons). I'm used to it. And I really hope you are not one of them. And yes. I want to help people understand the issue, and will do the tests when I can get my hands on 1080ti or similar, but it can take a few months or even be postponed to 2018, so don't hold your breath. ;)
 
Last edited:
Back