• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Memory Speed for Ivy Bridge Build

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

maximillion82

Registered
Joined
Apr 30, 2012
I am going to buy a new system tomorrow, and I just noticed that G.Skill has new memory I wanted to spend about $100 on memory so I could either buy G.Skill Ripjaw X 16gb (2x8GB) 1600MHz, or G.Skill Trident X 8GB (2x4GB) 2400MHz.

The system I am building will have the following components if that helps:
i7 3770K
Asus P8Z77-v pro (what's the difference to the P8Z77-v?)
2TB Seagate Barracuda
Cooler Master Silent Pro 850w
Sapphire Radeon 7950 OC

and either Ripjaws 16GB (2x8GB @1600MHz) or Trident 8GB (2x4GB @2400MHz)

I have not built a rig in over 6 years so I am a little out of date on hardware and the differences, my main concern is future upgradability. With the trident I top out at 16GB very fast memory with the Ripjaws I can get up to 32GB but slower memory. Which makes more sense.
I will be using the rig for graphic and video editing and I want to learn to use Maya, I also want to use it for video games that's why the Radeon.
 
I've only seen one review that covers this and it showed going over 1600mhz had very little improvement in real world usage. Sandy Bridge was the same way.
 
I've only seen one review that covers this and it showed going over 1600mhz had very little improvement in real world usage. Sandy Bridge was the same way.

I had exactly teh opposite review.

1600 is minimum I suggest for sandy bridge with 1866 minimum clocking past 5 GHz.

As far as "real world improvement" no.. you are not going to add 20 FPS in games, but in all tests I did those two multipliers showed the largest gains.

Difference is 3-4% tops in benchmark improvements. "Real world" however is different then benchmarks.

General desktop feel is totally tied to faster RAM. Lower latency feels faster.

16GB is great though.. especially if you need that much RAM :)
 
I had exactly teh opposite review.

1600 is minimum I suggest for sandy bridge with 1866 minimum clocking past 5 GHz.

As far as "real world improvement" no.. you are not going to add 20 FPS in games, but in all tests I did those two multipliers showed the largest gains.

Difference is 3-4% tops in benchmark improvements. "Real world" however is different then benchmarks.

General desktop feel is totally tied to faster RAM. Lower latency feels faster.

16GB is great though.. especially if you need that much RAM :)

So what would you suggest me to do? I am not sure if I really need 16GB or if 8GB will do the job.
 
If you are only going to run stock or maybe a small OC the 1600 will be fine. But if you plan to push for 5 GHz I would go with the faster ram instead.

8GB is still a lot of ram and you can always get two more sticks later on if you need 16GB.
 
If you are only going to run stock or maybe a small OC the 1600 will be fine. But if you plan to push for 5 GHz I would go with the faster ram instead.

8GB is still a lot of ram and you can always get two more sticks later on if you need 16GB.

That's true I won't overclock right away, but as soon as I spend the extra dough on better cooling I am sure I can't leave my fingers off it.
 
I'd go with 8GB at DDR3-1600, you will be just fine with that and you likely won't notice the difference outside of running benchmarks against it. And the odds of you pushing 5.0ghz through an ivy bridge chip is extremely slim from the numbers that I have seen, so the benchmark/improvements that Neuro speaks of would likely not be available/noticeable either.
 
I'd go with 8GB at DDR3-1600, you will be just fine with that and you likely won't notice the difference outside of running benchmarks against it. And the odds of you pushing 5.0ghz through an ivy bridge chip is extremely slim from the numbers that I have seen, so the benchmark/improvements that Neuro speaks of would likely not be available/noticeable either.

Now it gets tricky again, so I am not sure again. Any more ideas?
 
TridentX 8GB for games and benching. 16GB 1600-1866 for anything else that needs more memory like video editing etc ( and it seems like you will need it ).
 
I'd go with 8GB at DDR3-1600, you will be just fine with that and you likely won't notice the difference outside of running benchmarks against it.

See that was exactly the opposite of my experience. Benchmarking (minus competitively) shows very little difference. It is the desktop FEEL that improves most when using the faster ram. I used the benchmarks as an example as that is the only empirical data I had to explain the situation. The subjective desktop "feel" is not benchable and the only correlative benchmark I can find is maybe latency.

AIDA.jpg


25% performance increase in latency between 1333 and 2133. Despite looser timings.

I know that when I got stuck using 1333 MHz RAM it was PAINFUL. I had to use an SSD to get any kind of decent desktop performance out of it. Even then it was not as snappy as my Thuban based rig.

BUT....

If you go 1600 and never go with faster RAM you wont know what you are missing so you wont be disappointed. Its like running Core2 platforms.. if you dropped from a quad to a dual there was microstutter. If you never ran a quad you did not notice it (in fact after two weeks I did not notice it anymore either).

1600 will work fine and if you need more than 8GB of memory more but slower ram is better than not enough ram even if it was 3300 MHz.
 
Last edited:
I'd go with 8GB at DDR3-1600, you will be just fine with that and you likely won't notice the difference outside of running benchmarks against it. And the odds of you pushing 5.0ghz through an ivy bridge chip is extremely slim from the numbers that I have seen, so the benchmark/improvements that Neuro speaks of would likely not be available/noticeable either.
+1. Only in certain benhmarks (SPi32M, Vantage and Heaven) is faster than 1600Mhz nneded.

I personally do not notice anything different on the desktop, but then again, I pay little attention to things going on there. :shrug:
 
My computer feels snappier with 2400 compared to 2133 but who knows. I really want to try DDR3 2800+. I think this MVG can support up to DDR3 3200
 
I would rather spend the money on a nice SSD rather than faster RAM, much more noticeable difference when it comes to snappiness.
 
My computer feels snappier with 2400 compared to 2133 but who knows. I really want to try DDR3 2800+. I think this MVG can support up to DDR3 3200

Board yes but most IMC won't make that high. At least from random comments I see that best of available kits with standard IB will make ~2800. I'm not saying max under LN2 or sick voltages but something that you can keep stable 24/7. So 2800 seems possible 3000+ not really.
 
I went to the store and ask the guy again about it, he seemed not to care at all to answer questions, didn't even ask me what I will be using the machine for. I didn't buy the computer, because many discounts that were advertised were not given and the rig would have cost me over $2000 and I planned on spending not more than $1800 total including monitor. So I am waiting for a sale to come on again and then just order it online. Especially due to weak in-store support.

I was thinking on getting an SSD, they are still very expensive, but I plan on upgrading to a 240-250GB one later this year.
 
Benchmarking (minus competitively) shows very little difference. It is the desktop FEEL that improves most when using the faster ram. I used the benchmarks as an example as that is the only empirical data I had to explain the situation. The subjective desktop "feel" is not benchable and the only correlative benchmark I can find is maybe latency.
...

Damn, I now want to buy some G.Skill tridenX. In the unlikely event I do, I'll report whether I notice any improvement.

Off-topic comment: Normal desktop usage seems snappier after overclocking my 3570k from 3.6GHz to 4.4GHz. No other changes were made. I never expected to see any difference in games, let alone desktop environment, so this came as a complete surprise. Perhaps there is credibility in what Neuromancer said.
 
Back