Currently I'm in the middle of a X264 stress test on my system with the following setup:
i7 6700K at:
4.4 ghz (core)
44x (Ring ratio)
1.24 Vcore
LLC High
Trident Z 16gb 3200 Mhz ram with XMP profile (3200 [email protected])
Currently I'm hitting a max of 70 Celcius on the cores with x264 stress test. If I go back down to stock ring ratio (40x), I will probably be able to easily get 4.6 ghz at around the same Vcore while staying under 70 (Don't really want to go above this temp because my ambient temps can go up to 30 C at times and I want enough head room).
My question is this, would having my core be at 100x46 (4.6 ghz)@1.24v while the stock 40x ring ratio (4.0 ghz) be worse for my performance than just sticking to what I'm at now? I know generally the cache frequency is less impactful on performance than core frequency, but all the research I've done has people saying try to keep it to a 1:1 ratio or at least very close.
i7 6700K at:
4.4 ghz (core)
44x (Ring ratio)
1.24 Vcore
LLC High
Trident Z 16gb 3200 Mhz ram with XMP profile (3200 [email protected])
Currently I'm hitting a max of 70 Celcius on the cores with x264 stress test. If I go back down to stock ring ratio (40x), I will probably be able to easily get 4.6 ghz at around the same Vcore while staying under 70 (Don't really want to go above this temp because my ambient temps can go up to 30 C at times and I want enough head room).
My question is this, would having my core be at 100x46 (4.6 ghz)@1.24v while the stock 40x ring ratio (4.0 ghz) be worse for my performance than just sticking to what I'm at now? I know generally the cache frequency is less impactful on performance than core frequency, but all the research I've done has people saying try to keep it to a 1:1 ratio or at least very close.