• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

And here i thought hyper-threading (HT) actually helped...

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Depends on the game and what cores it can use. There is more sharing of the cache etc when you use more threads so, it depends. See this post: http://www.techpowerup.com/forums/t...700k-hyperthreading-test.219417/#post-3405620

Depends on how much DX12 can use HT versus logical cores. Nobody knows yet.

But yeah, this is the reason why we always tell people to 6600K with games unless their budget can afford better or they are doing other things with their PC that can use the cores. Kind of generally known really. :)

EDIT: Some of his results, at least the comments don't make sense either... That guy loves to test thnigs that people already know, LOL!

That said, test this again with a dual core and a dual with HT and see how the results differ. ;)

He is also magnifying any result by testing it at such a low resolution. Id like to see that at 1080p/1440p and see if it changes any.
 
Last edited:
That said, test this again with a dual core and a dual with HT and see how the results differ. ;)

He is also magnifying any result by testing it at such a low resolution. Id like to see that at 1080p/1440p and see if it changes any.

"I've tested 21 game on 1920x1080 resolution with all graphical settings set on maximum, turned on, except: no anti-aliasing was used." - same for the video, 1080p with settings maxed and no AA.

"HT is not worthless in games however - it delivers awesome performance in Core i3 processors, but not in Core i7."

So even if i turn HT off for gaming i gain ~10fps in some game at the most but it will still beat the 6600k at the same speed. For future games although that will supposedly use more threads the 6700k is still the best bet correct ? would like to see it tested out against a x99, for example a 5820k with HT off. Would the 6700k still win by dint of higher IPC ?
 
That still depends on how the game treats logical cores kenrou. 5820k no HT is 6 cores. It might just whoop you if the game doesn't use HT but will use "real" cores. FX CPUs ran into this problem
 
He didn't talk much about his Windows setup...did he have all other "stuff" turned off?

My experience with 4 cores versus 4 cores and hyperthreading is that 4 cores gets "chunky" as you start to max out the CPU. Interesting that his results don't show this.

If the game is using most of the cores as threads, Windows will multitask less. However, with HT turned on, Windows will see the extra "cores" available, and serve up more OS specific threads to do "stuff". As the HT cores are not real cores, you lose some in the task switching and cache misses.

This just validates my general theory that "more cores is better"! :D As long as it's in your budget!.
 
i think dx12 is going to help lower power cpu's gain alot more fps put them closer to par with the higher core / thread cpu's but that's just me. iirc it is supposed to take alot of load off the cpu to where the extra cores and threads dont really matter as much as they have previously.
http://www.pcworld.com/article/3039...es-you-really-need-for-directx-12-gaming.html
the only non synthetic bench albeit a game that is still beta.. seems to support this theory i have.


edit: im not sure but doesnt the ashes of singularity benchmark seem like it would have alot physics loading the cpu not the graphics themselves? which physics have always benefited from higher core count, would explain why in that synthetic benchmark the better cpu's would score higher, then in the real game (probably not very physics heavy) it would be more even.
idk this is all speculation we cant prove till more games are being released using it.

if dx12 is doing like mantle did and giving more low level hardware control then as we seen with mantle even dual cores were getting great fps on par with 4 and 8 cores with the same gpu's

"This brings us to Direct3D 12, which is Microsoft’s entry into the world of low level graphics programming. "
http://www.anandtech.com/show/7889/...level-graphics-programming-comes-to-directx/2
the way they explain the changes in dx 12 are very similar to how they explained how mantle worked, and it really seems like microsoft was like hey these guys are doing this we should probably copy them and push them out of the way and keep using our api over theirs. i mean until mantle came out microsoft had no reason to make their software better.


edit 2: http://techreport.com/review/29090/fable-legends-directx-12-performance-revealed/4
hard to find many people testing cpu performance but i found one more bench testing cpu.
edit3: another http://www.anandtech.com/show/9659/fable-legends-directx-12-benchmark-analysis/3
 
Last edited:
Why would you disable the other 'stuff'? Is that a realistic gaming scenario? Its been over a decade since I disabled anything when going to play a game.

My experience with 4 cores versus 4 cores and hyperthreading is that 4 cores gets "chunky" as you start to max out the CPU. Interesting that his results don't show this.
He doesn't mention it, but that doesn't mean it didn't happen (trying to see how many negatives I can cram in there, haha!). There are not many games out there that can 'max out' a modern quad core honestly. Were you talking games?
 
The only reason to disable the other "stuff" is to have an apples-to-apples comparison. I don't disable any stuff when I game personally (in fact, when I game, I run Folding@Home on 8 threads...no issues - hehe).

The gaming code could be running the threads at the highest priority...so the OS would postpone other lower priority threads as required. When you make more threads available to the OS, you no longer have the same test bed.
 
That still depends on how the game treats logical cores kenrou. 5820k no HT is 6 cores. It might just whoop you if the game doesn't use HT but will use "real" cores. FX CPUs ran into this problem

That has nothing to do with the game and everything to do with the operating system. The game has no idea whether a CPU if "real" or not. All it does is tell the OS to schedule a thread. The problem with FX was Windows scheduling t1(m1), t2(m1), t3(m2), t4(m2), t5(m3), t6(m3), t7(m4), t8(m4) (two threads on module one before module 2 gets it's first thread, so both threads are contending on FP operations) instead of t1(m1), t3(m2), t5(m3), t7(m4), t2(m1), t4(m2), t6(m3), t8(m4) like it should have.
 
Last edited:
Everyone has 'stuff'. Typical a review OS is more devoid of 'stuff' than someone's OS at home anyway. Stuff doesn't need disabled. ;)

What game runs under real-time (highest) priority, or anything else raised? I don't check much, but the few I have run across are normal priority.
 
Last edited:
Best case scenario I've come across puts HT cores as adding about 0.5 of a real physical core. So if an i3 system is CPU limited, having HT could be like having a 3rd core equivalent. If you have 4 physical cores already, is going up to 6 equivalent going to help? Less likely. Also there's a quirk with HT. For stuff that doesn't benefit from HT, the OS scheduler can end up with both threads per core partially active with others idle, and will slow things down. If you don't prevent that through affinity, I've seen ~10% CPU performance penalty in that case compared to HT off.

To recap: for HT benefiting applications, each extra HT can be worth up to 0.5 of a real core, so a 2C4T can behave like 3C.
For applications that have no benefit from HT, OS scheduling may result in a reduction in performance of 10% compared to turning off HT or manually setting affinity.
 
Hey, ED, are you moving or something. Weren't you "Stuck in Maryland" until recently or am I thinking of someone else?
 
I hope/likely, yes. My wife is on her 2nd interview and things look good... back home where the heart (family/friends) is. :)
 
Back