• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Fascinating Article about Intel's 10th Gen Desktop Lineup

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
HEDT is just a marketing term, really. Seems pointless to have that many cores/threads without quad channel ram/throughput. It really feels like a just because chip, honestly...I dont knkw who this is marketed towards.

Threadripper has quad channel the Eypc line has 8 channels.

EDIT: Marketed at whales?
 
Oops, yes. quad ch. Still its a just because chip. More posturing than usefulness in the HEDT market.

AMD, stop flooding the market with useless cores, plz. :p
 
AMD, stop flooding the market with useless cores, plz. :p

NOOOOO!

Cores are like ketchup, more is always better. In all seriousness, with Intel adding more cores to their enthusiast line-up too, I'm hopeful the software devs will start taking advantage of the extra resourses.
 
Last edited:
NOOOOO!

Cores are like ketchup, more is always better. In all seriousness, with Intel adding more cores to their enthusiast line-up too, I'm hopeful the software devs will start taking advantage of the extra resourses.

I think the biggest push will come from the new consoles coming out atleast for gaming support of multithreading. Both are coming with 8 core cpus, so games will start to be optimized that.
 
NOOOOO!

Cores are like ketchup, more is always better. In all seriousness, with Intel adding more cores to their enthusiast line-up too, I'm hopeful the software devs will start taking advantage of the extra resourses.
I'm hopeful, but only because of consoles released in 7 months that are 8c/16t machines. We've had hex+ cores out for almost 10 years now... so I'm not holding my breath. Really, more than 8c/16t in mainstream is too much, to me. MOre than 16/18c/32/36t should be reserved for the server level.
 
We've had hex+ cores out for almost 10 years now...

But not in the mass mainstream. There were the AMD X6 and the "8 core" Bulldozer designs, but neither took off in great numbers. I wouldn't count HEDT as mainstream either. So basically back a decade plus to around Core 2 era, what the vast majority of people had was up to 4 cores.

IMO many things that can scale to more cores are already here, so they can make good use of more cores if you were to have them. Some things will never scale. Then we have a small zone where things might scale better but haven't quite got there yet. So basically more cores is more of a cost consideration. How much do those cores cost, and where do you draw your value line?
 
But not in the mass mainstream. There were the AMD X6 and the "8 core" Bulldozer designs, but neither took off in great numbers. I wouldn't count HEDT as mainstream either. So basically back a decade plus to around Core 2 era, what the vast majority of people had was up to 4 cores.

IMO many things that can scale to more cores are already here, so they can make good use of more cores if you were to have them. Some things will never scale. Then we have a small zone where things might scale better but haven't quite got there yet. So basically more cores is more of a cost consideration. How much do those cores cost, and where do you draw your value line?
Not in the mainstream? I'd certainly call bulldozer 6/8c and it being so cheap was available in 'great numbers'. Yes, the majority was on 4c/8t CPUs, but, these were out, in significant numbers, for several years. Between X58 and AMD's mainstream platform, devs had plenty of time to code for these CPUs that have been on the market for nearly 10 years. The only reason we'll finally start to see more momentum is due to the consoles.

I have a 16c/32t processor. For my uses, I disable HT. Nothing I use outside of benchmarks tickles this thing. That isn't to say others can't utilize (not use) it. But just because they are there, doesn't mean they are used or utilized.

So, yay, more c/t... but that doesn't help a majority of users is the problem. :)
 
Last edited:
After some further thought, maybe it is better to think in terms of threads. Intel i7 quads would give you 8 threads, as does Bulldozer. So maybe that was/is the optimisation, rather than number of physical cores as such. I haven't checked, but I presume SMT will be enabled in the console CPUs, so 16 threads are the next evolutionary step.

I also tried to look at market share after the launch of Bulldozer, but was unable to find anything I'd consider reliable. I got two data points in rough agreement with AMD just under 30% share at the time (2012), one was based on PassMark submissions, the other was Steam Hardware Survey. That's higher than I remember actually.
 
Perhaps.

But think back at the time... quads, hexes, and octos weren't utilzed either... dual cores were the thing... it is a recent trend that quads (w/ht) are long in the tooth.
 
PC games have always been scaleable though. If we look back to just before Ryzen for example, people were still happily getting i5 4c4t for mid-value gaming systems, with those wanting the reasonable best at the time going for i7 4c8t. (I'd consider HEDT as unreasonable best). Today we are seeing 4c4t struggle some more with latest big name titles and 4c8t is becoming the starting point for a half serious gaming system. When the 8086k came out, I finally made the move beyond 4 cores in my gaming system and currently run 6c12t, and it definitely helped.

Hard to say what was needed when, but to me it feels like quads were required for serious gaming from at least Sandy Bridge era, if not a little before. Since Ryzen I feel like the sweet spot has moved to 6 cores, although I'll leave the argument if that is 6 real cores or how to compare core-thread counts as 2 SMT threads certainly does not equal two cores. We're getting very title specific if we want to look at scaling much above that.
 
4c/8t is, to me, almost a bare minimum. That still puts a glass ceiling on some titles. Are they playable, surely. But more and more that weaker cpu will grow more tiresome faster.

Sweetspot is 6c/12 for unhindered gaming.
 
Is there even any point to PCI-E 3.0? Or anything beyond 1.0? I mean a heavily overclocked RTX Titan still cannot saturate a PCI-E 1.0 x 16 slot, so anything beyond that is meaningless. We're on 4.0 but we dont even have anything that can fully make use of 1.0 yet. It's like the same with USB 3.0 v2. We dont even have any components that can saturate USB 3.0 v1 so V2 is pointless. Also the names are completely retarded. I say we just call them USB 3.0 and USB 4.0.
 
For consumer/gaming uses PCIe speed isn't really a limit, but for some more demanding use cases it starts to make a difference. While not something I've done much of, so correct me if my understanding is incorrect, folding@home does transfer more data to the GPU and PCIe speed can affect the work rate. e.g. if you run it like a miner and use 1x connections, that can choke it. Some other compute uses with GPUs might be affected similarly.
 
Is there even any point to PCI-E 3.0? Or anything beyond 1.0? I mean a heavily overclocked RTX Titan still cannot saturate a PCI-E 1.0 x 16 slot, so anything beyond that is meaningless. We're on 4.0 but we dont even have anything that can fully make use of 1.0 yet. It's like the same with USB 3.0 v2. We dont even have any components that can saturate USB 3.0 v1 so V2 is pointless. Also the names are completely retarded. I say we just call them USB 3.0 and USB 4.0.
Uhh, Yes. Look at any PCIe scaling reviews. A 2080Ti, for example, loses a couple of percent going down to PCIe 3.0 x8 (2.0 x16), and more when running PCIe 3.0 x4 (PCIe 1.0 x16). Also, AMD's RX 5500XT, in particular its 4GB variant, responds very well to moving to 4.0 from 3.0... because its memory amount is low the PCIe bus is used to 'page out' other textures. But moist certainly, if we were on PCIe 1.0/2.0 with any modern card, they would be choked. Here are a couple articles.....
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/
https://www.techpowerup.com/review/pci-express-4-0-performance-scaling-radeon-rx-5700-xt/
https://www.techpowerup.com/review/nvidia-geforce-gtx-1080-pci-express-scaling/

....that's gaming. As Mack said, some computational work may saturate that bus, and F@H uses it, but I don't believe it holds cards back (maybe at a x1 speed, but not sure - I haven't F@H on the GPU for ages).

As far as USB goes, there are items that can saturate that bus...we're starting to see USB Type-C portable M.2 drives which will blow through that bandwidth.

Easy on using the word 'retarded'... people can be offended by the use of that term. That said, I agree, along with many others, the USB names can be confusing. Just know now they are all USB 3.2 Gen2... one is 10 Gbps, the other is 20 Gbps which is usually defined in the specs/text, etc or with a 2x2 naming convention after.
 
Last edited:
GTX1080Ti in 3DMark PCIE bandwidth test can make up to 12.3GB/s so you already need PCI-E 3.0 x16 to reach that and there are faster cards.
In games it highly depends on the title and used graphics card so pretty much like ED said.

ASR_X299.jpg
 
GTX1080Ti in 3DMark PCIE bandwidth test can make up to 12.3GB/s so you already need PCI-E 3.0 x16 to reach that and there are faster cards.
In games it highly depends on the title and used graphics card so pretty much like ED said.

View attachment 210266

Real world performance for gaming isn't too heavily effected.

For longer than I like to admit my GTX 1080 was running at 8x cause I put my NVME drive in the wrong m.2 slot and it was sharing PCIE lanes. Once I fixed this issue I didn't see an increase in performance.
 
I was just giving an example that the PCIE 3.0 x16 bandwidth can be utilized by even older graphics cards, not mentioning 2080Ti+. In games, it doesn't change much but it was covered multiple times already. As I said, it highly depends on the game.
 
The benefits will be the same as when PCIe 4.0 came out, I would guess. Not many uses until components can use the extra bandwidth.

Well, I just barely got any Ryzen at all this year, couldn't even get one in 2019, because I couldn't afford a Ryzen build. And thus, have hardware from 2019 in 2020, LOL.
 
Back