• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE Threadripper 2 Review Compilation (2990WX and 2950X)

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Even for most graphics studios this CPU has no point. I personally know some guys who were making 3D/rendering for TV commercials. They could buy 16+ cores but they said that better is to add 2-3 more 6 core PC. What I mean is that barely anyone needs so many cores. To be worth it there has to be huge project which will be really profitable.

I'm not sure what to say when I see forum users saying they need 16+ cores for home rendering or that they want to start "professional" 3D work so they need 16 cores+. A lot of programs are using graphics cards anyway.
32 cores are great but with large amount of ECC memory set as cloud server or something with highly multithreaded environment or multiple VM.

I just see that most people who read about 16-32 cores have on their minds something like graphics work or servers. In real even there barely anyone needs that. For servers will be Epyc anyway. 32 core TR is a product which will make a lot of noise and will sell even worse than first TR as most of those who wanted TR already invested in 1k series.

I'm not saying that 32c TR is bad. It's a great CPU. I simply see no point of its existence on the current market. AMD could focus on new architecture and provide us faster 8-16 cores at higher frequency and improved internal bandwidth what would affect also other devices.
 
Unfortunately, the members of the general public who see more cores=better outnumber us by a bunch. Although $1800 CPUs is getting well in to people who should know better.
 
Unfortunately, the members of the general public who see more cores=better outnumber us by a bunch. Although $1800 CPUs is getting well in to people who should know better.

Agreed, though I think you under estimate how many people have more money than brains. Many of those people feel they must have only the best and that the best most assuredly must cost more. You know damn well pre-built high-end "gaming" pc companies will leverage this stupidity and bring towers to the market with these chips in them and sell $7K+ computers to the 6+ figure earning class members who aren't all that bright, who like to consider themselves gamers... i.e celebrities and pro athletes.

A niche market, yes. But a market with enough profitability none the less. Mark my words, Alienware (Dell) and similar companies will put these 32 core chips into towers and market them as high-end gaming machines for just those people and will make a killing doing so, while you or I will still have comparable if not better gaming performance in our current systems with 4-8 cores...

And I suspect we will see them emerge after the next wave of GPUs begin to appear this year.

I would compare this and Intel's iteration of the more cores cold war to exotic super cars, no real need for them and they cost more than the mass majority of the world populace can afford, but they exist none the less. Because that small niche market of individuals who simply want them and can afford to buy them is profitable enough to justify making them every year.
 
Ferraris actually provide usable performance, though.
I know there are people out there snapping up 8 core Ryzens from Dell, HP, etc., who will never have a use for 8c/16t before their rig is obsolete. My girlfriend's FX 8350 blasts through everything she throws at it with more than enough speed to keep her happy, and my rig will likely be enough for my needs until I'm old enough to be drooling on the keyboard and making old man noises at the nurses. Sometimes we just want, and sometimes people (wrongly) think they need way more than they do. On the plus side it keeps companies creating flagships and we get to point at them and exclaim "Ooooh! Shiny!", and sometimes it's more like "SQUIRREL!". LOL
 
until I'm old enough to be drooling on the keyboard and making old man noises at the nurses.
wait... that isn't happening now? It is for me and Im younger... greaaaaaaaaaaaaaaaaaaaaaaat. :rofl:

Remember when it was MHz/GHz though? AMD changed their entire marketing naming to reflect 'relative' performance against Intel processors of the time. Generations, in fact, did that. A64 2800+ Newcsatle/Venice etc...

Now, its a cores race, except, it really doesn't matter for most users, LOL!
 
wait... that isn't happening now? It is for me and Im younger... greaaaaaaaaaaaaaaaaaaaaaaat. :rofl:

Remember when it was MHz/GHz though? AMD changed their entire marketing naming to reflect 'relative' performance against Intel processors of the time. Generations, in fact, did that. A64 2800+ Newcsatle/Venice etc...

Now, its a cores race, except, it really doesn't matter for most users, LOL!

Ah the old days, I remember my first computer was a hand me down 99-00 Gateway with a 800 MHz Intel something or other with on board video comparable to having the monitor plugged into a potato.

Those first dual core A64's were beast back in the day, but if you were a kid you needed rich parents to have one lol... In my case it was a Dell with very low end intel and more crappy on board video and a 15" monitor that weighed about as much as a Toyota Prius for Christmas, my iPhone has significantly more power than that thing did :rofl:
 
My first rig was a Dell(s) with lousy Pentiums around 2003. I didn't know any better and was a new father in my 40s, working 6 days a week. For fun I broke my XP install and figured out how to fix it. Weekly. LOL Here's a handy tip-use Google before you start deleting folders on your C:\ drive, and write down pertinent info for when ipconfig brings up "???" and your wife hollers "You woke the baby up with that language!". :rofl:
 
I'm not saying that 32c TR is bad. It's a great CPU. I simply see no point of its existence on the current market. AMD could focus on new architecture and provide us faster 8-16 cores at higher frequency and improved internal bandwidth what would affect also other devices.
Couldn't agree more, I think AMD is on the wrong track with all this ramping up of the core count. Seems to me, gamers and enthusiasts want higher clocks and more performance PER core and AMD doesn't have their ear to the ground. I really don't think these chips are going to sell well, but then again, what do I know. I also don't believe Intel is sweating this release as much as some might think.
 
I believe most folks that buy PC's believe spend more money gets you increased performance. They don't have all the time in the world to learn how PC works or don't care about the complexity's of PCs, they don't know gaining performance is with programs used, clock speed, cores and Video cards.

Most folks that purchase Threadripper 2 2990WX box would just use it and not even know the gaming performance and other programs is not great.

Ryzen Build Your Desktop Link: https://www.originpc.com/configurator/prov3/p3.aspx?SYSTEMID=96#
 
I'm sure the gaming performance is better than not great. It's not like AMD chips are "bad" for gaming. That's a fanboy myth that really needs to go away.
 
I'm sure the gaming performance is better than not great. It's not like AMD chips are "bad" for gaming. That's a fanboy myth that really needs to go away.
There was a difference between Ryzen and Skylake... that difference seems to have dissolved for the most part with the 2 series Ryzen CPUs.

Skylake-X CPUs were also slightly slower in gaming than Skylake (the privatized cache I think played a role there). These CPUs aren't really designed to game. They can, and well, but, there can be penalties.
 
There was a difference between Ryzen and Skylake... that difference seems to have dissolved for the most part with the 2 series Ryzen CPUs.

Sooo, if my Skylake is still perfectly good for gaming (and it is), then the Ryzen 2, "for the most part", should be perfectly acceptable. I think that's what I said. :rolleyes:
 
ima just drop this here


EDIT: And my question to you all is do you think that is the processor was smaller than 12nm, we might seen an improvement?
 
Last edited:
Looks like there blaming the 2990WX poor gaming and programs performance on windows scheduler, not the node size of 12nm.
 
Why would the node size matter anyway? I don't understand where that sentiment is derived from.
 
For the 2990WX in the video he was speculating that maybe this time next year AMD will have a significantly better processor using the 7nm process. However, 7nm node has nothing to do with processor architecture. A shrinking node allows more transistors to fit in the same prior die size and use less watts also allow higher clock speeds.

He was also speculating that if windows scheduler was improved that would help the poor performance levels with the 2990WX.
 
Last edited:
Yeah, I get what a node is/does. :)

Node size has little to do with it (gaming performance). Sure, clockspeeds and voltages we all (should) know can effect performance, but, a node itself doesn't really hold anything back directly. TR2 is reaching similar clocks to previous gen and getting raked over the coals in some titles (its not a gaming CPU really...can't hold that against it). This in and of itself should lend users to believe it isn't in the clocks, but in the arcitecture (CCX and intercommunication between them/windows scheduling, etc). I have never heard a process node being attributed to holding back gaming performance before. Maybe someone wrote that... I dont know.

Feels like ganged and unganged HDDs honestly. :p
 
I think the new node (7nm) speculation about increased speeds is partly the process and some architectural changes. The original Ryzen process was designed for low power. GF has said already that their 7nm process is capable of 5 GHz but that doesn't mean the architecture design will allow that. The 12nm node was only an optical shrink on the same silicon base where 7nm has an all new silicon base with some changes in design. I assume both are geared toward more speed and higher transistor density.
 
Even for most graphics studios this CPU has no point. I personally know some guys who were making 3D/rendering for TV commercials. They could buy 16+ cores but they said that better is to add 2-3 more 6 core PC. What I mean is that barely anyone needs so many cores. To be worth it there has to be huge project which will be really profitable.

I just see that most people who read about 16-32 cores have on their minds something like graphics work or servers. In real even there barely anyone needs that. For servers will be Epyc anyway. 32 core TR is a product which will make a lot of noise and will sell even worse than first TR as most of those who wanted TR already invested in 1k series.

Right! This CPU should have a space reserved in Ripley's Believe it or not. Such a monstrosity of a CPU it's almost cartoon like. Wonder what the CPU landscape is going to look like in 20 years if AMD keeps going at this rate. 256 cores? 512 Threads? 128Mb cache?
 
Back