• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FEATURED AMD ZEN Discussion (Previous Rumor Thread)

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Here's another video, better angle on the bench and shows some mobos etc...

 
Hmm wonder if they are doing the rumored quad channel ddr4 a bit different than intel with all four DIMM slots on the same side (Intel's dual channel).
 
I'm assuming that's a prototype style board in the end, when MFGs get a hold of it, it'll likely be laid out a bit different but who knows. I did notice that and there's only 4 slots total.
 
I don't know how different they will be... there isn't one X99 board with all four DIMM slots on one side of the board. There isn't one Z170 board with the slots split either... boards will be different, but not THAT different. At least, we've never seen anything like that before.
 
Thanks for the videos Johan. Good to see some more recent footage.

As fast as Intel's current i7 8 core offering? Thanks good thing right?
 
We have to caution a bit here, this is for one test, and we don't know how they compare for other tests. Also Broadwell is current generation server/HEDT, but last generation as far as consumer goes. Intel is expected to release their next generation before Zen hits in any volume. So while this shows they could be competitive at 6+ cores, at 4 cores they will likely be behind once again.

IPC aside, we still know nothing about final clocks. If they clock faster, it could negate IPC differences.
 
For a power budget then 4 cores will be clocked higher than 8 cores, but if you throw enough cooling at it I'd hope both would reach the same max clocks. So my earlier comment on 6+ cores being a stronger region for AMD was due to that Intel server based CPUs have always lagged behind desktop, and you need to go to those to get the core count up. But at 4 cores on the desktop, Intel is expected to be 2 generations on from Broadwell by the time we have Zen.
 
For a power budget then 4 cores will be clocked higher than 8 cores, but if you throw enough cooling at it I'd hope both would reach the same max clocks. So my earlier comment on 6+ cores being a stronger region for AMD was due to that Intel server based CPUs have always lagged behind desktop, and you need to go to those to get the core count up. But at 4 cores on the desktop, Intel is expected to be 2 generations on from Broadwell by the time we have Zen.

Why limit to 4? Intel sells up to 10-core i7s. Sure, they're not exactly cheap, but they're still "desktop", and if AMD really means what they said about not trying so hard to undercut Intel's prices anymore, seems that's a place they'd want to be. After all, if some 8-core chips have defects, there's always binning for lesser parts.
 
No limit as such, but the market as it exists is dominated by quad cores in the mid to high end. The comparison AMD gave with 8 cores on each side was am Intel CPU generation behind what quad core users enjoy, and Intel will move another generation by the time Zen is here. I think AMD will want to use 8 cores as a way to claim the high but maybe not not extreme end of the market, but it will be a harder fight at quad core level. Intel could still surprise us and move beyond 4 cores in mainstream, and if they do we would have to thank AMD for reawakening the competition between them.
 
AMD has created a product that will work across multiple platforms. Zen is going to be a very good product for PCs but the money maker will be in the server business. Once Zen starts pumping out with integrated GPU I expect them to start breaking into the server space again. With basically no market share now they are surely going to make a splash
 
AMD has created a product that will work across multiple platforms. Zen is going to be a very good product for PCs but the money maker will be in the server business. Once Zen starts pumping out with integrated GPU I expect them to start breaking into the server space again. With basically no market share now they are surely going to make a splash

You've said the exact same thing earlier.

Why would an igpu make any major difference in the vast majority of server usage purposes? Also, what does them having almost no market share now have anything to show for them 'making a splash', unless you are at AMD and selling 5 units is a large % increase. Why would companies (and in particular the OEMS that sell the servers (Dell, IBM, HP, etc etc) switch entire product lines from Intel that they've used primarily for over a decade?
 
You've said the exact same thing earlier.

Why would an igpu make any major difference in the vast majority of server usage purposes? Also, what does them having almost no market share now have anything to show for them 'making a splash', unless you are at AMD and selling 5 units is a large % increase. Why would companies (and in particular the OEMS that sell the servers (Dell, IBM, HP, etc etc) switch entire product lines from Intel that they've used primarily for over a decade?

Price.
If broken down further, price per core, without sacrificing performance.
 
You've said the exact same thing earlier.

Why would an igpu make any major difference in the vast majority of server usage purposes? Also, what does them having almost no market share now have anything to show for them 'making a splash', unless you are at AMD and selling 5 units is a large % increase. Why would companies (and in particular the OEMS that sell the servers (Dell, IBM, HP, etc etc) switch entire product lines from Intel that they've used primarily for over a decade?

Servers rely heavily on cpu AND gpu depending on task. There's no single supplier of either. Intel makes good CPUs and nvidia makes good gpu but that is still two individual components from two individual companies. Motherboard makers must make boards that take them both. AMD will be the first company ever to supply both on the same socket which will create a server that's much more efficient with less components and parts. More integration=better performance and lower costs. AMD has already been in talks with several major server users, like Google, to take advantage of this technology. Unless Intel buys out nvidia they cannot match what zen is bringing currently.

The server market is ripe for the picking due to Intel having pushed AMD out in the past. Server markets are HUGE for a comapany and people want competition there.

Also once zen comes with gpu integrated you'll see the likes of Apple, xbox, and Sony onboard as well. The iMacs utilize AMD cards due to the lower TDP and xbox/Sony next gen consules will be based on this new technology.
 
First, an iGPU does not and cannot "improve price per core without sacrificing performance". The iGPU's mere existence will either require additional die space (higher cost) or lower core count (lower performance). And, as already stated, your render farm ain't using iGPUs. They're utterly irrelevant here and I can't see why some few keep bringing them up as if they were.

Second, 90% of servers need a remote management interface, and the iGPU is part of the same chip that provides IPMI, network KVM, and sensors. Nobody is going to remove that chip, and the GPU on that chip is part of the whole "sideband" thing, so no, we don't want to move it to the primary CPU. Adding a GPU onto the CPU just swaps usable CPU cores for what would be a practically entirely unused system.

If you're talking about AMD's long-term HSA goals, then that's DEFINITELY not reducing costs for a long time. Take a moment to consider the necessary die size to include a full-power GPU on the same die as the CPU, plus embedding at least a few GB of RAM because going out to system RAM is still 10-25+ times slower than a GPU's onboard RAM (DDR4-19200=19.2 GBps, GTX 1080 GDDR5x=320GBps, Fury X HBM1=512 GBps).
 
Last edited:
First, an iGPU does not and cannot "improve price per core without sacrificing performance". The iGPU's mere existence will either require additional die space (higher cost) or lower core count (lower performance). And, as already stated, your render farm ain't using iGPUs. They're utterly irrelevant here and I can't see why some few keep bringing them up as if they were.

Second, 90% of servers need a remote management interface, and the iGPU is part of the same chip that provides IPMI, network KVM, and sensors, and nobody is going to remove that chip, so why on Earth would we want an extra iGPU adding cost to our CPUs?

If you're talking about AMD's long-term HSA goals, then that's DEFINITELY not reducing costs for a long time. Take a moment to consider the necessary die size to include a full-power GPU on the same die as the CPU, plus embedding at least a few GB of RAM because going out to system RAM is still over 25 times slower than a GPU's onboard RAM (DDR4-19200=19.2 GBps, HBM1=512 GBps).

If you're trying to tell me that two individual components from two individual companies are better and cheaper than an integrated solution from one company then I've got a lot of really neat stuff I'd like to sell you.

Your information is based off current offerings and the current older school way of doing things. Take a step back from that and look at the potential from Integration and ability of on chip gpu. Guess I should listen to you and sell all the stock I have that's made me a killing since I bought it earlier this year.

Don't look at it from your current point of view of how things are currently done. You don't reinvent the wheel you show them why this magical device with wings is better. "Take a moment to consider the die size" like how they've already pushed that button with the Xbox one and PS4? Code can be rewritten and Optimized to use GPUs and there's a big market out there for them. AMD already has the die made I don't have to imagine it I've looked at it. Integration of these two technologies is something neither Intel nor nvidia can do and it definitely has a place in the market. You very well might know computers and what not better than I do but from a market standpoint I can tell you there's definitely one waiting for an integrated product.
 
If you're trying to tell me that two individual components from two individual companies are better and cheaper than an integrated solution from one company then I've got a lot of really neat stuff I'd like to sell you.

If you're trying to tell me that you think a motherboard consists entirely of components from a single company, I've got a bridge to the moon to sell you. That Aspeed or Renesas chip is going to be on that server board regardless. You're telling me that SuperMicro and Gigabyte and Asrock and Asus and Aspeed and Renesas should all redesign their sideband hardware (which, FYI, is the same stuff used on Intel boards, even for Xeons with iGPUs) because AMD might add an iGPU to their Opteron CPUs?
 
If you're trying to tell me that you think a motherboard consists entirely of components from a single company, I've got a bridge to the moon to sell you. That Aspeed or Renesas chip is going to be on that server board regardless. You're telling me that SuperMicro and Gigabyte and Asrock and Asus and Aspeed and Renesas should all redesign their sideband hardware (which, FYI, is the same stuff used on Intel boards, even for Xeons with iGPUs) because AMD might add an iGPU to their Opteron CPUs?
Lol no that would be completely silly. How did you come to that conclusion? I never once stated anything close to that. Ever. Anywhere. Read again.

Instead of having components from both nvidia and Intel they can get a product with both merged and realize a cost savings, a greater efficiency of use, and basically less drama in just about every way possible so long as fabrication doesn't suffer issues. This isn't a new concept it's actually been tought in basically every institution of higher learning as well as many upper level high school courses. Economies of scale and miniturization and all that. It's why computers aren't running off a room full of vacuum tubes anymore. As most have pointed out this isn't a mere product refresh but an entire leap that's multi platform and well needed to break the urgent status quo. You remind me of someone arguing how smart phones don't make sense when their current phone already makes calls or how texting is a fad. We are surrounded by leaps in technology and complete new concepts of how things will be done moving forward. Your concept of separate components is better has been proved time and again to be completely wrong.

Just sit in your corner and thumb your nose at everything because you know better and it can't happen. Since its not just a thought but has already been built, try using Google to look at it, I'll stick with the side of innovation and common sense. AMD has dramatically turned by creating a product they won't really have rivals with. They don't need to dominate any market they just need to find their place in it which is exactly what they are doing.
 
As an AMD shareholder (almost back to breaking even at $8 when I bought in when they were supposedly innovating and secured the Sony and Microsoft deal... which did pretty much nothing for their stock price, besides maybe prevent it from tanking worse these last few years...)

But I do desire some competition, I've just learned to take most everything AMD says with a huge grain of salt until unbiased/widespread reviews are available.
 
Lol no that would be completely silly. How did you come to that conclusion? I never once stated anything close to that. Ever. Anywhere. Read again.

Instead of having components from both nvidia and Intel they can get a product with both merged and realize a cost savings, a greater efficiency of use, and basically less drama in just about every way possible so long as fabrication doesn't suffer issues. This isn't a new concept it's actually been tought in basically every institution of higher learning as well as many upper level high school courses. Economies of scale and miniturization and all that. It's why computers aren't running off a room full of vacuum tubes anymore. As most have pointed out this isn't a mere product refresh but an entire leap that's multi platform and well needed to break the urgent status quo. You remind me of someone arguing how smart phones don't make sense when their current phone already makes calls or how texting is a fad. We are surrounded by leaps in technology and complete new concepts of how things will be done moving forward. Your concept of separate components is better has been proved time and again to be completely wrong.

Just sit in your corner and thumb your nose at everything because you know better and it can't happen. Since its not just a thought but has already been built, try using Google to look at it, I'll stick with the side of innovation and common sense. AMD has dramatically turned by creating a product they won't really have rivals with. They don't need to dominate any market they just need to find their place in it which is exactly what they are doing.

WTF does vacuum tubes to transistors have to do with merging two products? You are conflating so many tangentially related things... Sideband is sideband. The name says it all. If it were in the primary CPU, it wouldn't be sideband, and that'd defeat the entire point.
 
Back