Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
Yeah, this.If they clock faster, it could negate IPC differences.
For a power budget then 4 cores will be clocked higher than 8 cores, but if you throw enough cooling at it I'd hope both would reach the same max clocks. So my earlier comment on 6+ cores being a stronger region for AMD was due to that Intel server based CPUs have always lagged behind desktop, and you need to go to those to get the core count up. But at 4 cores on the desktop, Intel is expected to be 2 generations on from Broadwell by the time we have Zen.
AMD has created a product that will work across multiple platforms. Zen is going to be a very good product for PCs but the money maker will be in the server business. Once Zen starts pumping out with integrated GPU I expect them to start breaking into the server space again. With basically no market share now they are surely going to make a splash
You've said the exact same thing earlier.
Why would an igpu make any major difference in the vast majority of server usage purposes? Also, what does them having almost no market share now have anything to show for them 'making a splash', unless you are at AMD and selling 5 units is a large % increase. Why would companies (and in particular the OEMS that sell the servers (Dell, IBM, HP, etc etc) switch entire product lines from Intel that they've used primarily for over a decade?
You've said the exact same thing earlier.
Why would an igpu make any major difference in the vast majority of server usage purposes? Also, what does them having almost no market share now have anything to show for them 'making a splash', unless you are at AMD and selling 5 units is a large % increase. Why would companies (and in particular the OEMS that sell the servers (Dell, IBM, HP, etc etc) switch entire product lines from Intel that they've used primarily for over a decade?
First, an iGPU does not and cannot "improve price per core without sacrificing performance". The iGPU's mere existence will either require additional die space (higher cost) or lower core count (lower performance). And, as already stated, your render farm ain't using iGPUs. They're utterly irrelevant here and I can't see why some few keep bringing them up as if they were.
Second, 90% of servers need a remote management interface, and the iGPU is part of the same chip that provides IPMI, network KVM, and sensors, and nobody is going to remove that chip, so why on Earth would we want an extra iGPU adding cost to our CPUs?
If you're talking about AMD's long-term HSA goals, then that's DEFINITELY not reducing costs for a long time. Take a moment to consider the necessary die size to include a full-power GPU on the same die as the CPU, plus embedding at least a few GB of RAM because going out to system RAM is still over 25 times slower than a GPU's onboard RAM (DDR4-19200=19.2 GBps, HBM1=512 GBps).
If you're trying to tell me that two individual components from two individual companies are better and cheaper than an integrated solution from one company then I've got a lot of really neat stuff I'd like to sell you.
Lol no that would be completely silly. How did you come to that conclusion? I never once stated anything close to that. Ever. Anywhere. Read again.If you're trying to tell me that you think a motherboard consists entirely of components from a single company, I've got a bridge to the moon to sell you. That Aspeed or Renesas chip is going to be on that server board regardless. You're telling me that SuperMicro and Gigabyte and Asrock and Asus and Aspeed and Renesas should all redesign their sideband hardware (which, FYI, is the same stuff used on Intel boards, even for Xeons with iGPUs) because AMD might add an iGPU to their Opteron CPUs?
Lol no that would be completely silly. How did you come to that conclusion? I never once stated anything close to that. Ever. Anywhere. Read again.
Instead of having components from both nvidia and Intel they can get a product with both merged and realize a cost savings, a greater efficiency of use, and basically less drama in just about every way possible so long as fabrication doesn't suffer issues. This isn't a new concept it's actually been tought in basically every institution of higher learning as well as many upper level high school courses. Economies of scale and miniturization and all that. It's why computers aren't running off a room full of vacuum tubes anymore. As most have pointed out this isn't a mere product refresh but an entire leap that's multi platform and well needed to break the urgent status quo. You remind me of someone arguing how smart phones don't make sense when their current phone already makes calls or how texting is a fad. We are surrounded by leaps in technology and complete new concepts of how things will be done moving forward. Your concept of separate components is better has been proved time and again to be completely wrong.
Just sit in your corner and thumb your nose at everything because you know better and it can't happen. Since its not just a thought but has already been built, try using Google to look at it, I'll stick with the side of innovation and common sense. AMD has dramatically turned by creating a product they won't really have rivals with. They don't need to dominate any market they just need to find their place in it which is exactly what they are doing.