• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Hell freezing over: Intel to release CPU with AMD integrated graphics

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Who cares, everyone uses PCI-E dedicated graphics cards anyway. I'd prefer if the CPUs had no video support at all and save the extra money of not having to include a GPU. The only people using on board GPUs are those doing web browsing and things like that, in which they could literately pick any GPU and it would work fine for displaying the Windows environment. This affects absolutely nothing as onboard GPUs target a non-existent market.

Laptops? Thin clients, tablets and phones? We all need graphical power to connect a screen to those portable devices. In the end I believe gpu cards or any cards will die out. Remember that regular consumers decide what a company will produce and not the enthusiast.
 
Laptops? Thin clients, tablets and phones? We all need graphical power to connect a screen to those portable devices. In the end I believe gpu cards or any cards will die out. Remember that regular consumers decide what a company will produce and not the enthusiast.

If nothing else, heat dissipation requires external cards (or at least chipsets in the case of "gaming" laptops, though I wish something like MXM would come back so I could upgrade my laptop GPUs). SoC doesn't work for high-end. It never has, it never will. There are applications that will ALWAYS require additional GPU-type processors.
 
I won't claim the GPU will disappear, rather it would become onboard like a CPU. Marketing wise this could be beneficial for say intel if they start producing gpu in the future, X mobo supports 2-3 generations of gpu then you are forced to upgrade your whole system or lag behind. My old sandy bridge lasted 6 years and prolly could another 2 years at 1080p without bottlenecking the GPU.
 
There are applications that will ALWAYS require additional GPU-type processors.

While a good motivator, it doesn't necessarily stand to reason a "need" will automatically be filled. The smaller the niche the more likely they will be on their own. I "need" a P/S 2 mouse at the moment. I can imagine somewhere, somebody in the future will want one for whatever and be $h1 out of luck. LOL
 
While a good motivator, it doesn't necessarily stand to reason a "need" will automatically be filled. The smaller the niche the more likely they will be on their own. I "need" a P/S 2 mouse at the moment. I can imagine somewhere, somebody in the future will want one for whatever and be $h1 out of luck. LOL

No, you don't. Hardware exists that provides the same or better function than a PS/2 mouse (and your example is bad anyway; USB->PS/2 adapters are still sold even at Best Buy and Walmart, so you can still get a "PS/2 mouse"). Hardware does NOT exist to provide same or better function as a discrete GPU for high-power rendering. Now you've reinforced my example. If PS/2 still hasn't gone away, how on Earth does Richo999 think discrete GPUs, let alone expansion slots for everything else, will go away?
 
Certainly not apples to apples, my point was that an industry or product isn't necessarily going to exist just because somebody wants one. Market size will affect it, that's all. The mouse came to mind because I actually had to buy one yesterday.
 
I don't follow all the hardware development trends but I can tell you with high confidence that dedicated GPU is not going away. SOCs are on the rise though and you will probably see more PC systems with APU/SOC setups as they are a better fit for your average consumer (even those that do game). Remember not everyone can afford the Nvidia high ends or the AMD high ends. Go look at Steam statistics, and you will see most people still game at the average 1080p medium/high visual settings. An APU/SOC with a Vega core can easily cover this market.

DO expect that lower end cards will be sucked into the CPU as we are seeing (market will adjust as sales go more toward APU/SOC or independent CPU+GPU).
 
I don't follow all the hardware development trends but I can tell you with high confidence that dedicated GPU is not going away. SOCs are on the rise though and you will probably see more PC systems with APU/SOC setups as they are a better fit for your average consumer (even those that do game). Remember not everyone can afford the Nvidia high ends or the AMD high ends. Go look at Steam statistics, and you will see most people still game at the average 1080p medium/high visual settings. An APU/SOC with a Vega core can easily cover this market.

DO expect that lower end cards will be sucked into the CPU as we are seeing (market will adjust as sales go more toward APU/SOC or independent CPU+GPU).

Not saying APUs taking a bigger share of the market is a bad thing... Just as too many people buy 1500W PSUs for systems that top out at 400W full load, plenty of people (maybe less here on this forum, hopefully) buy quad+HT-or-more-core CPUs and midrange GPUs when all they're ever doing is Facebook games that would do just fine on a measly GMA600 and dual-core. And that's ignoring the utter mismatch between CPU and GPU power in a lot of systems. I realize that fixing that silliness means the high-end components are going to be even more expensive due to less demand, but I'll happily eat that cost if I don't have to see friends-of-friends on Facebook building machines with 32 GB of RAM to play Garden Warfare... :bang head
 
My guess is that this move has more to do with Qualcomm than it does AMD. Intel is about to get hammered if the press reports are accurate about SD835 in Windows.
 
I won't claim the GPU will disappear, rather it would become onboard like a CPU. Marketing wise this could be beneficial for say intel if they start producing gpu in the future, X mobo supports 2-3 generations of gpu then you are forced to upgrade your whole system or lag behind. My old sandy bridge lasted 6 years and prolly could another 2 years at 1080p without bottlenecking the GPU.

yep im still rocking the 2500k ive upgraded the gpu like 5 or 6 times on a 1060 6g now at 2560x1080 its starting to become a bottleneck a couple games i play its sitting at like 99% the whole time pubg being one of them.
 
My guess is that this move has more to do with Qualcomm than it does AMD. Intel is about to get hammered if the press reports are accurate about SD835 in Windows.

No its not. Its part of Intel's eternal goal to move everything to one chip. Intel believes they need to be the single source for all things computational (GPU/CPU/APU/SOC/Tensor/FPGA/AI/etc).

It's good though to see healthy competition.
 
Back