• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Does Intel have any published roadmap for their Arc videocards?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
I'd be interested to hear/read about where Intel intends to go w/their Arc videocards. Their A770 wasn't very impressive to me, it barely beats out a 1080ti and has even less memory.
 
A 1080ti is a flagship gpu, albeit from 3 gens ago. These were never meant to be high end cards. Mainstream at best. Loads of vram isn't needed for 1080p gaming, really.

Here's what Google turned up - https://www.tomshardware.com/news/intel-arc-gpu-roadmap-2022-2023-leaked

 
ARC is effectively dead as a discrete high end GPU. They will keep using it as an IGP in newer CPU's and will probably keep producing mid to low end discrete cards for OEM's as well as high end accelerators for HPC/Datacenter clients.

I still hold out hope that they can compete but fact is they blew the launch bad, it was late, buggy, and crap, along with being overhyped by Intel themselves.


MLID often tracks progress and updates for ARC if you want to stay ontop of the news.
 
ARC is effectively dead as a discrete high end GPU.
It was never alive as one. The new one, according to that roadmap, will have some kind of 'high-end' (read: enthusiast-class) GPU. So, it's being born next-gen......maybe, lol. But I don't think it will ever reach AMD or NV heights. Just perhaps a 2560x1440/120 or 4K/60+ card.
 
Last edited:
You have to walk before you run. Intel have found it no simple job to catch up to AMD and nvidia. We've only seen the start. With Arc going into their iGPUs from Meteor Lake onwards, more people than ever will be using it in some form.

Personally while next gen should be an improvement, it's going to take more generations for them to get closer to the top end. Battlemage would have been largely designed while Alchemist was releasing, so Celestial may be the first gen incorporating feedback to hardware from Alchemist. Battlemage still benefits from software experience over time so should do better on release.
 
Intel's expertise in iGPUs doesn't seem to have helped them much in developing a high-end discrete GPU. Was the A770 even competitive with the duopoly's mid-range offerings?
 
Was the A770 even competitive with the duopoly's mid-range offerings?
Most reviews have that information. :)
 
Last edited:
Intel's expertise in iGPUs doesn't seem to have helped them much in developing a high-end discrete GPU. Was the A770 even competitive with the duopoly's mid-range offerings?
sadly not really. It had some innovation and could have been competitive had it worked and shipped on time. If they had released when they said they would it would have been right before the GPU drought caused by Covid in 2020.
 
i think considering how the card is forward looking when designed. you can not fault them, they have had to make "emulators" if you will to run DX9/DX10 games. intel even admits it still has a lot of work to do on its drivers. considering their mid range cards is like a high end 1080Ti says something imo. as we move away from games with DX9/DX10 more and more. intel is more on the right track focusing that way, nv and amd have the legacy built into them already for dx9/dx10. who is to say where the a770 will stand when the drivers are polished and all the popular games are on dx12. one thing that really does kill arc is the need for rebar. that is why i am confused to see AMD setups with rebar or their name for it turned off. would amd's version of rebar even work with arc? even now when i see some X vs X vs X in 20game test videos on youtube, they do not tell you if rebar is on. i hope with this next release of cards, they can get it sorted. to where rebar adds more to its performance vs required for it. i mean in instances where with out it you get like half the fps in some cases. intel rebar with arc should be more a boost for performance/fps. the whole arc ecosystem is about to expand even more with the new mobile cpu's sporting it.

i personally think people are a bit hard on intel right now with arc. it does not matter if it was a standalone card or iGPU, they would have still had driver growing pains. there are always some issues with arch changes along the way drivers always need to be updated. intel focuses on the popular titles for its drivers first then sifts down to others. i do not thing it will take intel long for its mid range to catch up or match the next mid range cards. when intel gets the drivers polished and the price performance ratio beats either, more might but arc. i would have been on the early bandwagon for a770. i do not game as much as i use to and games i do play are older DX titles so there is that. intels bout this time with stand alone video cards is leaps above its i740 release back in the day.
 
Last edited:
I'd don't wish Intel any ill will w/the Arc or Battlemage(?) series of discrete GPUs I hope the do challenge the duopoly.

It is weird how Intel tied perf. for their Arc discrete GPUs to having resizable BAR/SAM though. Do Intel's iGPUs rely on resizable bar/SAM?
 
It is weird how Intel tied perf. for their Arc discrete GPUs to having resizable BAR/SAM though. Do Intel's iGPUs rely on resizable bar/SAM?
Intel's being a new general design uses it to optimise data flow. At the opposite end, nvidia's basis never relied on it so they're less bothered. AMD are kinda in between, since they used it as a marketing competitive advantage when they introduced it. I'm now curious, if anyone has done SAM on/off testing for RDNA3 in recent games? Is there much fall off from not having it?

Anyway, to the question at hand. The feature is about data transfers over PCIe. When you have a dGPU connected to CPU, that's what you use. With an iGPU, the data does not go over PCIe to get between CPU cores and GPU, so it is irrelevant.
 
they would have still had driver growing pains
This argument always falls so flat to me, Intel has a massive software team, they have to build drivers for iGPUs and ARC was already based on their datacenter accelerators which already had drivers. Granted they are not game drivers in that case, but all the iGPUs are game drivers.

Its clear it was a hardware issue not a driver issue and they held off shipping too long because they could not fix it in software. All the cards were already manufactured and sitting in warehouses a year before they launched.
 
Drivers were certainly a part of it, just looking at the significant increases from launch. You don't see that with NV/AMD drivers, which tend to launch a bit more 'mature' for discrete graphics and don't see the type of improvements Intel has (and more across the board).
 
Drivers were certainly a part of it, just looking at the significant increases from launch. You don't see that with NV/AMD. drivers, which tend to launch a bit more 'mature' for discrete graphics.
Drivers were part of how they saved their *** but they used DXVK to do all the translation for everything DX9 and down because performance was terrible.


So rather than write drivers they just emulate/translate DX9. Its honestly smart, but dont kid yourself that that Intel had a bigger lift, thats just apologizing for a bad product. Dont get me wrong I really really really want them in the market and making discrete GPUs, and I have several times seriously thought about buying an ARC just for the AV1 encoding alone, but Intel is a company like any other and should be held to the same standard. They cannot release a crap, buggy, late
 
They cannot release a crap, buggy, late
I mean, true, but, they all lay some eggs at some point. I'm simply saying that it's clear (to me) that the drivers weren't prime time ready. So while there may be some hardware issues (?), software helped.
 
Drivers were part of how they saved their *** but they used DXVK to do all the translation for everything DX9 and down because performance was terrible.


So rather than write drivers they just emulate/translate DX9. Its honestly smart, but dont kid yourself that that Intel had a bigger lift, thats just apologizing for a bad product. Dont get me wrong I really really really want them in the market and making discrete GPUs, and I have several times seriously thought about buying an ARC just for the AV1 encoding alone, but Intel is a company like any other and should be held to the same standard. They cannot release a crap, buggy, late
I think what alot of people fail to realize is that its about the gpu design handling DX9/10. this would be the same case with 3DFX, had better performance using glide or glide to openGL wrapper(as they called it then). performance is also some what game based as well, Diablo 2. built up around using Glide, but 3DFX went down fast. they threw in half baked DX support in there, which is why the game stutters with DX used. that is even for todays pc's, in large fights with lots of effects. I wish i knew more as to why the game does it when using DX, looks worse in software mode. It does not stutter when in the same situations. what we are kind of comparing though with Arc to amd/nv. while not to me the best way i can think of to say it. Is a 32/64bit cpu being amd/nv and intel that is only focusing the future aka 64bit. A emulator in that case would need to be used to run 32bit programs. just like windows 64bit install uses a emulation mode for 32bit apps.

also to talk about drivers, if nv or amd drivers were so perfect. then playing games with new cards would never have any glitches or artifacts. even NV and amd can optimize the drivers more to get more performance from X game. people forget how important driver development is for any device. the argument would be nv/amd are perfect, ok, so they should only need one driver release for the entire life of the card. windows would never need updates because it should be perfect when it starts to sell and come on pc's. telsa's software in the car should be perfect and never need OTA updates.
 
Last edited:
technically DXVK is a translation layer like Roseta for Apple going to ARM. Its not an emulator it just listens to the system call, and changes the instruction to the appropriate ones for the architecture.

Also like how WINE is not a Windows Emulator but a reverse engineering of its system stack so that when you make a windows call it responds as though your running a windows computer on the linux kernel.
 
also to talk about drivers, if nv or amd drivers were so perfect. then playing games with new cards would never have any glitches or artifacts. even NV and amd can optimize the drivers more to get more performance from X game. people forget how important driver development is for any device. the argument would be nv/amd are perfect, ok, so they should only need one driver release for the entire life of the card. windows would never need updates because it should be perfect when it starts to sell and come on pc's. telsa's software in the car should be perfect and never need OTA updates.
Agree. But, this isn't about AMD or NV being perfect/better/worse. I'm simply saying drivers had a lot to do with the performance increase of Arc, contrary to what was stated earlier. For me, the telltale sign is the difference in performance over time (with drivers, no h/w changes) where Intel's gains were more substantial than AMD or NV. :)
 
@Evilsizer I had thought one of the reasons 3dFx failed was that they didn't want to do 32-bit color, they figured 16-bit was enough for everybody.
 
Back