Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
It was never alive as one. The new one, according to that roadmap, will have some kind of 'high-end' (read: enthusiast-class) GPU. So, it's being born next-gen......maybe, lol. But I don't think it will ever reach AMD or NV heights. Just perhaps a 2560x1440/120 or 4K/60+ card.ARC is effectively dead as a discrete high end GPU.
Most reviews have that information.Was the A770 even competitive with the duopoly's mid-range offerings?
sadly not really. It had some innovation and could have been competitive had it worked and shipped on time. If they had released when they said they would it would have been right before the GPU drought caused by Covid in 2020.Intel's expertise in iGPUs doesn't seem to have helped them much in developing a high-end discrete GPU. Was the A770 even competitive with the duopoly's mid-range offerings?
anything before ARC does not use rebar/sam.Do Intel's iGPUs rely on resizable bar/SAM?
Intel's being a new general design uses it to optimise data flow. At the opposite end, nvidia's basis never relied on it so they're less bothered. AMD are kinda in between, since they used it as a marketing competitive advantage when they introduced it. I'm now curious, if anyone has done SAM on/off testing for RDNA3 in recent games? Is there much fall off from not having it?It is weird how Intel tied perf. for their Arc discrete GPUs to having resizable BAR/SAM though. Do Intel's iGPUs rely on resizable bar/SAM?
This argument always falls so flat to me, Intel has a massive software team, they have to build drivers for iGPUs and ARC was already based on their datacenter accelerators which already had drivers. Granted they are not game drivers in that case, but all the iGPUs are game drivers.they would have still had driver growing pains
Drivers were part of how they saved their *** but they used DXVK to do all the translation for everything DX9 and down because performance was terrible.Drivers were certainly a part of it, just looking at the significant increases from launch. You don't see that with NV/AMD. drivers, which tend to launch a bit more 'mature' for discrete graphics.
I mean, true, but, they all lay some eggs at some point. I'm simply saying that it's clear (to me) that the drivers weren't prime time ready. So while there may be some hardware issues (?), software helped.They cannot release a crap, buggy, late
I think what alot of people fail to realize is that its about the gpu design handling DX9/10. this would be the same case with 3DFX, had better performance using glide or glide to openGL wrapper(as they called it then). performance is also some what game based as well, Diablo 2. built up around using Glide, but 3DFX went down fast. they threw in half baked DX support in there, which is why the game stutters with DX used. that is even for todays pc's, in large fights with lots of effects. I wish i knew more as to why the game does it when using DX, looks worse in software mode. It does not stutter when in the same situations. what we are kind of comparing though with Arc to amd/nv. while not to me the best way i can think of to say it. Is a 32/64bit cpu being amd/nv and intel that is only focusing the future aka 64bit. A emulator in that case would need to be used to run 32bit programs. just like windows 64bit install uses a emulation mode for 32bit apps.Drivers were part of how they saved their *** but they used DXVK to do all the translation for everything DX9 and down because performance was terrible.
Intel Arc Driver Optimizations Leverage Valve's DXVK Translator
That's one way to improve performance.www.tomshardware.com
So rather than write drivers they just emulate/translate DX9. Its honestly smart, but dont kid yourself that that Intel had a bigger lift, thats just apologizing for a bad product. Dont get me wrong I really really really want them in the market and making discrete GPUs, and I have several times seriously thought about buying an ARC just for the AV1 encoding alone, but Intel is a company like any other and should be held to the same standard. They cannot release a crap, buggy, late
Agree. But, this isn't about AMD or NV being perfect/better/worse. I'm simply saying drivers had a lot to do with the performance increase of Arc, contrary to what was stated earlier. For me, the telltale sign is the difference in performance over time (with drivers, no h/w changes) where Intel's gains were more substantial than AMD or NV.also to talk about drivers, if nv or amd drivers were so perfect. then playing games with new cards would never have any glitches or artifacts. even NV and amd can optimize the drivers more to get more performance from X game. people forget how important driver development is for any device. the argument would be nv/amd are perfect, ok, so they should only need one driver release for the entire life of the card. windows would never need updates because it should be perfect when it starts to sell and come on pc's. telsa's software in the car should be perfect and never need OTA updates.