• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Looking for new GPU

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Dravenspur

Member
Joined
Dec 13, 2015
I’ve been using an RX 480 since it launched back in 2016. I like AMD and I have been using their cards for a long time (even before the 400 series). However, since I am running a 1080p display, I don’t need the 7900 series of cards. I have been waiting for them to release info on a midrange card in the 7000 series, but I haven’t seen anything.

Now I’m reading that AMD might not release a mid range GPU with the 7000 series. I have not seen anything direct from AMD, this is speculation from other sites.

I really don’t want to switch to Nvidia to feed that machine, but the 4080 is a possibility. I like the idea of the A770 from Intel, but since no one yet knows about support long term, I don’t know if I should go with that. I’ve read that Intel GPIUs aren’t selling right now, although Intel has come out and said they are going to support discrete GPUs now and they have more ideas for other cards in the future.

For the first time in years, I’m not sure what to do with a GPU upgrade. I’d like the card to last a while like the 480 has. I guess spending over 1k for a card from Nvidia wouldn’t be bad if it lasts me around the same amount of time.

I’d like to continue playing games at high to ultra settings. I play a mix of games. Spider-Man, Midnight Suns, Gotham Knights (when I have the card that will run it) King Arthur Knight’s Tale, Cyberpunk (when I have an GPU that will work). I want to play Street Fighter when it comes out later this year.

Has anyone bought an Arc GPU? Should I just go with the 4080? Any input is appreciated.
 
If I remember correctly the Arc 770 falls between the RTX 3060 & 3060Ti in performance. I don't know if that's enough horsepower to push 1080p games at ultra.

The RTX 4080 makes no sense to me for 1080p considering its performance is between the 7900 XT and XTX, and it is more expensive than both. The RTX 4070Ti would be a better choice from nvidia rather than the 4080. Another option is the RX 6900 XT or RX 6950 XT from $650 to $750, which is pretty good value IMO. The RX 6800 XT is a good value to at ~$500, just hard to find them in-stock.
 
Thanks for the reply. I thought about the rx 6000 series, but I saw a video (and checked out AMD’s driver page) and AMD hasn’t released drivers for the 6000 series for over a month. The video I watched from UFD Tech said they quit making drivers for the 6000 series to focus on 7000. The 4070ti would be fine, but I wanted the card to last a while, as I mentioned. I didn’t know if the current low end from Nvidia would last me 6 or 7 years. I know they may release the 4060 at some point, but I haven’t seen speculation on that yet. I will do more research into the 4070ti. It’s that, the 4080 or the A770.
 
The a770 isn't in the discussion if you're poopooing a 4070ti for longevity as the Intel card is considerably slower. 4070ti = 3090ti.

the 4070ti would be plenty of card at 1080p for several years. Likely even 2560x1440 if you decided to upgrade.

A 4070ti is literally a couple of percent slower than a 7900XT, note, so don't dismiss the amd card if a 4070ti is on the table...both will be plenty fast.

A 1080p needs cpu horsepower... you may want to consider an upgrade to the 5000 series cpu. You'd see a performance uptick there too, especially with a 4080 (4070ti too... 3090ti was limited a bit by cpu at 1080p)
 
Last edited:
The a770 isn't in the discussion if you're poopooing a 4070ti for longevity as the Intel card is considerably slower. 4070ti = 3090ti.

the 4070ti would be plenty of card at 1080p for several years. Likely even 2560x1440 if you decided to upgrade.

A 4070ti is literally a couple of percent slower than a 7900XT, note, so don't dismiss the amd card if a 4070ti is on the table...both will be plenty fast.

A 1080p needs cpu horsepower... you may want to consider an upgrade to the 5000 series cpu. You'd see a performance uptick there too, especially with a 4080 (4070ti too... 3090ti was limited a bit by cpu at 1080p)
Thanks. I’m looking to build a whole new system. Intel 13700, 32GB DDR5, and a new video card. CPU shouldn’t be a problem.
 
Thanks for the reply. I thought about the rx 6000 series, but I saw a video (and checked out AMD’s driver page) and AMD hasn’t released drivers for the 6000 series for over a month. The video I watched from UFD Tech said they quit making drivers for the 6000 series to focus on 7000. The 4070ti would be fine, but I wanted the card to last a while, as I mentioned. I didn’t know if the current low end from Nvidia would last me 6 or 7 years. I know they may release the 4060 at some point, but I haven’t seen speculation on that yet. I will do more research into the 4070ti. It’s that, the 4080 or the A770.

Some leaks say that RTX4060/Ti will have the performance close to RTX3070, so not really high. I wanted to sell RTX3070 recently, but because of low prices on auctions, I decided to keep it as it's still good enough for what I need.

A770 is already quite slow, as it was mentioned. You may only consider A750 as a temporary card if you aren't decided, have lower budget, or want to wait for any newer GPU (4080Ti/4090Ti, whatever will be released in some months). In the last few days was news about price drops for the A750 cards. It's probably still a bad idea and better is to get RTX4070Ti or RTX4080.

If you want to have it for longer then I would get at least RTX4070Ti. I don't expect price drops anytime soon, especially that AMD is not really pushing with Radeon 7000 performance or prices, and their cards are not selling well.
 
After watching a Moore’s Law is dead Q&A video, I’ve made a new decision. I’m going with AMD or Nvidia. It turns out Arc may not even be supported in the near future if what his video says is correct. I can’t decide between the 7900xt or 4070TI.

I’m concerned about Nvidia’s 16 pin connector. I’ve never had a card that didn’t use the normal 8 pin connectors. I don’t think the connector will melt on a 4070 (I’ve only heard of that so far with the 4090), but has anyone had other issues with the connector?

I’m a little concerned about the VRAM too. 4070 as we know is 12 GB of 6x, whereas the 7900 xt is 20 GB of GDDR 6. Does anyone think the 4070 they bought will be a problem sooner because the VRAM would be maxed out in games? System requirements do mention some VRAM requirements, but they’ve been small so far (I’ve seen the 1060 6 GB for VRAM a few times), so I guess 12 GB is plenty. I wonder why AMD would put 20 GB on a card. That seems like overkill. It looks good on the box, but I have been using an 8 GB card for years and haven’t had trouble with maxing VRAM.
 
There are no problems with connectors as long as they're properly connected. Don't bend the cable near the connector or pins may be loose/don't have good contact. It causes that the connector is heating up/melting.

Current graphics cards have better data compression than some years ago, so we moved from games that needed 6-8GB in GTX1000 era, to more demanding games that still need 6-8GB. Barely anything needs more, so 12GB is enough for longer.
You can also think that ok, but in 2-3 years can be games that need 16GB+ at higher display settings and higher resolution. Yes, but then probably current graphics cards will be too slow to handle higher display settings and resolution.

If I'm right then Nvidia has still better texture compression than AMD (correct me if I'm wrong as maybe something has changed), so requires less VRAM for the same titles and settings.
I guess that DLSS is the only thing that matters and makes Nvidia a better option in the same price. Personally I like AMD more, but I guess I wouldn't pick it over Nvidia if I had to spend so much money nowadays. I picked it in the past a couple of times for various reasons like I really wanted R9 Nano. The last time I bought AMD when RX6800XT cost me not much more than RTX3070 (I was lucky with first deliveries as it supposed to cost close to RTX3080). RX6800XT will be probably good enough for next 2 years. Now I'm playing games on RTX3060, even though I'm using RX6800XT and RTX4080 for tests.

I'm still wondering why anyone puts 20GB+ on graphics cards for gaming. 16GB is more than enough for high-end cards, but the bus layout forces to use more chips, so this is probably the reason why specific ICs give higher than required total capacity.
 
MLID is quite divisive and suffice to say I'm in that camp that is not a fan. Even with the bad financials Intel posted recently they reiterated their support for Arc, so it has every chance of making it longer term. Don't buy for the distant future, but what products offer today. Still, I wouldn't recommend Arc if it will be the only or main system and it is safer to pick one of the more established players. I can't promise Arc will be around any more than I can say the same for AMD GPUs.

On VRAM, 12GB is a good point for a new build today. We are seeing games already that can't use their potential with 8GB cards. Some of the quantity might be a case of "more is better" marketing, when in practice it is more like enough or not enough. However even if you don't have enough to turn the settings right up, you can still continue with slightly reduced settings.

A case where VRAM quantity might matter more than gaming is video editing. I believe some popular ones can gobble it up if you throw on a lot of effects. But that's a specific niche.
 
After seeing more reviews of the 7900 xt and xtx and the driver issues AMD is having, I’m looking hard at Nvidia. My question is, how is the ray tracing performance of the 4070ti? Can someone who has a 4070ti let me know how the performance is? Should I look at a 4080?
 
RT performance is better than AMD by far. Reviews detail the info you're looking for...it varies dramatically depending on the title, settings, amd if DLSS is used.
 
Thanks. I’m having trouble deciding. I looked again at the 7900 xt and the performance was pretty close to the 4070 ti. I read with the driver update in January they fixed some things. I’m deciding between XFX or Sapphire Pulse 7900 xt and MSI or ASUS 4070ti. My question is, how much stock should I put in the PSU requirements?

The XFX says it needs an 800W PSU, but the Overclockers review used an EVGA 750W PSU. Would my 850W work with the XFX? Or should I look at Sapphire, MSI or ASUS which require a 750W PSU? I’d like to keep my current PSU if possible.
 
750W will be plenty for the XT and 4070ti. I'd like 850w for XTX or 4080+ (though I ran both on 750w).

I'd go Nvidia, especially if you're considering using RT in games.
 
I don't think I saw it, what resolution and framerate/refresh rate are you targeting?
 
Right now, I’m running 1080p 60 Hz. I realize that both the 7900 XT and 4070 TI are overkill for 1080p, but they are the same price currently (and cheaper in some cases) than the 6800 XT or the RTX 3000 series and are more forward looking with drivers than the 6000 or 3000 series. I’ve heard of Nvidia not making new drivers for older cards in the past and I’ve read AMD is doing it now as well with the 6000 series. The 4070 TI or 7900 XT would also allow me to upgrade to 1440p if I wanted and still have a great experience.

I think I will go with the 7900 XT. It beats the 4070 TI in rasterization. The ray tracing is better on the 4070 TI, but I’ve never had ray tracing and I don’t need it yet, I don’t think. I don’t know if ray tracing would be worth it unless I stepped up to a 4080, which I don’t need and is still severely overpriced.
 
The other thing that pushed me toward the 7900 XT is not the memory size (even though 20 is more than 12), it is the memory bus bandwidth. Nvidia is running at 192 bits and AMD is running at 320 bits. Nvidia is running GDDR 6X vs GDDR 6 for AMD, and I know that 6X is faster and more efficient than 6, but the bus size has to count for something.
 
The other thing that pushed me toward the 7900 XT is not the memory size (even though 20 is more than 12), it is the memory bus bandwidth. Nvidia is running at 192 bits and AMD is running at 320 bits. Nvidia is running GDDR 6X vs GDDR 6 for AMD, and I know that 6X is faster and more efficient than 6, but the bus size has to count for something.
At 4K it's a few % difference for sure... 1080p... it is what it is, righrt? I mean, WYSIWYG on benchmarks, bus widths be damned, lol. 4K spreads the gap between the two for that reason, but otherwise...
 
Last edited:
Back