• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Did AMD/ATI at sometime in the past take the gaming perf. crown from Nvidia?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
I had thought sometime in the past either ATI or AMD had taken the gaming perf. crown from Nvidia, I think it was some generation prior to when Nvidia's Kepler series was released, but I'm really not sure if it ever happened at all.
 
I'm not sure they ever did it in AMD era. In ATI era, I recall the FX5000 series was considered a dud and ATI were ahead with whatever they had at that time.
 
Was that around the time of the 9800, 9800 pro (that you could flash to a....), 9800 xt?

Not laser cut hardware... crazy...lol.
 
i remember a time where price to performance was strongly in AMD/ATI realm.
i remember the ati cards not being quite as fast but under cutting nvidia by a wide margin and OCing them would make up the difference... if they could.

i wanna say that was around 7870 / 270x time.

nvidia has done this overpriced video card BS before and like before, i'm thinking of going AMD again with the next gen cards (8xxx or what ever)
 
Niku-Sama might have it right, it might be that AMD/ATI never took the performance crown from Nvidia but had the best price-to-perf. ratio. For years all I could afford was ATI/AMD and I never even bothered to look at what Nvidia had to offer. I seem to remember the performance delta between Nvidia and AMD increasing significantly when Nvidia released Kepler because that's when I finally jumped ship.
 
but had the best price-to-perf. ratio.
"If you're not first, you're last" - Reese Bobby

Jokes aside, I think Techpowerup does $ / FPS...
 
Done a bit of searching, it sounds like they might have had an advantage around 7XXX to 2XX era. This was when I was taking a break from keeping up with hardware so I guess I missed that.

When talking about performance I like to leave price out of it since that is a choice, whereas hardware is fixed. Also perf/price of GPU alone often fails to consider the system choices. If you're building a mid to high end system, an old GPU that is free will be worse value than paying for something higher performance.
 
My research seems to indicate the GTX 780 (which I bought way back when to replace my AMD 6970) beat out AMD's best: the 7970 Ghz. edition.

This article even included 99 percentile frametimes (back in 2013, which surprised me):

https://www.techspot.com/review/675-nvidia-geforce-gtx-780/page3.html

Even more interesting was that the GTX 780 beat out all tested crossfire setups (incl. the 7990) in terms of 99th percentile frametimes.
 
Weren't there at least rumors that the 20x0 Super was a response to the RX 5700 & RX 5700 XT beating the RTX 2060 & RTX 2070 in benchmarks? I wasn't paying close attention during that hardware cycle because I didn't have any software worth an upgrade.
 
The last time I remember AMD beating Nvidia was the R290 series. Especially when some could be modded to open up more memory or something like that.
 
The last time I remember AMD beating Nvidia was the R290 series. Especially when some could be modded to open up more memory or something like that.
Yes it looks like the r9 290x did briefly have the perf. crown over Nvidia's kepler series until they released the Titan Black which beat out the r9 290x, particularly at 4K.
 
£/perf AMD are always better especially if you're looking at 1440P. nVidia make quieter but more expensive GPUs. 6900XT is doing great.

RTX isn't full ray-tracing like used in CGI movies...
 
RTX isn't full ray-tracing like used in CGI movies...
While it's true, AMD is using the same/similar process that also isn't full RT like used in CGI movies. I don't understand why you say this (so frequently - and often appearing as a dig towards NV when it's the same for both sides).
 
For most generations, ATI/AMD was trying to keep up and usually was adjusting prices to a bit earlier released Nvidia products. The last generations are not really cheaper. They're cheaper if you compare theoretically the same "target user shelf" ... but Nvidia is faster anyway, so AMD is competing with lower GPUs than it is supposed to. A couple of times, AMD had no real answers to top Nvidia cards. They had no answer when they had the Vega series or RX5000. I don't want to dig back into the old stuff, but even when Nvidia released a pretty bad FX5000, it wasn't really slower than ATI cards back then. It was just more expensive, so ATI had a great chance to convince more gamers to their products.
 
Path tracing and global illumination is pretty much there. It is computationally very expensive so optimisations are still used so we get better than cinematic frame rates. Movies don't have to be rendered real time so they can turn the quality knob right up.
 
While it's true, AMD is using the same/similar process that also isn't full RT like used in CGI movies. I don't understand why you say this (so frequently - and often appearing as a dig towards NV when it's the same for both sides).

At the moment it's a waste of resources as it's subtle effect yet it's hit on performance is massively unsubtle. As far as taking digs, well nVidia have to look at their behaviour. If their customers are not happy there is probably a valid reason. I've had a few nVidia tech GTX cards (1070 latest) & nForce chipsets...
 
At the moment it's a waste of resources as it's subtle effect yet it's hit on performance is massively unsubtle. As far as taking digs, well nVidia have to look at their behaviour. If their customers are not happy there is probably a valid reason. I've had a few nVidia tech GTX cards (1070 latest) & nForce chipsets...
By digs, I meant you appear to shade NV for improper RT, but AMD does the same thing...........and AMD's performance when using it is even worse (their FSR implementation is also worse than DLSS). But, there are additional reasons as well (see Mack's post).

Well, 1000 series was released almost 8 years ago, and I can't remember the last time an Nforce chipset was used (2010ish?). So.... that speaks volumes on where you're coming from (dated).

What do you mean, their customers aren't happy? Nobody wants to pay AMD or NV prices for GPUs. Market share hasn't changed much in the dGPU market, with NV still dominating, even with higher pricing in some brackets (as was mentioned, it also depends on what you're comparing it to and if you manage to play only raster games). I'm not defending NV so much as I am saying apply the same logic to all parties involved. In this case, the RT argument applies to both parties and doesn't show NV (or AMD) in a negative light. It is what it is.
 
For most generations, ATI/AMD was trying to keep up and usually was adjusting prices to a bit earlier released Nvidia products. The last generations are not really cheaper. They're cheaper if you compare theoretically the same "target user shelf" ... but Nvidia is faster anyway, so AMD is competing with lower GPUs than it is supposed to. A couple of times, AMD had no real answers to top Nvidia cards. They had no answer when they had the Vega series or RX5000. I don't want to dig back into the old stuff, but even when Nvidia released a pretty bad FX5000, it wasn't really slower than ATI cards back then. It was just more expensive, so ATI had a great chance to convince more gamers to their products.
AMD deliver at consumer price points who cares about he xx90 or Titan if it's the fastest card; who cares about Rolls Royce or Bentley; people look at a BMW 3 series or MB C Class or an Audi A4 and see the best deal for their money.

I pushed the boat on a 6900XT last winter but I would have been shopping around the £550 (price adjusted for inflation for an 3Dfx Voodoo 1 Orchid Righteous 3D + 2D card)...
 
Back