• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

RTX 4090, maybe not that expensive when compared to past halo videocards

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
The Asus Mars II (a dual GTX 580 GPU card) retailed for $1,447.47 in 2011, which is $1,975.86 in 2023 dollars.

The famed, outrageously expensive and power hungry TitanZ retailed for $2999 in 2014, which is $3,889.76 in 2023 dollars!!!

The also power hungry r9 295x2 was much more affordable in 2014, at an MSRP of $1,499, which is $1,944.23 in 2023 dollars.

I can't imagine Nvidia moved many units of the TitanZ back in the day. The TitanZ didn't have any of the functionality of Nvidia's Quadro or Titan series did it?
 
I mean, that's true. But you're comparing dual gpu cards where the BOM is a lot more expensive than any single GPU cards like we see today. I see you're comparing fastest with fastest, but...thats a tough sell as 🍎 to 🍎. :)

I hapve to admit, I miss my r9 295x2... and what was amd's around that time... the watercooled one....

Multi gpu has been sunset. I couldn't imagine 2x of Ms. Lovelace, AD102 @ ~450W what it would be like to cool. Ha!
 
1080Ti was the top-of-the-line once released. It cost me ~$850 inc. 23% VAT, maybe 2 weeks after the premiere. Now for the same price, even with added inflation, you can buy a mid-shelf card.
Because of BTC mining, vendors noticed that people could pay much more, and prices, which were bumped 100-500% because of the BTC time shortage, remained at the 200%+ level after that. This is the regular price right now if we compare the same product shelves.
The whole RTX line, even including the inflation, is supposed to cost about 50-60% less. AMD is always adjusting its price/performance to Nvidia. They have a bit better prices, but still way too high to be reasonable. People simply buy whatever is available as prices won't be better anytime soon, but many buy budget cards as they can't afford anything better. We somehow laugh at the RTX4060/RX7600 8GB performance in AAA titles, but these cards sell the best. It's because of the stupid pricing.

Quadro is a different product line. It's not for gaming. Titan wasn't released for a long time, and it's an enthusiast gaming card. Like for "elite enthusiasts", not for mass sale. Nvidia also modified drivers some years ago. Everything that you could do on a regular GeForce could be done on Quadro with a proper driver. Now Quadro drivers are different, and to have the best performance in graphic designing software, you simply need a Quadro (even though looking at specs, it's slower than some gaming GPUs).

When they release a special version like Matrix, Lightning, KPE, or with a fancy liquid cooler, then it costs 20-100% more. Double-GPU cards were like 50%+ more, but we haven't seen those for a long time. However, check the new Matrix price right now. It's way beyond ridiculous for what it gives. I mean the user experience, not counting the fact you simply own one of those. XBOX or PS5 gives a better experience than a graphics card in their price ... and you still need other PC components. This is how stupid are the current graphics card prices. To enjoy the same, playing the same titles on consoles, you pay at least twice as much.
 
I have to try that... Asus ROG Strix 1080 Ti when I bought it was £770 in August 2017. Looking it up I see it was released in March, so price may have changed a bit between then. According to Bank of England inflation calculator it is +27.7% from 2017 to October this year. I can't be more specific than that. That puts the effective price today at £983. This is pretty close to the sale price of the ROG Strix 4070 Ti at £950, so we're comparing same model not just cheapest. Cheapest 4080 is a stretch at £1140.

Inflation is an indicator, not a driver. It is a measure of price increases, but by itself isn't the cause. Official figures are based on a wide variety of products, and some products can be much higher or lower than others. Some specific food items I've seen go up 3x since lockdown era, far more than the average value here.

I'm cautious with attributing current GPU pricing to the last crypto era because general costs to make things really have increased everywhere. I propose an illustration. What was nvidia's profit in 2017 vs today? Gross Margin is a commonly published figure in financials, and it roughly is the selling price of a product relative to manufacturing costs only. Component materials and working on them. R&D and other operating costs are not factored in. When I say sell, it will be what nvidia sells at, not what a retailer sells at, as there are additional layers of profit on top outside of nvidia. Nvidia's GM was pretty stable around 59% in 2017. 2023 is more variable, with the 4 quarters at 57%, 56%, 65%, 70%. It has spiked in recent quarters but as a year average (62%) it is not much different from 2017. That spike is likely due to the AI boom. We should also keep in mind the product mix of nvidia has changed over time. I can't find source data right now, but in 2017 nvidia was still primarily a gaming company. In the last quarter results, non-gaming is something like 6x the size of gaming now, and it is likely that increase in GM is largely driven outside gaming.
 
I bought my old Strix 980ti new for £600 back when they were king, which actually cost less than my current 3070 at £650 (way before prices started dropping)...
 
Last edited:
I had thought the Titan Xp was the halo model for the Pascal series? Its MSRP was $1200 in 2017, which is $1,503.19 in today's dollars (according to: https://www.usinflationcalculator.com/). That is a lot cheaper than the 4090 though.

Does anyone here know anyone who bought a Titan Z back when they were the halo model for Nvidia?
 
I remember buying one of the 3DFX cards and a Pentium II all at once. At time the 3DFX was the baddest thing on the market and the Pentium II was by FAR the most powerful CPU. I couldn't have paid more than 299 for the graphics card in 199X money. And you can adjust for inflation all you want (maybe $500 in today's money?).

That said... in terms of the modern era... The low end certainly seems cheaper than it used to be.

I remember, quite vividly, not being able to afford a GTX 1060. Then, once COVID hit and Germany wanted to boost their economy by suspending all taxes (plus I had like a 75 euro coupon), I ended up with a 2060 Super for like 200 bucks.

Funny how the world works...

Just sold it... for around 200 bucks. THREE YEARS LATER.

I can't imagine paying more than 300 euros for a graphics card.

My entire LIFE would have to be different...

I'd have to be a different person.

And even then... Once you start talking "A Thousand Bucks"... The first thing that would come to mind would be "GUITAR" or "EXPENSIVE LENSES" or "USED MOTORCYCLE" "PRIVATE DETECTIVE" "SARAH SILVERMAN'S PHONE NUMBER"

Not "Graphics card."
 
Last edited:
Up until 2023, I was using the 1080ti I had bought in 2018. I'm just glad I didn't have to upgrade my motherboard to use a 4090, because then I wouldn't have been able to upgrade anything. Usually I would upgrade my video card every two years, not every 5.
 
Up until 2023, I was using the 1080ti I had bought in 2018. I'm just glad I didn't have to upgrade my motherboard to use a 4090, because then I wouldn't have been able to upgrade anything. Usually I would upgrade my video card every two years, not every 5.

Kudos to you for still rocking the z390. People just don't understand how resilient the 9000 series CPUs are. I've been gaming in 4K for YEARS now.

I'd only get something like a 4090 if I actually knew where all my computer monitors were... :D
 
People just don't understand how resilient the 9000 series CPUs are. I've been gaming in 4K for YEARS now.
I understand why you feel that way gaming in 4K... the CPU doesn't play as significant a role as it does at lower resolutions. Playing at a lower res, where a majority still plays, that glass ceiling from the processor creeps lower fast.
 
I understand why you feel that way gaming in 4K... the CPU doesn't play as significant a role as it does at lower resolutions. Playing at a lower res, where a majority still plays, that glass ceiling from the processor creeps lower fast.

Funny how that works... I went down to 1440p in Cyberpunk to try out the new 4000 Series AI quality thing... Somehow the experience was better, in every conceivable way, without it and in 4K.

Haven't played since then but the next time I play I'm definitely switching back. It's more "cartoonish" with whatever that thing is...
 
Funny how that works... I went down to 1440p in Cyberpunk to try out the new 4000 Series AI quality thing... Somehow the experience was better, in every conceivable way, without it and in 4K.
If you're talking about dlss.... that makes sense and what it's supposed to do. Which, that work is all done in the gpu (the ai stuff).

Here are some charts to illustrate that point about the CPU in this game. Notice how many cpus are so close together at 4k?? Then as the res goes down the cpu hierarchy is blatantly obvious. A lot of cpus are resilient when gaming at 4k. :)

VQMj8F8on5wMsqj8qxe4X-1200-80.png tzhTbVSvPyHRBznGCMY5Zo-1200-80.png xxJ2hsYaDZZbTTAeNpR33o-1200-80.png

 
Since Rainless' 9600kf only has 6 cores could it potentially overclock significantly better than the rest of the Coffeelake series?
 
Since Rainless' 9600kf only has 6 cores could it potentially overclock significantly better than the rest of the Coffeelake series?
Like was said/alluded to in the other thread, if it uses more c/t than the CPU has it's not going to matter much. It will help, but cores rule if they are needed.
 
9k series CPUs were highly binned. In short, 9900K was overclocking much better than 9700K and 9600K. The same was with memory controllers. If you had a well-overclocking 9600k/9700k then you were really lucky.
 
hwbot.org results certainly support Woomack's comments (strangely enough I couldn't find any 9600k parts in the top 24 coffeelake overclocking champs):
https://hwbot.org/benchmark/cpu_fre...Id=processor_5773&cores=8#start=0#interval=20

The highest ranked coffeelake (a 9900k) hit 5.7 Ghz @ 1.52V, supposedly on air:
https://hwbot.org/submission/4000705_protoaus_cpu_frequency_core_i9_9900k_5700_mhz

"The Cooler is a TrueSpirit 140 Power and it was Thermal Grizzly Liquid Metal.Since it's just a frequency record pretty much just having a massive block of copper is the only thing that matters."
 
If you're talking about dlss.... that makes sense and what it's supposed to do. Which, that work is all done in the gpu (the ai stuff).

I was talking about DLSS Ray Reconstruction.

BTW, for those with RTX 4***, you should really enable DLSS ray reconstruction, image quality improvements on anything below the Quality setting are really something interesting (not quite sure how Auto works)...

...I lose about 20% performance and the game looks like a cartoon. Without it... the game looks much more realistic and runs faster.


Here are some charts to illustrate that point about the CPU in this game. Notice how many cpus are so close together at 4k?? Then as the res goes down the cpu hierarchy is blatantly obvious. A lot of cpus are resilient when gaming at 4k. :)

That's CRAZY! There's only like a 5fps difference between a 5950X and my 9600K in 4k... and a THIRTY-TWO fps difference in 1080p!

So what the HELL do you need, in terms of CPU, to hit 4K Ultra then??!
 
I was talking about DLSS Ray Reconstruction.
Yep. That's the AI stuff that the GPU and tensor cores handle. ;)
...I lose about 20% performance and the game looks like a cartoon. Without it... the game looks much more realistic and runs faster.
LOL, maybe you changed something else too, I don't know. It's supposed to improve FPS, and the IQ is supposed to look similar or better (DLSS 3.0+ has done a much better job, it seems).....at least according to a slew of reviews. But beauty/realism is in the eye of the beholder.

What AI settings did you have already that were 'better in every conceivable way'? And to confirm, enabling the DLSS RR made it look like 'a cartoon'?

That's CRAZY! There's only like a 5fps difference between a 5950X and my 9600K in 4k... and a THIRTY-TWO fps difference in 1080p!


Edit: For sure. It's where the glass ceiling starts. ~10% slower (like a tier of gpu slower) than a slew of other CPUs all within margin of error. In a game like this and at a resolution where every fps counts, you can see why the CPU is critical in reaching your gpu's potential. Clearly not every title or situation is like this, but 4c/8t and 6c/6t cpus several generations old do present an artifical limit compared to more modern options. Playing games at 4k hides that a lot better than 1080p.

Worth noting, lesser gpus (say a upper budget-class 4060ti), changes these charts where at 4k cpus would be even more similar and the 1080p charts bunch up a bit more.

Really, what matters is if youre hitting your fps and IQ goals. If you have a 4k/60 monitor so long as you're reaching that, it doesn't matter. Same with 1080p 165/240/360....

@Kenrou - what GPU are they using in that review?
 
Last edited:
Back