• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Is Ray Tracing here to stay?? PCGamer thinks so...

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
So... passing fade or here to stay?
Yeah, I think it's here to stay which is why there's such a push to make better upscaling and better frame gen. Once they've gone RT all the time the entry level segment is going to need playable & not ugly access to the party. Pretty sure that's not the case on previous gen AMD (7600) & nvidia (4060) cards.
 
Here to stay, ok-ish tech IMO, my issue with it is the execution and implementation, some games do it well, others should get the firing squad... And even the ones that do it well, half the time you can't tell the difference, so :shrug:
 
Yeah, I think it's going to stick around for a while. Games could go with 16k textures and a Brazilian polygon models to make models look better, but RT is the "next big lighting development".

(Personally I wish there was a push for a proper unified physics engine first, but that's just me.)
 
It isn't going away. It will only get better as game devs get more experience in balancing quality and performance, as well as generally increasing hardware performance. At the risk of repeating myself, there's multiple levels and not all of them need to be used. Some of the lower level ones like RT shadows and some reflections can help a lot without much perf impact that the likes of full path traced global illumination does.

Looking at Steam Hardware Survey for RT capable GPUs, named models cover about 61% of users. That's all RTX GPUs, Arc and RDNA2 onwards. GPUs that don't make the individual reporting cut would increase that proportion even higher.
 
RTX is a nVidia technology to turn a 5090 into 1080P card:ROFLMAO:: -

Pick ultra 1440P and it's smidge over 60FPS with 1% lows below 60 FPS...

Upscaling and fake frames are for consoles. You don't buy V12 have to get a super/turbo-charger because it doesn't make enough power. You expect it to be naturally aspirated and rev to where it needs to, to make power like the T.50 12K revs, no turbo no super charger, that's the secret to it's blistering throttle response. Upscaling and fake frames are not identical to having the raw performance, no replacement for displacement...
Post magically merged:


Seems more inevitable as the months and years go on. We're seeing more games where it's baked in/default enabled too.

So... passing fad or here to stay?

I've been saying it's here for realzz for years, not changing now!
TBH don't have time of day for it, best just to apply some AI to get the effect in post processing, if it's not too taxing; VR trumps it hands-down. Rather spend the BHP on driving a screen for each eye. I think the Index 2 with it's WiFi connection between the PC and headset could be a game changer.

I'm not against nVidia or path-tracing at all, I even had a nForce motherboard but 5090 has to be running at 1080P; they're not on a different planet, they're in a parallel universe...
 
Last edited:
'm not against nVidia or path-tracing at all, I even had a nForce motherboard but 5090 has to be running at 1080P; they're not on a different planet, they're in a parallel universe...
Read as... I'm not against Nvidia, I've used something of theirs 15 years ago. Is that like the 'I'm not racist I have [name something that isn't you] friends?? :rofl:

LOL, nothing changed since the last drive by NV hate of yours, lol. CP without DLSS is hard on GPUs. There are other titles like this too. Water is wet. It happens. There are plenty of other titles where it works fine without the DLSS/MFG, even at 4K. But RT without is, still a GPU killer.

But, it's not going away... :beer:


that's the secret to it's blistering throttle response.
..or get a better tuner. ;)
 
Read as... I'm not against Nvidia, I've used something of theirs 15 years ago. Is that like the 'I'm not racist I have [name something that isn't you] friends?? :rofl:

LOL, nothing changed since the last drive by NV hate of yours, lol. CP without DLSS is hard on GPUs. There are other titles like this too. Water is wet. It happens. There are plenty of other titles where it works fine without the DLSS/MFG, even at 4K. But RT without is, still a GPU killer.

But, it's not going away... :beer:



..or get a better tuner. ;)
:cheers:
 
I see that AMD is catching up with all the RT and upscaling features. I also notice that in some games, both brands experience issues with colors and textures when upscaling is enabled.
Somehow AMD is far behind with everything for more professional usage. Most of the software is CUDA optimized, and it runs much worse on AMD. Recently, I was trying to set the environment for AMD AI. It's a pain to gather all the required software to even make it work, and AMD doesn't link anything directly in their official articles, but have to search in some FAQs and forums. It's like when they say that a product is for gaming, it's only for gaming. When they say that something is designed for servers, then it's for servers, but will probably work on desktop OS/drivers, etc. In the case of Nvidia or Intel, it works on everything.
RT is already in consoles and widely used in games, so I highly doubt it will disappear ... or maybe not soon.
 
Last edited:
I see that AMD is catching up with all the RT and upscaling features. I also notice that in some games, both brands experience issues with colors and textures when upscaling is enabled.
Somehow AMD is far behind with everything for more professional usage. Most of the software is CUDA optimized, and it runs much worse on AMD. Recently, I was trying to set the environment for AMD AI. It's a pain to gather all the required software to even make it work, and AMD doesn't link anything directly in their official articles, but have to search in some FAQs and forums. It's like when they say that a product is for gaming, it's only for gaming. When they say that something is designed for servers, then it's for servers, but will probably work on desktop OS/drivers, etc. In the case of Nvidia or Intel, it works on everything.
RT is already in consoles and widely used in games, so I highly doubt it will disappear ... or maybe not soon.
TBH I worked on AI stuff and as you say CUDA is king. I remember some firm got Cuda to work with AMD chips but nVidia got them to stop as the license says you have to use it with nVidia hardware...
 
Of course it's a fad. Everything in 3D is a fad. Remember 3Dfx?

...I rest my case.

Someone will simply invent something that does the same thing without the performance hit and raytracing will be GONE... Instantly.

It happens all the time.

Always has. Ever since the first started selling graphics cards.
 
Last edited:
Hmmm I think other approaches and algorithms will come about. It's like the idea of video compression came about in 1920, MPEG1 codec around the 80s, now there are other codecs. Better lighting is not a bad thing it's just the cost. Imagine a game came out that required a 4090/5090 to play at 1080P in extreme mode. With ray/path-tracing you also need good HDR.

This video is good to show the differences between rasta/RT/PT: -
 
Hmmm I think other approaches and algorithms will come about.
Sure they will. Eventually. But let's not be obtuse here guys. We're not talking 'till the end of time. :)


Performance improvements are needed. And maybe DLSS/MFG isn't the way (and it has come a long way (xxx) but arent we all being led down the path by NV, AMD, and Intel? AMD and Intel are in on RT, trying desperately to catch up, even. Is there a sign of slowing down im not seeing?

Image quality isnt perfect to raster, but it is, in what seems like an overwhelming majority of cases (title and setting dependant -of course), not noticeably different or worse (esp. in action - comparing stills alone isn't the way)/as good or better(rare). There are plenty of articles showing its all of the above (sometimes worse, too - but that seems as rare as 'better').

Anyway, it seems clear (at least to me) that RT is here to stay for quite a while. There's nothing out that's a twinkle in the eye (actively competing) to supplant something that hasn't peaked yet. It has only been 7 years since we could use it (hardware on gpus - obviously I say 'use' with a grain of salt).
 
Someone will simply event something that does the same thing without the performance hit and raytracing will be GONE... Instantly.
"simply" :D RT will probably kill raster before anything else kills RT. I can imagine a time in the future where RT does become fast enough that it is the standard. Raster gets relegated to legacy and retro games, in a similar way new pixelart games still exist today. What is there even to replace RT? It's a more realistic simulation of how light works. Oh, right, there's AI. Why simulate the world when you can imagine it?

Don't mix up the "what" with the "how". RT being the "what", and GPUs are the current "how". There may be better future implementations, but they will be getting to the same destination. For example, below is an implementation of Doom for a quantum computer. The computer that would run it natively doesn't exist yet, but it can be simulated by conventional computers. https://github.com/Lumorti/Quandoom

I did do an exercise not that long ago where I posed the question, how far back would I have to go for my laptop to be equivalent or faster than the top supercomputer of the time? With a big "depends on workload", in pure FP64 compute terms (not storage) we're only looking at 25-30 years. If you accept low precision like FP32 that gap could be much shorter, since consumer GPUs are great at FP32 but haven't been at FP64 for many generations.
 
Back