• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Is RTX A Con

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Haider

Member
Joined
Dec 20, 2012
My mate at Uni used to do ray tracing we're talking mid 90s. This is the kind of stuff you would see: -
RayTracing.jpg

I'm having a PC built for me. The guy building it for me has a VR centre, PC running a mix of RTX3080 & 3090s on Vive & Pro headsets. I played a Half-Life Alyx on it, loved it, amazing experience . In his office he had an RTX3090 PC which he plays on once customers are helmeted up and playing in the small cubicles. Asked him to put a ray-tracing game on, he put on Cyber Punk & Control on he switched on RTX but the graphics didn't become like the above, tbh I couldn't tell the difference between normal & RTX, I was expecting a HUUUGE difference, like the movie CGI effects Lord of The Ring, Gladiator - the shots of the Colosseum from the outside. Why wasn't it like that? 3090s a beefy card...It turned out to be a damp squib...
 
Last edited:
Lol, a con? No. Does it smack you in the face as significantly improved/different, lol no.

AMD 6000 series cards support the feature too, they just don't perform as well with it enabled. That said, the 7000 series should be notably improved in that department as well. But there's also DLSS and Fidelity Fx (I think that's amd equivilant) read some of our reviews.... 4080.... 6000 series amd. They cover the technology.
 
Lol, a con? No. Does it smack you in the face as significantly improved/different, lol no.

AMD 6000 series cards support the feature too, they just don't perform as well with it enabled. That said, the 7000 series should be notably improved in that department as well. But there's also DLSS and Fidelity Fx (I think that's amd equivilant) read some of our reviews.... 4080.... 6000 series amd. They cover the technology.
This is the thing even going back to my Amiga A500 days, ray-traced graphics were like photos rather than computer graphics. TBH I was amazed by Half-Life Alyx on the Vive Pro it wasn't even v2 headsets, it was just AWESOME. I have a PSVR and even have put it on the PC for racing games but it is nowhere near as immersive as that. I couldn't tell the difference between Vive 1 Pro and a good monitor
 
Some early RTX games did put in "ooh, shiny" but maybe a touch too much. Also there are many tricks built up over the years like screen space reflections that can give a similar illusion in a more limited fashion to what RT can provide.

VR is a totally different topic. To me that best seen as putting you in the action, rather than looking at it through a window. Games have to be made for it to really work well.
 
I'm not convinced it's true ray-tracing, the more I think about the more the more obvious it becomes. I messed a bit round the Light Wave 3D, it would takes ages like 7 hours to render a simple image. I have the LoTR extended blu-ray and remember one of the extras was about the CGI. I took them nearly a week for a render farm to render a clip of one of the CGI scenes.

You basically have to follow the path of millions of lights rays coming from each (you can have multiple) light sources and compute how they interact with properties of each surface they come in contact with, which can be an enormous amount as each light ray can bounce of multiple surfaces, get refracted etc...I don't think even a 4090 is capable of that. You look at the a scene from Cyber Punk with neon lighting, different surfaces with different properties, there's now way that could be done at 30 FPS...
 
yea i was really expecting more like this
there are RT demos from back in the day that intel had there hands in. OpenGL games are easier to implement that kind of RT compared to what is out there. I can not what they call the engines, but they are harder to implement RT on. even some of the games that really wanted to show case RT on RTX to me you could see a difference but not enough to make me really want to get on it. back when intel did it they did say the doubling of cores did increase frame rate, that FPS was not all that effected by resolution. Consider the RT demo's intel was doing back in the day was all done by the cpu. Think the NV gpu what released in 2008 was a GeForce GTX280. i still wish the guy that programed and setup the RT Q3A demo. would do the full version of it, i still have the disc around some where. i did enjoy that game even if it was multiplier only, was fun agianst AI. i never measured up to the people going full balls to the wall comp in that game. what killed me though was watching these guys running SLI voodoo 2's with the lowest graphics settings on. this is back when my friend had regular weekend lan parties.

while this a current project of RT in Q3A, it does not look anything like the video i posted.

a few other links from the past

alot of older articles i can not find since its been taken over by newer RT articles on RTX. there were some video's that were posted in 2008 but i noticed some pictures of those in 2004 articles i linked too. intel had a plan for Larrabee graphics, which i think is why they may have pushed RT based games. was said to have 1000 cores but they would be running 1ghz each at the time. they never released for us to in gaming. some of the arch got changed then put to work in server type environment, i do not recall what function they served back then.
 

This is a good watch to show the same with and without RTX on. Is definitely making me wait for Ada Lovelace or RDN3 archi. My first programming language I was programming in commercially was Ada, and I don't live to0 far from her ancestral home and she was the the first computer programmer. Hope nVidia sort their arse from their elbow over this...
 
early in that video around 1:50 is the point people made in the past. About the whole using raster engines and adding RT on it. On the other hand take Doom/Doom:eternal, adding implementing RT is easier. i would really love to see ID do a update to the new doom games adding RT to them. Making your comment "Hope nVidia sort their arse from their elbow over this..." is not something NV has control over. NV just like AMD go hey game devs we have the ability for you to use RT on our video cards. how it gets implement and on what game engine makes a difference with these raster engines.
 
What does this mean??? What's wrong??!
They are on the road to alienating there customers. The RTX 4080 12GB debacle is the latest example. It reminds me of the spivs/east-end barrow boys trying pass off fruit and veg off as 'organic' when they're not...
 
I remember Project Larrabee; the interesting bit: -
'AMD’s reaction: We’ll merge with nVidia – a marriage that never happened
AMD’s reaction to Intel’s split to CPU and cGPU and future fusion parts was quite simple: Hector J. Ruiz and his executive team began to discuss a merger with nVidia, which ultimately fell through in the second half of
2005. AMD knew nVidia’s roadmaps just like nVidia knew AMD’s, thanks to the now-defunct SNAP [Strategic nVidia AMD Partnership], formed in order to get the contract for the first Xbox. A few weeks after those negotiations went under [Jen-Hsun?s major and unbreakable requirement was a CEO position, which Hector refused], AMD started to talk about the acquisition of ATI which ultimately became a reality a few months down the line [July 2006].'

 
well that article has nothing to do with RT in gaming.
This has been tough to follow....

They are on the road to alienating there customers. The RTX 4080 12GB debacle is the latest example. It reminds me of the spivs/east-end barrow boys trying pass off fruit and veg off as 'organic' when they're not...
I mean, it's a turnoff....that poor labeling, but it has nothing to do with RT...

So confused, lol.
 
Last edited:
yea i stuck with him about RT and thats its, realtime RT is possible. it comes down to use Raster based game engines that is it, any RT used on them is slapped on/bandaid IMO.
 
Not sure what this thread was about. It started as drivel.... and only got worse. :p

Feels close to closing...........
 
@Haider and @Evilsizer - Does Portal count as a non rasterized RT game?

 
Last edited:
I agree its a waste of money, *for me* I also have not played RTX, but what I've watched of it, it doesn't seem worth it. But that's how most technical advancements begin. IMHO as much as graphics have improved over the years, I don't feel like there is a huge bump in *realism* from say CoD Modern Warfare 2 (2009) to CoD Vanguard (2021). Sure it looks better, and sure I run one maxed out and the other not (and neither have RTX) but I don't have an experience of feeling like one is more real than another, if that makes sense. Eye candy is nice, but a lot of things I could buy with that money are more important to me. I still feel fortunate to have as nice of a system as I do, even though it's antiquated by today's standards.
 
Back