• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Is Ray Tracing Good? [HWUB] - Comparing Raster vs Ray Trace in various games

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Janus67

Benching Team Leader
Joined
May 29, 2005

It's a long video (warning @EarthDog lol)

Can skip to the 30-minute mark to see what they deem to be between poor to good utilization of Ray Tracing in games.

But man, this further shows that I really do not get the point. There are some cases where it looks nice, but when asked to do the 'spot the raytraced game' I could flip a coin to guess.

I just honestly don't see the point in the vast majority of the situations. If I were watching a movie/cutscene sure extra lighting or reflections may look nice. But who is playing a game like Spiderman or Cyberpunk and sitting there and saying 'wow, the floor is like a mirror' [while simultaneously asking... why is every floor wet/like a mirror...] instead of moving your way to the next objective or fast-traveling somewhere. Especially for the cost of the performance hit in order to run the games on the highest settings. Maybe with the 4090 [which I don't have] or the upcoming 5xxx series it will be even less of a hit, but if given the choice of 4K high/ultra at a steady 120fps or ~60 with uneven FPS because of RT I would basically never pick the worse-performant settings.
 
I agree with your assesment. But you can get your FPS back with DLSS and FSR......................then there is a chance the IQ is lower (or higher, depends, LOL). But that's the answer to the FPS.

It's funny now, because some games (looking at you, F1 2x) RT is enabled by default. So you actually have to make an effort to disable (or suffer the FPS loss) it. I'd imagine we're only going to see more of that in the future...especially as all camps involved get better at its implementation. The momentum in games supporting the feature is not dwindling.
 
I agree with your assesment. But you can get your FPS back with DLSS and FSR......................then there is a chance the IQ is lower (or higher, depends, LOL). But that's the answer to the FPS.

It's funny now, because some games (looking at you, F1 2x) RT is enabled by default. So you actually have to make an effort to disable (or suffer the FPS loss) it. I'd imagine we're only going to see more of that in the future...especially as all camps involved get better at its implementation. The momentum in games supporting the feature is not dwindling.

Yep that's true, I really like what DLSS/FSR can do, but I find it annoying that game devs/companies have become dependent on that being required in order to make a frame rate playable for a game.

IIRC the Star Wars Outlaws or whatever is also RT-always-on and hacking it off looks horrendous.
 
Didn't watch the video but skimmed to the table.

From what I recall SOTTR was one of the first titles using RT in any form, and it only used it for shadows so it is very limited impact in that sense. RT can be so much more than that. CP2077 overdrive is the dream. Has anything else come close in the year or so since that came out?

Part of the problem is that devs have had a very long time working with raster and know a lot of tricks to make it look nice in the best case. Unless you make a game RT only, you have to support that fallback path too. There was an argument going RT only can reduce dev effort too, since you don't have to bake in fake lights everywhere and actual lights will just work. Screen space reflections need to go away as soon as possible as that is very noticeable in raster games.

The fun part is, I don't even play any RT games. As in, none of the games I play regularly support it. But I really want it.

And the perf argument, it isn't a direct linear relationship between fps and being better. For example, if a game ran at say 80fps to 160fps depending on practical settings, personally I wouldn't gain from the 160fps end. That's wasted frames to me. I'd rather have higher quality at 80fps. Depending on the severity of RT, it can dip below my comfort zone. I partially blame AMD for holding back gaming in that area due to their low performance in console space, making cross platform games suffer for it.
 
Personally, apart from tech demos like RT Quake/Cyberpunk or a few choice Indie games (that the implementation puts AAA games to shame), I don't really see any game on the market today that I would play with RT enabled, suffer the FPS loss and call it a win (even with DLSS) 🤷🏻‍♂️

Reflections are nice but who really looks at them when you're getting shot at or driving at high speed? Shadows/lights can be REALLY nice when done right (which most games don't), but you need to be playing slow horror games to really take in the scenery and enjoy it, and there's precious few of those. And let's not forget you need a beast of a GPU to pull it off with decent FPS, for my eyes, 60fps should be classed as the bare acceptable minimum, 85-90fps passable fluidity, and 120fps+ optimal.

With the way things are going with games made with UE5...........................................
 
Reflections are nice but who really looks at them when you're getting shot at or driving at high speed? Shadows/lights can be REALLY nice when done right (which most games don't), but you need to be playing slow horror games to really take in the scenery and enjoy it, and there's precious few of those. And let's not forget you need a beast of a GPU to pull it off with decent FPS, for my eyes, 60fps should be classed as the bare acceptable minimum, 85-90fps passable fluidity, and 120fps+ optimal.
I think this shows difference in gamers. You look at games you like. I look at games I like. There is a very wide gamut of game types. I don't tend to play high intensity action games. I do have time to look at graphics, and triple digit fps doesn't add any advantage for me. I do agree that 60 should be a target minimum though. 30fps targeted games can burn in hell, but that is more of a problem for console gamers.

Similarly we can judge different aspects of image quality differently. There seems to be a growing complaint about modern games looking "soft" but I don't see that myself, and maybe there's a display resolution factor in play here. I'm very sensitive to excess motion blur, which should not be confused that I don't like motion blur. It's just in most games default is set way too strong compared to "cinematic" standards which is basic knowledge in the videography industry. In short, if you can consciously see it as opposed to just feeling it, it is probably too strong. I keep thinking maybe I should make a YT video on the topic but it would be very time consuming gathering the evidence and making the measurements manually in a variety of games to give sufficient proof. BTW as fps increases, correct motion blur amount decreases. Beyond a certain point you might as well not bother.
 
While I agree that poor implementation makes it a useless feature, when it is implemented correctly the RT feature is worth it. imo.

I bought a 2080ti and a 3080ti both hoping to get the best out of raytracing. I'll get a 5090 just so I can use the RT and pathtracing features when they are implemented right.

I agree on the point of that designers have to incorporate both rasterization and RT to satisfy the base or at least satisfy a marketing checklist. In this case, AAA and AA studios have shown that rasterization is still favored, probably because they know their target audience will not use RT too often.

With all that said, I think that gaming tools will evolve to where RT/PT/Lumen are the norms. It may take a few more generations to get there, and will require an open standard for GPU compute so that everyone can agree on a pipeline format, but I think it will become the norm. Its just like when shaders where first introduced and gaming consoles had to help lead the adoption.
 
@mackerel Motion blur/depth of field/chromatic aberration are settings I always turn off by default, motion blur in particular, never liked the effect and how Devs use it to make the animations look smoother (looks especially bad in VA panels like mine in darker scenes). I always liked sharp images but I think that came from playing too many games with bad or no antialiasing like the original Mass Effect port, that I had to crank it to 4k and inject SMAA to (almost) remove all the jaggies. Oddly enough, I have zero issues with DOS pixelization or the "new" pixel JRPGs like Octopath or the FF remakes 🤷🏻‍♂️

@Dolk it most likely will be the case it becomes the norm if new tech isn't discovered in the meantime, but like I alluded in my 1st post, right now we need a lot more horsepower AND a good graphics engine to make it work properly.... Or everyone just starts using the same one, like the DOOM engine for example?
 
Speaking of RT...

"Ray Tracing: Is The Performance Hit Worth It?"

23:40 - Overall Ray Tracing Performance Cost on GeForce
26:07 - Overall Ray Tracing Performance Cost on Radeon
27:44 - GeForce RTX 4090 vs Radeon RX 7900 XTX in Ray Tracing
29:36 - Is This Due to Game Sponsorship?
32:01 - Final Thoughts: Is the Performance Hit Worth It?

 
Obviously looked like they went for flagship, but absent further data wonder if that could skew the results in some way.
 
I'd guess the story/conclusion wouldn't change much outside of actual in-game FPS. Id imagine the difference in RT hit each takes would be similar, but the gap would be a little smaller.
 
Back