• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

CoD MW Remastered Ray-tracing

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Haider

Member
Joined
Dec 20, 2012
Hi,

Just bought Modern warfare Remastered in the Steam sale. I'm trying to switch on ray-tracing but cannot find any options under graphics - MSI Tomahawk, MSI 6900XT Gaming X Trio + 5800X3D + 16 GB DDR4 drivers are up-to-date...Can anybody help?

Thanks
Haider
 
This is what I pulled off the web...
Modern Warfare only has a single setting for its ray tracing support, and you'll find it buried deep in the Shadow & Lighting section of the main Graphics menu.
Call-of-Duty-Modern-Warfare-RTX-setting.png

EDIT: The screen literally says "THIS FEATURE REQUIRES SPECIFIC NVIDIA HARDWARE" so I'm not sure your AMD card supports it(?). That said, I thought DXR was DXR it just needs to be in the driver for the game. But that may be your problem(?). Have you tried deleting and reinstalling the game? Verifying files in steam?
 
Last edited:
Same here; I thought as long as your card supports DirectX/Direct3D feature you should be fine. I'll get home and give it a buthers...TBH I'm not that bothered about it, from what I have read even a 4070 struggles with ray-tracing on Cyber Punk. Hopefully nVidia will get it working on mid high-end cards like the 4070 in time for the Star Citizen release:!)
 
from what I have read even a 4070 struggles with ray-tracing on Cyber Punk. Hopefully nVidia will get it working on mid high-end cards like the 4070 in time for the Star Citizen release:!)
It's not that it 'isn't working'...Cyber Punk isn't a good barometer to judge RT (read: any) performance with. It's known as a GPU killer ('can it play CP 2077?'), so that's a terrible vertex to spin-off. Everything struggles with CP 2077. Your AMD card would be worse.

AMD is known to be behind in RT performance (comparing AMD 7000 series with NV 4000 or AMD 6k vs NV 3k) ... your concern is misplaced, especially owning an AMD card where RT performance is known to be behind NV.
 
It's not that it 'isn't working'...Cyber Punk isn't a good barometer to judge RT (read: any) performance with. It's known as a GPU killer ('can it play CP 2077?'), so that's a terrible vertex to spin-off. Everything struggles with CP 2077. Your AMD card would be worse.

AMD is known to be behind in RT performance (comparing AMD 7000 series with NV 4000 or AMD 6k vs NV 3k) ... your concern is misplaced, especially owning an AMD card where RT performance is known to be behind NV.
TBH that's tongue-in-cheek...
 
That said you look at the new Jedi, Hogworts or Callisto Protocol games, the ray-tracing is extremly demanding, even a 4070TI was benchmarked at 69FPS@1440P Win11 X64 with ray-tracing on the Callisto Protocol; NB 7900XT (£780 Sapphire) comes out faster than 4080 (£1199 Palit)...
 
even a 4070TI was benchmarked at 69FPS@1440P Win11 X64 with ray-tracing on the Callisto Protocol; NB 7900XT (£780 Sapphire) comes out faster than 4080 (£1199 Palit)...
It depends on the title, of course, but it's generally held that RT performance on AMD isn't as good as NVs. Perhaps that will change in time.

I can't help pricing... ;)
 
It depends on the title, of course, but it's generally held that RT performance on AMD isn't as good as NVs. Perhaps that will change in time.

I can't help pricing... ;)

I agree with you on that, nVidia is better for ray-tracing. On the pricing front my bug-bear is the damage it's going to do the PC gaming scene in the long run...TBH I'm quite impressed with the MSI RX 6900 XT Gaming X Trio, it meant to tide me over till 7900 XT but I'm thinking RDNA4 may have more architectural improvements for ray-tracing and 6900 XT hasn't really been found wanting...
 
That said you look at the new Jedi, Hogworts or Callisto Protocol games, the ray-tracing is extremly demanding, even a 4070TI was benchmarked at 69FPS@1440P Win11 X64 with ray-tracing on the Callisto Protocol; NB 7900XT (£780 Sapphire) comes out faster than 4080 (£1199 Palit)...
All the games you mentioned are well known for having port/optimization problems, it's no wonder FPS are so inconsistent :shrug: 6***/7*** AMD cards are rasterization monsters and Nvidia cards have higher driver overhead, so the only place you're missing out on is Ray tracing, which to be fair, only a handful of game do well, so your 6900xt should be more than good enough for many years to come...
 
All the games you mentioned are well known for having port/optimization problems, it's no wonder FPS are so inconsistent :shrug: 6***/7*** AMD cards are rasterization monsters and Nvidia cards have higher driver overhead, so the only place you're missing out on is Ray tracing, which to be fair, only a handful of game do well, so your 6900xt should be more than good enough for many years to come...
You can't really optimise on IBM compatibles. The hardware is not fixed. What do you optimise for which cache size, architecture RDNA1 or 2 or 3 or do you say GTX1060? It triggers me when people say it's not optimised. Personal computers like the VIC-20, C64, Amiga & ST were fixed hardware, IBM compatibles anything but...When I write code, it can be optimised because the hardware we are coding for is fixed before hand, along with the version and service packs for the software/OS etc...What needs to happen is the hardware vendors need to optimise their hardware and drivers to fit the application...I remember physically pulling off the 512KB RAM chips of the motherboard socket on Atari 520 STFM to upgrade it from 512KB to 1MB; doubled the memory. Why don't the GPU manufacturers just socket the RAM chips so they can be upgraded and optimise via the drivers. These cards aren't exactly cheap and offer no expandability...
 
Last edited:
You can't really optimise on IBM compatibles. The hardware is not fixed.
If the game can be patched to work properly a day or two after release (sometimes same day), it could have been properly optimized beforehand, nothing changed on the game/OS/hardware side - but everything changed on the consumer side with people leaving bad reviews/returning pre-purchases/new customers not buying the game. It has everything to do with timetables and the willingness of the devs and everyone working with them to release a fully working product. It's all about quality control. There will always be small things, no one is really complaining about them, everyone is doing it because apparently now it's common to release AAA games in beta, or worse alpha, and call it a finished product even though it's unplayable.

This has been stated many times by almost every gamer I know/spoken with, we would rather wait (hold the release) until it's in a playable state than the games coming out like they have been in the last few years, especially AAA releases...
 
Games are written for Windows utilising DirectX and specifically Direct3D. nVidia & AMD claim compatibility with Direct3D. Direct3D allows (or used to) allow software rendering, before 3Dfx came out with Voodoo1 I used to software render 640x480@30fps. That's all the programmers should optimise for Win 11, DirectX 12 & AMD X64. It should be up to the hardware manufacturers to aim to make their hardware performant. It's not Microsoft's fault 3070 hasn't got 16GB VRAM. May be in the future MS should specify VRAM, features and performance targets or the manufacturer doesn't get an 'approved for Windows DirectX 12' badge. The hardware vendors are responsible for the performance of their hardware e.g. ray-tracing not being as good on AMD hardware, looking at you AMD/Radeon to sought it out via drivers and hardware...
 
Why don't the GPU manufacturers just socket the RAM chips so they can be upgraded and optimise via the drivers. These cards aren't exactly cheap and offer no expandability...
1) because the current solution is very likely far more efficient and offers significantly more bandwidth than a socketed version.

2) it lets them differentiate products and upsell the ones with more
 
CP2077 I think is a good example of what games should be doing now. I know it's start was pretty bad, but with the latest RT overdrive update, this is the dream is it not? Yes, it is demanding but more than playable on 40 series GPUs. Other games mentioned perform worse for far worse graphics. There's a lot of performance possible if game devs have the time to do a good job, but I feel they're rushed to hit arbitrary deadlines. Too many games have a bad launch experience but is much better after a month or two of patches. A delay of a month or so for a better experience would be better all around.
 
1) because the current solution is very likely far more efficient and offers significantly more bandwidth than a socketed version.

2) it lets them differentiate products and upsell the ones with more
1.) Socketing a chip costs wouldn't decrease efficiency. In the past I have done unsoldered chips, stuck in a socket and then changed the integrated DACs/ADC/op-amps to change the sound signature of a HiFi sound system. A socket gives you flexibility and costs a couple of quid, nVidia would buy them by the plane load so would be a even cheaper. Obviously upgrading favours users rather than company but it creates less environmental impact. Imagine selling a memory module upgrade for 3070s to go upto 16GB. You're extending the life of those cards...
 
CP2077 I think is a good example of what games should be doing now. I know it's start was pretty bad, but with the latest RT overdrive update, this is the dream is it not? Yes, it is demanding but more than playable on 40 series GPUs. Other games mentioned perform worse for far worse graphics. There's a lot of performance possible if game devs have the time to do a good job, but I feel they're rushed to hit arbitrary deadlines. Too many games have a bad launch experience but is much better after a month or two of patches. A delay of a month or so for a better experience would be better all around.

It's money, as dev you are given a piece of code and time expectation to turn it around in. The project will be paid by the IT department for the time expectation to complete the feature not the actual time it took you. So you spend extra time sorting stuff out, means the project is going to go over costs and it will just have to come out of the overall budget for the IT department, or the devs do it for free in their lunch or after hours. End of the year as a dev you'll be measured by your salary is X, you've completed so many features worth so much therefore you're a profit. When you're developing a product, it's get the minimal viable product out and earning money, so that money coming in can make the case for getting money for further development from the backers or go to actually fund further development. Money is the root of all evil, or lack of it;) Games in the early 80s were developed by one guy in a bedroom, now there are armies of people working on them. This is the big problem, just look at Star Citizen, 10 years of development how much has that cost, if it was just Chris Roberts in his bedroom it would be far cheaper, the barrier to knocking out a game would be a lot less...
 
Any project will have a schedule and cost budget. My point was more that many big name games are launched in a poor state, which gets fixed post launch. They do have the ability and resources to do this, just maybe not in the original desired timescales. So the ask is, delay the release and give us that fixed up version. With the possible exception of TLOU wanting to tie in with the TV series, most releases are not time critical. Or they should be more honest. Steam has Early Access for in development games. Ship it under that until it is ready. Or maybe they really do need the whole internet shouting at them to fix it in order to actually do so.
 
Back