• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Silent Hill 2 Remake PC/Unreal **** Engine 5

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Haider

Member
Joined
Dec 20, 2012
Hi,

Getting 1080P HDR ultra 75-90 FPS, 120Hz Freesync, with stutters on my 6900XT OC (2.5 GHz GPU/2.13 GHz RAM/1080mv/+12% power limit) & 5800X3D stock, game on the 990 Pro (PCIe v4 x4). I knew Unreal Engine 5 was crap, wish Konami had just used the FOX Engine from Metal Gear Solid 5: Ground Zeros. That had excellent lighting, graphics and performance...Anyone else have issues with Unreal **** Engine 5...


Thanks
Haider
 
I don't have any real issues with my current (5090) or previous (3080) system with Unreal Engine 5 at 4K.
 
I've been getting into Robocop Rogue City recently which is built on it. So far, no real issues on my 2080.

Only one being a couple weeks back it crashed to desktop during a map change with an "out of vram" error, but other than that pretty smooth.
 
UE5 games, I guess I only play Fortnite, bench BM:Wukong, and Delta Force.... no issues here with 4090 nor the 5090 so far (nor any 5 series card in BM:W).

As far as performance goes, looks like you're on par with testing so... yeah.


Why are we cussing about this? :sly: :rofl:
 
May just need to back down from Ultra settings to High and see if it makes a difference.
 
UE5 games, I guess I only play Fortnite, bench BM:Wukong, and Delta Force.... no issues here with 4090 nor the 5090 so far (nor any 5 series card in BM:W).

As far as performance goes, looks like you're on par with testing so... yeah.


Why are we cussing about this? :sly: :rofl:
I was getting 80/90FPS in 1440P ultra on the Last of Us Part 1, doesn't seem to look any better...
 
Unbelievable a 4090 at 4K and you still don't get 60FPS...
Are you reading the same benchmark you quoted????

The MINIMUM for a 4090 at 4K is 88 FPS. Average is 95 for The Last of Us Part 2.



EDIT: You're talking about SH2.........but quoted my TLoU benchmark, LUL.
 
Are you reading the same benchmark you quoted????

The MINIMUM for a 4090 at 4K is 88 FPS. Average is 95 for The Last of Us Part 2.



EDIT: You're talking about SH2.........but quoted my TLoU benchmark, LUL.
Yes SH2, is a lot slower than TLOU...I'm getting about 3090Ti speeds in 1080P raster SH2 in terms of FPS, with the OC. 6900 XT loses ground as you go up resolutions...May be 5080 TI or Super could be on the cards, no pun intended...
 
Last edited:
I updated my AMD drivers, I disabled vsync and that seemed to make it smooth and keeps the frame-rate at pretty much 80 FPS. If anybody is having probs, could work for you too...
 
Stuttering is fairly common in Unreal Engine overall, has to do with shader compilation and asset loading, theres no "cure" for it, but game patches usually tend to smooth things over time. Vsync or frame limiter also helps sometimes. You don't really notice it on some games because they do a pre-load at the start.
 
That's a good read at that link, giving me a better understanding of shader compilation related stutters. However that isn't unique to UE, and can happen to any modern-ish 3D engine. Also there are other forms of stutter outside of that. If anything, I'd consider shader stutter to be somewhat solved. It is understood, and there are ways to mitigate it. None of these are ideal solutions, but it is much of the way there.

Actually, this is one way I think faster CPUs have improved the gaming experience. Not by giving more average fps, but by making compiling shaders faster, reducing load times. I noticed quite a big change in overall load times going from a 7980XE to 7800X3D.
 
I've been getting into Robocop Rogue City recently which is built on it. So far, no real issues on my 2080.

Only one being a couple weeks back it crashed to desktop during a map change with an "out of vram" error, but other than that pretty smooth.

I had lots of trouble with that... and I've been HAVING lots of trouble with Monster Hunter Wilds (though that uses the RE Engine... same as Resident Evil 7).

I generally play every game (Elden Ring, Baldur's Gate 3, Indiana Jones, Cyberpunk, etc...) at 4K, no problem. The only games that have stopped, slowed-down, or crashed have been Robocop and Monster Hunter Wilds (Wilds is the only game I've played... EVER... where I've had to go down to 1440).

I don't think Unreal Engine is to blame... I think some PC ports are just BAD.

Come to think of it... Elden Ring did crash on me... randomly... in the very beginning. And it also switches to desktop whenever I try to take a screenshot (unless I do it with the controller).

Silent Hill 2 would certainly be a game where the PC version is more of an afterthought.
 
I had lots of trouble with that... and I've been HAVING lots of trouble with Monster Hunter Wilds (though that uses the RE Engine... same as Resident Evil 7).

I generally play every game (Elden Ring, Baldur's Gate 3, Indiana Jones, Cyberpunk, etc...) at 4K, no problem. The only games that have stopped, slowed-down, or crashed have been Robocop and Monster Hunter Wilds (Wilds is the only game I've played... EVER... where I've had to go down to 1440).

I don't think Unreal Engine is to blame... I think some PC ports are just BAD.

Come to think of it... Elden Ring did crash on me... randomly... in the very beginning. And it also switches to desktop whenever I try to take a screenshot (unless I do it with the controller).

Silent Hill 2 would certainly be a game where the PC version is more of an afterthought.

As far as ports go, the way I see it PC hardware isn't fixed even 3060 came in 8/12GB Ampere, 4060 8/16GB Ada Lovelace, 5060 8/16GB Blackwell, 6900XT 16GB RDNA 2 ; if I was working on the project I would kick-off and say you need to specifically say what we porting to? The way I see it as developer should write game for Windows/Direct 3D and let the hardware manufacturers who claim compatibility with Windows Direct 3D make their hardware run well with it. I'd be saying that's they're job, they're (nVidia/AMD) the ones saying their hardware runs with Windows/Direct X/3D. I can also see the the software house and publisher want to sell games so I would ask what is the largest target audience - integrated graphics, RTX 3060 etc and make sure it runs well on our largest user base...

If the other users 3080. 3090, 4080, 4090, 30 are will to pay for further High Quality Super TI version optimised for their systems then we can do something; why should the majority of users pay for optimisations that they are going to use, makes no sense to me?
 
Sure but in that respect coding for a *60 model card means 1080p/60 medium settings (on a good day) and very often a targeted 30fps otherwise as you're running something somewhat comparable to what the current console generation is a generation ago in terms of cards.
 
Sure but in that respect coding for a *60 model card means 1080p/60 medium settings (on a good day) and very often a targeted 30fps otherwise as you're running something somewhat comparable to what the current console generation is a generation ago in terms of cards.
You could optimise to a lower level as you're coding specifically for Ampere & Ada Love code branches with specific specs of those cards...
 
Sure but in that respect coding for a *60 model card means 1080p/60 medium settings (on a good day) and very often a targeted 30fps otherwise as you're running something somewhat comparable to what the current console generation is a generation ago in terms of cards.

As mentioned earlier: I play all my games, except for Wilds and Robocop, in 4K. Every day.

Besides some random bugs (not related to graphics settings) Elden Ring runs smooth as butter for me in 4K. As does BG3, CyberPunk, etc...
 
I enabled hardware ray tracing 1080P get about 50-65fps but get horrible stutters...Hardware Ray tracing allows you to see when you're inside the buildings but is otherwise too dark...
 
Back