• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Cyberpunk 2077...sooooo excited

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
If the benchmark works fine, then everything else isn't important.
I would play it if it was a new Fallout.


That's literally, totally, and COMPLETELY wrong. :rofl:

Of COURSE the Benchmark works! That's what it's designed to do... CD Projekt Red wouldn't have been down like 300 million dollars or whatever it was if the actual GAME functioned as well as the Benchmark.

So I think some of that "everything else" might be important.

No. I thought I was clear about that, lol. I wasn't talking about experience playing the game(s), but how people read, interpret, and present information to others. You don't need to play the game to know if something got 'stomped' (or...didn't), for example. ;)

Let's leave this thread for actually discussing the game. My bad, folks. I may start a thread to crowdsource some info at a later time. :)

Actually discussing... the game... that you haven't played? I mean the irony here is...

ANYWAY!

I have actually played the game...

I'd have to look back at my earlier posts to see if I was still on the 2060 Super when I had the lag problem in that one major battle where I was constantly falling through the floor and the "boss" of that battle kept lagging in and out of my sight.

I'm curious whether, as Kenrou suggested, my old, terrible Samsung Q0 was the cause of that lag, or maybe the problems I was having, back then, with my power supply.

If I'm correct, since then, I have replaced the Samsung garbage with a brand new M2 drive and the old 550 watt power supply with an XFX 850 watt one (or 750... I forget...) and of course the 2060 Super with the 4060.

Your previous theory was that, somehow, my "low core count" was hurting me in this game. (Though likely, we agreed, not at 4K).

I guess the only way to know for sure would be to run that mission again and see what happens.

My day is already kinda weird... so maybe I'll do that right now.
 
I'm curious whether, as Kenrou suggested, my old, terrible Samsung Q0 was the cause of that lag, or maybe the problems I was having, back then, with my power supply.

If I'm correct, since then, I have replaced the Samsung garbage with a brand new M2 drive and the old 550 watt power supply with an XFX 850 watt one (or 750... I forget...) and of course the 2060 Super with the 4060.

Your previous theory was that, somehow, my "low core count" was hurting me in this game. (Though likely, we agreed, not at 4K).

Oh, your problem was definitely the Q0, my old 7200rpm 256mb cache WD Blue was much faster installing/patching Cyberpunk than what you told us about (yes, I actually tried it). About the core count, that was from the Devs mouth, they explicitly stated that anything below a 8c/16t would hurt performance from patch 2.0 and newer, and yes, below 4k 👍🏻
 
Actually discussing... the game... that you haven't played? I mean the irony here is...
....the irony is........misplaced.

It's a review of a VIDEO CARD... NOT the game.

Your previous theory was that, somehow, my "low core count" was hurting me in this game.
The charts posted there show your CPU is putting a glass ceiling on performance. Is it the source of all your issues? No idea. But it is holding back performance. Less so at 4K, but, it's still there. 6c/6t CPUs are long in the tooth these days. ;)
 
One thing to note as always is that the benchmark may not represent all aspects of gameplay. Digital Foundry for example use a different area for their testing as they look more worst case.
 
There's several spots around town that have FPS/frame spikes (at least for me), one of them is the roller coaster near the GIM, the new DLC area drops hard when you're running around in the car on the main street, anywhere with a ton of NPCs (I actually gain 10fps-20fps by dropping NPC density in town center), oddly, fights don't really mess with frame timing.

Oh, a really good spot to benchmark is the starting area for corpo, the building you walk through right after the mirror scene, there's a really good mix of NPCs/reflections/shadows/lighting and FPS fluctuate quite a lot depending of what you're looking at. Also it kinda runs like the benchmark, roughly always the same number of NPCs on the same spot.
 
That's the problem for reviews in uncanned benchmarks... repeatability/variability. While integrated benchmarks may not represent all aspects of gameplay, it's wholly repeatable with the same information to process each and every time. Same number of NPCs, same everything.

With benchmarking specific scenes, the potential for an increase in run variance is higher (even looking at a different area can throw FPS off). The longer you run the bench/scene, the more it irons that out, however. :)
 
Last edited:
That's the problem for reviews in uncanned benchmarks... repeatability/variability. While integrated benchmarks may not represent all aspects of gameplay, it's wholly repeatable with the same information to process each and every time. Same number of NPCs, same everything.

With benchmarking specific scenes, the potential for an increase in run variance is higher (even looking at a different area can throw FPS off). The longer you run the bench/scene, the more it irons that out, however. :)

However you CANNOT RELY ON BUILT-IN BENCHMARKS for the same reason you wouldn't trust an NVIDIA benchmark to review an NVIDIA graphics card. The benchmark is the one thing they KNOW will work as planned.

And, in the Cyberpunk benchmark especially... there aren't any bullets flying, or cars moving, or cop rappeling from ropes...

It's the worst gauge imaginable of actual game performance.

It's better to take the average FPS of a few runs of actual gameplay then it is taking the word of this benchmark that gives you absolutely zero real gameworld performance.
 
While it may seem, to you, like integrated benchmarks are there under a dubious pretense, in my years of experience doing just that, I've found most do show a relative idea of in-game performance and don't intend to mislead people. It would be a disservice to anyone running it if the results were abhorrently off compared to actual gameplay. Are some? Sure! That said, I do understand there are faster and slower portions in the game compared to the benchmark, depending on what scene you're benchmarking. I'd imagine most users understand that, too. It's also a lot easier to get different results in manual runs. But as an average, canned benchmarks compared to ingame are generally close (always exceptions).

Canned benchmarks do an especially good job of defining a hierarchy among graphics cards/CPUs/settings...whatever you're testing, because the scene is repeatable and without/minimal differences between runs. You can rest assured that if card A is ~5% faster and gets ~60 FPS in the canned benchmark at those settings, it's going to be ~5% faster in-game, too, and get around 60 FPS if you average things out. I'd have to imagine if reviewers all did this, results would be all over the place depending on what scenes each reviewer chose. Who do you trust in that case... we're back to averaging the average from a slew of reviewers to get an idea. Sounds cumbersome, especially for a negligible increase in accuracy.

I see value in both methods for benching and in-game FPS.


Here's a quick snip from an article...
The benchmark lasts just over a minute and involves swirling around the El Coyote Cojo bar in Heywood before heading up the alley behind it and then settling peacefully on some palm trees above reasonably populated streets. There are no firefights, so you'd be forgiven for thinking it wouldn't be representative of the game's action scenes, but the averages do line up with in-game frame rates, even if the minimums are a bit off.


Until now anyone trying to work out the best settings for their system had to make do with charging around Night City trying to find a spot that was representative of the game's various environments and play styles. This generally led to complicated runs that took way too long and were all too easy to derail with some random gang fights or pile-ups. It's a living breathing city after all, and one that has a penchant for violence.
 
OKAY... New M2 drive... New video card... New motherboard... New CPU...

...and the game is STILL glitchy as hell!

So it ain't the hardware.

I just redid the gas station fight. An enemy will be right in front of you then suddenly slide back to where they were two seconds ago as if they were moonwalking. This goes for both humans and robots.

Doesn't happen ALL of the time... but very, very often. Almost always when there are multiple enemies on the screen.

Now that I've got the 12900K and some DDR5 RAM... the game runs MUCH more smoothly overall. I get about 80fps, on average, in 4K.

Such a shame that combat is so glitchy. It's like I'm playing online on a dial-up connection.

It's interesting to see at what point a CPU holds you back and what point a GPU holds you back.

I had no trouble hitting 60fps in 4K with my 2060 Super... The 4060 didn't really add much in that regard... but the combination of that and the new CPU seems to have stabilized things.

Just like in Baldur's Gate 3... it's suddenly much faster to navigate a map or look in a bunch of different directions (guess that's more a CPU thing...) That's nice.
 
Not sure what to tell you, I rarely have fight glitches :shrug:

4k 60fps with a 2060, what? That's low settings with DLSS ultra performance mode (720p) I assume?
 
Interesting read, but that's from 2021, patch 2.0 should've changed the values quite a bit?
Oh ****, that is a older. With the changes they made, I'd only imagine gaps to get bigger though considering the native 8c/16t support, etc.

EDIT: Regardless, it depends wildly on resolution (1080p more affected than 4k), and settings.

I'd also like to see that retested... but there's no way they'll do it. That had to take so much time, lol... so many processors.
 
Not sure what to tell you, I rarely have fight glitches :shrug:

4k 60fps with a 2060, what? That's low settings with DLSS ultra performance mode (720p) I assume?

With MY ego??! Are you kidding??? Have you NOT been paying attention to who I am? :p

cyberpunk_graphics.jpg

cyberpunk_video.jpg

High and Medium, baby! :D

My best friend found some website that had like the best settings for the 2060 Super when Cyberpunk first came out. I haven't touched the settings since. (besides the new stuff that they added and everything I posted here.) You can go back to the beginning of this thread and see... (You actually don't even have to go back that far... I only got the 4060 a couple of months ago... I was rocking the 2060 all this time before that... and yes: It was always 60fps in 4K.)

In fact... I probably could've gotten more like high 60s if I had V-Sync turned off. Now I'm getting mid 80s.

But no... I'm not doing "Ultra" or anything like that. I think it looks fine as is,,, Though I did find this interesting:


They're only getting 18 in 4K at all Ultra settings. I would expect to, of course, take a huge hit at Ultra... but I'm not sure my performance would drop THAT low. But maybe they aren't using DLSS :shrug:
 
They're only getting 18 in 4K at all Ultra settings. I would expect to, of course, take a huge hit at Ultra... but I'm not sure my performance would drop THAT low. But maybe they aren't using DLSS
I'd imagine they are not.

I'm going through right now updating our gpu tests/data and ran a 4060 Ti 8GB (TPU ran a 4060) with our settings (days ago). We run 'default' ultra (14900K ddr5-6000) as raster only, rt only, then rt with dlss. Here are our 4K results. :)


Raster - 22 FPS
RT Only - 11 FPS
RT + DLSS Balanced - 29 FPS

We're publishing the new GPU testing article this morning. You can see the screenshots and details there if you choose. I'll drop the link now, and it will be live @ 08:00. :)

 
I'd imagine they are not.

I'm going through right now updating our gpu tests/data and ran a 4060 Ti 8GB (TPU ran a 4060) with our settings (days ago). We run 'default' ultra (14900K ddr5-6000) as raster only, rt only, then rt with dlss. Here are our 4K results. :)


Raster - 22 FPS
RT Only - 11 FPS
RT + DLSS Balanced - 29 FPS

We're publishing the new GPU testing article this morning. You can see the screenshots and details there if you choose. I'll drop the link now, and it will be live @ 08:00. :)


Ok... I see all the setup... but when do the results go live?
 
Ok... I see all the setup... but when do the results go live?
There's a 4070 Super review coming up Wednesday with just hte NV cards. The AMD data will be up with the 7900 GRE review (in the coming couple of weeks). Just waiting on the vendor to ship it. I'll post the CP charts for you in the meantime...(ill edit in a bit).

 
There's a 4070 Super review coming up Wednesday with just hte NV cards. The AMD data will be up with the 7900 GRE review (in the coming couple of weeks). Just waiting on the vendor to ship it. I'll post the CP charts for you in the meantime...(ill edit in a bit).

You're such a sweetheart... :love:

So you guys are reporting 11fps with RT on DLSS off and Ultra @ 4K on a 14900K... and TPU is reporting 12.7fps, RT on, DLSS off on a 13900K... on the 4060TI?

That's... interesting?

I wonder what other factors come into play... Cooling? RAM timings? Drivers? Power supplies (was certainly a HUGE factor for me back around thanksgiving :rofl: )...

I mean this is the built-in demo to Cyberpunk, right? Which we ALL have access to.

Everything seems to indicate that your 4060TI should have a higher score... test hardware being what it is.
 
That's...... likely due to a ton of variables, yep! RAM speed/timings, drivers, OS (not power supplies), motherboards, and how they treat the CPU you have/cooling, but more specifically, the version of CP 2077 tested and the settings. Mine is on the latest and greatest of everything versus that was done a while ago (but post PLiberty). There are way too many variables between test systems to compare between sites. For sanity checks, sure, but the point you should walk away with is that, yes, enabling Ultra with RT kicks most GPUs in the pants. Your GPU, too, would suffer the same losses w/o DLSS enabled.

Everything seems to indicate that your 4060TI should have a higher score... test hardware being what it is.
It's 4K and we're talking 11 and 12 FPS. Wholly unplayable no matter who reviews it and what system is 1 FPS faster/slower. Also, it's 4K, so the CPU doesn't matter quite as much. Assuming the settings and everything else are the same, I'm getting 95 FPS at 1080P on just raster... they are getting 80. The CPU/my system doesn't make that much of a difference, so there's (a lot) of something else going on (getting 56 FPS vs. their 50 at 2560x 1440, FTR).

EDIT: I checked the article and it looks like took the FOV from 70 to 100... perhaps that has something to do with why he's slower in 1440 and 4k?
 
Last edited:
That's literally, totally, and COMPLETELY wrong. :rofl:

Of COURSE the Benchmark works! That's what it's designed to do... CD Projekt Red wouldn't have been down like 300 million dollars or whatever it was if the actual GAME functioned as well as the Benchmark.

So I think some of that "everything else" might be important.



Actually discussing... the game... that you haven't played? I mean the irony here is...

ANYWAY!

I have actually played the game...

I'd have to look back at my earlier posts to see if I was still on the 2060 Super when I had the lag problem in that one major battle where I was constantly falling through the floor and the "boss" of that battle kept lagging in and out of my sight.

I'm curious whether, as Kenrou suggested, my old, terrible Samsung Q0 was the cause of that lag, or maybe the problems I was having, back then, with my power supply.

If I'm correct, since then, I have replaced the Samsung garbage with a brand new M2 drive and the old 550 watt power supply with an XFX 850 watt one (or 750... I forget...) and of course the 2060 Super with the 4060.

Your previous theory was that, somehow, my "low core count" was hurting me in this game. (Though likely, we agreed, not at 4K).

I guess the only way to know for sure would be to run that mission again and see what happens.

My day is already kinda weird... so maybe I'll do that right now.
I think the game is best played at 1440p but if you own a 4080 or a 4090 no matter what setting you set it at in should play without any problems just my opinion. ;) I have 2 1080tis and I can play the game not maxed out but balanced non the less.
 
Back