• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

rtx 40x0 talk

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Lmah2x

Registered
Joined
Apr 27, 2003
Is there anyway to do the firmware update without a IGPU/Secondary GPU?
 

Brando

Member
Joined
Jan 9, 2006
I decided to say F it and move ahead. Just ordered the MSI ventus 4080 being the cheapest one readily available and having one of the best scores out of all the cards on the egg. It looks like it's a little bit smaller than the other ones according to at least one review and runs super quiet. I don't think it's going to have extra juice available for overclocking but as far as I can tell that really doesn't matter much. If I get 4% extra FPS from overclocking instead of 5% I'm not going to cry about it since I'll still have an extra 200 bucks in my pocket and use less power. As far as the 4090 goes it's very impressive and I still sort of wish I could have gotten one for MSRP but I'm not going to be sad about an almost 50% jump in vram and performance over my previous card and I'm not sure if my home insurance covers video card fires so I feel a little bit better about that. 450w or more just doesn't quite sit right with me. Another Factor is that it looks like I can still get over 900 bucks for my 3080 TI ftw3 ultra on eBay for some weird reason. Not sure if people are actually paying that but it kind of seems that way if you do a search so time is of the essence. Will post results.
 
Last edited:

Woomack

Benching Team Leader
Joined
Jan 2, 2005
Is there anyway to do the firmware update without a IGPU/Secondary GPU?
For firmware update you don't need another graphics card. You need it only for recovery when the card is bricked for whatever reason.
Firmware can be updated under Windows and then, even if it fails, then the card will work fine until you restart the PC ... so if it fails the flash then you do it again.

I decided to say F it and move ahead. Just ordered the MSI ventus 4080 being the cheapest one readily available and having one of the best scores out of all the cards on the egg. It looks like it's a little bit smaller than the other ones according to at least one review and runs super quiet. I don't think it's going to have extra juice available for overclocking but as far as I can tell that really doesn't matter much. If I get 4% extra FPS from overclocking instead of 5% I'm not going to cry about it since I'll still have an extra 200 bucks in my pocket and use less power. As far as the 4090 goes it's very impressive and I still sort of wish I could have gotten one for MSRP but I'm not going to be sad about an almost 50% jump in vram and performance over my previous card and I'm not sure if my home insurance covers video card fires so I feel a little bit better about that. 450w or more just doesn't quite sit right with me. Another Factor is that it looks like I can still get over 900 bucks for my 3080 TI ftw3 ultra on eBay for some weird reason. Not sure if people are actually paying that but it kind of seems that way if you do a search so time is of the essence. Will post results.
RTX4080 is surprisingly quiet ... at least all those coolers with a vapor chamber design. Most manufacturers use better coolers than in the previous generations.
 

Brando

Member
Joined
Jan 9, 2006
MSI Ventus 4080 oc edition rough first results with afterburner and unigine superposition 4k. At stock it runs the core around 2775mhz iirc. Currently maxed around +150 core +1350 vram with core speed sitting around 2925mhz but not sure it'll hold up in games long term yet. The cooler holds the line at 65c and the core holds steady almost all the time due to said gigantoid cooler. Haven't gamed yet since controller broke but will try some fps games to see how it holds.

First run at stock

4080 unigine stock everything.jpg

And after some tweaking at +150 +1350

4080unigine150-1350.jpg
 

Woomack

Benching Team Leader
Joined
Jan 2, 2005
Colorful RTX4080 (front page review, screenshot below) went up to a flat 3000MHz core, but 2900-2930MHz is what I saw on some other cards. MSI Suprim X, in our review, has 2910MHz, so it's about as much these cards can do. That +/- 50-70MHz is as good as 1% performance, so doesn't really matter.
I also see these cards try to adjust fans to stay at about 65-67°C and about 70°C max. In short, temps are similar, but the noise depends on the cooler/fans. Most higher series RTX4080 are very quiet and have 0RPM mode at low load. Just the price could be lower ;)


iGame4080_oc1.jpg
 

Brando

Member
Joined
Jan 9, 2006
I did a quick non scientific test with and without oc enabled in control in 4k with the main character standing in place doing the exact same thing both times for a minute or so. Without oc fps was hovering at 100-101. With max oc it was 100-102 fps :eek:
 

Lmah2x

Registered
Joined
Apr 27, 2003
I did a quick non scientific test with and without oc enabled in control in 4k with the main character standing in place doing the exact same thing both times for a minute or so. Without oc fps was hovering at 100-101. With max oc it was 100-102 fps :eek:
You're CPU bound most likely, you will see that a lot on a 40 series cards in RT games.

I'll probably upgrade my platform when the 7800X3D comes out.
 

Lmah2x

Registered
Joined
Apr 27, 2003
Not so much at 4K. Most of these new cards, you only get a couple of percent increase from overclocking anyway.
Depends on the game and the overclock. I typically see about 7-8% more performance over stock with an overclock of +225 core / +1700 mem in GPU bound scenarios.

In games I'm CPU bound like Spider-Man and Hitman 3, there's gain at all.
 

Woomack

Benching Team Leader
Joined
Jan 2, 2005
CPU counts mainly at lower display resolutions, when you want to pass 144FPS+ for a specific monitor setups. Many games work like more FPS = faster animation. You can see it especially in some MMORPG. So people buy top hardware to play at 1080p ;)
At 1440p/high details or 4k+, CPU counts significantly less. What more, even graphics card overclocking isn't as visible as we could think as the max boost doesn't mean that the card runs like that all the time.
1-2FPS difference in some games/tests was about what I seen on my RTX4080 before and after OC at 1440p and maxed out details, when some titles had 100FPS+. In theory should be 5-7% performance gain, in real is 1-3% average. 4K+ reacts better to GPU OC.
 

Brando

Member
Joined
Jan 9, 2006
+150 didn't hold in game sadly but +125 seems fine. Sits around 2910 mhz in cyberpunk 2077 @ 1440p without really budging. The card seems pretty beastly though. Getting 80-100 fps in the city with tons of nps walking around. Not sure if these are "fake" frames or not but it feels smooth with ray tracing on psycho with dlss quality and all high settings. I'm sure it'll drop in a big battle outside but it's definitely a big step up from the 3080ti in this game (y)
 

Lmah2x

Registered
Joined
Apr 27, 2003
CPU counts mainly at lower display resolutions, when you want to pass 144FPS+ for a specific monitor setups. Many games work like more FPS = faster animation. You can see it especially in some MMORPG. So people buy top hardware to play at 1080p ;)
At 1440p/high details or 4k+, CPU counts significantly less. What more, even graphics card overclocking isn't as visible as we could think as the max boost doesn't mean that the card runs like that all the time.
1-2FPS difference in some games/tests was about what I seen on my RTX4080 before and after OC at 1440p and maxed out details, when some titles had 100FPS+. In theory should be 5-7% performance gain, in real is 1-3% average. 4K+ reacts better to GPU OC.
Here's a video showing an example of a CPU bottleneck on a 4090. Typically it happens in games with RT. CPU bound scenarios happen at 4K in Hitman 3, Spider-Man: Remastered, Spider-Man: Miles Morales, Fortnite, Cyberpunk 2077, Witcher 3 Next-Gen, Forza Horizon 5 in my experience.

Also I'm not finding 1-3% performance gains it's 7-8%. These benchmarks were done at the settings I play at typically maxed except for Hitman 3 (I don't use the RT because the performance... This game was ran with DLDSR 2.25x which is the way I play since DLSS sucks in the game). Also Cyberpunk was done at DLSS Quality Max Settings.

My overclock used was +135 Core, +1600 Memory. Which is my max 100% stable overclock, I could have ran these at +225 without a crash, but +225 isn't Metro Exodus stable so I don't use it for gaming.

CP2077: +8.1734%
GotG: +8.39695%
Hitman 3: +7.28171% (was CPU bound for about 5 seconds)
SottR: +7.8125%

OC Scaling is fine as long as you're not CPU bound it doesn't have anything to do with resolution.

+150 didn't hold in game sadly but +125 seems fine. Sits around 2910 mhz in cyberpunk 2077 @ 1440p without really budging. The card seems pretty beastly though. Getting 80-100 fps in the city with tons of nps walking around. Not sure if these are "fake" frames or not but it feels smooth with ray tracing on psycho with dlss quality and all high settings. I'm sure it'll drop in a big battle outside but it's definitely a big step up from the 3080ti in this game (y)

Try +135, ever since Maxwell (I think) its been 15Mhz increments.
 

Attachments

  • SottR - Overclocked 4090.jpg
    SottR - Overclocked 4090.jpg
    1.1 MB · Views: 4
  • H3 - Stock 4090.jpg
    H3 - Stock 4090.jpg
    1.1 MB · Views: 4
  • H3 - Overclocked 4090.jpg
    H3 - Overclocked 4090.jpg
    1.2 MB · Views: 3
  • GotG -Overclocked 4090.jpg
    GotG -Overclocked 4090.jpg
    767.6 KB · Views: 2
  • GotG - Stock 4090.jpg
    GotG - Stock 4090.jpg
    768.4 KB · Views: 3
  • CP - Stock 4090.jpg
    CP - Stock 4090.jpg
    388.9 KB · Views: 4
  • CP - Overclocked 4090.jpg
    CP - Overclocked 4090.jpg
    388.6 KB · Views: 5
  • SottR - Stock 4090.jpg
    SottR - Stock 4090.jpg
    1.1 MB · Views: 4
Last edited:

Brando

Member
Joined
Jan 9, 2006
I checked the oc performance difference in heavily modded skyrim in the tundra and it made more difference. It went from 49 to 52 fps or about 6.5% so you were just about on the money.
 

Lmah2x

Registered
Joined
Apr 27, 2003
After further testing GPU core overclocking does next to nothing at all. I saw a .7-1.4% improvement (within margin of error) to performance between +0, +135, +225. (3DM SpeedWay)

Also tested it undervolted. +210 [email protected] versus +0 [email protected] and saw no performance difference. (3DM SpeedWay)

I also reran CP2077 at the same settings as above but without the core clock increase (120% PT, +0 core, +1600 memory) and got 79.86 FPS.

The 40 series seems to clock stretch with any addition overclock so it doesn't actually do anything meaningful. Overclocking VRAM on the otherhand does add performance, so if you want higher performance definately up your VRAM frequency. I might do some further testing but it seems like a waste of time.
 

Brando

Member
Joined
Jan 9, 2006
The amount of performance is almost meaningless. I'll take it because it's free but 2 fps obviously won't be felt. Makes me feel better about getting the cheap one tbh. The overall package has been pretty impressive. Started playing guardians of the galaxy today in 4k with everything cranked and dlss in quality mode. Best graphics I've seen and stays over 100fps for the most part. I forgot about dlss at first and still got 60-70+ fps with all eye candy in 4k. Looks amazing.
 

Brando

Member
Joined
Jan 9, 2006
I messed up on my cyberpunk settings when I was at 80fps in 4k oops. I had dlss on performance. If I put it on quality with psycho ray tracing and high on pretty much everything it's about 55 fps or so in the city with lots of traffic and people around. Setting RT to ultra and DLSS to balanced gets me 65-70 fps in the samish situation. Balanced DLSS still looks like 4k to me. I might see RT differences if I look hard but not noticing it right away. Looks pretty awesome.
 

TerranBrackiatt

Member
Joined
Nov 1, 2009
Location
Austin, TX
the 4070ti had Jay cursing up a storm

oh boy.... looking solely at Port Royal benchmarks.... Forgotten Legend gets 11900..... with a pair of 2070 supers.
sooooo, a 4070TI getting 14200 is a waste of money... especially for being 2 generations more advanced, and considering that 3070ti was basically the same performance as a pair of 2070 supers...
a 4090 has enough improvement in benchmarks and framerates to warrant a consideration, but the price is way too high.... ugh....

P.S. [insert ellipses here]