• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

rtx 40x0 talk

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Evilsizer

Senior Forum Spammer
Joined
Jun 6, 2002
i am not sure what forums to go too. To see if i can find answers or just chat about things that seem to have not been brought up in "talks" of the new 40x0 line of cards.

Power draw/TDP:
1) increase power from what i see on "specs" is the increase in L2 can/does increase power draw. how much does it?
2) some have pointed out clock speeds, that clear it does increase power draw.
3) if you look at power from any x060 card there are slight increases.
  • 1060 6GB TDP 120 watts
  • 2060 6GB TDP 160 watts
  • 3060 12GB TDP 170 watts
  • 4060 8GB TDP 200 watts
Looking over the power increases 1060 to 2060 is 40 watts, then 3060 to 4060 is 30 watts. 1060 6GB L2 1.5mb to 2060 L2 3mb, then 3060 L2 3mb to 4060 L2 48mb. the L2 cache really cause those kinds of increases? i still don't fully under stand why the 4060 would need that much L2. while the x060's is being priced out of the midrange segment, depends on what you consider mid-range pricing now. For the money there better be more added to the card if its around $400. How much lower would the tdp be if it was 24mb L2 and the cost? We wont know what the exact L2 will be, could be true for 48mb L2. We have no real comparison to make, meaning how much is the L2 going to effect gaming performance? Is that large L2 more for crypto mining, RT, or for anti aliasing performance boost.

GPU benchmarking will be one of the pitfalls in the computer/gaming world. As we have no way to limit core count or the amount of L2 used. I am more going back to the core 2 days. we could limit cores like today, only different cpu's would have more L2. we could limit the cores and speeds to see the real difference L2 made. I would imagine L2 on GPU's will have more impact given the flexibility cuda cores has for doing work.
 
yea thats something i've noticed nvidia is pretty good at. its happened a few times in the past. fastest at all costs.

FX 5xxx cards were in that boat. for the time they used a lot of juice
blower2.jpg

ahh, i had to go diggin for that pic... memories

i've also noticed as time goes on ATI....err AMD and nvidia flipflop over power efficiency. Although you dont see AMD go so as far with it as nvidia does some times.

but it seems to go like this any more.
go faster and faster, now get efficient with less performance improvements, rinse and repeat.

and when a 4060 is supposedly to be as fast as a 3080 theres probably good reason for that power draw although a quick search shows 3080's pulling about 340 watts so about equal performance (again supposedly) for 140 watts less in that case...

it just depends on how you look at it
 
I'm still waiting for a high-performance ~100-150W card, and all I see is that the wattage is going up while next-gen cards are good for about the same display resolution. They are just adding transistors, causing the die to be larger and use more power. In the end, you can live with a cheaper and lower wattage Core i5/Ryzen 5 CPU even for the most demanding games, but the graphics card is large and noisy and the best if it had its own water cooling loop. I don't know if it's really what most gamers want/need. Another way, you could use a PC in the size of a typical NUC with 100-150W PSU, but the graphics card itself will soon need a full PC case with 500W+ PSU. Maybe well-designed eGPUs with full PCIe bandwidth connectors would be a good idea.
 
yea, i am still curious how much the increase in L2 has on performance. paper numbers tell me it should be 3070/3070 TI performance. If like you said it is going to be 3080 performance, then that kind of increase in L2 is my be putting there. current specs https://www.techpowerup.com/gpu-specs/geforce-rtx-4060.c3891
shows it has a lower pixel rate then the 3070

memory BW is down by about half, a little less then half. will be interesting to see the benchmarks when it hits. i am still on the fence if i should wait, i feel like if i wait. the 4060 my not show up on the market till mid 2023 or late 2023, given they want to get rid of 3000 stock.
 
So far, we can see news that the RTX4000 will be delayed because of too large 3000 stock (what you already know). Hard o say how much delayed, but it's suggested that 1-2 months or if AMD won't push, then can be more (probably their cards will be delayed too). As always, on the premiere day, we will probably only see 2-3 high versions, and as you said, 4060 will be probably a half year later, if not more. We are in the worst time to buy anything as current hardware is already old, and all premieres are expected in 1-3 months.
It's also quite weird as everything we have in stores is faster than most users will need for the next 2-3 years, but I feel like buying "old" technology is not the best idea.
I already sold what I could from the last/current generation while it was still expensive. RTX3000 is already a long time on the market. AMD CPUs are quite old too. Intel is releasing beta products, and I'm tired of motherboard switching only to find out that another one has some new problems. I don't expect Z790 and X670 to be perfect. I only hope we won't have to wait for BIOS updates or other things for 6-8 months.
 
Not top end cards already provide all the performance most need. If you run 1080p or 1440p the bigger question is how cheap can you get it. Only maybe at 4k is there still some demand for more performance. Right now I'd consider 3070 entry level 4k, about equal to 2080 Ti, giving 4k60 class performance at high+ settings. I'm wondering if/when 4070 eventually arrives it'll finally give me that bit more than I want than a 3070, without going to the expense of 4080+.

It not being latest gen I think is only a "problem" for tech enthusiasts who live close to the cutting edge. For the masses, it doesn't matter.

High end being first makes sense, as there is marketing value in holding the halo product of a generation. If you're not going perf at any cost, you're basically making a tradeoff on performance and value.

Pricing of next gen GPUs I feel will be mainly determined by how AMD behave this round. They did not try for volume in RDNA2 so pricing isn't so different from nvidia. Only if AMD makes more of an attack on share do I think pricing will be bought down by competition between them.

I'm still hoping to get an Arc to play with. From various Intel online events I've already got Arc T-shirt, hoodie and water bottle. Still looking for the silicon. Also got a discount waiting from the earlier scavenger hunt. If that can be applied to the lower end models I could get one for very little and it'll be a great play thing.
 
Nvidia just missed earnings pretty bad so I wouldn't be surprised if they try to push the timeline up. They talked about changing pricing to address "challenging market conditions" which hopefully means lowering current offerings.

Also interesting to see their comments on mining hardware. It makes me wonder if they will remove the LHR on future cards to boost sales.

Source
 
I'm still hoping to get an Arc to play with. From various Intel online events I've already got Arc T-shirt, hoodie and water bottle. Still looking for the silicon. Also got a discount waiting from the earlier scavenger hunt. If that can be applied to the lower end models I could get one for very little and it'll be a great play thing.
man im jealous, yea im wondering about Arc as well. i forget which video i watch on youtube they were talking about emulating DX9 to run those titles. just hoping it doesn't hit performance that much. one thing intel did not talk about, are they going to offer something akin to SLU/Xfire

one thing i realized i missed about in the first post, really what i ment to post about. is that reading all the things talking about power increase being huge on 40x0'. i was looking at the x060 range since it is most common cards people have bought for budget gaming cards.
 
So far, Arc's results are quite pathetic. I was counting on much more, but everything released/leaked so far matches the performance of the lowest AMD/Nvidia cards, or in the best case, something like GTX1060. This week were huge updates for the Arc, fixing a long list of bugs. We still haven't seen the highest chips, but I heard that something from upcoming chips was recalled.
 
yea i think people get stuck on the 380 thinking its some big power house card and its really not. for ~$150 its a little less than an evga 1650 so in all honesty i think its right where it should be and once they get their drivers in order it'll be a decent little card.

I remember i had one of the old intel AGP cards from a long while ago, i think it was only 2x with 4 megs of dram. it was bad for the time and they clearly didnt read the market. this time around they did a bit more and better research before jumping in.

any way that old 2x intel card was my backup for a little while in case i had a problem with my voodoo 3 but when we found out a PCI S3 Virge was better than it we shot it with a BB gun.
 
Arc was expected to hit the stores two months ago. Intel has huge problems with performance and/or stability, and they only released lower chips. Now I see that higher GPUs are not even on the Intel website listed for Q3 2022. There are some lower GPUs from the Pro series, and that's all. The only not-so-bad GPU is A770M, but it's for laptops.
I find everything below ~RTX3050 performance pointless as anything as slow as GTX1650 (A380 is directly compared to that) can be nowadays replaced by IGP/APU (maybe still faster but both are equally useless for modern games). The next-gen APUs (should be soon) are expected to perform better.

In my ITX PC, I have EVGA RTX3060 XC, which I probably won't replace for longer. If I decide to buy any from the RTX4000 series, then something that will be significantly faster than the RX6800XT (I'm using it for tests right now) but won't be a $2k, 400W+ card.
 
I'm probably skipping the 4000 series. My 3080Ti has yet to see games in my collection that even warm it up, for the couple times it's even seen 3D duty.
 
if this is right
and the 4060 is the same price as the 3060 then it will be worth it. for the 200watt tdp it looks pretty good vs the 3060TI, what i think is holding it back is the lack of ram bw. that is looking at the time spy numbers, in game benchmarks are different of course.

long term for the money, i may actually look at the 4070 card. i tend to keep my monitors a long time and at the much slower rate i upgrade, this should last me a while. latest and greatest games, i wait till they go on sale for about half off give or take.

watching this video
if the 4070 card is 260watts, thats easier to swallow for the price/performance.
 
not till 2023 though... maybe thats been delayed because of "stock issues" that have been in the news lately.

although i wonder if a 4080 ti would come out around the same time. i am looking at a 4080 so i can ride it for a while like i tired to do with my 1080 till it died suddenly.

also 16 pin power connector? is that the same thats been on the 3xxx cards? seems like its more
 
Why did I read the comment section on that linked wccf post. Lol, fanboyism at its worst.

On my desire for a 3080+ performance upgrade, it sounds like my options will be, depending on pricing, and availability timing: 3080 (10 or 12GB), 4060 Ti, 4070. Again from my side, the target first is good enough, then secondarily how cheap can I get it?

also 16 pin power connector? is that the same thats been on the 3xxx cards? seems like its more
I'd have to guess it is the new PCIe 5.0 spec version, which extends upon the nvidia version from last time.
 
As soon as I can I will pre-order the RTX 4090 from Evga. Im thinking it will be soon. Nvidia is going to announce the release date in Sept...right? Just doubled my ram to 64Gb for a piddly $150. The 1 set of 32Gb cost me almost $400!

Will my PSU be enough for the rig in my sig and a 4090? I have a Corsair RM1k watt
 
I hope you're using close to/over 32GB.. otherwise that 64GB was a waste!

PSU will be plenty. :)
 
As soon as I can I will pre-order the RTX 4090 from Evga. Im thinking it will be soon. Nvidia is going to announce the release date in Sept...right? Just doubled my ram to 64Gb for a piddly $150. The 1 set of 32Gb cost me almost $400!

Will my PSU be enough for the rig in my sig and a 4090? I have a Corsair RM1k watt
do they pre-order for msrp? i just want the basic level card they have since they have FTW ect, assumes 4070 will be out the same time or preorder at least. still looks like all indications for the 4060 will be in 2023. really don't want to wait any more!
 
Back