• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

4070 GDDR6 impending

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

mackerel

Member
Joined
Mar 7, 2008
For those that don't keep up on the rumours, the standard 4070 has 21 Gbps GDDR6X. Due to shortages, that is in the process of being switched over to 20 Gbps GRRD6. I'm seeing some stupid takes elsewhere. Why do I even look? Some places are losing their heads over this "massive" performance nerf. How could nvidia do this? Slash the prices to compensate!

I decided to run some numbers myself. I don't have one of the new models, nor do I have any intent of getting one, but it seems easy enough to turn down the memory speed to match. Since I did testing for KB5041587 recently, I was already half way there before I even started! I only needed to rerun the benchmarks at the lower memory speed. The drop in memory speed is 4.8%, so if all else were equal, if a workload was purely bandwidth limited, we could expect to see 4.8% drop in performance.

Steel Nomad does whatever it does. The rest I run at 4k output since that is my display.

3DMark Steel Nomad - 3819 vs 3763 (-1.5%)
Black Myth Wukong benchmark - 4k very high, motion blur off, 50% scaling, RT medium - 61 vs 60 fps (-1.6%)
SOTTR - 4k high - 122 vs 120 fps (-1.6%)
Watch Dogs Legion - 4k Ultra, RT off - 51 vs 50 fps (-2.0%)
FFXIV DT benchmark - 4k max - 70.75633 vs 70.63608 (-0.2%)

There we are. Up to <2% drop, which no one is likely to ever notice outside of benchmarks. I should add, I'm no benchmark monkey. I ran each of these once. They looked in the ball park of the old numbers. So no repeatability, no double checking. Also the middle three only report to nearest whole fps, so that's some quantisation error there. Dropping the speed of GDDR6X might not exactly replicate GDDR6 either, since there might be something in timings, or power usage. If GDDR6 uses less, then possibly the core could boost a bit more. I'll leave that to the usual suspects after the GDDR6 cards hit the shelves.
 
Yeah, I saw some of those reactions... I'll leave it at that. :)

Thanks for taking the time to approximate the difference and get some ballpark figures!
 

A test is out for a GDDR6 model. WCCF did the testing, but their presentation really needs work. Videocardz tried to put it in a more digestible table. Average perf drop at 1080p, 1440p, 4k were 0.2%, 1.0% and 2.3% respectively.

Looking at the detail raises some interesting questions. As mentioned before, if something were purely bandwidth scaling, we could expect up to 5% drop. Metro Exodus saw drops of 4%, 9%, 10%, but not when RT was used. I'd treat this as an outlier and it should have further investigation to check repeatability. There are a couple other individual points around 5% but generally it is less. I suspect this is run to run variation. I'm not familiar with how WCCF test.

I think it would be more interesting to see a median value as that would resist outliers better. I would use Videocardz table but they seem to have implemented some anti-copy to their site and I'm not motivated enough to get around that. In the past text is still easily extractable from source, but I'm not sure it would go so well for a table.
 
I would use Videocardz table but they seem to have implemented some anti-copy to their site and I'm not motivated enough to get around that. In the past text is still easily extractable from source, but I'm not sure it would go so well for a table.

I can't copy the text without digging into the code, but I can take a simple screenshot.

r47.jpg
 
I can't copy the text without digging into the code, but I can take a simple screenshot.
Which then needs to be fed into OCR to extract the numbers, and formatting and values need to be checked. More than I'm motivated to go through right now. If it wasn't protected, it would be a simple copy paste into a spreadsheet and I'm done already.
 
Back