• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

New 3080 with 12GB of RAM

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

rainless

Old Member
Joined
Jul 20, 2006

I never understood why the original only had 10GB... that's such a WEIRD amount of RAM...

I mean I guess they could come out with a card that ha 190GB of RAM for about 30 bucks... but it wouldn't matter since they'd only be able to make like FOUR of them.
 
On the original, I guess they went with a target ram bandwidth. With faster chips they could meet that, and they didn't need to go as wide. Now the direction is paying more attention to capacity.

I've seen a LOT of hate on it elsewhere, but IMO it doesn't matter. It's going to be close enough in performance and the pricing will be whatever market dictates regardless of MSRP. And still sell out before I get to see one.

I was looking around new nvidia GPUs earlier. I don't know why. In the 200 currency unit* class were 1650/1050. In the 400-500 currency unit class were 1660Ti/Super/2060 6 or 12GB. The 12GB exists on sale! 3070 Ti nudging 1000, and 3080 Ti ~1800. If my 3070 dies I'd be screwed as I'm not replacing at current pricing. Have to make do with my backup 2070 or 1080 Ti.

*currency unit is GBP, but for whatever reason it is often numerically close to the USD pricing.
 
Yeah the 1080 Ti had 11GB...The good news is the old 3080's are now obsolete and we need to upgrade....:ROFLMAO:


Yeah this is what I mean... I never got why a 3080 could have less ram than a 1080ti... (Even though 11GB is an even WEIRDER amount of RAM! :D )

On the original, I guess they went with a target ram bandwidth. With faster chips they could meet that, and they didn't need to go as wide. Now the direction is paying more attention to capacity.

I've seen a LOT of hate on it elsewhere, but IMO it doesn't matter. It's going to be close enough in performance and the pricing will be whatever market dictates regardless of MSRP. And still sell out before I get to see one.

I was looking around new nvidia GPUs earlier. I don't know why. In the 200 currency unit* class were 1650/1050. In the 400-500 currency unit class were 1660Ti/Super/2060 6 or 12GB. The 12GB exists on sale! 3070 Ti nudging 1000, and 3080 Ti ~1800. If my 3070 dies I'd be screwed as I'm not replacing at current pricing. Have to make do with my backup 2070 or 1080 Ti.

*currency unit is GBP, but for whatever reason it is often numerically close to the USD pricing.

Wait a minute... There's a 2060 Super with 12GB of RAM?!? That's news to ME!

I don't really care about gaming on PC anymore ("PHILISTINE!!!") I've got a PC loaded with all the last few games I cared about (Cyberpunk, Watch Dogs: Legion, etc) and all I can do is play Disco Elysium. I've just stopped thinking of my desktop as an entertainment device. I use it almost exclusively to edit videos in DaVinci Resolve. And in Resolve... the move vRAM the better...

...or so I thought! Until the Macbook M1 came along! Somehow with 8GB shared RAM my m1 Macbook performs about the same as my desktop with an 8GB 2060 Super, 32GB RAM, etc, etc, etc...

I had the 3060 12GB that I posted a photo of. Returned it because the performance seemed too similar to the 2060 Super that I already had. (Turned out to be a wise move because I used the money to get my M1 Macbook.)

A couple of weeks from now (if I'm not obliterated from the earth and there's still, somehow, a place for me in the universe) I'll finally be reunited with my desktop. Then I'll be able to run some tests and see which machine encodes faster. If the M1 actually outperforms my desktop... then that's a WRAP for me in terms of video cards this generation.

I won't be looking for anything new until the 4000 series. Hopefully, by then, COVID-19 will finally be at bay and the chip shortage will have ended.
 
Yeah the 1080 Ti had 11GB...The good news is the old 3080's are now obsolete and we need to upgrade....:ROFLMAO:
The 1080Ti was a flagship card. The 3080 is not (well... that depends on if you think the 3090 was a 'Titan' I guess, lol).


The thing it matters for is the memory limit to load high resolution textures in some games...There may be a work around but with my RTX 3080 I could not load high res textures.
I have to wonder if this is outdated? I say that because we test FC6 with the HD Texture pack. For my freelance work I use an 8GB RTX 3070... no errors and runs fine (getting the FPS expected, no notices).

EDIT: That FC6 article was from 10/2021. So it's 'old'... but I don't recall running into that issue ever and I started benchmarking on this title shortly (less than a month?) after. I'd imagine that was patched. If I load it up now, for 1080p/Ultra/HD Textures on this 8GB card, says 5.56GB of 7.85GB will be allocated.


People poking at the 3080 for a lack of vRAM (not at this site) were, IMNSHO, off their rocker. Unless their intent was to use the card at 4K with games with modded textures... otherwise, we can see from tests that even at 4K, few titles even come close to that amount of vRAM use. TPU has a great database of game reviews that cover vRAM use. Last year when the 3080 came out, the last 10 averaged like 6GB at 4K with one title reaching 9GB. I'd imagine a similar story if we looked at the last 10 there now.........

On top of that, there's also a difference between allocation and use. Some games may allocate more vRAM but not actually utilize it. So just because a title may show 9GB used, doesn't mean it affects performance in any way. We've seen cards with different RAM capacities perform the same even though the vRAM was 'full'. WE've also seen games scale with available vRAM (without detriment to the lower vRAM cards) There's a lot to consider here.

To me, a 3080 12GB is marketing preying on the unadorned gamer. It's useful, it's for those who want to use that card at 4K res and may have games that are modded........or use it for creative-type work where the increase in vRAM matters. But for gaming... save your money IMO. :)
 
Last edited:
I saw these on eVGA last night, oddly there is no queue to sign up for it. I have a 3080 and don't really see the need for a 12g version. I run every game I play maxed out with DLSS and RTX on with no problems. Granted I don't play Crysis style games but COD Vanguard, Warzone and Blackops all run great maxed out. I guess if someone couldn't get a 3080 10g and the 12g comes available then it makes sense but not from a 3080 10g.
 
To me, a 3080 12GB is marketing preying on the unadorned gamer. It's useful, it's for those who want to use that card at 4K res and may have games that are modded........or use it for creative-type work where the increase in vRAM matters. But for gaming... save your money IMO. :)
Is that a gamer who games naked? 😄

The only way to get GPU's at close to retail was to be smart enough to be in the EVGA notify (In other words have someone like Janus recommend it (y)) when it came out or the newegg shuffle...(I guess you could wait in line at MC or Best by every morning.) You have to take whatever you can get. Or like smart folks are doing skip them all together.

It is not like you can ponder which GPU to choose at a store shelf...I'll take the 12GB if it comes in a shuffle. (At the right price).

According to Ubisoft you still need a lot of RAM for FC6...Did you download the larger packs for your tests? I certainly wanted to run it with higher textures for it to look it's best. I switched from 4K to 2K for performance...but I can see a time in the near future when my next monitor/TV will be 4K again.


There are also recommendations available for playing the game at higher resolutions.

Please note that the minimum requirement for running the HD Texture Pack for Far Cry 6 is 12GB of VRAM. For 4K configurations, you will need 16GB of VRAM.

If you download and run the HD Texture Pack with lower VRAM capabilities, you will encounter performance issues while playing.
 
Last edited:
Is that a gamer who games naked? 😄



According to Ubisoft you still need a lot of RAM for FC6...Did you download the larger packs for your tests? I certainly wanted to run it with higher textures for it to look it's best. I switched from 4K to 2K for performance...but I can see a time in the near future when my next monitor/TV will be 4K again.

:rofl:... just someone who doesn't know. No experience. Likely not the best word there. How about... 'those in the know'? :p

I'm not sure what to tell you there. I absolutely have the HD Texture pack downloaded and installed as the option in settings would be greyed out. But yeah, try it yourself and see. The game tells you how much vRAM it uses in Video settings. At 1080p/Ultra/HD enabled, a 3070 runs ~124/~135 FPS (min/avg) with that 5.4 GB allocated (according to the game).

vram.jpg

Note, this could also be one of those games that allocates more if it has more available. But yeah, FPS are obviously solid, and in my limited time playing on that system, I didn't notice any hitching or performance issues (but it was literally about 10 mins of gameplay). I just run the benchmark, lol.


If you're trying to game still in 4K, drop AA down to off or 2x. With the higher res, you don't need the same amount of AA and that can be an FPS killer. ;)

It is not like you can ponder which GPU to choose at a store shelf...I'll take the 12GB if it comes in a shuffle. (At the right price).
Oh, no doubt... just saying that someone who doesn't know, generically believes that 12GB is better than 10GB. And while that can be true, it surely isn't always (read: gaming at 4K or less without modding). if you dont' have a choice, you don't have a choice. It isn't going to hurt anything, lol.
 
Last edited:
Wait a minute... There's a 2060 Super with 12GB of RAM?!? That's news to ME!
No, it's a 2060 regular with 12GB of VRAM. The internet did do a collective "why?" when it was revealed. My best guess was nvidia's previous statement that a modern gaming GPU needs 8GB minimum, and the old 6GB wasn't cutting it. So without messing up ram bandwidth, 12GB is the way to go. I think they made similar comments when the 3060 came out.

As for the Apple side, they have the advantage of being able to optimise at a system level, where in PC land we pick at component level. Of course, there's tradeoffs either way. Apple chose where they wanted to fight, and of course, in that they're doing well. But Apple are useless for high end gaming so for that I'll have PCs for the foreseeable. Nearest I get to Apple is an iPad mini for mobile games, where again they're "winning" in performance against Android hardware.
 
My issue with this new 12GB card is that the cost [per an article I was reading] is around $1300 starting, which is just insane. Granted, in this market many beggers can't be choosers, but I sure wouldn't be paying that premium for an extra 2GB of memory on one of my 3080s. The 10GB, even at 4K, has been fine as far as I could tell. Especially because all of these cards will be LHR as well.
 
I think that Nvidia has always played with odd RAM sizes. I stopped looking at that back when I got suckered into buying a 5200 with double the vRAM back in the day. I feel that the speed and/or amount of components within a card matter far less than the performance of the card itself. Like I don't care if my card has 11 or 12 GB as long as it does what I want.

Hear me out. I know that some folks will have requirements that lean heavily on memory bandwidth in a card. The key to that statement is the word "some". For the rest of us, manufacturer's marketing departments look to utilize such numbers to persuade us to buy one card over another even if those numbers have no meaning to our intended use. Saying that one car can go 160 mph vs another cars 170 mph means nothing if I only ever drive no more than 70 mph. Do I care if my card has 11 or 12 MB? No. I only care if it can play Crysis.
 
Saying that one car can go 160 mph vs another cars 170 mph means nothing if I only ever drive no more than 70 mph. Do I care if my card has 11 or 12 MB? No. I only care if it can play Crysis.

To build on your statement. It isn't just about top speed, its how quickly you get there.

3080 - 8704 Cuda Cores

3080 12g - 8960 Cuda Cores

3080Ti - 10240 Cuda Cores


EVGA FTW3 Ultra Gaming pricing:

3080 - $920

3080 12g - $1300

3080Ti - $1430



There is a $380 stretch from the 3080 to the 3080 12g. Which nets you 16.7% more RAM and 2.8% more Cuda Cores. But, comes at a price increase of 29.2%


Going from a 3080 12g to a 3080Ti is $130 more. Which nets you same RAM size and 12.5% more Cuda Cores. But, comes at a price increase of 9.1%


Looking at the cost and increase of RAM size and core count I am left scratching my head. $380 difference from a 10g card to a 12g card is a large jump in my opinion. Once that I don't think is worth making. Especially when you consider the next jump is only another $130. This card makes little sense to me. If it cam in around $1100 to $1200 then it would be more understandable.


But then I look at it like this. 3080 12g chips are failed 3080Ti chips. So create a new sku that has a hefty price increase in order to maximize profit from a piece of silicon. But 256 more Cuda Cores does not justify a large price increase. But add 2Gb of RAM and then you can justify it.



Bottom line to me is this card would not sell if shelves were full of 3080s and 3080Tis. From a consumer standpoint it does not fill a need.


EDITED to add: I wonder how many 3080s were released prior to this that could have been capable of using 8960 cores? I wonder if someone will come out with a hack to unlock all the cores?
 
My issue with this new 12GB card is that the cost [per an article I was reading] is around $1300 starting, which is just insane. Granted, in this market many beggers can't be choosers
See that's the thing: Beggars can and SHOULD be choosers.

Pricing has gotten completely insane. (Especially given that most of these cards can BARELY handle ray-tracing at higher resolutions and we basically have to wait til the next gen to see the technology fully realized anyway...)

If people stopped BUYING these cards... and paying scalpers and putting up with all this weird lottery insanity... then they manufacturers would change their tunes.

It's like when the game companies announced they were going to raise the price of games from 59.99 to 69.99. I wasn't worried at all because I knew no one in their right minds would ever pay that much.

So what happens now? There's a sale like EVERY... SINGLE... WEEK.

Especially on AAA titles. (I wish there was some metric out there where you could track what percentage of people have actually paid $70 for a game...)

They could make the retail price $180... doesn't mean anyone would ever actually pay that.

Same thing would would happen with graphics card prices if system builders could use the "chill-the-f-out" feature.
 
See that's the thing: Beggars can and SHOULD be choosers.

Pricing has gotten completely insane. (Especially given that most of these cards can BARELY handle ray-tracing at higher resolutions and we basically have to wait til the next gen to see the technology fully realized anyway...)

If people stopped BUYING these cards... and paying scalpers and putting up with all this weird lottery insanity... then they manufacturers would change their tunes.

It's like when the game companies announced they were going to raise the price of games from 59.99 to 69.99. I wasn't worried at all because I knew no one in their right minds would ever pay that much.

So what happens now? There's a sale like EVERY... SINGLE... WEEK.

Especially on AAA titles. (I wish there was some metric out there where you could track what percentage of people have actually paid $70 for a game...)

They could make the retail price $180... doesn't mean anyone would ever actually pay that.

Same thing would would happen with graphics card prices if system builders could use the "chill-the-f-out" feature.
Understand gamers should not buy these...Those of us not gaming with them save money with them....

It took 3 2060's to do the science of one 3080. Over the course of a year it is half the electricity.
 
Last edited:
In 2018 Toyota sold 2.1m vehicles. In 2021 Toyota sold 2.3m vehicles. In a chip shortage?

It feels to me like 2021 was the year of the upgrade. The year of "no matter the cost, the upgrade is imperative."

I've said it many times here and elsewhere. Buying from scalpers has shown the manufacturers that the product they produce is worth more than they were selling it for. So here we are. Complaining that newly released cards are going for scalper prices.
 
Back