I brought up the issue of how much vram has increased over time at a different site (admittedly in an unrelated thread) and was not at all satisfied with the answer. I started the argument out by bashing Tweaktwown for posting Vram consumption and called it B.S. I now regret that since their data was very important.
So here is what I posted:
The article by tweaktown is sensationalism and rather irresponsible. By the end of it, they are basically having uninformed consumers run out and get GPUs with 12+ GB of VRAM "in order to play tomorrows games" since ME:SoM is shown to use 8.5 GB. This could have easily been validated by showing a large performance drop between the 6 GB 980ti and the 12 GB Titan or even between a 4 GB and 8 GB 390x. They also did not compare how different architectures and memory speed affected VRAM consumption which would be especially important with HBM since it is an entirely new beast.
Then I looked into it a little further. It would appear from their data that vram consumption didn't scale with resolution, but I in a way showed that it did:
I found it fascinating that 4k does not use 4 times as much memory as 1080p. Its like there is a
'y factor' where y is memory reserved despite resolution. Bare with me...
Take for example Tomb Raider which uses 1.5 GB ram at 1080p and 3.1 GB with 4k
Now use 2.077 (megapixels) for 1080p and 8.3 for 4k
Lets call 'x' the amount of vram needed / megapixel
and 'y' this vram overhead i mentioned earlier that is not affected by resolution.
Starting with 4k:
8.3x + y = 3.1 .... and now 1080p:
2.07x + y = 1.5 .... using subsitution,
8.3x + 1.5 - 2.07x = 3.1 ...reduce to x and
x=.253 or in other words .257 GB needed for each Megapixel
now we can solve for y:
2.07x + y = 1.5
y = 1.5 - 2.07x ....replace x with .257 and
y=.970
Now lets test with 1440p or 3.7 Megapixels which was said to use 1.94 GB
3.7x + y = 1.91 ---> pretty damn close!!
Just for fun, I tested again first using ME:SoM and i came up with
x = .1 GB and y = 4.56 where again x is GB/megapixel and y is the "game skeleton"
I tested on 1440 p and got 4.93 GB vs. 4.97 GB posted by tweaktown.
Then Metro Last light gave me:
x = .115 and y = 1.06
testing formula with 1440p gave me 1.49 GB vs. 1.46 GB that tweaktown recorded
ARE YOU FRICKIN KIDDING ME!!!
So if it wasn't for this mysterious 'y-factor' that has NO explanation,
we could play all these games with at 4k with 2 GB or below.
What was really fascinating is that the VRAM hog ME:SoM actually required LESS
VRAM/megapixel than Tomb Raider!!!
Last one I promise - Far Cry 4. This time i used 8.3 mp for 4k and 2.07 for 1080p (thanks rumartinez)
x gave me .43 GB/ megapixel (highest so far!) with "the skeleton" being 2.17 GB
Again testing on 1440p:
3.7x + 2.17 = 3.76 GB wait for it..... tweaktown reported 3.77 GB
The other forum members argued that things like AA and Z-buffering were the cause of this large 'y-factor' or as I like
to refer to it now as "skeletons"
So then what I did was used an older game that still used some of these modern day effects...
It took alot of digging but here is a link to vram usage in crysis 1:
http://hardforum.com/showthread.php?t=1456645
defaultluser reports .31 GB@ .48 MP, .36GB@ .79 MP .45GB@ 1.31 MP
and .575 GB at 1.9 megapixels
using 1.9x + y = .575 and .48x + y = .310
.....using substitution....
x = .187 Yep, thats right. Still the same GB/ megapixel as today
y = .22 GB much more reasonable. This was 4xAA btw.
Checking with .79 megapixels or 1024x768 .....
.79x + y = .368 compared to .36 GB that they recorded
So in review, all of these games had similar GB of memory used/ megapixel. What changed is how much this base skeleton increased.
Going from Crysis to ME:SoM has been a 20 times increase! The only good news out of this is that it will only take a couple of more
GB of VRAM to get to 8k in the future. Unfortunately, you will need 30 GB just to play at 1080p
So here is what I posted:
The article by tweaktown is sensationalism and rather irresponsible. By the end of it, they are basically having uninformed consumers run out and get GPUs with 12+ GB of VRAM "in order to play tomorrows games" since ME:SoM is shown to use 8.5 GB. This could have easily been validated by showing a large performance drop between the 6 GB 980ti and the 12 GB Titan or even between a 4 GB and 8 GB 390x. They also did not compare how different architectures and memory speed affected VRAM consumption which would be especially important with HBM since it is an entirely new beast.
Then I looked into it a little further. It would appear from their data that vram consumption didn't scale with resolution, but I in a way showed that it did:
I found it fascinating that 4k does not use 4 times as much memory as 1080p. Its like there is a
'y factor' where y is memory reserved despite resolution. Bare with me...
Take for example Tomb Raider which uses 1.5 GB ram at 1080p and 3.1 GB with 4k
Now use 2.077 (megapixels) for 1080p and 8.3 for 4k
Lets call 'x' the amount of vram needed / megapixel
and 'y' this vram overhead i mentioned earlier that is not affected by resolution.
Starting with 4k:
8.3x + y = 3.1 .... and now 1080p:
2.07x + y = 1.5 .... using subsitution,
8.3x + 1.5 - 2.07x = 3.1 ...reduce to x and
x=.253 or in other words .257 GB needed for each Megapixel
now we can solve for y:
2.07x + y = 1.5
y = 1.5 - 2.07x ....replace x with .257 and
y=.970
Now lets test with 1440p or 3.7 Megapixels which was said to use 1.94 GB
3.7x + y = 1.91 ---> pretty damn close!!
Just for fun, I tested again first using ME:SoM and i came up with
x = .1 GB and y = 4.56 where again x is GB/megapixel and y is the "game skeleton"
I tested on 1440 p and got 4.93 GB vs. 4.97 GB posted by tweaktown.
Then Metro Last light gave me:
x = .115 and y = 1.06
testing formula with 1440p gave me 1.49 GB vs. 1.46 GB that tweaktown recorded
ARE YOU FRICKIN KIDDING ME!!!
So if it wasn't for this mysterious 'y-factor' that has NO explanation,
we could play all these games with at 4k with 2 GB or below.
What was really fascinating is that the VRAM hog ME:SoM actually required LESS
VRAM/megapixel than Tomb Raider!!!
Last one I promise - Far Cry 4. This time i used 8.3 mp for 4k and 2.07 for 1080p (thanks rumartinez)
x gave me .43 GB/ megapixel (highest so far!) with "the skeleton" being 2.17 GB
Again testing on 1440p:
3.7x + 2.17 = 3.76 GB wait for it..... tweaktown reported 3.77 GB
The other forum members argued that things like AA and Z-buffering were the cause of this large 'y-factor' or as I like
to refer to it now as "skeletons"
So then what I did was used an older game that still used some of these modern day effects...
It took alot of digging but here is a link to vram usage in crysis 1:
http://hardforum.com/showthread.php?t=1456645
defaultluser reports .31 GB@ .48 MP, .36GB@ .79 MP .45GB@ 1.31 MP
and .575 GB at 1.9 megapixels
using 1.9x + y = .575 and .48x + y = .310
.....using substitution....
x = .187 Yep, thats right. Still the same GB/ megapixel as today
y = .22 GB much more reasonable. This was 4xAA btw.
Checking with .79 megapixels or 1024x768 .....
.79x + y = .368 compared to .36 GB that they recorded
So in review, all of these games had similar GB of memory used/ megapixel. What changed is how much this base skeleton increased.
Going from Crysis to ME:SoM has been a 20 times increase! The only good news out of this is that it will only take a couple of more
GB of VRAM to get to 8k in the future. Unfortunately, you will need 30 GB just to play at 1080p