• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

AMD 7nm Vega 20 Benchmarked

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

deathman20

High Speed Premium Senior
Joined
Aug 5, 2002
Saw this article this morning. https://wccftech.com/amd-7nm-vega-20-32gb-gpu-3dmark-benchmark-leaked-up-to/

Quick cut to the point if true and AMD can actually turn this out in a timely fashion, this card goes head to head with a Vega 64 Liquid cooled unit at 70% of the clock speed. Course that is if 3D Mark is reading the right numbers from the card. I think its nuts to have 32GB of HBM2 ram though on the card... 16GB i could maybe see, still think thats overkill, but 32GB i think is an unnecessary overkill unless its for workstation class cards.
 
I'm guessing with that much vRAM on board, its not a 'gaming' card.

Either way...

1. Wccftech
2. Nice results for whatever that thing is assuming those clocks are remotely right (I doubt it).
 
Maybe that much Vram can be used as a "superfetch" or pagefile? It'll definitely be faster than mechanical HDD
 
Ehh, I don't think it can be used that way. This is likely a workstation level card with that much vRAM.
 
Couldn't this just be a dual gpu workstation class card based on the current Vega platform. This would explain the performance at the lower clocks as well as the 32Gb of VRam.
 
That's thinking outside of the box!

Though, I don't think we'll see a dual GPU card..........just a hunch.
 
Everyone should be looking into UMA typologies and looking for articles (when they become available) on xGMI and GenZ interfaces.

https://genzconsortium.org/
https://en.wikipedia.org/wiki/Cache_coherence
https://en.wikipedia.org/wiki/Uniform_memory_access

Additionally, this is most likely the very first 7nm Vega that AMD has received off of GF at risk development. It most likely is one of their earliest samples of Vega 20 and definitely not close to the final product. I would be shocked if 32GB is used on the consumer grade cards, but the workstation and server class Vegas definitely need it. However, it is worth to note that AMD has been on track for moving RAM calls into the GPU.
 
I assumed this was a test card for what could become the next VEGA FE, but have heard the new vega 64 model would be 32GB and the new vega 56 model would be 16GB. Just seems like overkill for today, unless this was done to double the bandwidth moving to quad stacks of HBM and a 1024bit bus. I however do not want to know what power envelope that stuffs this card into.

I also hope that driver development on it goes better than the current Vega FE card as I am horribly disappointed in that, especially when you drop it into a mixed system or want more than one.
 
Wouldn't be the first time AMD got ahead of themselves (and the need) with new tech.
This. Considering gddr5x has plenty of bandwidth for 4k, hbm is fine... to double it up at the consumer level is asinine. When its needed, the core wont be able to play the games....

Bit seriously, this better be a workstation class card. If more than 16gb hits the consumer on this round, im going to cry.
 
This. Considering gddr5x has plenty of bandwidth for 4k, hbm is fine... to double it up at the consumer level is asinine. When its needed, the core wont be able to play the games....

Bit seriously, this better be a workstation class card. If more than 16gb hits the consumer on this round, im going to cry.

Finally! Crossfire finally makes sense! I'm laughing my a** off right now.
 
This. Considering gddr5x has plenty of bandwidth for 4k, hbm is fine... to double it up at the consumer level is asinine. When its needed, the core wont be able to play the games....

Bit seriously, this better be a workstation class card. If more than 16gb hits the consumer on this round, im going to cry.

Yes please no more than 16GB for a consumer card. I think 10GB is the most I've seen used so far, and that is a rare case. I enjoy my 11GB on my card, came from a 4GB card before, but I still think its a wee bit overkill from a gaming perspective.

Why not, sir? Run 3 different games simultaneously at 4k. Fun for the whole family

Forget the 3x 1080p or 1440p panels... Lets make it 3x 4K, 5K or 8K panels!


Really the only sad part about this, nVidia will come out with something this summer, in theory, and we'll still be waiting till April / May at the earliest to hit. I wouldn't mind getting back on the AMD boat for GPU as I've run with them for a very long time in the past and do like their openness more than nVidia. I just needed more power this time but would be happy to switch back.
 
You guys forget that we are no longer using GPUs just for gaming and video rendering. People are using these cards for super high accelerated computing. This demand is being seen not only in the server market, but also at home and small business.

Additionally, we are looking at the same approach as AMD has been doing with the CPU market for many years: Why produce, validate, and support two different memory formats for the same GPU, when you can lower cost and produce one? Even if its overkill for that small portion of our market?

Who gives a butt if its HBM or GDDR10X? If the end goal is the same, but the result is differentiated by cost and time why not go the route that best suits your company's need?
 
I dont think anyone cares if they call it a potato - you missed the point.

From a business perspective, a 16GB chunk of HBM isn't cheap. So it makes complete sense to separate the markets. I don't want to pay a $150+ premium because a company decides to slap on enough memory to run 3 games at once..........at 4K, with AA, on Ultra. ;)
 
I dont think anyone cares if they call it a potato - you missed the point.

From a business perspective, a 16GB chunk of HBM isn't cheap. So it makes complete sense to separate the markets. I don't want to pay a $150+ premium because a company decides to slap on enough memory to run 3 games at once..........at 4K, with AA, on Ultra. ;)

Who says that 32GB HBM2 won't be the same price as 16GB HBM2 from 2 years ago? If the cost is the same, who cares?
 
Back