• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FEATURED AMD RDNA2/Big Navi Rumors thread:

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I wonder if the clocks are being tweaked behind the scenes on Navi now they have an idea of Ampere.

I really wonder how much behind the scenes spying goes on between them. Did they need to wait until the public announce? I haven't worked for team red, green or blue, but in my previous employment I did have to regularly sign a code of conduct along the lines of I wont try to get information by less than legit methods. Still, there were times where we had access to competitors pre-release product at customer sites. I'm sure our competitors got our products similarly. If they copied us, it would have set them back!

Also I have to wonder how much scope AMD have for adjustment. Ok, there's always some room to go up or down the performance curve, but at some point it will become too hot to handle even if they follow nvidia's 350W limit. Competition has been literally hotting up on both of AMD's fronts, CPU and GPU, and I feel they're constantly operating close to the upper limit of what the silicon has to offer. There might not be more to give, other than unlocking more of the die for the cut-down versions.

I wish I knew what kind of lead time manufacturers need from having all components on hand, to a product ready to go to customer at a shop (online or physical). That might give some indication of when they would have to freeze the design at the absolute latest, unless you're counting on last minute bios updates to make changes.

In short, whatever AMD have decided on in technical specs, they probably already frozen it, and aren't likely to radically change that. What they could still do of course is optimise pricing. This is in part why I like to understand the technical differences outside of any cost considerations to get to know what it does. Only then, does pricing/value come into it, but price/performance isn't the sole area of consideration.
 
I had high hopes for a worthy replacement for the 2080ti this gen. Hopefully AMD has secretly created infiniti fabric for gpu's and will have a 6700xt single core as fast as a 2080ti for $400 and a 6900xt dual core linked and acting as one with double the speed for $800. Since the rtx 3080 has less vram and the 3090 isn't worth it I think AMD need to come to the rescue here. Please Ms. Lisa, save my fps.
 
Ok, not quite what I want to post.

You have to look at performance as performance per dollar. Whatever metric you use as performance. So FPS, benches, hashrates etc.

The 5700XT has really come up since release and thrown Nvidia for a loop. At release FPS performance was behind 2070super but now has exceeded it. Hardware Unboxed just made a YouTube video comparing the two.


In my opinion AMD has done well in the FPS per dollar comparison versus Nvidia.


Moving forward AMD will likely keep or slightly lose/gain market share if the 6000 series are close to their respective 3000 counterpart. Ie: if the 6700xt performs at 3070 levels for nearly the same price they have no reason to earn any more market share than they have now.

When I bought my 5700XT it was $400 & $500 for the 2070. To me the $100 savings trumped the slight FPS advantage that the 2070 had. The continuing improvement of AMD drivers versus stale Nvidia drivers were also a draw


If you've read my AMD driver complaints around our forums you've likely noticed that I probably will go Nvidia this time around. In order to earn my slice of market share AMD will have to create a huge FPS to $ gap over Nvidia. And even then I will likely go to Nvidia simply because regardless of my FPS and what I pay if I am halfway through a game and I get bumped to the desktop due to a driver crash my effective FPS is now zero and my frustration level is now at the max.

So you guys can argue about past performance of cards and will AMD be relevant due to a month later release... It really comes down to me and people like me. Are we sick of having our games interrupted by AMD's driver issues? The answer will probably be "yes".

Now, moving on to high end cards. Why would I want to consider a $1,000+ AMD flagship card if they couldn't sort out their $400 flagship card. Seriously think of that. Leading up to a huge release for both AMD and Nvidia they still don't have their drivers sorted. Again, in my opinion AMD has an ostrich sized egg on their face.


If AMD wants my money for 6000 series give me a 3080 competitor for 3070 price. Who am I kidding I'll probably still go with Nvidia. This $400 bug in my PC is still to fresh. I'm salty.
 
Last edited:
I have no complaints about my 5700xt, cost £375 when the 2070s was £600+ and never really gave me any issues apart from the Hotspot temps which was a completely new thing for me (moved up from a 1060 6gb), and which were quickly put under control. Great investment at the time and one I would do again given the chance, which seems to be coming now with the 3070 between £400-£500 - that is, assuming the benchmarks come back positive that it really is faster than the 2080ti in EVERYTHING, not just (fairly useless for me) Raytracing [emoji106]

EDIT: I see the point of gambling with AMD hardware/drivers but you also need to check for long term potencial, the 5700xt was barely on par with the 2060/2060s when it came out, now it’s trading blows with the 2070s, while the nVidia cards barely gained any performance overall in the same timeframe. I was told at the time I bought it (have no idea if it’s true or not), that this has been the trend with AMD GPUs for a while ?
 
Last edited:
I have no complaints about my 5700xt, cost £375 when the 2070s was £600+ and never really gave me any issues apart from the Hotspot temps which was a completely new thing for me (moved up from a 1060 6gb), and which were quickly put under control. Great investment at the time and one I would do again given the chance, which seems to be coming now with the 3070 between £400-£500 - that is, assuming the benchmarks come back positive that it really is faster than the 2080ti in EVERYTHING, not just (fairly useless for me) Raytracing [emoji106]

EDIT: I see the point of gambling with AMD hardware/drivers but you also need to check for long term potencial, the 5700xt was barely on par with the 2060/2060s when it came out, now it’s trading blows with the 2070s, while the nVidia cards barely gained any performance overall in the same timeframe. I was told at the time I bought it (have no idea if it’s true or not), that this has been the trend with AMD GPUs for a while ?

Software has probably been optimised for the underlying nVidia architecture. Looks like AMD code their drivers to mitigate those calls that were slow...This is where AMD is in a pickle; on the consoles the software is developed and optimised for their (AMD) hardware. You can see that a Playstation or XBox run faster than if you used equivalent hardware spec PC.

I might look at AMD graphics cards if there half-precision capability is good, coupled with a good amount of RAM...
 
EDIT: I see the point of gambling with AMD hardware/drivers but you also need to check for long term potencial, the 5700xt was barely on par with the 2060/2060s when it came out, now it’s trading blows with the 2070s, while the nVidia cards barely gained any performance overall in the same timeframe. I was told at the time I bought it (have no idea if it’s true or not), that this has been the trend with AMD GPUs for a while ?
Depends on if your're drinking their juice. THere have been some improvements on some titles over time due to drivers, sure. Some significant, others, none/not so much. I don't recall it consistently jumping up an entire SKU, however. Fine Wine turned into Mad Dog 20/20 for some. :(
 

They surely cannot come out with a card that is on par with the 3080 but more expensive? It just makes the 3080 the obvious choice of that’s the case. They have to come in under the price of all of the Nvidia cards at each level. They have to do the same strategy as they did with Ryzen, undercut prices and sell 90-95% of the performance for 80% of the price. If they don’t then it’s a green team win again.


 
True BT&D, until AMD can take back the performance crown with proper efficiency/power/noise and a better/more consistent software experience they are going to have to play the value game.
 
From what I've watched so far it's aiming at being faster at 1080p/1440p but slower at 4k (because Ampere architecture), but AMD's sharpening works so well even with the current gen that it will likely make almost no difference visually for the average user 1440p->4k. I don't think anyone is expecting it to be on par with RT but the usual lower price (fingers crossed) should more than make up for it ?
 
From what I've watched so far it's aiming at being faster at 1080p/1440p but slower at 4k (because Ampere architecture), but AMD's sharpening works so well even with the current gen that it will likely make almost no difference visually for the average user 1440p->4k. I don't think anyone is expecting it to be on par with RT but the usual lower price (fingers crossed) should more than make up for it ?

I guess that gamers won't complain as most of them stick with 1080p and a really low % is going for 4k. The same as with RX5600/5700/XT, AMD can have pretty good sales in this price range. I guess that RTX3060/Ti is what most want and something at a similar price/performance from AMD.
 
I'm already looking forward to the arguments on another forum if this rumour/leak/whatever is true. Could this be in part why AMD are (apparently) going high VRAM? It'll make direct comparisons more difficult.

It was also interesting that when Turing was launched, everyone expected Ampere to blow it out of the water (for RT), and AMD to jump in at 2nd gen nvidia levels. Have people finally accepted that RT is not easy and will come at a performance cost? It will still be interesting what happens on the AMD side, given that consoles will have some version of that with RT.

I guess that gamers won't complain as most of them stick with 1080p and a really low % is going for 4k. The same as with RX5600/5700/XT, AMD can have pretty good sales in this price range. I guess that RTX3060/Ti is what most want and something at a similar price/performance from AMD.

I have to wonder about 4k, it has been widely available on TVs for some time now, and even consoles are supporting it to some level. It wont satisfy the "high fps" crowd maybe but for many other types of games 4k does make enough of a visual impact difference over 1440p it shouldn't be ignored.
 
It was also interesting that when Turing was launched, everyone expected Ampere to blow it out of the water (for RT), and AMD to jump in at 2nd gen nvidia levels. Have people finally accepted that RT is not easy and will come at a performance cost? It will still be interesting what happens on the AMD side, given that consoles will have some version of that with RT.


I have to wonder about 4k, it has been widely available on TVs for some time now, and even consoles are supporting it to some level. It wont satisfy the "high fps" crowd maybe but for many other types of games 4k does make enough of a visual impact difference over 1440p it shouldn't be ignored.

I think AMD expected to be on par with Turing at ray tracing with RDNA2 but I also think everyone expected Nvidia to double ray tracing performance, which they haven’t done. So I think ray tracing may be less of an issue for people when they look at buying a new card. I know it is for me now, if Ampere was incredible at RT I wouldn’t have looked at AMD honestly.

4K definitely shouldn’t be ignored, but it’s certainly getting far more attention than it probably deserves, especially when it comes to pc gaming. I know laptops skew the figures, but when you look at the steam hardware survey. Over 65% of people are on 1080p, only 7% are on 1440p and a very small 2.2% are on 4K. That will change in the coming years but we are probably looking, at least 5 years for 1440p to reach the same levels as 1080p, let alone 4K. I know I have a good 1440p screen and I cannot see me upgrading that for 4/5 years as I don’t need anything bigger than 27” and a 4K 27” isn’t worth it IMO.

Which also brings me onto DLSS. Amazing technology, but is it needed? With the majority of people on 1080p or 1440p you could argue it isn’t needed at all. My 5700XT pushes on average 120fps @1440p at the moment. When I do finally upgrade to 4K the cards that are out will do it natively at 120fps at least.

Just seems Nvidia have spent a lot of time and money on features that may or may not be needed, instead of spending all that energy on improving basic performance. Could leave the door ever so slightly open for AMD to sneak in and offer better 1440p performance and possibly match 4K performance. Only two weeks till we find out if this is going to be another vega/Radeon7.


 
Which also brings me onto DLSS. Amazing technology, but is it needed? With the majority of people on 1080p or 1440p you could argue it isn’t needed at all. My 5700XT pushes on average 120fps @1440p at the moment. When I do finally upgrade to 4K the cards that are out will do it natively at 120fps at least.

Just seems Nvidia have spent a lot of time and money on features that may or may not be needed, instead of spending all that energy on improving basic performance. Could leave the door ever so slightly open for AMD to sneak in and offer better 1440p performance and possibly match 4K performance. Only two weeks till we find out if this is going to be another vega/Radeon7.

I agree with most of the post, but this part I think is more open to debate. I'd argue that AMD have not tended to push the feature sets. They are quite happy to let other companies drive the way before adopting it themselves in some way. I don't know if this is by choice, or by necessity if they don't have the resources to effectively do so. Features don't happen with just hardware or software support. Both are needed, and someone needs to break that cycle. That someone is not AMD. They let nvidia drive DXR into the mainstream and will benefit off that work. DLSS is a bit more arguable I guess, but it offers gains across the board. If it means I can run 4k output at 1440p hardware requirement levels? Wouldn't say no to that!

I wonder if low 4k adoption is in part again due to hardware limits. It is not so much the display, it is not the CPU. The GPU hardware to drive it well hasn't quite been there. As higher performance affordable GPUs come out like 3070, 4k will become more viable to more people.

BTW My main gaming system is a 1440p 144 Hz display. I'm using that more often than not, but I also game on a 4k 120Hz TV, which I need to get a 30 series in order to drive above 60 Hz at 4k. For some games where 60fps is fine, it is no problem, or I can drop down to 1440p and put up with the increased blurryness to get higher framerates.
 
Skip to 16m50s


I saw this earlier. Fairly interesting and i think it is a legitimate leak. Can’t remember where I saw it, but there was also a leaked frequency table showing all this information. AMD then pulled the AGESA code from the website that leaked it and it disappeared. So it looks like they didn’t want the information out there.


 
Back