• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FEATURED nVidia Ampere (3000-series) GPU Rumors and Discussion

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
380W on a 2080Ti standard PCB and a reflashed bios. :p

Most you can 'safely' pull on dual 8-pins. And man, it's hard to get rid of that heat if you push it.

With an A card you can flash a 600W BIOS on it and that's just got the standard 8/8 pin connector. People use it and draw 600w over those connectors and it still does fine. Those 75/150w ratings for the connectors are totally false and complete BS. A real 8 pin connector with 16 AWG wire can handle 360w per connector. A 6 pin can handle 240w per connector.

https://xdevs.com/guide/2080ti_kpe/#cbios
 
Last edited:
No they dont support 'pretty much every freesync monitor'. Actually only maybe 5 - 8% of freesync monitors actually work with G Sync. To phrase it in another accurate way, very few freesync monitors will support G Sync correctly. Just because you can get G Sync to say active in Nvidia CP doesent mean its working right. In some cases it says it's active, but in fact it does absolutely nothing at all. IN other cases, it says active, but it causes the monitor to drop frames so really it's just modifying the frame rate of the monitor through frame skipping which is NOT G Sync or anything even close to it.

There's multiple parts to the question here. If the question is if FreeSync is as good as full G-sync, the answer is/was generally no. G-sync has a much higher set of image quality requirements. Thus, the officially "supported" FreeSync are G-sync Compatible certified by nvidia. For the non-certified displays, by and large they do work, within the limits of the monitor's implementation. Open question: does any one know if there are monitors that work with an AMD card but not similarly with an nvidia card?


With an A card you can flash a 600W BIOS on it and that's just got the standard 8/8 pin connector. People use it and draw 600w over those connectors and it still does fine. Those 75/150w ratings for the connectors are totally false and complete BS. A real 8 pin connector with 16 AWG wire can handle 360w per connector. A 6 pin can handle 240w per connector.

There's a world of difference between what something is rated at, and what it can do in practice. That doesn't make the rating BS. The rating will take into consideration multiple factors including reliability, ageing, a good safety margin. That you can get away with running more doesn't mean it is a good idea to do so.
 
I'm also interested to see what kind of cards this new series will bring. I think NVIDIA managed to get away with murder with Turing. It was a very lackluster series. Price increase on every model just because of a self-hype feature that is supported on maybe 10 games in total. 2080ti 35% stronger than 1080ti, yes, but comes at a higher price point, don't forget that. Yet the GPU situation is so absurd that AMD only managed to put out a card that could compete with NVIDIA's previous flagship (5700XT vs 1080ti), and failed to even do that because of driver issues or bad batches or whatever they call non-functioning cards nowadays. I have two PCs, one with 1080ti, rock stable. The other with 5700XT, green screens once a week.
 
It has triple 8-pins.

Which is what Nvidia should do if they truly think they will need it. Creating a new connector is stupid and just means more god damn connectors in an industry that already has WAY too many different connector standards. There is absolutely no reason whatsoever to create a new connector. Just add on a 3rd 6pin plug if they truly think they need it. If they do create a new connector, I hope PSU manufactures give them the middle finger by just not making any PSUs with it. That would be fantastic because then no one would buy the thing.
 
Last edited by a moderator:
Which is what Nvidia should do if they truly think they will need it. Creating a new connector is retarded and just means more god damn connectors in an industry that already has WAY too many different connector standards. There is absolutely no reason whatsoever to create a new connector. Just add on a 3rd 6pin plug if they truly think they need it. If they do create a new connector, I hope PSU manufactures give them the middle finger by just not making any PSUs with it. That would be fantastic because then no one would buy the thing.

Agreed. why changing something that already works.

And such power is needed only for topnof the line hungry GPUs, niche...
 
maybe Nvidia is getting ready to start selling PSU's
 
With COVID-19 supply chain shortages world-wide, any launch will closely resemble a paper launch and finding stock any place will be difficult.

Given they never gave a date anyway, if availability is going to be too big a pain point they could put it back. That's not to say there wont be some shortage when it comes out. To have enough day 1 stock will be hard to predict and shortages when something is new is not unusual.
 
I've seen that as well.. but, the numbers don't seem to add up to me... at a high-level here... we have a 250W 2080 Ti that is (in FE form) ~45% faster than a 5700XT (225W reference). AMD is bringing an updated/new arch to the table on the same node, but perhaps a tweaked process. To reach flagship levels of NV from 2018, they are already 45% in the hole. If Ampre is just 30-35% faster (similar to 1080Ti vs 2080 Ti), that means the new card needs to be at least ~75% faster just to match Ampre. I don't recall seeing such a significant jump from flagship to flagship in one generation...and I think a 30% increase is tiny for what they are saying power use is going to be. Where does big Navi perform with this in mind? To reach ampre, is it going to be 300W? Does it need to run closer to the limit like the 5700 XT does?

That said, on the NV side... we're already at 250W with 2080 Ti. Ampre is a node shrink AND brand new architecture. With this comes performance /W improvements out of the box. So if Ampre is to use 20% more power even after a node shrink and have IPC/architectural improvements... where will its performance land? Do we expect 30%? 50%? More?

That's how I'm thinking of things. :)
The GTX 1070 to RTX 2070 was only 26% average increase in performance at 1920x1080p, for a over priced $500.00 video card :mad:just for games only. LINK: https://www.techpowerup.com/review/evga-geforce-rtx-2070-black/33.html The GTX 1080 Ti to RTX 2080 Ti was only 26% average increase in performance at 2560x1440p for a large fee of $1200. LINK: https://www.techpowerup.com/review/asus-geforce-rtx-2080-ti-strix-oc/33.html

This is interesting specification about the PCI E power cable.

"The PCI Express 2.0 specification released in January 2007 added an 8 pin PCI Express power cable. It's just an 8 pin version of the 6 Pin PCI Express power cable. Both are primarily used to provide supplemental power to video cards. The older 6 pin version officially provides a maximum of 75 watts (although unofficially it can usually provide much more) whereas the new 8 pin version provides a maximum of 150 watts. It is very easy to confuse the 8 pin version with the very similar-looking EPS 8 pin 12 volt cable." link: http://www.playtool.com/pages/psuconnectors/connectors.html#pciexpress8

So with both 8 pin connectors at 150 watts x 2 = 300 watts, plus 75 watts for the PCI-E connector = a total of 375 watts for the video card.
 
Last edited:
Yep.. thats the spec. There's also the rumor about the new 12-pin for ampre...but not sure if it's real, if it's only on a datacenter part or what. So many rumors... lol.
 
I've seen that as well.. but, the numbers don't seem to add up to me... at a high-level here... we have a 250W 2080 Ti that is (in FE form) ~45% faster than a 5700XT (225W reference). AMD is bringing an updated/new arch to the table on the same node, but perhaps a tweaked process. To reach flagship levels of NV from 2018, they are already 45% in the hole. If Ampre is just 30-35% faster (similar to 1080Ti vs 2080 Ti), that means the new card needs to be at least ~75% faster just to match Ampre. I don't recall seeing such a significant jump from flagship to flagship in one generation...and I think a 30% increase is tiny for what they are saying power use is going to be. Where does big Navi perform with this in mind? To reach ampre, is it going to be 300W? Does it need to run closer to the limit like the 5700 XT does?

That said, on the NV side... we're already at 250W with 2080 Ti. Ampre is a node shrink AND brand new architecture. With this comes performance /W improvements out of the box. So if Ampre is to use 20% more power even after a node shrink and have IPC/architectural improvements... where will its performance land? Do we expect 30%? 50%? More?

That's how I'm thinking of things. :)
It's unfortunate but I agree with you. As in the previous 4 years, AMD will have a hard time competing against NVIDIA in the high-end market. What they can do, is win the mid-range market. The 2060s and 2070s are in my opinion horrible value cards for the mid-range. 3060 and 3070 will probably be bad mid-range cards too. However, even if some miracle happens, and AMD releases a 2080ti-like card at 3060 price point, I wouldn't buy it within the first few months of release. 5700XT is pretty much 1080ti at 2060s price, and it's still a card I would never recommend to anyone. Driver issues? Bad units? I don't care what they want to call it, but as someone who experienced it firsthand, I was very disappointed
 
Oh I think they'll be a lot closer than they now... which is great news for everyone. RDNA2 and big navi should be their first high end card in a few years. :)
 
Oh I think they'll be a lot closer than they now... which is great news for everyone. RDNA2 and big navi should be their first high end card in a few years. :)
Here's hoping it's not competing against 2 year old hardware as the benchmark!

 
New GPU appeared in userbenchmark with core 2100, memory 19k gbps, title memory 10gb. 2080ti for comparison is 14k gbps memory. Might be pretty good. The 1080ti for comparison is at about 11k gbps and the 2080ti is roughly 30% faster than it. If we follow the same trend, this new unknown GPU is gonna offer a nice leap. I wonder if it's the 3080, and if so, how much it's gonna cost.
 
Just been doing some reading on my dinner at work and I came across this.


https://www.google.com/amp/s/www.tw...0-40-50-faster-than-2080-ti-for-1399/amp.html

Nothing in it that hasn’t been rumoured before by other people. What does annoy me though is the way they speak.

They’re like $1400 for the 3090 what a great price, the 2080ti was $1200 but only had 11GB of ram and this new card has 24GB and is 40-50% faster.

I just wish the tech media would stop this. New tech is meant to be better than old tech. That doesn’t mean it should cost more, replacing like for like should be the same price apart from outside factors such as inflation which haven’t been 10% in two years.

I just really hope this doesn’t turn out to be true as we could easily see $1000 3080’s and $6-700 3070’s. I was targeting a 3070 as an upgrade card when it releases but if it is above $500 I just don’t think I can justify it. Last time round the 2070 was already a rip off and a massive jump from the 1070 in cost, from roughly $400 to $500.

Its like they are just trying to push people to the consoles if it is true of course.

We shall just have to hope AMD come out fighting. Even if they don’t have the best they have to be aggressive in pricing. AMD cannot release a card that is as powerful as a console for more money. Which gives me hope we shall see 2080/2080ti performance for $500. But I think that’s a dream and pc gamers shall have to bend over and take it if they want the new stuff.


 
A 3070....2080Ti performance for half the price? Where do I sign up? :)

Being more serious, I agree with the sentiment. GPU prices are getting crazy. Turing was a huge jump in price... I'm hoping Ampere isn't going to be quite as big of a kick to the pants.
 
Last edited:
A 3070....2080Ti performance for half the price? Where do I sign up? :)

Being more serious, I agree with the sentiment. GPU prices are getting crazy. Turing was a huge jump in price... I'm hoping Ampere isn't going to be quit as big of a kick to the pants.

Don’t get me wrong, it is great that a 3070 may have that kind of performance but like you say. Those prices just seem to keep rising.

Not too bad for you guys across the pond. $500 isn’t too bad when your wages are on average higher than the U.K. But for whatever reason tech seems to be priced 1:1 so if it is $500 it will be £500 which is silly when Dollar to Pound is 0.75 right now.


 
Back