• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Thoughts on nVidia's future method of releasing GPU's?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Vishera

Member
Joined
Jul 7, 2013
So Pascal is, by far, the best GPU line released to date. It's extremely power efficient, extremely powerful, and some might argue that the outrageous price issues going on currently are justified by the performance you get. They'd be wrong, but it's a nice sentiment.

What I'm curious about, is what are your guys' thoughts on the future of the numbering model they use? For three generations, there was x50, x50 ti, x60, x60 ti, x70, x80, and either x90 or x80 ti. But with Maxwell, there was no ti "refresh" for the 950 or 960. And now, there's doubt as to whether or not there will be a 1080 ti due to the small gap between the 1080 and Titan XP, or if there will be a 1050. What do you guys think they're going to do? Do you think due to increased efficiency and performance they're going to ditch the ti versions of their cards, and/or ditch the 50 cards altogether? I personally don't see why they wouldn't release a 1050, if they don't they risk losing people to AMD and the RX 460. A 1080 ti would be waaaaay unnecessary and, even if the major show offs still buy Titan just because it's Titan, I think it's kind of silly to COUNT on that happening. Because if it doesn't.....then nVidia might find themselves having to cancel the Titan to not waste money.
 
I don't see why Nvidia needs to release a GTX 1050 when future games are going to be more demanding. GTX 1050 http://wccftech.com/nvidia-geforce-gtx-1050-ti-specs-performance-leak/

True, very true. But there's a form of gaming that doesn't require a POWERHOUSE GPU, yet still requires a dedicated one. If you play console emulators, specifically emulators of consoles using optical media, chances are that emulator requires a bit of "oomph" that an iGPU just can't offer. Other games, like League, DOTA, CS:GO, and Minecraft also benefit from a dedicated GPU, and if you mod or your CPU and therefore its iGPU is old enough, you might NEED one to get a playable frame rate. But you don't need a 1080 for those games and, really, for the percentage of it that would be utilized in that scenario a 1060, even the 3GB version is a bit pricey. I was one such person, still technically am really, and that's why I never bothered upgrading from the 750 I bought last year. Because a 50 sku card is more than enough, and although a 3GB 1060 would offer you the option of expanding your horizons should you desire to down the road....if you never do, you wasted money. A $150 GTX 1050 with 1, 1.5, maybe even 2GB of VRAM is perfect for the above scenario, and a great potential competitor for the RX 460. Right now AMD has the crown for GPU's designed for these low powered scenarios simply because nVidia is seemingly uninterested in a 1050.

The only reason I can see them being uninterested in doing this is because even a 1050 would be an impressively powerful card, compared to the 750 and 950 before it. And they don't want to put that sort of power under $200. Seems unlikely though. A 1050 being released just for the sake of people on older GPU's being able to upgrade to stay in support isn't necessarily a bad idea either.
 
I think they will release lower GPUs, in the same way there have always been lower end GPUs. They can't use previous generation cards to fill the low end forever, and need a new low end. Right now the sub-75W category (no PCIe power connector needed) really needs a refresh.

As for 1080Ti, I personally want it to happen. There certainly is enough gap between 1080 and Titan XP for it, especially if it offers XP for not much more than 1080 cash, it will be my next major update. They might also finally see some viable competition from AMD in the mid to high end by the time it is out, so can't rest.
 
I think they will release lower GPUs, in the same way there have always been lower end GPUs. They can't use previous generation cards to fill the low end forever, and need a new low end. Right now the sub-75W category (no PCIe power connector needed) really needs a refresh.

As for 1080Ti, I personally want it to happen. There certainly is enough gap between 1080 and Titan XP for it, especially if it offers XP for not much more than 1080 cash, it will be my next major update. They might also finally see some viable competition from AMD in the mid to high end by the time it is out, so can't rest.

I was thinking AMD was sticking to the lower end range with Polaris and that Vega wouldn't be out for a while. Might have been misinformed on that one. I think as we go on we'll see more and more energy efficiency and performance gains, until 1440p is the new standard that replaces 1080p. At that point, the low end cards will be so powerful that the Titans of today will be the closest equals, and the rest of the cards, even 80 sku cards, will be smoked in comparison. Eventually, 1440p will become what 1080p is today, and 4k will become what 1440p is today, and something will replace 4k as the ultra badass resolution, like 5k, or maybe even 8k. By the way, how crazy is it to think that just last generation, 1080p was still a stretch on low end cards as far as serious gaming and now the 1060 can max everything out?
 
AMD for now are sticking to low-mid range but Vega might be fighting with a hypothetical 1080Ti by the time it is out. Things will continue their move upwards, but outside of hardcore gamers, you don't need an upper card to play even modern games. No, you wont have Ultra everything but you can still achieve good enough framerates at good enough quality as long as the tech isn't too old.

Take my current gaming laptop... it has a 970M, which is roughly comparable to a 960 desktop. Do I expect 980Ti or 1070 performance like I do in my other desktops? No, but can it get me playable 1080p? Yes, and likely will for some years yet before the pain gets too bad and I feel a need to upgrade. My previous gaming laptop lasted about 5 years (AMD HD-era GPU) before it could no longer run latest games adequately. It still works perfectly fine for older games and less demanding modern titles. I've given it to a friend and I don't think it runs anything more stressful than LoL or WoW.
 
I'm really incredibly indifferent with their naming conventions. I stopped trying to predict and care a couple gens ago. :)



As far as the product stack, they will, as always, fill it out from bottom to top. In Kenrou's 1050 thread, we talked about the reasons why that is a viable product.

I guess at this point, well...for several years, I am a PC elitist. Meaning, I have a PC because I expect it to look a lot better than the latest consoles. I don't find turning many settings down to get 'playable' framerates an acceptable thing to do. I didn't invest in a solid [insert res here] IPS monitor to turn settings down. I like to run default ultra settings or bust. Typically, even before reviewing I would rock upper midrange or above.
 
They're not going bottom to top. It's all over the shop. It was more top down to start with, with 1080, 1070, 1060 6GB. Then bang, Titan XP, and down again to 1060 3GB. Rumours are 1050 isn't far off, with 1080Ti possibly after that.

On desktop, I guess I'm kinda value elitist. That is, I want high end, but not stupid price high end. You can argue where to draw the line isn't exact, but for example I'm not going to get the current Titan X in the same way I didn't get the old Titan X. The 980Ti was close enough same performance, but far cheaper. That's my position, and I'm hopeful for a repeat with a 1080Ti. Then again, I'm not finding the 980Ti lacking at the moment... running a single 3440x1440 ultra-wide.
 
In the reality of things I think the Titan is entirely unnecessary. Why they started doing it is beyond me, the ti sku cards should have all of the CUDA cores active, and still have that high clock rate, and just make it a bit more expensive. The Titan GPU line is the most un-justifiable, idiotic, and wasteful decision they've ever made imo.
 
The Titan isn't unnecessary, they're simply not targeted at gamers.
 
In the reality of things I think the Titan is entirely unnecessary. Why they started doing it is beyond me, the ti sku cards should have all of the CUDA cores active, and still have that high clock rate, and just make it a bit more expensive. The Titan GPU line is the most un-justifiable, idiotic, and wasteful decision they've ever made imo.

The reason is when they Bake a GPU wafer some of the Pascal wafers have a different yields of Dies, meaning there are defects so the Dies (Chips) are sorted accordingly the best they can and then some are purposely crippled also some are discarded because they can't be used. When they make wafers they save the premium dies and the ones with problems in certain areas. Some dies become less then premium and some become top tier, they run top to bottom in workability. That is the way it has always been done in the silicon industry.

Of cores just like in everyday life there is less dies that are functional for Titan X Pascal compared to plenty of dies for GTX 1060.
 
The reason is when they Bake a GPU wafer some of the Pascal wafers have a different yields of Dies, meaning there are defects so the Dies (Chips) are sorted accordingly the best they can and then some are purposely crippled also some are discarded because they can't be used. When they make wafers they save the premium dies and the ones with problems in certain areas. Some dies become less then premium and some become top tier, they run top to bottom in workability. That is the way it has always been done in the silicon industry.

Of cores just like in everyday life there is less dies that are functional for Titan X Pascal compared to plenty of dies for GTX 1060.

You know, I never knew that. So the only thing differentiating a die used for a 1060 and a die used for a 1080 or Titan XP is the possibility of some defects on the "1060 die" that make it fall below the standards it needs to meet to qualify as a die for the 1080 or Titan?
 
You know, I never knew that. So the only thing differentiating a die used for a 1060 and a die used for a 1080 or Titan XP is the possibility of some defects on the "1060 die" that make it fall below the standards it needs to meet to qualify as a die for the 1080 or Titan?
Sort of... There are different tiers of each silicon. Meaning, off the same wafer, one cant have a titan and a 1050. You can have a 1080ti and titan off gp102 (name of the core). You can have a 1080 and 1070 from gp104, and 1060 6gb and 3gb is based off gp106. But you cannot make a 1050/1060/1070/1080 from gp102/Titan silicon.

Yield rates will be different for each tier of GPU/silicon.

https://en.m.wikipedia.org/wiki/GeForce_10_series
 
You know, I never knew that. So the only thing differentiating a die used for a 1060 and a die used for a 1080 or Titan XP is the possibility of some defects on the "1060 die" that make it fall below the standards it needs to meet to qualify as a die for the 1080 or Titan?

A example to your question is the GTX 980Ti uses the same Die 601 (mm2) (GM200 code name) as the Titan X. Link https://en.m.wikipedia.org/wiki/GeForce_900_series Also what has been already said there are tiers.
 
Back