NVIDIA Says No to Voltage Control

It has come to our attention that NVIDIA is making sure that there will be no voltage control for their GK100 line up. In a forum post on EVGA forums, EVGA employee Jacob (EVGA_JacobF), responded to question about why a new GTX 680 Classified shipped without an EVBot port.

Picture Pulled from the Original Post

Picture Pulled from the Original Post Courtesy Sticky622 @ EVGA Forums

The initial assumption by members was that this was just a manufacturing defect that accidentally passed QA. However, here’s Jacob’s response to the initial question.

“Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution.”

Of course, many members were wondering about why EVGA had decided to remove the signature feature of their flagship GTX 680, in which Jacob responded with…

“Unfortunately we are not permitted to include this feature any longer.”

So, they are not permitted by NVIDIA to include the feature. NVIDIA is the only entity that could “not permit” them from doing something on their own product. Then, members asked the “Why?” question once again to try to coax a less vague answer to this situation and they are supplied with…

“It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device.”

From this quote, it’s obvious that NVIDIA does not want their partners to supply any means of voltage control with the GK100 series of GPUs. This is a slap in the face to many of the enthusiasts and everyday overclockers who enjoy pushing hardware for that extra performance. That leaves the extreme, warranty-voiding modders that hardmod their GPUs with the ability to increase voltage for the Kepler cards and have a stress-free overclocking experience

The only fault of EVGA throughout this process of removing EVBot ports is that there wasn’t an official announcement before cards without EVBot ports were shipped or listed on their site. Also, an EVGA forums member pointed out that the picture of the GTX 680 Classified in their product section seems to have the EVBot port blacked out using something like MS Paint. So, from the outside looking in it looks like EVGA was trying to hide the fact that the GTX 680 Classified will no longer have EVBot support and hoping no one noticed.

From EVGA's Product Page

From EVGA’s Product Page

We sent an email out to EVGA for some more details about why this happened and some Classified specific questions. Jacob was able to answer the Classified specific questions.

Will you be reducing the price of the GTX 680 Classified?

“No plans at the moment.”

What makes your Classified worth the premium being charged if it is now limited to NVIDIA’s (low) Power Target limits with no additional voltage control?

“Higher power target, better cooling, higher out of box performance, binned GPU, superior voltage/power regulation, 4GB memory.”

Doesn’t that make all the records Vince keeps setting kind of worthless for anybody but him and EVGA? Not that they aren’t astounding and take a ton of skill, but if only he has access to cards that can do it, what’s the point?

“Anybody can do the same, just need to have the expertise like he does to modify the cards manually (this will VOID warranty though)”

Now, there may be ways of getting around even this if you can do it. Will EVGA be willing to supply a diagram / explanation for making your own EVBot port or directly soldering on the EVBot lead?

“Not from EVBot, but there are other documented ways to override voltage, again this will void warranty though.”

The questions that could not be answered were “Why is NVIDIA doing this?” and “Are they [NVIDIA] experiencing an increased level of RMAs? …as in, does real voltage control kill Kepler GPUs excessively fast?” and it would have been nice to to know the answer to these. Only NVIDIA knows exactly why they are holding back the potential of their GPUs by limiting the cards so much.

All this information makes it seem like it’s just a matter of time before NVIDIA snuffs out other voltage control features from other manufacturers. We know MSI and Galaxy have been having trouble getting NVIDIA to budge on allowing voltage control. ASUS has their GPU Hotwire feature, which can control GPU voltage when combined with their high-end motherboards (simliar to EVGA’s EVBot). I haven’t heard or read anything about ASUS removing hotwire for NVIDIA cards, but it looks to be inevitable. We’ve sent an email to our contacts at ASUS asking about this and we’ll update with any information we get from them.

So, the AIB partners are not to blame here, it’s all NVIDIA.

- Matt T. Green (MattNo5ss)

Tags: , , , , , , , ,

94 Comments:

txus.palacios's Avatar
Oh, well, back to ATi then.

If they want to stop us from overclocking, I might aswell stop me from buying their products.

Sadly, if this "no overclocking" policy is nVidia's new way of thinking, these Fermis will be the last green thing I unbox.
TimoneX's Avatar
Lame move nv.

Just reminds me of the old days with chopped surface traces to disable SLI on nv chipsets and drivers that wouldn't allow SLI if certain chipsets were detected.
EarthDog's Avatar
Well, no overclocking may be a bit harsh, LOL... But its incredibly dumbed down and easier compared to AMD now...

Nvidia OC in a nutshell: Increase power limit slider to the highest it will go, increase fan speed to keep temps under 70C, push the core and memory clocks. The voltage is limited and goes to the max regardless. Done. They just limit the voltage that can go to those GPU's.
trekky's Avatar
same for me. im the builder for all my friends and family.
my last build i did 2 560ti ing SLI and did a nice (safe) overclock
and the only reason i was only getting ATI was the bitcoins but thats ending soonish so i guess i have another good reason to stay now........
p.s. Nvidia is crazy!
hokiealumnus's Avatar
Hah, NVIDIA is clamping down across the board. Looks like MSI got caught trying to circumvent NVIDIA's restrictions too: http://www.tomshardware.co.uk/MSI-GT...ews-40278.html . Different situation, same end result - reduced overclocking for the consumer.
EarthDog's Avatar
Oh yes, this isnt only for EVGA...

I can confirm that other partners went to Nvidia asking for MOAR voltage and, AFAIK, were turned away at the door. MSI was one of them with the 680 Lightning (which I believe needs an unreleased bios and version of AB in order to get past the 1.21v limit).
bmwbaxter's Avatar


That is all I have to say.
PolePosition's Avatar
I think WARRANTY is the big key determining factor here, as I'm sure there have been several warranty claims on modified cards, which Nividia honored to keep face, but their focus shift may be on trying to eliminate the optional mods, where when a card fails, the user just simply removes the mods and returns the card in stock condition to get the warranty claim. I can see where that might add up to a significant profit loss in the grand scheme of things, where consumers get two or more cards for the price of one over the warranty period, but at the same time, its driving enthusiasts to the other side with the changes and limitations. Nividia holds its ground on being the fastest yet most expensive while AMD provides comparable performance sold in larger quantities and being more user friendly. Nividia sounding more like another very big company.
Convicted1's Avatar
Buttttt... In reality NVidia isn't the fastest anymore.

The 7970 ROFLstomps it in almost all benchmarks...

Anddd... In the end... That's really who we are talking about here... Benchers.
hokiealumnus's Avatar
I think it should matter a bit. AMD sells their GPUs to AIB partners. Said partners can do whatever the heck they want to them, as they OWN the GPUs.

NVIDIA sells their GPUs to AIB partners. Then they say 'if you want to use our GPUs, you have to do so as we tell you. If you don't, we just won't sell them to you. Neener neener boo boo.'

Which strategy do you like better?
moocow's Avatar
The main thing is that nvidia doesn't want people overclocking their cheaper cards to match the performance of the top end cards. Why would someone want to purchase a 680 when they could buy a 660 or 670 and overclock it to the same levels?

As far as the 680 being limited, I'd say that it has to do with the next-gen cards coming out soon and Nvidia wanting people to buy the latest and greatest (that is limited) rather than stick with the older generation card and overclock it.
PolePosition's Avatar
I guess it depends on how much of the market share you own. When you're the big dog, you can afford to dictate policy to those reselling your products.
The question I would ask is, in the case where one of these 2nd and 3rd party vendors sells a card and it craps out, who is footing the bill in the end? Lets say you buy an MSI lightning with a Nividia chip and it craps out. Of course you would return the card to MSI for a warranty claim, but does MSI in return get their money back from Nividia if the failure is chip related?
Convicted1's Avatar
I look at this a bit differently...

Sure... You could buy a 670 and make it outperform the 680 @ stock clocks... But the 680 when OC'd should be able to reach higher clocks than the 670... Thus the 680 is still superior.

Unless I'm just completely lost in how the new 6 series cards work that is.
EarthDog's Avatar
EXACTLY. Some are looking at this from a different perspective. Not to mention, people who physically modify high end cards are what, one in a million? That is not affecting their bottom line as much as moron who thinks the card doesnt work but forgot to plug in both power connectors. Seriously, compare ID10T errors to the people who modify cards...
Convicted1's Avatar
Yes. That's exactly how it works.

MSI passes the buck up to NVidia...
thideras's Avatar
I don't see how this is any different than Intel or AMD does their RMAs. If you overvolted the processor, beat on it hard, and it dies, how would they know? It all comes down to being honest. What I would like to know is how many actually killed their cards through overvoltage out of the total that got returned. It can't be that high, so this comes down to profits and bean counters. Someone inside nVidia saw this as a way to recoup money being "lost" through RMAs.

From seeing their past decisions, they are either going to renege on this really fast if there is enough "kick back" or stick with it because they are hard-headed like that. Once ATI drivers for Linux become a bit more mature and easy to deal with, I will probably switch if they want to continue this route.
EarthDog's Avatar
Perhaps I do not understand how warranty's work within companies...This excerpt was from the article Hokie linked a few posts up:

Why is MSI changing the voltage a problem for Nvidia? Does MSI get a kickback from NV for returning bad cards? I mean, I see this as MSI should shoulder the cost of making that change, not NV.
txus.palacios's Avatar
So, summing this thread up, nVidia has just lost 10 potential enthusiast customers (for now) due to their new policy.

Oh, just wait until this escalates...
hokiealumnus's Avatar
FWIW, I do not think MSI was "cheating". They just weren't doing what NVIDIA told them to do. Others were. I think they were doing consumers a favor, assuming it didn't cause problems, which it apparently could.
thideras's Avatar
That is a good point. I was under the impression that nVidia designed the chip and the reference board, but didn't actually build or distribute them. Maybe this is a reputation thing?
EarthDog's Avatar
Did it though? Considering 99.99% of users likely dont have a problem with it? How many benchers are there (The only group this affects)? I mean from a stock 1006Mhz to mid 1200's on average for air overclocks. Its nothing to shake a stick at. Perhaps 1300Mhz is in the cards with more voltage, but to what end? We all know the Ghz edition + new AMD drivers = the 680 and 7970 trading blows. The only reason the 7970 wins hands down is when its overclocked past a 680.

AMD goes to 1.2v(right?) and also end up on average mid 1200Mhz range. With 1.3v would it see 1300+? To what end there?
PolePosition's Avatar
I would agree totally with that. I'm assuming cranking up the voltage though does lead to possible frying of the chipset, which is where MSI looks to Nividia. It would be better if Nividia just sold the chips outright with no warranty to 2nd parties, leaving the 2nd party to do as they please and suffer the consequences, but then, I doubt companies like MSI would dare buy from them "as is", leaving them in the dark even when it has nothing to do with adding more voltage should a chip fail.
Bobnova's Avatar
I've seen a few builds by non-benchers that were using EVBots, for whatever that is worth.
hokiealumnus's Avatar
But aren't we the trendsetters? Granted, we're a small subset. But if you want to see what a GPU is truly capable of. What it can do with no limitations at all; where do you look? Extreme overclockers and/or web sites that show off what GPUs are capable of. Sure, most people won't take advantage as you say, but the end is showing off. Showing that you're better than the other guy dangit and you'll do what it takes to get there.

As it stands, for the 99%, with the tools they have available, the overclocked HD 7970 is better than the overclocked GTX 680 simply as a result of these limitations. If they'd give voltage control (and the GPU can take it, which might well be the problem here; I don't know), the GTX 680 could be better.
EarthDog's Avatar
I agree with you 100% Hokie...

Signed,

Devil's Advocate.

All I can say about that is... . I bet they had an X79 build with a 3820 too... . GAH the irony kills meh!
Convicted1's Avatar
Yes... But they are doing it for no purpose other than to have more toys on their table.

Anddd... If they are truly using the EVBot for what it was DESIGNED for and doing so under air or water cooling... They are idiots and deserve to fry their cards and be out $600.
wagex's Avatar
agree hokie these few guys are the ones who do the reviews that everyone reads and gets their facts from right?
PolePosition's Avatar
Not me, I'm all STOCK, just like PRO STOCK on the race car scene

I guess that makes me the ideal Nividia customer, and target audience.
I see comparisons between the 7970 and 680, but what about the 690?
Lets do a comparison of Nividia's and AMD's flagship cards and see who comes out on top. I'm talking best of the best here, with comparable specs such as the same VRam, etc.

I'd be curious to know if that 680 you're running has the EVbot connected

EDIT: I'd also be curious to know what percentage of Nivida sales is related to OCF members. 10%?, 20%?, 50%?
EarthDog's Avatar
PRO stock are HIGHLY modded in the 'race car scene' FYI.

The 690 has nothing to compare to this second.. its a dual GPU card vs single GPU 7970 (I suppose 2 7970's vs one 690 is valid). The 7990 is supposed to come out soon? Perhaps then it would be a more like comparison.

My 680 CANT have an Evbot because its not an EVGA brand 680.

Im just not sure what that has to do with this thread though...LOL! I think we are a bit off the path here.

Id imagine WELL less than .001%...not that many members here compared to other sites, and between ALL sites, OEM's rule. That number is WAY WAY WAY smaller than you think.
Bobnova's Avatar
What percentage of Nvidia sales are OCF, or what percentage of OCF people buy Nvidia?

I run an nvidia daily, a 660ti I was sent for review.
It replaced (for 24/7 stuff) a GTX580 I bought (used) for benching purposes, as it's the top card for 3d01.

EDIT:
I wouldn't say Pro Mod is modified in the slightest. Every piece in that thing is specifically built for racing, there's no "stock" there to start with :P
wagex's Avatar
so the 7990 vs the 690? theyre actually prettymuch neck and neck but the 7990 seems to pull ahead for the most part @ 5760 x 1080
according to a couple benches out there but idk
http://www.hardwareheaven.com/review...rclocking.html
hokiealumnus's Avatar
ASUS cannot issue an official comment on whether GPU Hotwire will remain on their cards right now. What they will say that "ASUS currently is maintain[ing] channel availability of all of its current GeForce GTX series of cards."

Considering GPU Hotwire required soldering anyway and they're just blank solder points on the PCB (albeit very handily labeled), I think that might mean they plan on keeping it. Is that how you guys read that?
Bobnova's Avatar
I read Nvidia as saying "Any voltage over X must require soldering".
PolePosition's Avatar
My point is you don't buy a Pro Stock car at a dealership no more than you buy a custom build at a Bestbuy. Of course they're modded compared to dealership vehicles. duh.
No turbocharger (overclocking), nitros oxide (LN2), yet all engine. Just a metaphor.
A top fuel dragster would be more like your typically overclocked, water loop rig maxed out.
Yep, you baited me, hook, line, and sinker
Has a Nividia card ever been reviewed here? Like the latest 660Ti? or better yet, the 680?
wagex's Avatar
.....yes? look on the front page? lol theres three of them. the super clock and direct cu and power edition
http://www.overclockers.com/asus-gtx...cs-card-review
top fuel dragster would be more like LN/2 not water.
Bobnova's Avatar
Top fuel = CPUz benching on LN2. Maaaaybe SPI1M.

Lots of Nvidia reviews here. Plenty of AMD reviews too. OCF does pretty much everything.
Robert17's Avatar
There have been something like 365 million PCs sold worldwide so far this year (Gartner Research) of which most (this is a complete and total guess) are probably business desktops with on-board graphics chips. Another unknown but probably large number are Lenovo, Dell, HP, and who knows else's name brand with on-board graphics sold at Wally World or other. NV and ATI provide most of those. Willing to bet that less than .002% of the folks that buy these computers even know that it has a graphics chip, much less that it takes some amount of elecrical voltage to make it work.

My point is this. I don't think discrete graphics cards make up the kind of numbers, such as 365 million, that should get NV or ATI excited about in the big scheme of things. The point about bragging rights is valid. I'd think that any card manufacturer would like to see enthusiasts smoke 'em, bend 'em, and break 'em while doing a double back flip. And include it in pictures on the box. So why NV would limit an enthusiast's fun and potential seems to be a marketing mistake to me.

Certainly there are limits which a puff of smoke will certainly sort out. Kind of like any hobby that pushes the limits. Stuff breaks. What do they care ? They just sell another one. They actually don't have to honor any warranty related to abuse. Ask your insurance company about it the next time you try to jump the Grand Canyon in a stock mini-van; both they and the manufacturer will teil your widow it's not their problem.
bmwbaxter's Avatar
Lots of Nvidia cards have been review here, although I believe they have all come from AIB partners, not from the mouth of the beast itself.
PolePosition's Avatar
Very simple question, as stated.

How can one hate Nividia so much when they've given you a $600 graphics card? I can certainly understand why you'd want to switch in the future though, as that is the direction I'm leaning towards as well come Christmas, when maybe I can get a 7970 on sale or some special.
wagex's Avatar
they are given for non-bias reviews just cuz they give some one something doesnt mean they have to like them. i dont see where you are going with this.
Bobnova's Avatar
Quite easily, really.

Even when you get to keep it you have to remain impartial. If the fact that they send you something sways your viewpoint you are a crap reviewer. It's as simple as that.
I really do not like Nvidia nor Nvidia's practices. The end for me was the GTX480's "launch" where the Head Dude held up a "Fermi" that was a chop-sawed PCB (right through some stickers) attached to a heatsink with wood screws. They haven't improved since then.
They make some fantastic cards, my personal feelings about the company doesn't change the fact that the 660Ti is a hell of a card.
If there's even the slightest hint that maybe we're biased we get jumped on by everybody for being a terrible reviewer.

To answer the previous question, as stated, maybe 0.00001%.
PolePosition's Avatar
Not what I asked, and the question was directed at Bobnova exclusively.
If you don't see, don't respond.
Bobnova's Avatar
As a sidenote, the GTX660Ti that Gigabyte sent me retails for $300.
Nvidia hasn't sent me anything and is not likely to, they'd far prefer that it go to someone that will give it a biased review due to it being free :P
wagex's Avatar
if i dont see what. that is exactly what you asked how can you hate some one even after giving you free stuff which then would make them bias. how about you chill for a little while.
PolePosition's Avatar
It all boils down to money and reputation IMO. Nividia has a great reputation for being one of the best in tech support, customer warranty claims, even when it might be the customer who abused the graphics card. Somewhere somethings gotta give, and rather than Nivida to stop honoring warranty claims, which would certainly affect their reputation, they shift the focus on product alteration ability where it prohibits and limits modification.

I'm sure Nividia also has a research and development department where they put cards through all sorts of stress tests and abuse to see what the limits are, and perhaps they'd rather not use that as a marketing ploy to encourage others to follow suit. They don't need consumers to do that for obvious reasons, that primarily being RMAs and profit loss. With electronics, it can often be hard to tell abuse from just plain ol failure. When someone drives off into the Grand Canyon, thats pretty obvious and intentional.
EarthDog's Avatar
I see your point... but Nvidia doesnt sell cards to consumers anymore (do they? Look at newegg I dont see Nvidia brand cards.. if I missed it let me know _ I only looked at 680's). All this stuff you are talking about is from their AIB partners (Giga, evga, galaxy, msi, etc)... Thats what I asked originally is HOW does the warranty work from the AIB to Nvidia side? Do the AIB's get a kickback for any failed units? I would imagine not... but..........

AFAIK - You buy an XXXXXX brand card, you return the card to XXXXXX and they take the hit. Can someone in the know shed light on this?
Convicted1's Avatar
Here's the way it works...

NVidia sells GPU Core to "X GPU CARD Manufacturer"...

X GPU CARD Manufacturer Puts said GPU Core on GPU card and sells to consumer.

GPU CARD Craps out...

X GPU Card Manufacturer looks card over upon return... If GPU CORE is found faulty... A note is sent to NVidia... You owe us one GPU Core.

Next shipment of stuff from NVidia comes with a credit for that faulty core on the bill.

ETA: This is the short version... AND... Note the difference between a GPU CORE... And a GPU CARD.
PolePosition's Avatar
If Nividia were not taking a hit, it would seem logical to not bother with removing the EVbot and prohibiting voltage changes, discontinuing support for such practices, especially since they don't sell graphics cards to consumers! They'd have nothing to lose!

Nividia however, should take a page out of the Intel business model. That is, provide both overclocking capability chipsets (k models) and fixed models. That way, they satisfy all parties and don't find themselves losing market share down the road.
EarthDog's Avatar
Yeah, not sure why I thought that looking g back. Thanks convicted for he clarity.

Regardless, makes sense to me to let the aib's float their own warranty on the cards they want to do that and absolve nvidia of all liability.
madman7's Avatar
I wonder what MSI will do with the one I have. Mine works now but what happens if it fails. I wonder if MSI will take the affected cards back and replace them with updated ones.
PolePosition's Avatar
That will be at MSI's discretion. If they can repair it, great. If not, then the replacement is probably what you can expect unless they still have some originals in stock. As time passes on, the latter would be expected.

So much for a next generation MSI lightning w/Nividia. That will probably only be available in the AMD form.
bmwbaxter's Avatar
They will be replaced with whatever they have available.
M33Cat's Avatar
this is madness

I overvolted my 1st 6950 in 2 games early on, but then stopped since I gained very little, then once I CFed I didn't need to OC
Pavin's Avatar
ATI all the way guys. Looking to build a new rig in an year or so. And, I'm pretty sure that I'll go with the next gen ATI's.
BTW, I have only used Nvidia, my whole life.

If they wanna mess with us our lives, we might as well mess with their goddamn lives. Just dont buy Nvidia anymore.

Hopes that the next gen AMD Radeons just leave the Nvidia card in the dust
Brando's Avatar
I can kind of sympathize with them on this. I work in a car parts place and it's annoying when people keep retuning defects that have obviously been abused like guys that try to use stock shocks and struts on lowered/lifted vehicles or taxi companies that warranty radiators once every couple months because they fail from being run 18 hours a day instead of maybe an hour a day like a normal person going to and from work once a day with a stop at the store here and there.
EarthDog's Avatar
Nvidia has commented on the subject...

Source: Brightsideofnews

So it looks like the its the companies that are choosing to be in warranty with Nvidia and not sticking their neck out and supporting the returns themselves. Understandable from a business perspective. Well, I mean take that side and AIB's side and Im guessing the truth lay in the middle?
PolePosition's Avatar
The proof is in the pudding. Nividia certainly does influence the AIBs and MSI by the simple tactic of money, in terms of warranty. No one wants to bite the bullet. I say one only has to boil it down to the source to see who is dictating policy. Really though, I can't imagine OVing accounts for huge amount of RMAs in the grand scheme of things, as I would tend to believe only a small portion of owners every bother to OV their cards.

Lets just hope this doesn't become a trend among all the super chip makers

What impact would that have on enthusiasts?
hokiealumnus's Avatar
Thanks for posting that ED. My $.02 just posted over @ XS:

EarthDog's Avatar
+1... edited my post before yours was up Hokie.

Well, the same effect this has on extreme clockers already...they went to AMD to get the 'globals' off the 7970 since its a lot easier to push that card with software voltage than to hard mod a 680 for most extreme overclockers.


Quite frankly, I wish the AIB's would 'man up' and on their top of the line cards, offer these things. Perhaps pass the additional cost on to the consumer? That doesnt make great business sense, as it segregates the tiny TINY extreme cooling market further and most of us cant afford a premium on top of the top end cards in the first place so... not sure. I wish that but at the same time the wallet Im sitting on is screaming Noooooooooooooooooooooooooooooo!
hokiealumnus's Avatar
Heh, with the Classified retailing for $630, I think it's safe to say they already passed the cost on to the consumer!
DarthGrantius's Avatar
Why don't they release a 680k edition that's unlocked? And charge an extra 20 bucks?
EarthDog's Avatar
Did they though? Prices didnt drop without that feature...Its $630 WITH a warranty (meaning Nvidia to EVGA). With the onus of warranty placed on EVGA, I would imagine that number to skyrocket. My guess is $100 or so to have it fully unlocked with AIB's supporting the warranty. But again, complete guess based on nothing.

$20 wont cover it (not knowing a darn thing about failure rates of the cards locked, then with extra voltage control).
Bobnova's Avatar
I'd say minimal difference for most enthusiasts.

Excluding the EVBot, all the software voltage controllable cards have had limits on the core voltage. Some in software, some in hardware. Even including the EVBot, as you needed a special EVBot firmware to EVBot the GTX580 classy over 1.3v.
Manufacturers with access to the Volterra / CHiL / Digi+ / IR datasheets could make special NDA software that could go higher, but it was just that: Special NDA software.
The Fermis had a bios lock on normal voltage control, you could edit the bios to get up to 1.21v but if you wanted higher you had to hardmod the card or use very special software.

I'd say benchers use the 7970 because the 680 is junk for HWBot benchmarking in all benchmarks but 3d11, and not that much better for 3d11 either.

Recently the AIB companies have been more open with their software, allowing higher limits before the NDA bits are required. This probably concerned Nvidia, so it was time for some incentive towards new limits.
It's also possible that over the long term Keplers are fragile to voltage where Fermi and older were not. This wouldn't surprise me at all really and would be an excellent reason for Nvidia to limit voltage controls.
A card that dies slowly due to voltage will kill the core, the part that Nvidia has to warranty. A card that dies quickly generally kills the ram or detonates a MOSFET, those are AIB problems (except reference, I don't know whos problem it is in that case. Foxconn's maybe) rather than Nvidia problems.

I think it's probably more along the lines of "If you allow more voltage one one (1) card, you get no warranties on ANY card".
That fits with how the companies are acting and also fits with what the PR flunky said.

In any case this should be a boost for AMD, which AMD will hopefully plow right back into R&D.
wagex's Avatar
thats what the classified is to begin with thats why it already costs like 100+ bucks more than your average 680
sysane's Avatar
yea but only means evga will not sell many of the classified cards as who in world pay more for a card thats no better than the stock 680
txus.palacios's Avatar
You're a little bit late to the party, ED already informed us about this. Thanks anyway.
Robert17's Avatar
This thread got some play at The Tech Report with attributes:


http://techreport.com/news/23678/nvi...voltage-limits
hokiealumnus's Avatar
NVIDIA response per Jacob @ the EVGA forums

The emphasis is mine. That tells me any voltage above what they set causes serious damage. If they're as high as they can without physically damaging the GPU relatively quickly, I think I'm ok with this answer.

It certainly doesn't seem like a CPU, where Intel will even warranty the thing for $25. If it was as simple as that, I think they might actually allow overclocking. Seems they needed to already push the GPUs as far as they could go, likely so they would beat AMD. At stock, of course.

Because AMD can apply more voltage, they can overclock farther and end up coming out ahead. There isn't enough headroom left in Kepler GPUs without causing damage it would seem.
hokiealumnus's Avatar
Heh, then there's the other side: http://www.bit-tech.net/news/hardwar...ing-partners/1

Thanks to Nelly @ XS for posting that one. Now you have both sides.
mjw21a's Avatar
If this is still true when I next upgrade, then I guess I'll be sticking with AMD cards.
wagex's Avatar
wow nvidia are kicking themselves in the butt.
bmwbaxter's Avatar
yes and no, I am sure there are lots of people out there that run their cards at stock. so to them no voltage control doesn't matter. It only matters for benchmarking and lower tier cards since for the most part I believe lots of people buy mid range then overclock to achieve stock performance of top of the line cards.

Personally, this won't affect my personal purchases much since I run my GTX 680 at stock...

It is more than enough for any game on the market without overclocking. I will continue to buy whoever has the best performance out of the box for my daily use. so it remains completely clear with my conscience if it ever dies. my benching stuff doesn't stand a chance in


EDIT: just to be clear, this post isn't singling you out for your opinion. just the one I responded to is all.
wagex's Avatar


i know, idk i mean if they cant put out pretymuch anything but reference all amd will have to do is be like ha ours is cocked higher and can oc furter. but i guess i see both sides as most people dont even know what an overclock is

p.s. we need a smaller version of that smiley lol
txus.palacios's Avatar
Then you should also ban me and... I think it was hokie who ran stock?

Most times my GPUs are stock too.
Bobnova's Avatar
This isn't a new thing. It's a return to the old stuff.

I continue to remind everybody that there was no software voltage for Fermis over 1.21v if you didn't have the (very) special, restricted use, software.
Many generations didn't have any software voltage control at all!
Convicted1's Avatar
THIS.

And...

The fact of the matter is... This affects NOBODY but us Extreme Benchers... Or approximately .0000000002% of nVidia's market.

Otherwise there is PLENTY of voltage available for normal people to blow up their GPU's on air... Or even water.

It's just that us Extreme Benchers have to go to the old school methods of volt modding the hard way.
hokiealumnus's Avatar
Yes, absolutely.

However, (you knew that was coming, didn't you?) the scope of this discussion has gone beyond the reason this article was written in the first place - the removal of EVBot. EVBot != software voltage control. IIRC, you could exceed 1.21V with EVBot on a Fermi GPU, no?

Yes, software was mentioned but the reason this is upsetting (to me, anyway) is that they've forced removal of EVBot as an option. Well, that and EVGA is keeping the price the same.
Bobnova's Avatar
I don't think Nvidia saw the EVBot coming on the fermi, they just locked the BIOS and left it at that. Assuming people wouldn't go over it I guess.
It could be said that there is software on the EVBot that interfaces with software on the GPU

I think that Nvidia forcing the removal of the EVBot is rather rude, I also think that EVGA leaving the price high is rather rude as well.

If we're believing Nvidia to an extent and believing that they really do warranty the cores they sell (lease?) to manufacturers this move makes sense to me. At least it makes sense if the Kepler core is very weak to sustained voltage.
It looks to me like Nvidia knows that the core will die if over-volted for any meaningful length of time and is trying to prevent that from happening. Ideally while hiding it.
ivanlabrie's Avatar
Sounds like that...Mid range Kepler card rushed in, overvolted and overclocked from factory to keep up with AMD big dogs at the moment, till they could tweak and launch their leaky big kepler.
hokiealumnus's Avatar
TiN has outdone himself this time, with a great guide to manually taking your Classified to the max: http://kingpincooling.com/forum/show...14&postcount=3

Warning: it is not for the faint of heart.
MattNo5ss's Avatar
I really want to learn electronic soldering...
Jacob says you can't just solder an EVBot connector in the empty spot, then TiN does it and it works

Also, the GPU Flasher is awesome! No more EVGA EVBot supported mobo required for flashing
hokiealumnus's Avatar
Jacob is PR, he can't say it because that's the company saying it. If TiN says it, he's an engineer helping out fellow enthusiasts who happens to work for EVGA.
ivanlabrie's Avatar
Thanks for sharing that!
Looks promising...
voodoo do'er's Avatar
buys gpus form NV
starts temp comp
adds over volting support
NV tells me to stop
tell then to kiss my butt I bought the gpus, they don't stop comps from making non reff cards
txus.palacios's Avatar
AIBs can add overvolt support, if they drop nV's warranty. That means, that if one of the cores dies, it's all the AIB fault. AIBs don't want that.
voodoo do'er's Avatar

most warrantys prohibit overclocking
so whats to lose. the owner broke the card, not the aib
txus.palacios's Avatar
How can the AIB prove the owner overclocked the card?
wingman99's Avatar
It looks like nvidia was having a problem with warranties so they needed to save some money from what i've been reading. Sounds like it was a problem that needed to be solved.

+1 How can they blame the customer if they don't tell that they overclocked the card.
voodoo do'er's Avatar
other way around
Robert17's Avatar
Here's a link to NV's and AIB's version of events, good read:
http://vr-zone.com/articles/nvidia-s...ons/17318.html
Leave a Comment