NVIDIA Says No to Voltage Control

Add Your Comments

It has come to our attention that NVIDIA is making sure that there will be no voltage control for their GK100 line up. In a forum post on EVGA forums, EVGA employee Jacob (EVGA_JacobF), responded to question about why a new GTX 680 Classified shipped without an EVBot port.

Picture Pulled from the Original Post

Picture Pulled from the Original Post Courtesy Sticky622 @ EVGA Forums

The initial assumption by members was that this was just a manufacturing defect that accidentally passed QA. However, here’s Jacob’s response to the initial question.

“Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution.”

Of course, many members were wondering about why EVGA had decided to remove the signature feature of their flagship GTX 680, in which Jacob responded with…

“Unfortunately we are not permitted to include this feature any longer.”

So, they are not permitted by NVIDIA to include the feature. NVIDIA is the only entity that could “not permit” them from doing something on their own product. Then, members asked the “Why?” question once again to try to coax a less vague answer to this situation and they are supplied with…

“It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device.”

From this quote, it’s obvious that NVIDIA does not want their partners to supply any means of voltage control with the GK100 series of GPUs. This is a slap in the face to many of the enthusiasts and everyday overclockers who enjoy pushing hardware for that extra performance. That leaves the extreme, warranty-voiding modders that hardmod their GPUs with the ability to increase voltage for the Kepler cards and have a stress-free overclocking experience

The only fault of EVGA throughout this process of removing EVBot ports is that there wasn’t an official announcement before cards without EVBot ports were shipped or listed on their site. Also, an EVGA forums member pointed out that the picture of the GTX 680 Classified in their product section seems to have the EVBot port blacked out using something like MS Paint. So, from the outside looking in it looks like EVGA was trying to hide the fact that the GTX 680 Classified will no longer have EVBot support and hoping no one noticed.

From EVGA's Product Page

From EVGA’s Product Page

We sent an email out to EVGA for some more details about why this happened and some Classified specific questions. Jacob was able to answer the Classified specific questions.

Will you be reducing the price of the GTX 680 Classified?

“No plans at the moment.”

What makes your Classified worth the premium being charged if it is now limited to NVIDIA’s (low) Power Target limits with no additional voltage control?

“Higher power target, better cooling, higher out of box performance, binned GPU, superior voltage/power regulation, 4GB memory.”

Doesn’t that make all the records Vince keeps setting kind of worthless for anybody but him and EVGA? Not that they aren’t astounding and take a ton of skill, but if only he has access to cards that can do it, what’s the point?

“Anybody can do the same, just need to have the expertise like he does to modify the cards manually (this will VOID warranty though)”

Now, there may be ways of getting around even this if you can do it. Will EVGA be willing to supply a diagram / explanation for making your own EVBot port or directly soldering on the EVBot lead?

“Not from EVBot, but there are other documented ways to override voltage, again this will void warranty though.”

The questions that could not be answered were “Why is NVIDIA doing this?” and “Are they [NVIDIA] experiencing an increased level of RMAs? …as in, does real voltage control kill Kepler GPUs excessively fast?” and it would have been nice to to know the answer to these. Only NVIDIA knows exactly why they are holding back the potential of their GPUs by limiting the cards so much.

All this information makes it seem like it’s just a matter of time before NVIDIA snuffs out other voltage control features from other manufacturers. We know MSI and Galaxy have been having trouble getting NVIDIA to budge on allowing voltage control. ASUS has their GPU Hotwire feature, which can control GPU voltage when combined with their high-end motherboards (simliar to EVGA’s EVBot). I haven’t heard or read anything about ASUS removing hotwire for NVIDIA cards, but it looks to be inevitable. We’ve sent an email to our contacts at ASUS asking about this and we’ll update with any information we get from them.

So, the AIB partners are not to blame here, it’s all NVIDIA.

– Matt T. Green (MattNo5ss)

Leave a Reply

Your email address will not be published. Required fields are marked *

Discussion
  1. Oh, well, back to ATi then.
    If they want to stop us from overclocking, I might aswell stop me from buying their products.
    Sadly, if this "no overclocking" policy is nVidia's new way of thinking, these Fermis will be the last green thing I unbox.
    Lame move nv.
    Just reminds me of the old days with chopped surface traces to disable SLI on nv chipsets and drivers that wouldn't allow SLI if certain chipsets were detected. :rolleyes:
    Well, no overclocking may be a bit harsh, LOL... But its incredibly dumbed down and easier compared to AMD now...
    Nvidia OC in a nutshell: Increase power limit slider to the highest it will go, increase fan speed to keep temps under 70C, push the core and memory clocks. The voltage is limited and goes to the max regardless. Done. They just limit the voltage that can go to those GPU's.
    txus.palacios
    Oh, well, back to ATi then.
    If they want to stop us from overclocking, I might aswell stop me from buying their products.
    Sadly, if this "no overclocking" policy is nVidia's new way of thinking, these Fermis will be the last green thing I unbox.

    same for me. im the builder for all my friends and family.
    my last build i did 2 560ti ing SLI and did a nice (safe) overclock
    and the only reason i was only getting ATI was the bitcoins but thats ending soonish so i guess i have another good reason to stay now........
    p.s. Nvidia is crazy! :screwy:
    Oh yes, this isnt only for EVGA...
    I can confirm that other partners went to Nvidia asking for MOAR voltage and, AFAIK, were turned away at the door. MSI was one of them with the 680 Lightning (which I believe needs an unreleased bios and version of AB in order to get past the 1.21v limit).
    Quoting Bobnova from the previous thread on this subject:
    Bobnova
    Nvidia has a huge say in what the manufacturers do, they can simply refuse to sell the cores to them!
    There was already a ban on software overclocking >1.3v, but EVGA dodged that with the hardware EVBot.
    Guess Nvidia didn't like that much.
    They're shooting themselves in the foot marketing wise IMO. On the other hand it allows them to sell more 680s as the limits prevent the 670 from stomping all over the 680, and the 660ti from stomping the 670, etc. etc.
    I can safely say that I won't be buying anything Nvidia (new, anyway) any time soon.
    Admittedly, I wouldn't be buying a new GPU any time soon anyway.
    EDIT:
    From the picture in the thread (http://s13.postimage.org/mcu6xgzhj/photo.jpg) the holes are still there. It becomes a simple issue to buy the proper header from digikey and solder it on. Of course, there goes the warranty, which is exactly what Nvidia's plan is.
    This is the header more or less, you'll need to cut one pin off.
    http://www.digikey.com/product-detail/en/87230-3/A26593-ND/353085
    2.EDIT:
    That said, there are non-solder ways to attach pins/wires there :sn:
    I think WARRANTY is the big key determining factor here, as I'm sure there have been several warranty claims on modified cards, which Nividia honored to keep face, but their focus shift may be on trying to eliminate the optional mods, where when a card fails, the user just simply removes the mods and returns the card in stock condition to get the warranty claim. I can see where that might add up to a significant profit loss in the grand scheme of things, where consumers get two or more cards for the price of one over the warranty period, but at the same time, its driving enthusiasts to the other side with the changes and limitations. Nividia holds its ground on being the fastest yet most expensive while AMD provides comparable performance sold in larger quantities and being more user friendly. Nividia sounding more like another very big company.
    Buttttt... In reality NVidia isn't the fastest anymore.
    The 7970 ROFLstomps it in almost all benchmarks...
    Anddd... In the end... That's really who we are talking about here... Benchers.
    I think it should matter a bit. AMD sells their GPUs to AIB partners. Said partners can do whatever the heck they want to them, as they OWN the GPUs.
    NVIDIA sells their GPUs to AIB partners. Then they say 'if you want to use our GPUs, you have to do so as we tell you. If you don't, we just won't sell them to you. Neener neener boo boo.'
    Which strategy do you like better? :shrug:
    The main thing is that nvidia doesn't want people overclocking their cheaper cards to match the performance of the top end cards. Why would someone want to purchase a 680 when they could buy a 660 or 670 and overclock it to the same levels?
    As far as the 680 being limited, I'd say that it has to do with the next-gen cards coming out soon and Nvidia wanting people to buy the latest and greatest (that is limited) rather than stick with the older generation card and overclock it.
    I guess it depends on how much of the market share you own. When you're the big dog, you can afford to dictate policy to those reselling your products.
    The question I would ask is, in the case where one of these 2nd and 3rd party vendors sells a card and it craps out, who is footing the bill in the end? Lets say you buy an MSI lightning with a Nividia chip and it craps out. Of course you would return the card to MSI for a warranty claim, but does MSI in return get their money back from Nividia if the failure is chip related?
    moocow
    The main thing is that nvidia doesn't want people overclocking their cheaper cards to match the performance of the top end cards. Why would someone want to purchase a 680 when they could buy a 660 or 670 and overclock it to the same levels?
    As far as the 680 being limited, I'd say that it has to do with the next-gen cards coming out soon and Nvidia wanting people to buy the latest and greatest (that is limited) rather than stick with the older generation card and overclock it.

    I look at this a bit differently...
    Sure... You could buy a 670 and make it outperform the 680 @ stock clocks... But the 680 when OC'd should be able to reach higher clocks than the 670... Thus the 680 is still superior.
    Unless I'm just completely lost in how the new 6 series cards work that is.
    Convicted1
    Buttttt... In reality NVidia isn't the fastest anymore.
    The 7970 ROFLstomps it in almost all benchmarks...
    Anddd... In the end... That's really who we are talking about here... Benchers.
    EXACTLY. Some are looking at this from a different perspective. Not to mention, people who physically modify high end cards are what, one in a million? That is not affecting their bottom line as much as moron who thinks the card doesnt work but forgot to plug in both power connectors. Seriously, compare ID10T errors to the people who modify cards... :)
    PolePosition
    I guess it depends on how much of the market share you own. When you're the big dog, you can afford to dictate policy to those reselling your products.
    The question I would ask is, in the case where one of these 2nd and 3rd party vendors sells a card and it craps out, who is footing the bill in the end? Lets say you buy an MSI lightning with a Nividia chip and it craps out. Of course you would return the card to MSI for a warranty claim, but does MSI in return get their money back from Nividia if the failure is chip related?

    Yes. That's exactly how it works.
    MSI passes the buck up to NVidia...
    PolePosition
    I think WARRANTY is the big key determining factor here, as I'm sure there have been several warranty claims on modified cards, which Nividia honored to keep face, but their focus shift may be on trying to eliminate the optional mods, where when a card fails, the user just simply removes the mods and returns the card in stock condition to get the warranty claim. I can see where that might add up to a significant profit loss in the grand scheme of things, where consumers get two or more cards for the price of one over the warranty period, but at the same time, its driving enthusiasts to the other side with the changes and limitations.
    I don't see how this is any different than Intel or AMD does their RMAs. If you overvolted the processor, beat on it hard, and it dies, how would they know? It all comes down to being honest. What I would like to know is how many actually killed their cards through overvoltage out of the total that got returned. It can't be that high, so this comes down to profits and bean counters. Someone inside nVidia saw this as a way to recoup money being "lost" through RMAs.
    From seeing their past decisions, they are either going to renege on this really fast if there is enough "kick back" or stick with it because they are hard-headed like that. Once ATI drivers for Linux become a bit more mature and easy to deal with, I will probably switch if they want to continue this route.
    Perhaps I do not understand how warranty's work within companies...This excerpt was from the article Hokie linked a few posts up:
    In other words, MSI was cheating. Perhaps no one would ever have known if it hadn’t been for one side effect. The increased voltage can cause the system to refuse to POST.

    Why is MSI changing the voltage a problem for Nvidia? Does MSI get a kickback from NV for returning bad cards? I mean, I see this as MSI should shoulder the cost of making that change, not NV.