NVIDIA Says No to Voltage Control

It has come to our attention that NVIDIA is making sure that there will be no voltage control for their GK100 line up. In a forum post on EVGA forums, EVGA employee Jacob (EVGA_JacobF), responded to question about why a new GTX 680 Classified shipped without an EVBot port.

Picture Pulled from the Original Post
Picture Pulled from the Original Post Courtesy Sticky622 @ EVGA Forums

The initial assumption by members was that this was just a manufacturing defect that accidentally passed QA. However, here’s Jacob’s response to the initial question.

“Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution.”

Of course, many members were wondering about why EVGA had decided to remove the signature feature of their flagship GTX 680, in which Jacob responded with…

“Unfortunately we are not permitted to include this feature any longer.”

So, they are not permitted by NVIDIA to include the feature. NVIDIA is the only entity that could “not permit” them from doing something on their own product. Then, members asked the “Why?” question once again to try to coax a less vague answer to this situation and they are supplied with…

“It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device.”

From this quote, it’s obvious that NVIDIA does not want their partners to supply any means of voltage control with the GK100 series of GPUs. This is a slap in the face to many of the enthusiasts and everyday overclockers who enjoy pushing hardware for that extra performance. That leaves the extreme, warranty-voiding modders that hardmod their GPUs with the ability to increase voltage for the Kepler cards and have a stress-free overclocking experience

The only fault of EVGA throughout this process of removing EVBot ports is that there wasn’t an official announcement before cards without EVBot ports were shipped or listed on their site. Also, an EVGA forums member pointed out that the picture of the GTX 680 Classified in their product section seems to have the EVBot port blacked out using something like MS Paint. So, from the outside looking in it looks like EVGA was trying to hide the fact that the GTX 680 Classified will no longer have EVBot support and hoping no one noticed.

From EVGA's Product Page
From EVGA’s Product Page

We sent an email out to EVGA for some more details about why this happened and some Classified specific questions. Jacob was able to answer the Classified specific questions.

Will you be reducing the price of the GTX 680 Classified?

“No plans at the moment.”

What makes your Classified worth the premium being charged if it is now limited to NVIDIA’s (low) Power Target limits with no additional voltage control?

“Higher power target, better cooling, higher out of box performance, binned GPU, superior voltage/power regulation, 4GB memory.”

Doesn’t that make all the records Vince keeps setting kind of worthless for anybody but him and EVGA? Not that they aren’t astounding and take a ton of skill, but if only he has access to cards that can do it, what’s the point?

“Anybody can do the same, just need to have the expertise like he does to modify the cards manually (this will VOID warranty though)”

Now, there may be ways of getting around even this if you can do it. Will EVGA be willing to supply a diagram / explanation for making your own EVBot port or directly soldering on the EVBot lead?

“Not from EVBot, but there are other documented ways to override voltage, again this will void warranty though.”

The questions that could not be answered were “Why is NVIDIA doing this?” and “Are they [NVIDIA] experiencing an increased level of RMAs? …as in, does real voltage control kill Kepler GPUs excessively fast?” and it would have been nice to to know the answer to these. Only NVIDIA knows exactly why they are holding back the potential of their GPUs by limiting the cards so much.

All this information makes it seem like it’s just a matter of time before NVIDIA snuffs out other voltage control features from other manufacturers. We know MSI and Galaxy have been having trouble getting NVIDIA to budge on allowing voltage control. ASUS has their GPU Hotwire feature, which can control GPU voltage when combined with their high-end motherboards (simliar to EVGA’s EVBot). I haven’t heard or read anything about ASUS removing hotwire for NVIDIA cards, but it looks to be inevitable. We’ve sent an email to our contacts at ASUS asking about this and we’ll update with any information we get from them.

So, the AIB partners are not to blame here, it’s all NVIDIA.

– Matt T. Green (MattNo5ss)

About Matthew Green 57 Articles
Self-Proclaimed WordPress Editing Guru

Loading new replies...

Avatar of txus.palacios
txus.palacios

Member

3,934 messages 0 likes

Oh, well, back to ATi then.

If they want to stop us from overclocking, I might aswell stop me from buying their products.

Sadly, if this "no overclocking" policy is nVidia's new way of thinking, these Fermis will be the last green thing I unbox.

Reply Like

Avatar of TimoneX
TimoneX

Closet Elitist Member

3,763 messages 0 likes

Lame move nv.

Just reminds me of the old days with chopped surface traces to disable SLI on nv chipsets and drivers that wouldn't allow SLI if certain chipsets were detected. :rolleyes:

Reply Like

Avatar of EarthDog
EarthDog

Gulper Nozzle Co-Owner

76,458 messages 3,194 likes

Well, no overclocking may be a bit harsh, LOL... But its incredibly dumbed down and easier compared to AMD now...

Nvidia OC in a nutshell: Increase power limit slider to the highest it will go, increase fan speed to keep temps under 70C, push the core and memory clocks. The voltage is limited and goes to the max regardless. Done. They just limit the voltage that can go to those GPU's.

Reply Like

Avatar of trekky
trekky

Member

1,813 messages 4 likes

Oh, well, back to ATi then.

If they want to stop us from overclocking, I might aswell stop me from buying their products.

Sadly, if this "no overclocking" policy is nVidia's new way of thinking, these Fermis will be the last green thing I unbox.

same for me. im the builder for all my friends and family.
my last build i did 2 560ti ing SLI and did a nice (safe) overclock
and the only reason i was only getting ATI was the bitcoins but thats ending soonish so i guess i have another good reason to stay now........
p.s. Nvidia is crazy! :screwy:

Reply Like

click to expand...
Avatar of hokiealumnus
hokiealumnus

Water Cooled Moderator

16,560 messages 25 likes

Hah, NVIDIA is clamping down across the board. Looks like MSI got caught trying to circumvent NVIDIA's restrictions too: http://www.tomshardware.co.uk/MSI-GTX-660-670-overvolting-PowerEdition,news-40278.html . Different situation, same end result - reduced overclocking for the consumer.

Reply Like

Avatar of EarthDog
EarthDog

Gulper Nozzle Co-Owner

76,458 messages 3,194 likes

Oh yes, this isnt only for EVGA...

I can confirm that other partners went to Nvidia asking for MOAR voltage and, AFAIK, were turned away at the door. MSI was one of them with the 680 Lightning (which I believe needs an unreleased bios and version of AB in order to get past the 1.21v limit).

Reply Like

Avatar of hokiealumnus
hokiealumnus

Water Cooled Moderator

16,560 messages 25 likes

Quoting Bobnova from the previous thread on this subject:

Nvidia has a huge say in what the manufacturers do, they can simply refuse to sell the cores to them!

There was already a ban on software overclocking >1.3v, but EVGA dodged that with the hardware EVBot.
Guess Nvidia didn't like that much.

They're shooting themselves in the foot marketing wise IMO. On the other hand it allows them to sell more 680s as the limits prevent the 670 from stomping all over the 680, and the 660ti from stomping the 670, etc. etc.

I can safely say that I won't be buying anything Nvidia (new, anyway) any time soon.
Admittedly, I wouldn't be buying a new GPU any time soon anyway.

EDIT:
From the picture in the thread (http://s13.postimage.org/mcu6xgzhj/photo.jpg) the holes are still there. It becomes a simple issue to buy the proper header from digikey and solder it on. Of course, there goes the warranty, which is exactly what Nvidia's plan is.
This is the header more or less, you'll need to cut one pin off.
http://www.digikey.com/product-detail/en/87230-3/A26593-ND/353085

2.EDIT:
That said, there are non-solder ways to attach pins/wires there :sn:

Reply Like

click to expand...
Avatar of bmwbaxter
bmwbaxter

Member

4,135 messages 7 likes

:facepalm:

That is all I have to say.

Reply Like

Avatar of PolePosition
PolePosition

Member

795 messages 29 likes

I think WARRANTY is the big key determining factor here, as I'm sure there have been several warranty claims on modified cards, which Nividia honored to keep face, but their focus shift may be on trying to eliminate the optional mods, where when a card fails, the user just simply removes the mods and returns the card in stock condition to get the warranty claim. I can see where that might add up to a significant profit loss in the grand scheme of things, where consumers get two or more cards for the price of one over the warranty period, but at the same time, its driving enthusiasts to the other side with the changes and limitations. Nividia holds its ground on being the fastest yet most expensive while AMD provides comparable performance sold in larger quantities and being more user friendly. Nividia sounding more like another very big company.

Reply Like

click to expand...
Avatar of Convicted1
Convicted1

Member

1,900 messages 0 likes

Buttttt... In reality NVidia isn't the fastest anymore.

The 7970 ROFLstomps it in almost all benchmarks...

Anddd... In the end... That's really who we are talking about here... Benchers.

Reply Like