• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE NVIDIA GEFORCE GTX 780 Graphics Card Review

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Great Review Hokie!!! :thup::thup::thup:

I am proud to say I will be in the VERY near future, a proud owner of a GTX 780 Classified Hydro Copper owner or Hydro Copper. Just waiting for the release and the questions I've asked EVGA to be answered in regards to the EVport/EVBot. I have seen rendered images as it seems the dual fan version of the Classified has a EVBot port and not in the Classified Hydro Copper or Hydro Copper. They aren't giving us definite answers as of yet and told to "wait and see". I worry if I did go with a classified hydro copper with the EVBot port and not owning a actual EVBot, if theres a point in purchasing it without using that feature and just OCing it via Precision, assuming I will be get the same OC's as a EVBot user or are those cards specifically made for EVBot users only.
 
Last edited:
Damn this is getting technical, but insulting someones grammar/etc merely shows your lack of respect, for that person, and likely everyone else with opposing views around you.

We can argue till' the end of time, but throwing your dollars and support behind nVidia is the strongest move you can make. Sig your rig proudly, be part of the green team. The whole AMD vs Intel/nVidia thing is a lot like politics, the losing team won't change untill something affects them directly, on their camp, and that is when they will change their tune.

Just wait a year, a couple of dead 7970s will teach em a thing er two =P

And if they continue after that, they it can be said that they are indeed fanboy(s)

I would get GTX 780 however the cost is to high for me and i'm a Nvidia fan.

I don't think people need to worry about AMD cards if you get one with a 3 year warranty you will be safe, they make them to last.
 
Holy hell, a 780 with a functional EVBot port would be glorious!
I'd be rather surprised if it went over 1.21v, but if it does... zomg!
 
I would get GTX 780 however the cost is to high for me and i'm a Nvidia fan.

Me too, but im getting another identical model of my 670 instead, because two 670s > Titan anyways. Hopefully the prices will go down soon...
 
Damn this is getting technical, but insulting someones grammar/etc merely shows your lack of respect, for that person, and likely everyone else with opposing views around you.

We can argue till' the end of time, but throwing your dollars and support behind nVidia is the strongest move you can make. Sig your rig proudly, be part of the green team. The whole AMD vs Intel/nVidia thing is a lot like politics, the losing team won't change untill something affects them directly, on their camp, and that is when they will change their tune.

Just wait a year, a couple of dead 7970s will teach em a thing er two =P

And if they continue after that, they it can be said that they are indeed fanboy(s)

What on earth is that kinda of BS?

How can you keep turning a blind eye to the information in front of you, and keep sticking to your fanboy points of view?

AMD vs. Intel/nvidia ? Wth? I understand YOU ARE Intel/nVidia fanboy to the core.

Personally, i prefer Intel CPU's over AMD, back in the days, i had both Intel and AMD, but now im all Intel on the CPU, and i dont even bother learning about current AMD cpus, in the same way i think, you dont even bother learning about AMD/ATI gpus. You keep throwing around all these at the end of the day pointless facts about cards.

You cannot sit and compare heat on card when you have nothing to compare to..

I have 2x 7970 6Gb cards, i currently Bitcoin mine with them. This means they are being stressed quite a lot. My main GPU sticks around 76 under full load, and my second one stick at 65. Under full load, thats great, but if you stick my cards into a rig that hasnt got the same airflow, then the temps would be different. Until you test cards under identical environments then you have no personal experience to base anything on.

Like i said earlier in the thread, i have owned a GTX590, it had to be RMA'ed, so stop claiming nvidia cards dont break. Its been set very clearly on in earlier posts that AMD mass produce, and nVidia dont. If one outta 500 Amd cards fail, and 1 outta ever 100 nvidia cards, then its nvidia with the short straw in your argument, but you dont know the production numbers, you dont know the RMA numbers, and not every single person on earth posts online when they have a hardware issue, in fact, most people just drag the pc to a store to get it fixed.

You are basing all your facts on the inner most belief that nVidia is god, you look at an nVidia GPU and you see a damn holy light around it.

Its OKAY to be a fanboy, but sometimes your are going to have agree to disagree, which BTW is something you seem unable to do.

We have always known AMD run hotter, this is nothing new, but what if, AMD are aware and they design the products to run with that amount of heat?
 
Last edited:
Holy hell, a 780 with a functional EVBot port would be glorious!
I'd be rather surprised if it went over 1.21v, but if it does... zomg!

that has SOOOOO much win in it.

but I think more dream list item than reality. :clap:
 
Its OKAY to be a fanboy, but sometimes your are going to have agree to disagree, which BTW is something you seem unable to do.

We have always known AMD run hotter, this is nothing new, but what if, AMD are aware and they design the products to run with that amount of heat?

Yes, I know it is okay.

Some people around the forums, however, choose to hide it, and act like they're not biased when they really are.

As for the 2nd thing about the heat, yeah, they know, it is a smart business plan I admit, but it doesn't place the consumer first. More heat > faster death, less heat < slower death
 
Agreed. This discussion, while fruitful in some respects, is really an excersise in theory crafting. Which, one side is really coming to a conclusion that isn't fitting from the facts at hand. So lets concede it does run warmer, like bobn said, to what end? Does it matter if the amd card dies in 10 years versus 15?
 
Last edited:
Agreed. This discussion, while fruitful in some respects, is really an excersise in theory crafting. Which, one side is really coming to a conclusion that isn't fitting from the facts at hand. So lets concede it does run warmer, like bobn said, to what end? Does it matter if the amd card dies in 10 years versus 15?

I concur, there's no way to know.
base on Historical records, AMD cards have been rock solid in builds.
yes, there are some complainers, but true for both camps.

Only time will tell. :) theorycrafting is OK, fun to discuss, but til proven, really more like hypothesis only. Time will do justice, we need not judge now. :)
 
More heat > faster death, less heat < slower death

This is wrong.

Higher temperature = faster death.
Higher voltage = faster death.
More heat production (if temperature and voltage and all other variables are the same) does NOT equal faster death.

I don't get why there were like 10 posts just on trying to understand the above.

Also, anyone trying to argue that one company inherently has better products than another (regardless of what companies and what they make, doesn't even have to do with computers) are wrong. Also, I'd argue that fanboyism is inherently a bad thing, but that's just my opinion.

Also, random theorycrafting typically tends towards biased, terrible arguments more than anything productive. Theorycrafting in itself isn't bad, but the results of trying it usually are.
 
Last edited:
And AMD cards use more voltage than nVidia?

Sure, I'm not sure, I don't personally own/have overclocked anything remotely modern.

However, the statements I made above were all meant to be ceteris paribus, which means all other things considered equal. Since AMD and NVIDIA (obviously) have different core designs, among other things, they're not directly comparable in some aspects. Apples and oranges, if you will.

In other words, if you take two completely identical cards from either company, and run one at a higher voltage than the other, it'll probably die sooner (by some amount that might be significant or not, who knows). That's all my post was meant to say. You can't use voltage or power consumption or any measure like that to claim one GPU will last longer than a GPU of a completely different design, it's just illogical. If they were the same GPU, then yes, it makes sense.
 
They give you the option to. But higher voltage is relative to stock voltage within a design. I have a 7ish year old athlon xp that runs 1.75v stock on a small air cooler. Try doing that to an ivy bridge and see if it lasts 7 years. Or even 7 months.
 
I find the comparison there about which card dies first inconclusive at best.

Yes, higher voltage is not as good.
but look at the following example I will use to help illustrate why this is pointless.

In the US, they use 110v lightbulbs
in Japan, we also have 200v lightbulbs to use.
the estimated life of these light bulbs are usally the same hours.
why's that? cause the 200v lightbulbs comes with 'thicker' tungsten filaments, build for that voltage.

(now there's a trick we use, is we buy 200V lightbulbs in 110V countries and those lightbulbs almost NEVER burn out. )

So think of AMD as the 200V bulb and NVDA as the 110V

the voltage usage there is not meaningful,
cause AMD would have the higher voltage use in mind,
and make those chips with 'thicker' silicon transits, which can last longer 'potentially'.

*** I am trying to explain the theory using something not exactly comparable to the semi-conductor size of things, there are differences,
*** but I hope in this example you see why I say the voltage use in not really meaningful. :)

it will ONLY be meaningful, IF you find the following Info:
- AMD GPU and NVDA GPU (current gen), what electric tolerance are they build for? is it in fact the same?

Only if so.. we should be comparing voltages highs/lows, on the potential impact on the lifespan of the chip. :)
 
Last edited:
Dual GK110?

I dunno if that'll happen, I feel like it'll just cannibalize their own sales of the Titan (unless it is significant more powerful and more expensive than the Titan).
 
Back