• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

G80 incredible but is it worth it?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

dreamtfk

Member
Joined
Feb 1, 2002
Location
Orlando FL
Just read an article on inquirer about the demo Nvidia is going to display at the sample launch. It will render a real playboy playmate and it supposedly going to be the best render yet (surpassing the old pixie render)

My question is realistcally when will we see any games that can take advantage of this behemoth of a card? I really want to get the 8800GT but $500? I am assuming that the 7900GTX and GX2 will drop some in price after the release, which sounds tempting.
 
I don't think it's worth it, particularly when you consider the power consumption. The cost of ownership will be far higher than is obvious for these power guzzling cards. Obviously only you can decide if it's worth it to you.
 
I will be waiting to see if it can actually pull of what they say it can, but I am starting to really not be interested in the DX10 cards until a refresh of them can bring power consumption down.

~jtjuska
 
jtjuska said:
I will be waiting to see if it can actually pull of what they say it can, but I am starting to really not be interested in the DX10 cards until a refresh of them can bring power consumption down.

~jtjuska

Yeh that is a little ways off, I think DX9 still has alot of life left in it. Although I did read that the next-gen cards will greatly improve the best DX9 performance we currently have (ie SLI).
 
was the last gen high end worth it ? To some yes, others no
Will this gen high end be worth it? to some yes, others no.

Simple as that.
 
Mr.Guvernment said:
was the last gen high end worth it ? To some yes, others no
Will this gen high end be worth it? to some yes, others no.

Simple as that.

Amen to that.

Maybe we can relate this to:

Some guys like slim and athletic, some like thick and curvy. (I'm not sure which card is which type, but uhh yeah :) )
 
lol I just never thought of it that way. I am one that was all for DX10 but with them eating up that much power I just don't personally see it being worth it. We will see in about 6 or 7 months when I end up having to get rid of the X1950s ill have coming next week (i hope...) but at this moment it isn't worth it.

~jtjuska
 
I agree that the power/cooling requirements are getting a bit out of control, but if you want the best performance, you have to suck it up!

And hey, you've got a PCP&C 750, so I think you have enough power there :) (as long as your can afford it on your power bill)

But then if you have 2 x1950's coming, those will surely pull quite a bit of power themselves.
 
Its interesting how everyone is complaining that gpus use alot of power now. When the 6800/x800xt pe came out everyone complained that is used a bunch of power. Then the x1900 came out and people started saying hey this power thing is getting out of hand but they still bought them like crazy. Really all the 8800 is just another progression in the faster faster faster no matter what power requirements. Its funny how everyone is finally slowing down and saying hey 2 years ago I had a 400 watt PSU what I am doing with this 750 watter now?
 
speed bump said:
Its interesting how everyone is complaining that gpus use alot of power now. When the 6800/x800xt pe came out everyone complained that is used a bunch of power. Then the x1900 came out and people started saying hey this power thing is getting out of hand but they still bought them like crazy. Really all the 8800 is just another progression in the faster faster faster no matter what power requirements. Its funny how everyone is finally slowing down and saying hey 2 years ago I had a 400 watt PSU what I am doing with this 750 watter now?

Who's this "everyone" you keep referring to? I've always considered power consumption an important factor in my GPU selection and I'm certain I am not alone. Sure there are those that want performance at any cost and don't mind(or don't pay) the electric bill, but that ain't me. I think with each generation consuming more and more power, and electricity costs increasing faster than inflation there will be more people recognizing the cost of owning/operating some of these high end cards. Grab yourself a wattmeter and hook it up to your main rig. It will become all too obvious where the power is going when the electric bill comes.
 
I mean from my current card to say either on of these cards I could be looking at a 2 to 3 times increase in power consumption in just the GPU alone.

Truely I guess I don't have as much a problem but considering I just have a 520W PSU with an OCed rig it becomes a slight concern of how much power the GPU will take. My main thing is I hope alot of that hot air is spilling out the back of the case instead of into my case heating up the rest of my components or needing me to buy a new case because of the increased length.
 
Well, I wasn't an "Everyone" that complained about power consumption until I built an OCed opteron 165 system with SLI 7900GTX's and all the other bells and whistles. It took a few months to realize that it was raising the electric bill by a good $20-$30 a month!

So, While I still would want the best performance I can afford (which may or may not be the 8800) I am more conscious of the power draws now that I'm familiar with how notable they are.

EDIT: I guess one good thing is that you could probably throw some ductwork on the card and use it to heat your house!
 
The overall power withdraw from the G80 is absolutely ridiculous. It seems to gain more performance, we have to up the ante on the power. Like I said before, the design of the G80 is robust at best. It sure as heck is not something I will plan to buy come the next generation, rather I'd wait for the refined version that will hopefully use much less power. 400W for one card is absolutely friggin ridiculous, things are bad enough with OC'ed rigs, but the 400W for just operating is not something to laugh at. Imagine if you ran an Opteron rig that was overclocked and had two 8800GT's in SLI. The power requirements is already a big enough barrier to break, let alone the fact that you would want to OC later on. How much more of a power barrier do the guys at Nvidia and ATI think we're going to deal with before a large majority just say, "Forget it, I've had enough. No more of this bs from either company." Honestly it's absolutely outrageous at a 400W barrier. :mad:
 
I am glad that htey keep releasing these more then powerful super duper ridiclous OMGFTW!!1! cards. It drops the price on the cards that are "good enough to play any game on the market" to a reasonable 200 or less level.

By the time I need to get a G81/RD9million whatever card... the power supply that I need to power it will be right back in my price range.. un der 100 bucks :)

I wont ever be cutting edge... but top 10% of the market is good enough for me :) (thanks to all I have learned from OCF and XS :))
 
InsaneManiac said:
The overall power withdraw from the G80 is absolutely ridiculous. It seems to gain more performance, we have to up the ante on the power. Like I said before, the design of the G80 is robust at best. It sure as heck is not something I will plan to buy come the next generation, rather I'd wait for the refined version that will hopefully use much less power. 400W for one card is absolutely friggin ridiculous, things are bad enough with OC'ed rigs, but the 400W for just operating is not something to laugh at. Imagine if you ran an Opteron rig that was overclocked and had two 8800GT's in SLI. The power requirements is already a big enough barrier to break, let alone the fact that you would want to OC later on. How much more of a power barrier do the guys at Nvidia and ATI think we're going to deal with before a large majority just say, "Forget it, I've had enough. No more of this bs from either company." Honestly it's absolutely outrageous at a 400W barrier. :mad:


The card itself does not use 400W, NVidia says you should have a minimum 400W PSU in a system with 1 8800 card.
 
G80: Its huge...we just got to wait till we see what it can do.

DX10: Same here, just wait, but i have to say, its going to be awesome if it really makes true their promises.
 
Back