• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Why x800xt/6800u arent worth $500

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

avesta

Member
Joined
Apr 22, 2004
I was really dissapointed with the fact that we will only be getting 30-40fps average in doom 3 with x800xt and 6800u at max settings and resolution. You would figure a $500 dollar card would run today's high end games a lot better than that. I would expect these 500 dollar cards to run the highest end games with everything maxed out at 60-70fps, so that it never dips below 30fps when there are a lotta monsters on the screen at one time, for example.
To the best of my knowledge, when we start playing d3 and hl2 at max details, we are going to see the framerate dip way below 30fps at certain stages if you are running max details. (please correct me if you feel otherwise, im judging this from the d3 benchmarks). This is not something you want from your $500 card which is designed to be played at MAX DETAILS. (Face it, no one pays 500 bucks for a card, to turn down the settings to achieve playable framerates in today's high end games)

I do realize that these games are VERY demanding on the gpu and have very high quality textures, lighting, physics, etc...BUT we are being charged $500 for a card which cannot achieve any higher than 30-40fps in upcoming high end games. IMHO, this is why these cards aren't worth 500 bucks.

Ideally, I'd like to play these games with everything maxed out no less than 50-60fps, so that im never dipping below 30fps or so.

Let me know how you guys feel about this....and sorry for the long post
 
I completey agree.

I always thought hardware was way ahead of software. Simple fact is its not.

I hope to play Doom III at 1280x1024 with max detail and 4xaa and 8x fsaa.
 
You can't run Doom3 at max settings yet... hardware limitations.

And I disagree with you, because supply and demand is what determines price. Not only that, but I'd rather have room to grow in a game than force the game to look crappier just to run a constant 60fps.
 
but wouldnt you like to run the game in its FULL glory the FIRST time that you play it? rather than waiting for the next 2 generations of cards before u can do that?
 
Nope, I'd rather not play it at the highest detail the first time, because if game developers worked around the cards limitations, we'd never get better cards. I'd rather the game be 2 generations ahead of the card than visa versa. Also, if a game was truly meant to be seen in the highest detail, then it should be good enough to last 2 generations worth.

Oh, and yes, I don't doubt ID Software's word one bit, especially coming out of Carmack's mouth. As we know, it's not the games holding up the cards, it's the cards holding up the games.
 
Well it appears you haven't been messing with computers very long, because this is normal in the game/hardware cycle. When Q3 came out, most hardware was doing 30-40fps and there were $500 video cards around as well. Yes, it kinda sucks, but as the leading edge is always pushed, you can expect it to go like this for the forseable future.

avesta said:
I was really dissapointed with the fact that we will only be getting 30-40fps average in doom 3 with x800xt and 6800u at max settings and resolution. You would figure a $500 dollar card would run today's high end games a lot better than that. I would expect these 500 dollar cards to run the highest end games with everything maxed out at 60-70fps, so that it never dips below 30fps when there are a lotta monsters on the screen at one time, for example.
To the best of my knowledge, when we start playing d3 and hl2 at max details, we are going to see the framerate dip way below 30fps at certain stages if you are running max details. (please correct me if you feel otherwise, im judging this from the d3 benchmarks). This is not something you want from your $500 card which is designed to be played at MAX DETAILS. (Face it, no one pays 500 bucks for a card, to turn down the settings to achieve playable framerates in today's high end games)

I do realize that these games are VERY demanding on the gpu and have very high quality textures, lighting, physics, etc...BUT we are being charged $500 for a card which cannot achieve any higher than 30-40fps in upcoming high end games. IMHO, this is why these cards aren't worth 500 bucks.

Ideally, I'd like to play these games with everything maxed out no less than 50-60fps, so that im never dipping below 30fps or so.

Let me know how you guys feel about this....and sorry for the long post
 
Pake said:
Nope, I'd rather not play it at the highest detail the first time, because if game developers worked around the cards limitations, we'd never get better cards. I'd rather the game be 2 generations ahead of the card than visa versa. Also, if a game was truly meant to be seen in the highest detail, then it should be good enough to last 2 generations worth.

What the ****?????

So you'd rather not play at the highest detail because it takes longer to get better cards??

LMAO, go away dude. :eek:
 
I figure that Doom III is more of a display for what the engine can do for developers, similiarly to the quake engines. If you look at Quake III that engine is still being used in some fairly new games and in 3 years games will be using the Doom III engine and the hardware then will be getting 200+ FPS in it and people will complain that the new hardware is a waste because 200, 300 or 400 FPS is not really needed in games. Its just the cycle, first the software is ahead then after a few years hardware passes the software then the next new engine brings all the current hardware to its knees and then the cycle repeats.

I figure if you want to have doom III at high settings, some aa/af and high resolution then you should have to pay the premium. Its a game engine that will be powering games for the next 3+ years. The engine is meant to be for the future and probably even more eye candy than Doom III will come out of it in the years to come. I believe Call of Duty was a Quake III engine game but look how much better call of duty is then quake III.

I am happy with my 9700 Pro and should be able to play it at medium settings at 1024x768. In a year and a half or so when the next gen cards are coming out and give 50% or something like that performance increase over the current lineup and there are athlon 64 4xxx's we will start to see Doom III at 1600x1200, 8aa/16af and ultra high settings at 60+ FPS.
 
Falcon-K said:
In a year and a half or so when the next gen cards are coming out and give 50% or something like that performance increase over the current lineup and there are athlon 64 4xxx's we will start to see Doom III at 1600x1200, 8aa/16af and ultra high settings at 60+ FPS.

True.
You see, thats called playing doom 3 in its full glory. Its a shame that we dont even have the option of playing it in its full glory at the present time, even if we pay $500 bucks for a card. You have to either compromise the gaming experience by turning down the settings on a $500 card (which btw is designed for high res/high aa/high af) OR you have to wait until the next gen of cards comes out and THEN play d3 for the first time.
 
avesta said:
True.
You see, thats called playing doom 3 in its full glory. Its a shame that we dont even have the option of playing it in its full glory at the present time, even if we pay $500 bucks for a card. You have to either compromise the gaming experience by turning down the settings on a $500 card (which btw is designed for high res/high aa/high af) OR you have to wait until the next gen of cards comes out and THEN play d3 for the first time.


I would like to play it in its full glory but this engine is so advanced and ID made it purposly to be way ahead of the hardware manufacturers yet still playable on lower machines. They did this so they can sell their engine for years to come instead of 6 months. The engine is designed to scale and in a few years the graphics on games using the Doom III engine is going to be insane and probably far better than the graphics in Doom III.

Just like the Unreal III engine when it comes out is supposed to be an even more advanced engine then Doom III. the unreal III engine will put what ever hardware that is out at the time to its knees.

When an engine like this, unreal III, or any other engine debuts its meant to cripple the current hardware and boy do the engine's do that. The software coming before the hardware is just something that we are going to have to live with.
 
LOL, this is the same argument every 18 months...

"Why a GeForce3 isn't worth $500!"

"Why a GeForce4 isn't worth $500"

"Why a GeForce5800 isn't worth $500" <-- well at least that argument had some merit ;)

"Why a Radeon 9800Pro isn't worth $500"

"Why a X800 or GF6800 isn't worth $500"

Every cycle, someone complains about how the newest video card isn't worth it's salt because it cannot play XYZ game at full details.

GeForce3 days, it was probably either Quake 3 or Unreal or something. "It can't play Q3 at 1600x1200x32 at 4xAA and 8xAF trilinear... This card SUCKS!" Oh really, did it?

GeForce4 days, I think it was like Unreal Tournament, CounterStrike, Halflife, that sort of thing right? I dunno, but I'm sure there was someone in the corner muttering "A GeForceTi4600 can't play CS at 100fps at 1600x1200x32 at 8xAA and 8xAF trilinear. What crap!"

And then in our NV30 / R300 days, there was UT2K3 and RTCW and whatever else I"m forgetting. And again, there was someone whining: "Dude? I can't be l33t playah at 1600x1200x32 6xAA and 16xAF trilinear on UT2K3 at 100fps. WTF Sux0r!"

Gee, here we are again. Someone again is crying "OMG!!11!1!one! Doom3 isn't even out yet, and I still can't play it at 1600x1200x32 8xAA 32xAF trilinear with every setting absolutely maxxed to the ceiling! What an egregious collage of fecal material!"

No matter what the hardware costs, how advanced it is, or what features it can perform, there will always be software that can bring it to it's knees. Everyone has been crying for DX9-level games to come out, now you have them and NOW you know what kind of performance they require.

It's kinda like asking why a Ford Excursion can't get 30mpg... Gee, we've only been building internal combustion four-stroke motors for 80+ years, why the hell can't they get 30mpg? Wwat you're forgetting is, we've come a SERIOUSLY long way from where we started.

Just curious, but why doesn't anyone ask this about processors?

"Gee, the new AMD64 Opterons SUCK. They still can't complete PiFast 1M in under 30 seconds. What a pile of crap! I refuse to spend $400 on a processor if it can't do PiFast in under 30 seconds!"

What kind of failed logic is this, really?
 
the cards are worth $500 its doom 3 that isnt worth $50 since nothing can play it fully atm. reverse logic but seems alot more fitting.
gaming is the only area of computing where software is ahead of hardware.
 
I agree with you somewhat, but you should perhaps proof read your post a little next time to make sure you have'nt made the same point like 6 times. 40 fps is NOT impressive ill admit. these games were made on the old high end hardware from ages past. how long has doom 3 been in development now? years? it should be able to run on a 6800 ultra a lot better since it was being made on geforce 3's and like.

I know i will not pay $500 for a card to run this game, but more the situation the market is in atm. it would be a really retarded thing to go buy a $500 *800cdn* card that will be obsollete like. next week. with pci express taking the market this fall and new cpus and memory that will require a new board, users who are thinking of buying these machines are going to have a $500 dead weight on their hands.

Ah well. i think a 9800 xt is going to be a optimal card for this game and others like it. i highly doubt anyone has a nice enough monitor to run these games in 1600x1200 with 70 fps. in fact, i dont belive you can buy monitors that will do that at any higher than 60 hz... 1024x768 with aniso *maybe some AA* will run perfectly fine at 85hz and im sure the game will be speedy on a 2.5+ ghz amd or 3 ghz intel.
Agp 8x for these new chips is kind of a money grab i belive. for those who want to have the best of the best right now. i know a guy at work who is dead set on buying a $800 Geforce 6800Ultra. im sooo trying to convince him against it since these cards are going to be a lot better when pci express boards are released, and are matured.

I dont know. i think in resonable resolutions, these games will run fine on $300 cards *9800xt, 5900xt*
Well, i know for a fact i dont have a $1100 monitor <_<
 
Valk said:
I dont know. i think in resonable resolutions, these games will run fine on $300 cards *9800xt, 5900xt*
$300 would be better spent on a 6800nu if you're thinking inside the $300 range).
 
OK...lets point our fingers in the right direction. *IT IS NOT THE VIDEO CARDS*. If you are gonna blame someone, blame the CPUs.

All these vcards are heavily CPU limited. The vcards are *not* to blame.
 
deathstar13 said:
the cards are worth $500 its doom 3 that isnt worth $50 since nothing can play it fully atm. reverse logic but seems alot more fitting.
gaming is the only area of computing where software is ahead of hardware.


Exactly - my $450 x800PRO plays all of the games that are actually OUT @ max resolutions - so why complain about something that is not even out yet?

:D

Valk said:
I agree with you somewhat, but you should perhaps proof read your post a little next time to make sure you have'nt made the same point like 6 times. 40 fps is NOT impressive ill admit. these games were made on the old high end hardware from ages past. how long has doom 3 been in development now? years? it should be able to run on a 6800 ultra a lot better since it was being made on geforce 3's and like.

I know i will not pay $500 for a card to run this game, but more the situation the market is in atm. it would be a really retarded thing to go buy a $500 *800cdn* card that will be obsollete like. next week. with pci express taking the market this fall and new cpus and memory that will require a new board, users who are thinking of buying these machines are going to have a $500 dead weight on their hands.

Ah well. i think a 9800 xt is going to be a optimal card for this game and others like it. i highly doubt anyone has a nice enough monitor to run these games in 1600x1200 with 70 fps. in fact, i dont belive you can buy monitors that will do that at any higher than 60 hz... 1024x768 with aniso *maybe some AA* will run perfectly fine at 85hz and im sure the game will be speedy on a 2.5+ ghz amd or 3 ghz intel.
Agp 8x for these new chips is kind of a money grab i belive. for those who want to have the best of the best right now. i know a guy at work who is dead set on buying a $800 Geforce 6800Ultra. im sooo trying to convince him against it since these cards are going to be a lot better when pci express boards are released, and are matured.

I dont know. i think in resonable resolutions, these games will run fine on $300 cards *9800xt, 5900xt*
Well, i know for a fact i dont have a $1100 monitor <_<

some recent benchs have shown PCIe vs AGP - there was like a 2-3FPS difference - the cards of today dont use the full AGP 8x banwidth they got.


Sentential said:
OK...lets point our fingers in the right direction. *IT IS NOT THE VIDEO CARDS*. If you are gonna blame someone, blame the CPUs.

All these vcards are heavily CPU limited. The vcards are *not* to blame.



not accoridng to another review i saw - the difference between a 2.4c and a 3GHz C was maybe 5-10-FPS - that is not a huge bottleneck considering if you own a 3.2 or 3.4 GHz CPU there should be no bottle neck and most game are becoming more GPU dependant - in the old days games were more CPU intensive because vid cards did not have the power.

As has been explained - it is the devlopers pushing their coding as far as it can go.
 
A few counterpoints for Valk:
Valk said:
how long has doom 3 been in development now? years? it should be able to run on a 6800 ultra a lot better since it was being made on geforce 3's and like.
That's somewhat like saying the Unreal 3 engine should have optimal implementation on a GF6800, when clearly Gabe stated that the GF6800 will be the bare minimum technology needed to run it. Same thing goes for Carmack and the D3 engine: GeForce 3 hardware is what started this engine, but that's considered the bare minimum technology requirement. And trust me, it will run much better on a 6800U than it ever would on a GF3 :)

Valk said:
i highly doubt anyone has a nice enough monitor to run these games in 1600x1200 with 70 fps. in fact, i dont belive you can buy monitors that will do that at any higher than 60 hz...
I have a four year old NEC P1150 that will do 1600x1200 at 80hz -- I think it cost me $250 slightly used. You can buy brand new ones for $350 that will do 1600x1200 at 85+hz.

Valk said:
1024x768 with aniso *maybe some AA* will run perfectly fine at 85hz and im sure the game will be speedy on a 2.5+ ghz amd or 3 ghz intel.
I'm sure it will too, especially if you've got an R9600 or GF5600 or higher. I think that's perfectly playable on "reasonable" hardware, don't you? But then again, is your R9600 going to play at 1600x1200 at 4xAA and 8xAF at playable framerates? Maybe you don't care, but some of us actually do :)
Valk said:
Agp 8x for these new chips is kind of a money grab i belive. for those who want to have the best of the best right now. i know a guy at work who is dead set on buying a $800 Geforce 6800Ultra. im sooo trying to convince him against it since these cards are going to be a lot better when pci express boards are released, and are matured.
Someone already corrected you on the merits of PCI-E, but I'll do it again. The performance difference between AGP and PCI-E is around ~2%, and not always in PCI-E's favor. And why do you suggest that AGP will simply fall off the planet in three to six months? Will everyone on this blue-green globe simply throw away their AGP motherboard? Just like everyone threw away their Slot 1 motherboard too right? And just like everyone threw away their ISA motherboards when PCI was introduced? Yeah, JUST like that I'm sure :)

I'm sick of all this Doom and Gloom (pun intended) about how the new video cards are a waste. Just like I said earlier, why doesn't anyone make this same lame argument against processors?

I don't hear anyone crying about how slow SuperPi or PiFast is running on their brand-spanking new $400 AMD64 processor, or their brand-spanking new $400 Prescott... That software's been around for HOW long, and we still can't do 1M digits of prime under 30 seconds? WHAT UTTER CRAP!
 
I agree, I wouldn't pay $500 so I can still lag. But with a cpu with at least 3ghz (I'd recommend 3.2+), and a 6800GT or X800PRO and 1gb of ram, you should be able to run it 1024x768 with all max details. Why do you REALLY want the highest res possible, sometimes that just seems so....bleh. Not to mention refresh rates. Most monitors support 1280x1024, but at either 60rr or 85rr. IMO 100+ is way better on the eyes.

Just my .02
 
Back