• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

I thought the R600 was supose to kill 8800gtx

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I wasn't expecting it to kill the 8800GTX, but I am a bit confused now. From what I heard since R600 is made from a smaller manufacturing process the card would be cheaper, so it could be that they are comparing based on price. Or maybe there is supposed to be a XTX or something faster soon after the XT is out.

edit: I'm right they will be releasing a XTX version, it is at the very end of the article.
 
I think the two cards are theoretically at the same price point. The HD 2900XTX is the top card (like the 8800GTX) and the HD 2900XT is the 2nd in line (like the 8800GTS). But I do think you are right, IIRC it was stated the HD 2900XT was supposed to trump the GTX, but ATM it's hard to see if it has.

I think DT needs to re-run the benchmarks at a higher resolution and then make a comparison to both the GTX and the GTS...We should probably just sit back and see how it looks once we get more data in, but I'm sure there are a lot of people who can't hold their speculations in.


EDIT: It looks like the XTX has been pushed back. Also, I think some of the first R600 cards are 80nm still, but they should be switching to 65nm part way through... I haven't heard anything definite tho.
 
Well in that it shows it against the GTS. Because thats what its up against for the price point. XTX is suppose to be against the GTX. Though the XT is suppositly just nearly as fast as the GTX though.
 
It's slightly worrying considering that nVidia has left space for a 8900 named card while ATi has gone for the top end straight off, so if nVidia do release a 8900 of sorts, will ATi have a response, or consider their cards good enough, or use agressive pricing? There are lots of unanswered questions here. I guess ATi could use the XTX PE name, but I understood they adopted the HD moniker in its place, but they haven't used that since the X800 (at least not to my knowledge). Then again, they have room for a X2950 if they choose to use it. Seems as though they haven't given themselves much room to manuever though. Looks as though nVidia has learnt well from the last generation.
 
From what I've been reading even if nVidia comes out with something faster they wouldn't be able to make enough of them to make a difference.
 
Avg said:
From what I've been reading even if nVidia comes out with something faster they wouldn't be able to make enough of them to make a difference.

Link?
 
I'm not so thrill about the xt performance over gts. I thought it would be alot faster. Didn't ATI came out last month saying the the 2900 will be the fastest dx9 card ever??? Maybe they were referring to the xtx, which I've heard that the it's delay till 3rd quarter and it will be a limited quantity card, so expect to pay a lot.
 
rainless said:

Its been mentioned around the web that there will be very limited quantities, but no "concreate" evidence. INQ had it up early last week if not mistaken.
 
boonmar said:
I'm not so thrill about the xt performance over gts. I thought it would be alot faster. Didn't ATI came out last month saying the the 2900 will be the fastest dx9 card ever??? Maybe they were referring to the xtx, which I've heard that the it's delay till 3rd quarter and it will be a limited quantity card, so expect to pay a lot.

I wouldn't take that test shown over anything. Thats a relatively low resolution since these cards have alot more in them than what was shown. Give a 1600x1200 resolution and there we should see the real results. Let alone of course this is still beta drivers as well. Theres still more hope for it but the question is how much.

Either way, it preforms better than the 8800GTS in that test, but also its nearly a direct compeditor for the 8800GTX. THere test system seems screwy for such low scores though.
 
I'll just wait until there's more varied results, a few games and 3DM at low-mid resolution without any IQ effects doesn't mean much. There is one thing everyone seems to forget or gloss over about NVs 8800s though, that the stream processors are clocked much higher than the main core speed. In terms of operations/second purely on the stream processors (assuming all other things are equal, which they aren't but...) that means ATi needs twice the stream processors at half the speed. So the 'wowza 320 stream processors' needs to be considered in light of that fact - unless ATi's are clocked higher as well.
 
MadMan007 said:
I'll just wait until there's more varied results, a few games and 3DM at low-mid resolution without any IQ effects doesn't mean much. There is one thing everyone seems to forget or gloss over about NVs 8800s though, that the stream processors are clocked much higher than the main core speed. In terms of operations/second purely on the stream processors (assuming all other things are equal, which they aren't but...) that means ATi needs twice the stream processors at half the speed. So the 'wowza 320 stream processors' needs to be considered in light of that fact - unless ATi's are clocked higher as well.

Well to that extent its been mentioned the ATI's and nVidia's stream processors are not the same. Even if the 320 is clocked lower at 740Mhz it still would need nVidia to make that up with a much higher clock with there 120 to match it if it was even in preformance so to say.
 
And you know that they perform that much differently based on what, all the varied benchmarks which are out there? And yes I know they aren't equal, then there's also drivers and individual application implementation to consider which is why I put in that part in parenthesis ;) It's just something which occured to me the other day which I figured was worth posting someplace or another because it rarely gets mentioned that's all. At this point none of us really know, the benchmarks up there are *meh* in terms of comparison usefulness although the 3DM freaks will probably be all giddy.
 
deathman20 said:
I wouldn't take that test shown over anything. Thats a relatively low resolution since these cards have alot more in them than what was shown. Give a 1600x1200 resolution and there we should see the real results. Let alone of course this is still beta drivers as well. Theres still more hope for it but the question is how much.

Either way, it preforms better than the 8800GTS in that test, but also its nearly a direct compeditor for the 8800GTX. .

That's a bit of a stretch. It BARELY outperforms the 8800GTS, which means it is not even close to a direct competitor to the 8800GTX.

Avg said:

And here's the link! Thanks AVG :beer:

EDIT: Not the link I was expecting, but whatever... I like links :)
 
Everyone seems to forget that the R600 core is designed closer to the specifications of the DX10 Standard set out my M$. nVidia went out and did there own thing which would work for both DX10 and DX9 using unified shaders and such. From what i know that means the R600 has more driver overhead when running DX9, so the fact it performs as well as the 8800's is pretty amazing considering the DX9 API has to be converted to DX10 before rendering which dosn't need to happen quite so much with nVidia cards.

What dose all I've said mean? Well it means that theretically (and only that at this stage) the R600 should whip the pants off the 8800's in DX10 mode...

but only time will tell. Anyway, if i'm wrong about what i've said (its its likally i am) i'd like someone to put it straight for my benifit as much as everyone elses.
 
We all know that Crysis runs amazingly well on the 8800 series already, the GDC 07 demos were reported to be running on a single core intel processor and an 8800 variant on a 30" screen at some ridiculous 2500x1600 resolution. Seems as though DX10 is a giant leap forward in the performance area, increased fps and eyecandy.
 
Wait till the R600 comes out. I have been hearing that the 2900XT bested the 8800GTS in many benchmarks, but never saw enough concrete evidence to prove it. You figure that ATI would use the monikor of 2800XT versus 2900XT. That does not give them a lot of room to work with.
 
I'm guessing they changed it to 2900XT (from 2800xt) once they saw how late they would be coming to market. 2900XT visually looks more like a direct competitor to the 8900gts, and I doubt they wanted to look like they are a full generation behind (I personally view it as more like half a gen).
 
Back