• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Performance: X1800XT Crossfire vs. nVidia 7800GTX SLI

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

d94

$30 a phone
Joined
Sep 26, 2004
Location
48302
"
We were pretty excited as we ran a head-to-head of ATI X1800XT Crossfire (5.11 Drivers) on the ATI RD580 Dual x16 to nVidia 7800GTX SLI (81.87 drivers) on the Asus A8N32-SLI Deluxe Dual x16. This is, after all, the comparison everyone would like to make if all the parts were available. Unfortunately ATI asked us not to publish specific benchmarks since RD580 and Crossfire x1800XT have not been officially released and there may be more changes before the products launch. However, we will talk about relative performance after detailing the setup and some new findings.
x1800xt.jpg
ATI was roundly criticized for the inability of X850XT Crossfire to run at resolutions above 1600x1200, even though most end-users are not actually able to run at higher resolutions with today's most common 19" and 20" flat-panel displays. We confirmed that the new compositor chips used in X1800XT Crossfire do indeed run fine in Crossfire mode at 2048x1536. ATI tells us there is no limitation in the X1800XT compositor that would prevent even higher resolutions from working as they should.
xfire.jpg
While the exact performance results achieved comparing X1800XT Crossfire with 7800GTX SLI can not yet be published, we can tell you we benchmarked with F.E.A.R., Quake 4, Splinter Cell - Chaos Theory, Doom 3, Far Cry, and 3DMark05 at 1600x1200 resolution with 4X AA and 8X AF enabled. ATI X1800XT Crossfire won every benchmark over nVidia 7800GTX SLI in these tests.

We also ran standard scores (1024x768) for Aquamark 3, 3Dmark03, and 3Dmark05. Once again Crossfire X1800XT outperformed nVidia 7800GTX SLI in every benchmark.

We then ran all the same tests in single video mode, comparing a single X1800XT on the ATI RD580 to a single 7800GTX on the Asus A8N32-SLI. Benchmarks were run under the same conditions as Crossfire/SLI - 1600x1200/4xAA/8xAF in games and "standard" scores in 3Dmarks and Aquamark 3. Once again the ATI X1800XT on the RD580 was the winner in every benchmark. It is clear the new 5.11 drivers do make a difference in Open GL games like Quake 4 and Doom 3. Even more exciting, the RD580/X1800XT Crossfire will be a potent graphics combination.

There is no doubt that the nVidia 7800GTX 512MB, which everyone expected would launch 2 days ago as a 7800GTX Ultra, would likely win a head-to-head performance test as single or SLI when compared with the single X1800XT or Crossfire. However, ATI clearly believes the competitor for X1800XT is the $499 7800GTX and not the $700 7800GTX 512MB. ATI was quite clear they will be introducing a "PE version" of X1800XT to compete with 7800GTX 512."
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2609&p=4
 
oh yes....let the battle begin...

cant wait to see the actual results when ati finally says they are ready to compete....this has dragged on long enough....
 
so 2x 512mb x1800xt's beat 2x 256mb 7800gtx's....thats not a suprise at all...but the x1800xt's would get spanked by 512mb gtxs.... last time i checked in UK a gtx 256 is a lot cheaper than an x1800xt, and a 512mb gtx is only a bit more expensive due to short supply and price gouging...
ati release an xtpe, nvidia release an ultra...
 
Vrykyl said:
so 2x 512mb x1800xt's beat 2x 256mb 7800gtx's....thats not a suprise at all...but the x1800xt's would get spanked by 512mb gtxs.... last time i checked in UK a gtx 256 is a lot cheaper than an x1800xt, and a 512mb gtx is only a bit more expensive due to short supply and price gouging...
ati release an xtpe, nvidia release an ultra...

Problem is the 7800 512's are too expensive. :shrug:...

Maybe benchmark 2 x1800xt's 256MB cards?

I guess that's the card I'll be getting next year or so...

dan
 
hahahaha ATi was mouthing off about having a card out how long ago now? I forget since it's been put off so many times.

Now they are are saying that it was made to compete with a card that is two generations older than it is? LOL

Then they actually have the gall to come out and say they are gonna make a new card made to compete with the 512 card when that is what they said about this one over and over again? LMAO!!!!

I only have this to say to ATi...

STOP THE MADNESS ATi! You change your story so much, and don't deliver even then, and nobody believes you anymore. You talk MUCH $#!^, but don't back it up!
 
The 512 card was released before the x1800r520??<>()*()%$#@! (or whatever they are calling it now).

The 7800GTX 512 is one generation, the 7800GTX would be the other. :)
 
7800GTX, 7800GTX 512...They are different models in the same generation.
If they tried comparing it to a 6800GT Ultra that would be a different story.
 
3DFlyer said:
The 512 card was released before the x1800r520??<>()*()%$#@! (or whatever they are calling it now).

The 7800GTX 512 is one generation, the 7800GTX would be the other. :)

all the 7800 models are the same generation...

Two generations ago would be the dreaded FX series.... Or ati's 9xxx series...
 
though I'm an ATI's Fan but I still don't think it's fair comparing a 512MB X1800XT to a 256MB GTX.

And ATI boasts about the XTPE verion to beat the 512MB GTX. What will they do about the GTX Ultra? Launch the XT-PE OC or Ultra version?

I'm sorry ATI, but this is not a fair battle UNLESS you score more than half better than the GTX, else it's not a fair game. What we need is a head to head comparison between the 256MB X1800XT with the 256MB 7800GTX
 
Until we see results lets not judge, for all we know the x1800xt in crossfire killed sli, but we don't have numbers, we just have someone saying that one is better than the other. Someday we'll actually know which one is better, but in the end, both will be great cards and 'top of the line'.
 
ok, bad choice of words...let's call it "models" then. Whatever you want to call it, they are backpeddaling from their previous talk, and are gonna try and compare their card to an older card. They got beat again, and their next one will get beat. Why? Because nVidia has had all this technology, and all they have to do is release it after they see ATi talking about their next release, then wait, and wait, and wait on them, and then release it a month before they even get the hardware out. They're stuck in "perpertual behindness" and can't get out of it.

What they really need to do, is back pedal on the $#!^ talking and get to work, and start talking again when they think they are about 4 or 5 models ahead, and then release that. All they are doing now is giving nVidia the advantage.

I know this from being sponsored in another hobby I'm into. You don't run your mouth about stuff. You keep it shut until you know what the other guy is gonna do. nVidia is playing this game nicely, and they are winning too. See how that works?
 
3DFlyer said:
ok, bad choice of words...let's call it "models" then. Whatever you want to call it, they are backpeddaling from their previous talk, and are gonna try and compare their card to an older card. They got beat again, and their next one will get beat. Why? Because nVidia has had all this technology, and all they have to do is release it after they see ATi talking about their next release, then wait, and wait, and wait on them, and then release it a month before they even get the hardware out. They're stuck in "perpertual behindness" and can't get out of it.

What they really need to do, is back pedal on the $#!^ talking and get to work, and start talking again when they think they are about 4 or 5 models ahead, and then release that. All they are doing now is giving nVidia the advantage.

I know this from being sponsored in another hobby I'm into. You don't run your mouth about stuff. You keep it shut until you know what the other guy is gonna do. nVidia is playing this game nicely, and they are winning too. See how that works?



K, Maybe I'm a little out of line...but I'm going to say it anyways :)
I've notice quite a bit of "harsh" words coming from you on ATi's release of this generation of cards. Does it keep you up at night thinking about why it hasn't been released yet? I mean honestly...who gives a sh*t.

It's almost a guarantee to hear something negative from you about either ATi, or AMD on a daily basis and to be quite frank it's getting pretty old...

TBH the only one that needs to back pedel on the "$#!^" talking is you.

/end rant
 
Nexus Realized said:
K, Maybe I'm a little out of line...but I'm going to say it anyways :)
I've notice quite a bit of "harsh" words coming from you on ATi's release of this generation of cards. Does it keep you up at night thinking about why it hasn't been released yet? I mean honestly...who gives a sh*t.

It's almost a guarantee to hear something negative from you about either ATi, or AMD on a daily basis and to be quite frank it's getting pretty old...

TBH the only one that needs to back pedel on the "$#!^" talking is you.

/end rant

You don't come around here much do you? If you did, you'd know that I'm watching ATi closely, and may very well be using them in crossfire on my next machine. So yes, you are definitely way off base and out of line.
 
I'm glad you are considering Crossfire, but it seems that you are very passionate about this. Instead of waiting to see the actual results that 5.11 had for ATI, you'll just assume that they are still very behind and "They're stuck in 'perpertual behindness' and can't get out of it." I'm very sorry, but they aren't so far behind that they will forever be doomed to be in nVidia's shadow.

It does annoy me that they chose to compare 512mb to 256mb, but that doesn't really say anything until we get numbers. Personally I am looking forward to seeing how they really compare and less of speculation. I didn't respond to your last post because I didn't feel like I had anything good to say. But I do feel like we all should sit back and just see what happens and not start letting our mouths (or fingures) fly.

Flame away if you want, but I just don't like seeing people who have their mind set and nothing can change it. I wish the best for both ATI and nVidia and I'm looking forward to what the competition brings forth.
 
I love the bias expressed in this review and anyone notice how Anandtech is the only ones that get to see crossfire earlier.

Second this is all Vaporware remember ATI promised Crossfir in March when did we get it? October when they needed it to not get completely killed.

Personally my opinion is Crossfire is a trick to to convince tier one mobo manufactuers to use ATI chipsets which atleast right now in everrything other than the ability to use 2xATI cards suck in comparision to the NF4
 
OK, for the flamebaiters of this thread here is a post I made, and I stick to my guns when I say something...

3DFlyer said:
http://www.anandtech.com/cpuchipset...doc.aspx?i=2569

If nVidia does not step forward, and quit hindering progress for the Intel platform, they will loose a long time user (me), and they will also loose the support of the entire Intel OC'ing community. Not supporting SLi on this new 975x xhipset is a very stupid thing to do.

I will be upgrading to Conroe when it becomes available, and if they do not have SLi support by then, I will be moving to ATi and Crossfire.

and here is a direct link to that same post so we don't start this stuff about "making up" quotes...

http://www.ocforums.com/showpost.php?p=4054864&postcount=14

What I don't like is companies who blab and blab, and blab and then don't deliver. If they do that's great, but there are folks on here who follow stuff, and actually wait on stuff to be released, and when they (me included) find out that the stuff didn't measure up to all the hype, they have to start looking elsewhere and wait even longer.

If they ***show us*** that's great, and then they have the right to talk the stuff up all they want, and more power to them when they do that. Right now, they are a legend in their own minds, but they are not delivering what they say they will.
 
What's the issue with comparing 256mb cards to 512mb cards? Most games do not fill the 256mb frame buffer. Supposedly, Doom3 can use 500mb "in a few scenes" and from the review at www.xbitlabs.com COD2 shows a decent performance boost from 512mb, but other than those 2 instances....I just don't see why its not fair to compare those cards.
 
Back