• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

CrossFireX (3870x2, times two) - Prelim Benches!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
ALLRIGHT~! CUT THE CRAP SLINGING AND NAME CALLING RIGHT NOW!

Next flame bait peep out of you and you get a a 3 day vacation.

There are 2 other threads here where you have flame baited members.

krag, OCF Moderator

LoL, nice reply ;)

I already had a chat with two other mods about the other threads, both of which conveniently stopped responding to me after I pointed several things out that they seemingly got squeemish about answering. PM me if you too want to have any sort of legitemate conversation on this topic, perhaps you can take up where the other two stopped.

Now, back to the actual reason for this post:

I've yet to see any other benchmarks, and I'd like to assume this isn't the ONLY website to get ahold of these drivers. So that leaves me with more questions than answers...

Important things like timeframe - do we have any general idea when these are supposed to be available to the public? Two months? Four months? Six months? Twelve months? Only after NVIDIA already has something similar for their GX2? (Well wait, I believe they already DO support tri-SLI in their current driver set, so they can't be that far off...)

Important things like further game support - I couldn't care less about 3DMark06 optimization, although it looks like low-haning fruit to me (these benches show NO scaling in SM2/3 and like 35% scaling in the HDR test). But I'd love to see some heavy-hitter benches, like Crysis at full tilt, or some CoD, or whats the new one about the Russians coming after us in a nuclear WW3? These are the games that are dragging out video cards into the dirt; I wanna see these at 2048x1536 with 4xAA :drool:

I'd also like to see some power draw numbers; I assume the worst under load, but what about at idle? And then does the scaling continue nicely when overclocking? And what about those games that suck with current crossfire? While few and quite far between, I can only imagine it being FAR worse with four GPU's rather than "only two".

I wouldn't mind seeing some comparos between two of these in a 16x / 4x borad versus another pair in a 16x / 16x board either.
 
Last edited:
Looks like "one of the other mods" thought you needed a vaction...;)
 
I guess someone sent some harsh PMs, or XS is looking a bit better after all...

I'm waiting for some benchmarks and it looks we'll have an OCF member jumping the gun and getting two 3870X2's to inform us of how good CrossFireX, QuadFire can be.

So let's direct out attention here in the next couple days: http://www.ocforums.com/showthread.php?t=546872
 
first off, i want to say that i'm not supporting the flame wars that albuquerque may have tried to start or whatever PM's he may have sent.

but, i will say that i thought he was a very helpful contributing member of our community. i hope that his ban isn't permanant.
 
first off, i want to say that i'm not supporting the flame wars that albuquerque may have tried to start or whatever PM's he may have sent.

but, i will say that i thought he was a very helpful contributing member of our community. i hope that his ban isn't permanant.

He'll be back in a couple of days or so :)
 
when dealing with 4 videocards, your gonna be dealing with a massive amount of bottlenecking.
if a benchmark relies heavily on the CPU to do part of the work, then the videocards will only be held back by the CPU, even if that CPU is at 4.2 ghz.
I've seen people with 5.5 ghz conroe benchs, where they didnt even need to overclock the videocards cause the cards had so much left, that the extra few hundred points wouldnt have made much of a difference.
ati designs thier cards to have massive amounts of available bandwidth to them.

4 x 3870 cores, in 3dmark2006 at 4.2 ghz, isnt enough to really allow the cards full power.
a game like fear, which is heavily GPU based, will allow the cards more power to show off what kinda potential really is there.

for 3dmark2006 to really scale with the videocards, you need a hell of alot more CPU power.

plainly put, 2 x 3870 x2's is just a hell of alot of GPU power, but you better have alot of CPU power to back it all up... or your dealing with a hell of alot of bottlenecking - CPU holding the rest of the system back.

What scaling?

shoes_xfirex1_3dmark.jpg
 
So after a few days of hiatus, I'm back. Yay :) And I wanted to respond to this first:
4 x 3870 cores, in 3dmark2006 at 4.2 ghz, isnt enough to really allow the cards full power.
Depends on your video settings though, doesn't it? I can enable 8x Wide-tent AA, adaptive supersample in quality mode, and 16xAF with a video resolution of 1680x1050 and drag my CF setup into the dirt on something as "old" as Oblivion. Another video card or two would make a world of difference for those settings. Of course, I know you know this as you specifically mentioned more video-dependant games. My point is: you can make a LOT of games far more video dependant by slapping on some high-quality AA modes.
for 3dmark2006 to really scale with the videocards, you need a hell of alot more CPU power.
Well, for your CPU score to jump, yes. For your SM2 and HDR scores to jump, no. If you go back to that review, you'll notice that "quadfire" wasn't really even doing anything for those two score portions -- in fact, it reduced the SM2 score straight away. Obviously a driver problem, which means there's much more to be had in 3DMark06 than those scores indicate. In fact, that entire bump was driven only by the 3000-point jump in HDR score, which is actually pretty good IMO.

I think a 4Ghz E8400 would have no problems pushing a pair of 3870X2's for that person who has a nice big LCD screen and likes to play games at full AA and AF settings :) Just like I'm now LOVING all my "old" games again because of all the features I can crank up to uber-max and love it. Yet, I still have more settings that could be enabled if my GPU power could deal with it, and I've already got a pair of overclocked 3870's on a PCIE 2.0 board :)
 
Back