• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

The Official 3870X2 Reviews Thread

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
That is... if you consider 3-5 FPS a spanking.

True, but I consider the extra $200 for the 8800GTX a spanking. $200/5fps = $40 per additional frame per second... OUCH!

I'd sure like to know where you went to school...

Radeon HD 3870 X2: ~$450

GeForce 8800 GTX: ~$470-530 (+$20 - +$80)
GeForce 8800 GTS 512: ~$350-400 (-$100 - -$50)

"Let the beatings commence!"

You fanATics really need to get a grip on reality.
 
So then, I assume you're going to have the exact same complaints about the 9800GX2?
I expect to have the same complaints about power and heat. As far as price and performance are concerned, I'll have to wait and see the price and performance numbers.

I don't disagree that two chips is a bit ungainly, but at the same time, the whole concept of a single bigger and bigger and faster and faster chip is going to go the same way that new CPU's have. In other words, at a certain level of complexity, it will be more efficient (in terms of power, production rate, time and cost) to have multiple smaller cores than a single larger core.

We may not be at the end of the single-GPU era, but it is much closer now than it was five years ago. Just like five years ago, multi-core CPU's were essentially non-existant in the consumer space. Keep in mind that multi-core GPU platforms have been in the hands of big business for many years, just as multi-core CPU platforms were too. As the costs continue to make more sense, those multi-GPU arrangements will indeed make their way into the consumer platform.

Just like right now: would you consider someone with a single core CPU cutting edge? Would you consider someone with a dual-core CPU wasteful and deserving of an ***-kicking? Think about it.
I don't have a problem with multi-core GPUs or CPUs. But I find it laughable for one to argue that ATi has caught up with NVidia in terms of performance because they strapped two of their flagship GPUs onto one card and it almost equals a single NVidia's flagship GPU. I will, though, certainly admit that a single card solution is much more convenient than a xFire or SLI setup in that it allows simpler installation and removes the headaches involved in getting two cards to work in tandem. It also removes most motherboard restrictions for those that will only allow either xFire OR SLI--but not both (though, you will likely have to invest in a beefier PSU).
 
Last edited:
Well I just tried to match the Anandtech GPU test on cryis..

on my current rig at my daily driver settings I get 33 fps at 1680x1050......so this card is not as promising as I had hoped...(the X2).
 
I expect to have the same complaints about power and heat. As far as price and performance are concerned, I'll have to wait and see the price and performance numbers.
Why on earth did you pick that color to reply in? In any case, I wholeheartedly agree. Glad we see eye to eye on this one :)

I don't have a problem with multi-core GPUs or CPUs. But I find it laughable for one to argue that ATi has caught up with NVidia in terms of performance because they strapped two of their flagship GPUs onto one card and it almost equals a single NVidia's flagship GPU. I will, though, certainly admit that a single card solution is much more convenient than a xFire or SLI setup in that it allows simpler installation and removes the headaches involved in getting two cards to work in tandem. It also removes most motherboard restrictions for those that will only allow either xFire OR SLI--but not both (though, you will likely have to invest in a beefier PSU).
I don't necessarily disagree with any of this, but at the same time, performace is what it is. Not everyone likes NVIDIA hardware; you may find this hard to believe, but my previous video card was a 7900GT-on-AGP card from Gainward :) Before that, I had an R350 (9800Pro 256mb) and even earlier, an R300 (9500np 128mb softmodded). If we continue going backwards, previous models were a GF3Ti200, a Voodoo2 SLI and a Riva 128.

The only reason I mention ALL of that is this: I have used a considerable amount of NV hardware over my years of toying in 3D, and I have used at least a few ATI cards also. The ATI's were my personal favorites. for reasons that are likely only relevant to me. That is to say, my opinions don't necessarily dictate the opinions of everyone else.

I moved to the 7900 because I wanted more performance, and the 1900 series just didn't do much for me. But after using that card for two years, I wanted -- no, I needed an ATi video card again -- for all the same opinionated reasons I had before.

This is what ATI is giving those who want their hardware, and I'm not about to tell them to stuff it. The performance of my pair of 3870's is excellent, the video quality is precisely what I expect and want, and I don't have to fidget with NVIDIA's control panel or drivers.

These things may not relate to you personally, but they do to me. So while you might label people as fanATIcs (and you know the NV version of this, don't you? ;) ) it's likely because we really don't want NVIDIA's hardware. Maybe if NVIDIA had picked up the ball, they'd have something better to come back with too rather than a rehashed pair of their flagship processors strapped to a single PCB?
 
Why on earth did you pick that color to reply in? In any case, I wholeheartedly agree. Glad we see eye to eye on this one :)
Fixed. (You like cyan better?)
I don't necessarily disagree with any of this, but at the same time, performace is what it is. Not everyone likes NVIDIA hardware; you may find this hard to believe, but my previous video card was a 7900GT-on-AGP card from Gainward :) Before that, I had an R350 (9800Pro 256mb) and even earlier, an R300 (9500np 128mb softmodded). If we continue going backwards, previous models were a GF3Ti200, a Voodoo2 SLI and a Riva 128.

The only reason I mention ALL of that is this: I have used a considerable amount of NV hardware over my years of toying in 3D, and I have used at least a few ATI cards also. The ATI's were my personal favorites. for reasons that are likely only relevant to me. That is to say, my opinions don't necessarily dictate the opinions of everyone else.

I moved to the 7900 because I wanted more performance, and the 1900 series just didn't do much for me. But after using that card for two years, I wanted -- no, I needed an ATi video card again -- for all the same opinionated reasons I had before.

This is what ATI is giving those who want their hardware, and I'm not about to tell them to stuff it. The performance of my pair of 3870's is excellent, the video quality is precisely what I expect and want, and I don't have to fidget with NVIDIA's control panel or drivers.

These things may not relate to you personally, but they do to me. So while you might label people as fanATIcs (and you know the NV version of this, don't you? ;) ) it's likely because we really don't want NVIDIA's hardware. Maybe if NVIDIA had picked up the ball, they'd have something better to come back with too rather than a rehashed pair of their flagship processors strapped to a single PCB?
I have to admit, I do so love to poke (good-natured) fun at both fanATIcs AND NVIDIOTS. For these people, they can't stand to believe that they might not have made the best decision or have the best card. My job is to inject a few facts into their heads and see if they will explode. Let it be known that I never make fun of anyone's purchases--only their logic/conclusions. I love it that we live in a Capitalist country and people can spend their money however they like.

Like you, I have owned my share of both NVidia (which used to be nVidia) and ATi products through the years. I was around when SLI meant "Scan Line Interleave" (RIP 3Dfx). I purchased whatever made the most sense at the time. Historically, NVidia has offered better performance while ATi has offered better image quality. Comparing xFire vs SLI, ATi GPUs seem to scale better for now. At this point in time, however, NVidia has such a performance edge, that I can turn on all the AA, filtering, HDR, (and whatever other eye-candy that is offered), and still get ridiculously high framerates for the games I play at 1900 x 1200 (though, I haven't yet picked up Crysis...). Furthermore, I can do this with a single-slot, non-Flagship GPU for relatively low cost compared to other GPUs.

For me, this is a watershed era as this is the first time that I have actually spent more on a GPU than on the CPU in my rig. Even though I managed to get a EVGA 8800GT SC for $208 from Dell, This is still more than I paid for my E6750 (Fry's deal E6750 + basic ECS mobo for $168--sold the mobo for $35 = ~$133 for CPU).

All that being said, it must be known that I want both ATi and NVidia to succeed--just as I want both Intel and AMD to succeed. Competition drives innovation and lowers prices. Right now, it seems that like AMD, ATi has some catching up to do--and I hope they do it quickly. It looks like NVidia is pulling an Intel by delaying products' introductions and inflating their prices because they don't respect their respective competitors product lines at the moment.
 
Well I just tried to match the Anandtech GPU test on cryis..

on my current rig at my daily driver settings I get 33 fps at 1680x1050......so this card is not as promising as I had hoped...(the X2).

Doesn't anandtech use Vista and a quad core?
 
The 3870X2 has a faster GPU than the 3870, but DDR3 RAM instead of DDR4. Sort of evens it out. I really wish ATI had used DDR4 memory in the X2. It's rather puzzling why they didn't. Guess it was a cost thing.
 
The 3870X2 has a faster GPU than the 3870, but DDR3 RAM instead of DDR4. Sort of evens it out. I really wish ATI had used DDR4 memory in the X2. It's rather puzzling why they didn't. Guess it was a cost thing.
It was exactly that, 1GB of GDDR4 would cost a lot more and you need a lot of chips for that card.

I'm hoping someone will roll out an improved 3870X2 soon with GDDR4(and maybe 2GB), then I'll get one.
 
If they were in the $300 range they would sell like hot cakes, but alas they are just an expensive small step-up from the G92's.
 
Volt mods are now figured out for the 3870X2.

http://www.quantum-force.net/tutorials/T000000006/

The DDR4 memory of the regular 3870 is clocked faster, but I believe the timings are not as tight compared to the DDR3 of the 3870X2. At least that seems to be the opinion of some over at the XS forum.

The benchmarkers are breaking records with the 3870X2. So, I'm not sure 2 crossfire 3870 cards would be a much better option. It's probably about 6 of one and half a dozen of the other.
 
Given equal modding treatment, I'd still think a pair of 3870's will outperform a 3870X2. But there's still the motherboard interface to consider, and of course all the "randomness" that makes overclocking a bit of a hair-puller. On individual 3870 cards, you might find one that's less overclockable even at stock, so you sell it and buy another (and another, and another, ...) until you get the one that works best. On an X2, if only one core is a little grouchy and the second core is awesome, you gotta sell both to swap out. :(

However, I'd also have to say that a pair of 3870X2's modded like the above would very likely do FAR better than any sort of quad-crossfire on individual 3870 cards. I'm sure we'll see some UBER-benches of that kinda thing shortly.

I also have a deep-down desire to see someone get a quadfire motherboard (PCIE 8x lanes times four) and throw two 3870's and two 3870X2's into the mix. You ask why? Because even though each 3870X2 has a single CF connector, each 3870 card has two. So you put the 3870's in the middle two slots, and the 3870X2's in the outside slots. Link one 3870 with it's neighboring 3870X2, and then in the middle use the second connector on the 3870's to link them both together.

Eeeeeevil :beer:
 
If it takes 2 of ATI's GPU's to = 1 NV GPU, IMO that is still a spanking--even if talking only in terms of performance. This is not even taking into account power and heat issues with 2x the GPUs.

I have similar issues with ATI fans who like to compare an O/Ced 3870 to a non-O/Ced 8800.
That'd be similar to them example of a 4cylinder car that comes close to beating a v8. "Well I have 4 less cylinders, so technically i still win, because it was close". No, you don't. At the end of the day its the V8 that finished first, and that is all that really matters.

It was a good idea for AMD because the RnD is much less than a new GPU, they have a competitive product, and its not a thrown together hunk of **** because its a dual GPU card done right.

AT the end of the day for the next few weeks it's hands down the fastest thing regardless of how they did it. Run what ya brung.


Yes i understand AMD/ATi really needs to get thier **** together to actually be competitive with nvidias next series of GPU. I understand that fully. I'm just obviously pointing out AMD found a nice cost effective way of taking the crown for a short time and it shouldnt be so quickly denounced because its a dual GPU card. It's done right and effective for 95% of the population that don't have a crossfire mobo. The performance of this card is only going to increase more and more with each optimized crossfire driver release. Not to mention it also gets AMD to really optimize crossfire drivers which will only help in the long run when SLI and crossfire become more of a reality in the mainstream.
 
That'd be similar to them example of a 4cylinder car that comes close to beating a v8. "Well I have 4 less cylinders, so technically i still win, because it was close". No, you don't. At the end of the day its the V8 that finished first, and that is all that really matters.

It was a good idea for AMD because the RnD is much less than a new GPU, they have a competitive product, and its not a thrown together hunk of **** because its a dual GPU card done right.

AT the end of the day for the next few weeks it's hands down the fastest thing regardless of how they did it. Run what ya brung.

Ah, I see another fanboi got his feathers ruffled.

Your car analogy would be fine--except for the fact that ATi's 8 cylinder LOST to NVidia's 4 cylinder. Also, in case you missed it, they never compared the x2 to the ULTRA, just the GTX.

spanked++;
 
I got the X2. It's close enough to the Ultra for me and like $200+ less. I gladly spent that $200 on an extra 2 gigs of RAM (especially since I went Vista).
 
Ah, I see another fanboi got his feathers ruffled.

Your car analogy would be fine--except for the fact that ATi's 8 cylinder LOST to NVidia's 4 cylinder. Also, in case you missed it, they never compared the x2 to the ULTRA, just the GTX.

spanked++;

my feathers aren't ruffled, I dont own either card and won't be purchasing an ATi or Nvidia soon as my PC is garbage at the moment and I am currently putting money into my car. Twin disk clutch > pc for me.

I just like how ATi went about it and don't feel it should be so quickly denounced; I simply gave my opinons. Do as you wish, its not going to ruin my day.

It also would seem you are the biased as you got so defensive to my post which was never really offensive. That and the fact I see alot of benchmarks have both the GTX and ULTRA, where have you been reading?

http://www.hexus.net/content/item.php?item=11520&page=5
http://www.anandtech.com/video/showdoc.aspx?i=3209&p=7
http://techreport.com/articles.x/13967/7
http://www.fpslabs.com/reviews/video/amd-radeon-hd-3870-x2-review/page-6
http://www.tweaktown.com/reviews/1279/8/page_8_benchmarks_f_e_a_r/index.html

and don't forget alot of the reason it's losing is because of the drivers. This isn't a single gpu that will gain a few fps from tweaks, its a crossfire card and crossfire has a wayyyyys to go in really getting the drivers to where they need to be. With that said, THE Same goes for SLi and we should see goodgains in driver development for the 9800gx2 ALSO. Hence the reason for me liking this card and the future 9800gx2, because it will really force nvidia and ATi to get on to developing crossfire/SLi drivers to where they should be instead of the after-thought they have been
 
Last edited:
Back