• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Crossfire?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

batboy

Senior Moment
Joined
Jan 12, 2001
Location
Kansas, USA
Always wanted to try crossfire, just something cool about a pair of big vid cards, even though they really aren't all that practical. I got a chance to buy another RX 580 just like the one I have (it was on sale at the egg). I know the GTX 1080Ti still whips me, but at least I can slap a 1070 around now and match or beat the plain vanilla 1080 (sometimes depending on which bench). Sadly, I'll probably whoop Vega's butt on many benchies too. My other choices were to grin and bear it or buy a 1080Ti (if you can't beat 'em, join 'em). So, I have $200 less invested into this setup than if I had bought the 1080Ti (if you can find them).

crossfire and new fans.jpg
 
Oh I wouldn't beat yourself up over it; 580s are great cards especially if you got a good deal. Geforce 10 series is wildly overrated
 
Overrated how ?

@batboy have fun benching tonight

Folks make them out to be the best card NVidia has ever made and they're really not. All they've done is strip out it's compute ability and turn it into a strictly gaming processor. If you try doing any kind of compute or any sort of application beyond gaming, like VM deployment they really show their weaknesses. This isn't like the FX versus 9xxx days where it's total blowout or even Sandybridge vs Piledriver bad.
 
Folks make them out to be the best card NVidia has ever made and they're really not. All they've done is strip out it's compute ability and turn it into a strictly gaming processor. If you try doing any kind of compute or any sort of application beyond gaming, like VM deployment they really show their weaknesses. This isn't like the FX versus 9xxx days where it's total blowout or even Sandybridge vs Piledriver bad.

1060s and 1070s seemed to be as rare as AMD cards for the last mining craze. I don't think they're all that bad. And for my needs there isn't anything my 1070 HOF can't handle at 1080p@60Hz.
 
Overrated how ?

@batboy have fun benching tonight

:) The second card arrived yesterday. I got maybe 3 hours of sleep last night. Yippee!

Here is just one example of how much CrossfireX improved scores for the 3Dmark Time Spy benchmark. Official personal best for a single RX 580 was 4818 points compared to 8411 points laid down by the 2x RX 580 in crossfire.

Sent, you're right about the RX 580s being a more rounded card rather than only for gamers. That's definitely one big factor that appeals to me.

I like how Tir thinks! This setup will hold me over until Volta arrives probably sometime next year.
 
:) The second card arrived yesterday. I got maybe 3 hours of sleep last night. Yippee!

Here is just one example of how much CrossfireX improved scores for the 3Dmark Time Spy benchmark. Official personal best for a single RX 580 was 4818 points compared to 8411 points laid down by the 2x RX 580 in crossfire.

Sent, you're right about the RX 580s being a more rounded card rather than only for gamers. That's definitely one big factor that appeals to me.

I like how Tir thinks! This setup will hold me over until Volta arrives probably sometime next year.

That's pretty nice- I can't even get 8400 with two Vega 56s in TimeSpy! Granted, it was much cheaper to buy two video cards than build a whole new system- The Ryzen 1700s and up all do very well- but that's another $500 there minimum for board+ DDR4 RAM.
 
Alaric, it depends on the program/app/game. Obviously, I picked one of the more dramatic examples to post.

Like I said, I've never done crossfire before. Heard lots of problems early on, so I stayed away. It was painless to add another graphics card into my system. These newer generation Radeon cards don't use a bridge link like before. It's basically, plug and play. If you decide to try Crossfire, I suggest you do the following:

First, make sure your motherboard supports two PCIe 16X cards. I have 3 PCIe slots in my motherboard. The top two slots are the full 16X lanes, but the bottom PCIe slot is "only" 8X. Sometimes only the top slot is designated as 16X and the others are 8X.

Next make sure your drivers are up to date. If not, install them now.

Shut down, install second card, reboot.

I had to reinstall the display drivers once crossfire was enabled, for some reason.

That's it, tada, Crossfire.

- - - Updated - - -

That's pretty nice- I can't even get 8400 with two Vega 56s in TimeSpy!

That makes me feel all tingly inside.
 
Depends on the program/app/game. Obviously, I picked one of the more dramatic examples to post.

Like I said, I've never done crossfire before. Heard lots of problems early on, so I stayed away. It was painless to add another graphics card into my system. These newer generation Radeon cards don't use a bridge link like before. It's basically, plug and play. If you decide to try Crossfire, I suggest you do the following:

First, make sure your motherboard supports two PCIe 16X cards. I have 3 PCIe slots in my motherboard. The top two slots are the full 16X lanes, but the bottom PCIe slot is "only" 8X. Sometimes only the top slot is designated as 16X and the others are 8X.

Next make sure your drivers are up to date. If not, install them now.

Shut down, install second card, reboot.

I had to reinstall the display drivers once crossfire was enabled, for some reason.

That's it, tada, Crossfire.

I definitely have 2x16 PCI-e lanes- my CPU is just a little slow. FX8320E- not clocked super fast like the FX8350, simply because I don't use it for gaming most of the time, just demos and benching casually. My board's specs: https://www.newegg.com/Product/Product.aspx?Item=N82E16813128514

Using an EVGA BQ-850 PSU and the latest 17.9.3 drivers that support mGPU (the new Crossfire term for Vega generation)
 
Last edited:
That makes me feel all tingly inside.

Yes, my score was around 8036.
https://www.3dmark.com/spy/2452560
Graphics Test 1 83.52 fps
Graphics Test 2 65.61 fps
CPU Test 9.35 fps
Though if you look at your results, your CPU score was much better, while my GPU score had better frames. Games don't always use that physics realism and seem to benefit more from raw graphics processing when there is not CPU bottleneck. Your CPU will be able to support faster graphics cards than mine, but I have extended the life of my system for a few more years.
https://www.3dmark.com/spy/2454941
Graphics Test 1 59.53 fps
Graphics Test 2 49.04 fps
CPU Test 22.43 fps

For the price though, two 580s now aren't a bad deal- the mining craze had higher prices earlier but they have settled a bit. I was willing to spend more on Vega since I don't plan on upgrading again for a while.
 
:) The second card arrived yesterday. I got maybe 3 hours of sleep last night. Yippee!

Here is just one example of how much CrossfireX improved scores for the 3Dmark Time Spy benchmark. Official personal best for a single RX 580 was 4818 points compared to 8411 points laid down by the 2x RX 580 in crossfire.

Sent, you're right about the RX 580s being a more rounded card rather than only for gamers. That's definitely one big factor that appeals to me.

I like how Tir thinks! This setup will hold me over until Volta arrives probably sometime next year.

Yea the 580 was a solid card for sure, all Vega needs is a die shrink and it'll be fine.


That's pretty nice- I can't even get 8400 with two Vega 56s in TimeSpy! Granted, it was much cheaper to buy two video cards than build a whole new system- The Ryzen 1700s and up all do very well- but that's another $500 there minimum for board+ DDR4 RAM.

Really? I mean unless you're really CPU bound here's mine for comparison (6126):
https://www.3dmark.com/spy/2368391

EDIT:
Yea you're REAAAAAAALLLY processor bound, when a laptop on a dirty run gets 36% faster CPU score then you've got a problem.
 
Last edited:
If I didn't already have the RX 580, I might have tried a pair of Vega RX 56 cards. But, I'm about maxed out on power supply now though, so I couldn't do crossfire with anything that uses more power (the RX 56 does use more than my RX 580.).
 
Yea the 580 was a solid card for sure, all Vega needs is a die shrink and it'll be fine.




Really? I mean unless you're really CPU bound here's mine for comparison (6126):
https://www.3dmark.com/spy/2368391

Exactly, the CPU affects the general score more than actual graphic frames per second. I was hoping to score at least 9000 (you get an award for that which I achieved with Firestrike), but since I did some research before buying it, I realized the GTX 1070 and 1080 still resulted in a huge boost for FX processors similar to what Vega could do for my system with a Radeon 7950. So while I lost a few frames in minimal CPU bottleneck, I still gained 3-4x frames due to the underutilized 2x16 PCI-e lanes on my AM3 motherboard. I just bought a bigger power supply to upgrade from my 620watt Seasonic to handle the higher power of both Vega 56s- and Vega 64 was out of the question for me.
 
If I didn't already have the RX 580, I might have tried a pair of Vega RX 56 cards. But, I'm about maxed out on power supply now though, so I couldn't do crossfire with anything that uses more power (the RX 56 does use more than my RX 580.).

Not much more to be honest, maybe 100-200W more realistically speaking
 
I've discovered something that I should have researched before now. I mentioned I have 3 PCIe slots--two are 16X and one is 8X. I was poking around in the BIOS and saw this:

PCIEX16_1
Card: AMD GPU
Type: running at X8 native

PCIEX16_2
Card: AMD GPU
Type: running at X8 native

I studied my MB manual and for the X299 platform, the CPU has to have 44 lanes to run two GPUs at X16/X16. In other words, you need the i9. You can also run a third card at X8.

For the CPUs with 28 lanes (i7 with at least 6 cores), it'll do X16 in slot 1 and X8 in slot 2. Slot 3 is disabled.

For the 16 lane CPUs like my 7740X, unfortunately it only allows X8 for slot 1 and X8 for slot 2. Slot 3 is disabled. A graphics single card in slot 1 will run at X16.

Intel really made the 7740X a red-headed stepchild. It won't do quad channel memory either. Perhaps it's time for a 7920X?

I'm puzzled why I do so much better in the 3D benchmarks? Maybe that slot 1 doesn't get to use all of the theoretical 16 lanes when it comes down to real life? My M.2 drives might be stealing some bandwidth?
 
Last edited:
I've discovered something that I should have researched before now. I mentioned I have 3 PCIe slots--two are 16X and one is 8X. I was poking around in the BIOS and saw this:

PCIEX16_1
Card: AMD GPU
Type: running at X8 native

PCIEX16_2
Card: AMD GPU
Type: running at X8 native

I studied my MB manual and for the X299 platform, the CPU has to have 44 lanes to run two GPUs at X16/X16. In other words, you need the i9. You can also run a third card at X8.

For the CPUs with 28 lanes (i7 with at least 6 cores), it'll do X16 in slot 1 and X8 in slot 2. Slot 3 is disabled.

For the 16 lane CPUs like my 7740X, unfortunately it only allows X8 for slot 1 and X8 for slot 2. Slot 3 is disabled. A graphics single card in slot 1 will run at X16.

Intel really made the 7740X a red-headed stepchild. It won't do quad channel memory either. Perhaps it's time for a 7920X?

I'm puzzled why I do so much better in the 3D benchmarks? Maybe that slot 1 doesn't get to use all of the theoretical 16 lanes when it comes down to real life? My M.2 drives might be stealing some bandwidth?

Yep the AMD AM3 chipset allows 40 lanes of PCI-e 2.0 so enough for my 2x16 Vega 56.
 
Yep, you're a smart fellow, Neb. Maybe I need to, "look before I leap," as my mom used to tell me.

Well, I've been saying all a long that this Kaby Lake-X was just to hold me over until I got a Coffeelake or an i9. Now it looks like no Coffeelake-X for at least a year or I can get a 7920X right now. Quad channel memory and X16/X16 Crossfire here I come. Although, I'll surely miss the high clock speeds of this 7740X.

Intel 16 lane CPU with X8/X8 still whoops AMD 40 lane CPU with X16/X16, interesting.

EDIT: I got to thinking, WWND (What Would Nebulous Do)? Why, he'd wait until after the Oct. 5th release of the Coffeelake processors to see what might shake down. Maybe there will be some price changes when they upgrade the line-up? Ok, waiting a week sounds like a plan. Thanks, Neb!
 
Last edited:
Back