Table of Contents
Some of you may have heard of Lucid Hydra, and it’s implementation on motherboards to allow GPUs from AMD and NVIDIA to play nicely together. We have been testing this unique feature from HIS and Lucid: HIS decided to try putting one of Lucid’s chips on an HD6970 to allow this AMD and NVIDIA intermingling, and they call this feature MIX.
AMD Eyefinity is enabled via HDMI without using expensive DisplayPort monitors or active adapters, and can also be enabled via an HDMI to DVI adapter. With the HIS 6970 IceQ MIX, you can easily set up Eyefinity with ANY of three outputs or all five outputs for multi-display gaming.
Allows running both an AMD and NVIDIA GPU in the same system to enjoy the benefits of both graphics features and performance.
Thanks to HIS, we have their 6970 IceQ Mix to test out this new and unique feature. Below are a couple of pictures of the GPU; one out-of-the-box, and one with the heatsink removed so we can see the Lucid chip on the PCB beside the GPU die.
The following information is also available at Lucidlogix.
HydraLogix 200 Series
Lucid’s LT22102 is a system on a chip (SoC) device providing an affordable generic multi GPU solution for motherboards as well as add-in boards. The LT22102 is a unique PCIe 2.0 device designed to scale graphic performance in a multi GPU environment.
The LT22102 has 48x 5 Gbps Serializer/Deserializer (SERDES) lanes, a full PCI-Express switch with one x16 upstream port and two x16 downstream ports, and an embedded end point. The upstream port is typically connected to the system’s chipset north bridge or directly to the CPU and the two downstream ports are connected to up to two GPUs from any vendor. The embedded end point incorporates a reduced instruction set computing (RISC) processor and direct memory access (DMA) engines used by Lucid’s proprietary algorithms. Together with the unique HydraLogix driver, the LT22102 provides an adaptive and dynamic, parallel graphics load-balancing scheme, which optimizes the usage of GPU resources with minimal power consumption.
HydraLogix Control Panel
The driver software is really easy to figure out. There are three tabs: Main, Games, and About. On the Main tab, one can enable or disable HydraLogix by clicking on the green button in the first screenshot. The button will change from green to gray and from saying enabled to disabled when clicked. The Games tab shows a list of games (and benchmarks) supported by the current driver. Games can be added, edited, or removed by clicking the buttons near the bottom of the window. To add games or benchmarks, one has to tell the software what .exe will be used by browsing to its location, clicking on it, giving it a name, clicking OK, then clicking Apply.
- Intel i7-2600K CPU @ 3.4 GHz
- eVGA P67 FTW Motherboard
- 2 x 2 GB Corsair Dominator GT RAM @ DDR3-1600 8-8-8-24 and
- 2 x 2 GB Kingston HyperX RAM @ DDR3-1600 8-8-8-24
- HIS 6970 IceQ Mix @ 880/2750 MHz
- Asus Matrix GTX 580 @ 772/2004 MHz
- MSI GTX 580 Lightning @ 772/2004 MHz
- 50 GB OCZ Vertex 2 SSD
- SeaSonic X-750 Power Supply
- Windows 7 Pro x64 SP1
- SLI Drivers: NVIDIA 280.26
- MIX Drivers: AMD 11.3, NVIDIA 266.58, and Lucid HydraLogix 1.7.105
- Kill-a-Watt Meter to measure power consumption
You may notice that I have run quite a few more game benchmarks than are usually run for our GPU reviews. This is because I wanted to use as many games as I could that were already set up in the HydraLogix drivers with “official” support. Also, if one of the games or benchmarks was not already set up, then I added the .exe for that game to the drivers as outlined in the HydraLogix Control Center section, which would be “unofficial”. To differentiate between the two, I was sure to put the tests with official support in their own graph as well.
Game benchmarks were typically run at a screen resolution of 1920×1080 with 8x antialiasing (AA) and other settings at maximum. There are a few exceptions: Alien vs Predator was run at default settings; Lost Planet 2 and Diablo III beta were maxed as much as allowed. Since Diablo III beta doesn’t have a benchmark, it was tested by making a King Leoric run while using MSI Afterburner’s logging feature to record frames per second at one second intervals. After the run was complete, all of the logged frames per second were averaged for the final result.
I realize that testing games with a play-through isn’t consistent because each play-through will be different. Because of this, I did five runs on Diablo III to see how much variation there was between the runs. The difference between the lowest performance and the highest performance runs was 3-4 frames per second, which works out to be an approximately 3-4% difference since average frames per second (FPS) were around 100. I think the Diablo III results are a pretty good estimate of performance. Results are displayed in the graphs as performance relative to the MIX setup, this was done so that the % performance difference in the different setups were easy to see.
The various 3DMark software packages were run at default “performance” (Pxxxx) settings. Unigine Heaven was run with the Xtreme preset. The average frames per second result is used in comparing Heaven since a score wouldn’t be shown while using MIX. The average frames per second can be found in the power right hand corner during the Heaven bench and the final value seen was used for the final result. Results are again displayed in the graphs as performance relative to the MIX setup, this was done so that the % performance difference in the different set ups are easy to see.
System power consumption was determined by taking the maximum value seen on a Kill-a-Watt meter while the benchmarks were running. I mainly used system power consumption to tell whether or not both GPUs were being used in a multi-GPU setup. For example, both the GTX580 and HD6970 have a TDP around 250W, so if I see only approximately 250-300W, then I know that both GPUs were not being used much, if at all.
CrossFireX tests were done by EarthDog using our typical suite of benchmarks before I decided on additional tests for the MIX feature. Because of this, unfortunately, there aren’t CrossFireX performance results for every game. However, there are results for all the synthetic benchmarks.
System Power Consumption
I started off with power consumption; I know this seems like an odd place to start, but as I mentioned in the methodology section, system power consumption was used to tell whether or not MIX was using both GPUs. I think this is the best place to start so we will know what to kind of results to expect.
The GTX580 and HD6970 have a TDP of ~250 W and the i7-2600K has a TDP of 95 W, so if everything was being stressed under 100% load, I would expect a little more than 600 W for the system power consumption. However, the tests do not stress all of the components to 100%, so what I looked for to tell whether or not the GPUs are being used was 400+ W and/or power consumption close to the SLI results.
Based on the power consumption alone one can see that MIX and HydraLogix are not fully utilizing both GPUs in the following tests: Diablo III beta, Crysis, Just Cause 2, Dirt 2, Hawx DX11, Stalker
Here is how I have deciphered the game results in my head: first, assume that MIX isn’t using both GPUs very well at all. Now, if that’s the case, then the additional performance of SLI/CFX is basically the two card scaling of SLI/CFX. For example, if MIX is at 100% and SLI is at 130%, then you know MIX is working, at least somewhat, since SLI scaling with two cards is much better than 30%. If SLI/CFX is above ~185%, then you know that MIX is hindering performance because SLI/CFX scaling isn’t typically 85% or greater.
It seems that MIX is a hit or miss in games, and you just don’t know what the results will be until it has been given a try. The worst games are obviously Dirt 2, Lost Planet 2, Hawx 2 DX11, and Stalker. It looks like MIX is hindering the performance in those games to less than what a single GPU would get you. This is because the difference in the percentages is greater than 100% and we know SLI/CFX scaling cannot exceed 100%, EVER.
Theoretically, in a perfect world, a single card would be 100% and two cards (SLI/CFX/MIX) would be 200%. So, when we see SLI at 238% (Dirt 2), that’s like MIX running as a single card at 62% usage because 200% (max) – 138% (dif) = 62%.
The games where MIX worked well were Hawx 2 DX10 and Alien vs Predator (HQ). Games that performed decently were Devil May Cry 4, Resident Evil 5, and Alien vs Predator (at Default). Crysis performed pretty badly with MIX, but not nearly as badly as Dirt 2, Lost Planet 2, Hawx 2 DX11, and Stalker. Diablo III beta and Just Cause 2 didn’t work well in either MIX or SLI. Diablo III is still in beta so we still have yet to see how that game will handle multiple GPU setups when finished. Just Cause 2 only used one GPU (298 W) in MIX, but SLI only did 27% better while using two GPUs (425 W). If I exclude those four games where MIX actually hinders performance, then MIX is 36% slower than SLI on average. With those extremely bad games included the average jumps to MIX being 80% slower, which I think is skewed by those outliers. Either way, the game results aren’t looking too good for MIX…
On to the synthetic tests. What I really like about the synthetic tests is that they show the potential of the HydraLogix technology. In all the synthetic tests, with the exception of 3DMark11, MIX and HydraLogix fare very well against SLI and CFX. In 3DMark06, all three configurations are within 4% or less of each other with MIX even beating out SLI by a small margin of 2%. The synthetic benchmarks are definitely a good showing for MIX and HydraLogix. If only this scaling potential was applied to the game tests, then MIX would be doing really well so far.
The following graph shows the tests that are officially supported by HIS MIX, according to the support list supplied by HIS. As mentioned earlier, games seem to be hit or miss, and that’s even the case with the supported games and benchmarks. I just wanted to bring these officially supported tests together into one graph, to show whether or not the supported titles have any advantage. However, the results are just more of the same. A game or benchmark being officially supported doesn’t seem to have any effect on the MIX performance while using those programs, which is odd to me. Before testing and seeing the results first hand, I assumed the supported games and benchmarks would do really well because they must have been tested by HIS and/or Lucid before putting them on that support list. However, that’s just not the case, and MIX just seems to be hit or miss whether the program is supported or not.
After running through all the games and benchmarks, MIX didn’t do as well as I had hoped. Don’t get me wrong, it does have a lot of potential as seen in the synthetic tests, but it doesn’t consistently beat come close to CrossFireX or SLI. I didn’t expect it to beat SLI since two GTX580s should be more powerful than a GTX580 and a HD6970. However, I did expect it to beat or come really close to CrossFireX since the CrossFireX setup is theoretically not as powerful as the MIX setup. If the purpose isn’t to compete with SLI and CFX, then MIX seems like a neat, but pointless, feature. This is because I can’t really think of a situation where adding a GPU from a different company would be more beneficial than just adding an identical GPU.
On the plus side, the MIX Eyefinity feature could be really useful for people without DisplayPort monitors or DP to DVI adpaters. However, the adapters don’t cost near as much as when Eyefinity was first released and can be had for around $30.
If the scaling of MIX with HydraLogix was better and more consistent for games, and the list of supported games was larger, then HIS and Lucid would have a winner. As it stands now, there’s no reason for someone to opt for MIX over CrossFireX or SLI.
Thanks again to HIS for the review sample and for allowing us to check out their MIX feature.
– Matt T. Green (MattNo5ss)