- Joined
- Nov 29, 2002
- Location
- USA
So I'm goofing off on my computer last night playing around with memory timings to see how they affect my 3DMark06 scores and I noticed that I was scoring lower than I should be.
I went into the nVidia display properties and made sure that SLI was enabled but was still scoring lower than I ought to.
I went back into the nVidia control panel and went to the advanced section and noticed there was a setting that let you choose the method used to render your screen when SLI was enabled.
It turns out mine was set to Single GPU Only or some such crap like that.
When I selected Alternate Frame Rendering from the drop-down list, I got a EULA-Like pop up that I had to scroll through and accept prior to the system accepting my setting.
Utilizing AFR instead of the default improved my 3dMark06 scores by 1617 points (9091 now instead of 7474) so I'm wondering why the default setting once SLI is enabled is set to Single GPU Only.
Anyone care to shed some light on this for me? I'm a bit confused as to why SLI isn't "really" SLI until you select the second option and click another EULA...
I went into the nVidia display properties and made sure that SLI was enabled but was still scoring lower than I ought to.
I went back into the nVidia control panel and went to the advanced section and noticed there was a setting that let you choose the method used to render your screen when SLI was enabled.
It turns out mine was set to Single GPU Only or some such crap like that.
When I selected Alternate Frame Rendering from the drop-down list, I got a EULA-Like pop up that I had to scroll through and accept prior to the system accepting my setting.
Utilizing AFR instead of the default improved my 3dMark06 scores by 1617 points (9091 now instead of 7474) so I'm wondering why the default setting once SLI is enabled is set to Single GPU Only.
Anyone care to shed some light on this for me? I'm a bit confused as to why SLI isn't "really" SLI until you select the second option and click another EULA...