• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GTX260's in SLI, mini review

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

curtisbouvier

Registered
Joined
Oct 16, 2006
Location
Canada
You know I spent a lot of time looking for reviews on GTX 260's in sli, I think I found maybe 2, which seemed shady. Because other reviews were way different in the numbers... so there were some big discrepancies.

I thought I'd just give you guys an Idea of how it all runs from an average joe :cool: for any who may be interested in running these cards in sli.

So I picked up an Intel 3.0GHz E8400, 4GB of mushkin redline memory (5-5-5-12) I believe, and two evga GTX260's (core216's) superclocked editions, they were on sale at $209 canadian for a short time at NCIX. I was building a new computer so the timing was convenient.

all of this is running on an EVGA ftw 750i sli mainboard.

image of the setup:

http://img34.imageshack.us/img34/8159/inside1a.jpg

I currently have this cpu running stable at 3.89GHz (1732 fsb speed) @ 33c on air.

260's are running slightly overclocked, (from 626mhz standard clock) to 667 mhz with the shader at 1438. memory is at default.

windows XP of course, latest nvidia drivers, gaming resolution is 1680x1050.

3DMark06 score was a tad over 19K
http://img9.imageshack.us/img9/3021/3dmarkl0lz.png

Vantage was 13.6K on windows vista (I have no idea if this is good or not, i'm not familiar with vantage at all, what do you guys think?
http://img43.imageshack.us/img43/7176/vantagescore1a.png

Crysis on windows XP (CFG files edited for direct X10) about 56 FPS average (topped at 65, bottomed out at 47) this is with 16x AF and 4x AA.

http://img20.imageshack.us/img20/146/crymax4xaa.jpg

this was actually pretty disturbing considering the directX10 features Are indeed fully working on XP, you can see the sun beaming through the trees, the random waves of water, the minor sepia photo filter applied to the games over all color, all of these are only apparent under vista, with crysis set for Very High. so this makes me wonder if the game is actually mainly running in DX9... or if the game some how by passes the direct x 9 files on windows XP?

edit: forgot to mention that crysis under vista, runs horrible, 25-30 fps with a 1 second delay on all mouse movement..

any ideas there?

what are your thoughts my good friends!
 
Last edited:
Thanks for this review.
I found it very relevant as i'm getting an E8400 and a single GTX260 with the intention of getting my gaming graphics up.

19k on 3dMark06 is pretty high, much better than i'm getting at the moment.

One question though- why is your Vista Crysis performance 'horrible'. I am unfortunately blighted with Vista so what is it that makes it go from silky smooth on XP to poor on Vista?
 
I honestly have no idea, all I know is everything just runs far slower than XP. For example 3dmark06 alone scores 2000 less, with nothing changed other than the operating system. I'm running service pack 1, latest drivers on vista

I did a little more testing with crysis on vista. I can hit around 50 fps in some areas, others about 25-30, this is also with out 16x AF, only 4x AA. so xp even had the disadvantage. I will have to try windows seven here one of these days and see what happens.

world of warcraft, call of duty 4, bioshock also run a lot smoother on XP. altho I don't have the numbers for those at the moment.

I cannot take screenshots of crysis in vista for some reason. My alt tab is pure instantaneous on xp, with vista theres a slight screen flicker, and a 3-5 second delay tabbing back into the game.

I really have no clue as to what the problem is with vista, maybe I need 64 bit or something.
 
wow that is weird, i have not noticed the same slowdown as you have in vista.. tbh i didnt think people running setups like yours would still be interested in xp! :) i think something might be wonky somewhere.. i have not used xp in quite sometime, but i get 20,206 or so in 06 with my setup. maybe it is your tweaked cfgs. and .ini files and such that are give you the slowdowns in vista?


sweet pix tho man :thup:
 
big differences in speed between Os's is weird, but yeah if your in a 32bit OS, your basically only using 2gb of ram. Vista with some patches should report that you have 4gb in there, but back to the old 32bit OS issue, it can only address 4GB total. You are losing almost 2gb to video memory. Swapping to a 64bit OS may make a little bit of a difference. May not as well, but would be interesting to see. If you try Windows 7, I would definitely try the 64bit as opposed to the 32bit.
 
wow that is weird, i have not noticed the same slowdown as you have in vista.. tbh i didnt think people running setups like yours would still be interested in xp! :) i think something might be wonky somewhere.. i have not used xp in quite sometime, but i get 20,206 or so in 06 with my setup. maybe it is your tweaked cfgs. and .ini files and such that are give you the slowdowns in vista?


sweet pix tho man :thup:

I'm wondering if its all that nonsense vista has running in the background, from what I hear there is a ton of it.

I have not touched anything in vista, its pretty much a barebone fresh installation. all I did was disable the security center. I'm thinking there could be many other features or things running in the background that's robbing cpu / gpu power?

I just don't have any experience with vista to trouble shoot it quite yet.

what system specs are you running ?
 
ive been useing vista ultimate x64 since it came out pretty much. all of my scores in the 3dmark section are useing v64, thats why i find it a little odd :confused: as for my specs, they are in my sig :)

except those numbers were from a zotac gtx285 amp running at 741/1548/1350 :p
 
If he's running 32bit XP, wouldn't there be no difference between that and his 32 bit Vista, so performance to that degree may not matter.

And I've seen some data that suggests the difference between crysis 32 and 64 is not significant at all. Around 3FPS. Can't speak for synthetic benchmarks though.
 
well, between vista x86 and x64, i noticed a lil more then 1000mb/s increase in either memory writes, or copys in everest ultimate. it was one or the other heh. i personally think games feel better in 64bit goodness :beer:
 
I'm a bit confused on the GTX260 SLI 3dmark06 scores in general as well. I have seen reviews showing very low scores and your score sounds pretty good.

For comparison I'm only running a single GTX260 with an E8400 and I got 16062 in 3dmark06. Based on my score vs yours, it doesn't seem like GTX260 scales very well by going SLI.
 
do you guys know of a good utility for stress testing a GPU?

just to be sure an over clock is stable in all situations..
 
I'm only running a single GTX260 with an E8400 and I got 16062 in 3dmark06. Based on my score vs yours, it doesn't seem like GTX260 scales very well by going SLI.

thats about exactly what I got as well at 3.6ghz on my e8400, and 1 GTX260. Just a tad over 16K, so 3.1K jump with SLI..

i've been doing lots of testing with crysis all day, and it seems to scale very well, where ever I get 30-35 FPS, in sli it's about 60-65

40 would turn to 80, and so on. call of duty 4 also practically doubles in frame rates.

perhaps games just need to be coded with SLI in mind?

the bottom line is there are definitely some games that don't scale at all, and some that scale really nicely. I'd love to know the reason behind this if any one here is wiser.

I can tell you right now, world of warcraft DROPS in performance by about 60-70% in SLI, that's quite the odd ball. In a populated zone that normally gives me 60.0 FPS rock solid (dalaran for example), I get about 13-15 Fps in SLI LOL.

http://img39.imageshack.us/img39/1985/wow1a.jpg
 
World of Warcraft isn't programmed to work with SLI.


Naturally, a game like Crysis which was designed to push cards to the limit would be coded with SLI in mind.

It's just up to the maker to decide whether they want to support it. Because of that, quite a few people support buying 1 greater card rather than SLI/XFIRE because you're guaranteed a performance increase across the board, not just in synthetics and some games.
 
You got a i7, of course it will be better :p
Also you can sli a 192 with 216????
3dmark06 is more cpu bottlenecked, for example with one 8800gts and Q9450 I got 16k, then 19k with a second 8800gts 512mb.
Vantage scales more with sli as it is more of a gpu benchmark. 13.6K is average, but that score is low because your cpu drags it down.
With my GTX295 (two GTX275's in sli) I have scored around 24-25k with physx off because having a quad core, especially core i7 can boost the cpu test loads.
 
Just want to add @op that GPU is a pretty decent score for those, I have a stock 285 and it hits in the range of 12500-13000 points for the GPU score. (overall score is skewed because its an OC'ed I7 which puts the CPU score at almost 50k :) )

@op what kind of fps do you get in furmark at 1680x1050 on this SLI setup?
 
Last edited:
I'm a bit confused on the GTX260 SLI 3dmark06 scores in general as well. I have seen reviews showing very low scores and your score sounds pretty good.

For comparison I'm only running a single GTX260 with an E8400 and I got 16062 in 3dmark06. Based on my score vs yours, it doesn't seem like GTX260 scales very well by going SLI.

3DMark06 runs at a very low rez and is not indicative of actual gaming performance.

I got 18.6k w/ a Q9550 and a single 8800GTS-512. A single 260, and especially 2 of them, will wipe the floor vs that setup in actual gaming, but not so much in '06 which puts way too much weight on the CPU test and runs at a very low rez by default. The low rez means that most of the Shader Processors remain unused.

Crank up the rez/settings in '06 and then run the tests again or use Vantage 'H' or 'X' to draw better conclusions to actual modern gaming performance. That or test with actual games.
 
Back