• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Guru3d article on PCIe Express Scaling that has one interesing point

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
http://www.guru3d.com/articles_pages/pci_express_scaling_game_performance_analysis_review,17.html

This just pretty much confirms what ED and everyone have been saying about the increased bandwidth of PCIe 3.0 being pretty much irrelevant except for one comment at the end of the article:

"If this article invokes enough interest, then in the future we'll also spend some time testing AMDs latest Radeon cards in Crossfire. With the Crossfire bridge removed all composing data now runs over the PCIe bus."
 
well some of us have been saying that since pcie 2.0 came out and people wanted to compare 1.0 to 2.0 via FPS numbers. really small difference that could be chalked up to a variance in bench runs. ironically though, when pcie 1.0 was released, no card could take full advantage of the bw. alot of the top cards that were both agp 3.0 and pcie 1.0, didnt even max out agp 3.0. as things with computers go,they will always push the next standard as a selling point. the nice thing about going to pcie 3.0 over the others is lower overhead which is a good thing. if your using multi card setups, be it 2 way sli with dual gpus and physx or any variation you can think of.

i think pcie 3.0 mainly benefits servers with multi raid card setups.0.2
 
With modern GPUs FPS isnt the main taxer of data transmission, something like HPC applications (probably) benefit from the increased bandwidth. Though I have no hardware to test this with. As you add more devices to the motherboard the more taxed the overall PCIe bus becomes so I suspect that the faster individual lanes help in that situation as well.
 
well some of us have been saying that since pcie 2.0 came out and people wanted to compare 1.0 to 2.0 via FPS numbers. really small difference that could be chalked up to a variance in bench runs. ironically though, when pcie 1.0 was released, no card could take full advantage of the bw. alot of the top cards that were both agp 3.0 and pcie 1.0, didnt even max out agp 3.0. as things with computers go,they will always push the next standard as a selling point. the nice thing about going to pcie 3.0 over the others is lower overhead which is a good thing. if your using multi card setups, be it 2 way sli with dual gpus and physx or any variation you can think of.

i think pcie 3.0 mainly benefits servers with multi raid card setups.0.2

The article has clear evidence that modern games do benefit from PCIe 2.0 over PCIe 1.1 in SLI setups. They found up to a 20% difference not only in FPS, but FCAT when going from PCIe 1.1 to PCIe 3.0 in their GTX 980 SLI setup. The difference was about 15% going from PCIe 1.1 to PCIe 2.0 for their SLI setup.

Would a single dual GPU card act the same as a high performance single GPU card as far as performance WRT the PCIe bus revision (i.e. little to no change in performance)?
 
right but your talking modern, what kind of graphics cards did we have with the agp to pcie 1.0 swap. as well as 1.0 to 2.0 pcie, cards with not near the power they have now. that is what i was talking about and we always have some kind of improvement before gpus every need the extra bw or make use of it. as in what i was referencing with my post, its been that way for a while.

you could call it the chicken or egg question, which came first.
 
right but your talking modern, what kind of graphics cards did we have with the agp to pcie 1.0 swap. as well as 1.0 to 2.0 pcie, cards with not near the power they have now. that is what i was talking about and we always have some kind of improvement before gpus every need the extra bw or make use of it. as in what i was referencing with my post, its been that way for a while.

you could call it the chicken or egg question, which came first.

That makes sense. So as GPU's get more powerful, the gaming performance differences between PCIe 2.0 and PCIe 3.0 might become significant?

Did you ever own a 3DFX card? I used to have their Voodoo 3 2000 PCI card -- my 1st 3D accelerator card. It made a huge difference in games -- both in performance and visuals. I still wonder why 3DFX stuck to their 16-bit color for gaming should be good enough for everyone stance.
 
That makes sense. So as GPU's get more powerful, the gaming performance differences between PCIe 2.0 and PCIe 3.0 might become significant?

Did you ever own a 3DFX card? I used to have their Voodoo 3 2000 PCI card -- my 1st 3D accelerator card. It made a huge difference in games -- both in performance and visuals. I still wonder why 3DFX stuck to their 16-bit color for gaming should be good enough for everyone stance.

They never got past 16Bit because they had five generations of the same core. I remember reading about them as a kid and wanting one. But by the time I was in college and had money the GeForce 2 was stomping all over them so I got that instead.
 
They never got past 16Bit because they had five generations of the same core. I remember reading about them as a kid and wanting one. But by the time I was in college and had money the GeForce 2 was stomping all over them so I got that instead.

The GeForce 2 was pretty much only stomping them with Direct3D stuff and mostly related to blending functions. In emulation, it seemed to pretty much only add a blending function that Direct3D required, which wasn't in the Voodoo 3.

The Voodoo 3 in Direct3D had an issue with color rendering under DirectX 7 and possibly with 6 as well.

I basically only got a GeForce 2 MX, because of color glitches under Direct3D plugins with Nintendo 64 emulation. LOL.

Received my first GeForce on October 5, 2001...
 
Last edited:
The GeForce 2 was pretty much only stomping them with Direct3D stuff and mostly related to blending functions. In emulation, it seemed to pretty much only add a blending function that Direct3D required, which wasn't in the Voodoo 3.

The Voodoo 3 in Direct3D had an issue with color rendering under DirectX 7 and possibly with 6 as well.

I basically only got a GeForce 2 MX, because of color glitches under Direct3D plugins with Nintendo 64 emulation. LOL.

Received my first GeForce on October 5, 2001...

At that time though many games where dropping glide support. Diablo 2 had it but they also had Direct 3D. Morrowind and Neverwinter Nights didn't support glide and of course MechCommander 2 being a Microsoft published game was running DirectX. I saw it as the writing on the wall. :D

That was my first graphics card too. I bought a GeForce 2 MX PCI for the computer I got as a High School Graduation gift. It was a Compaq Presario with a AMD K6-2 @ 500MHz, 56K modem 32MB of RAM, 4 GB Hard drive and integrated graphics so no AGP slot. With the GeForce 2 MX card I was able to play those games at low to mid settings at VGA. LOL the good old days.
 
Thanks for the confirmation Mags... I don't BS! Linked that a couple of times. ;)


Here is one from TPU -

Single card

Fury X - https://www.techpowerup.com/reviews/AMD/R9_Fury_X_PCI-Express_Scaling/18.html
GTX 980 - https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/

I remember one of the articles you quoted, an older one, had one game that showed a clear difference in FPS as PCIe bandwidth increased, it a was a Japanese title, one of their survival-horror, Silent Hill type games. I think the article stated it had something to do with post-processing effects.
 
That makes sense. So as GPU's get more powerful, the gaming performance differences between PCIe 2.0 and PCIe 3.0 might become significant?

Did you ever own a 3DFX card? I used to have their Voodoo 3 2000 PCI card -- my 1st 3D accelerator card. It made a huge difference in games -- both in performance and visuals. I still wonder why 3DFX stuck to their 16-bit color for gaming should be good enough for everyone stance.
yep, first one was a voodoo 1 4mb, with the 2d card being a matrox millennium II 4mb with the 4mb addon card. then dual voodoo 2 12mb cards then i went to NV but later bought some v5 5500's.

They never got past 16Bit because they had five generations of the same core. I remember reading about them as a kid and wanting one. But by the time I was in college and had money the GeForce 2 was stomping all over them so I got that instead.
it wasnt really the same core, it had different improvement to it but combined the geomtry engine and texture unit into one. the v5 5500 was the only card at that time they wouldnt take a fps dive into unplayable fps for Quake 3 when FXAA was applied. in quantum 3d systems they used vsa 100 chips, they seemed to get things fixed before 3dfx did for the v6 6000. now the thing about the quantam cards was that they could be sli'd. the some quantum systems could have up to 8 cards with each card having 8 vsa100 chips. it was suppose to be the next gen that added 32bit color. the vsa-100 was from what i recall 24bit color but dithered down to 16bit or im thinking of another card that only lasted one generation.

The GeForce 2 was pretty much only stomping them with Direct3D stuff and mostly related to blending functions. In emulation, it seemed to pretty much only add a blending function that Direct3D required, which wasn't in the Voodoo 3.

The Voodoo 3 in Direct3D had an issue with color rendering under DirectX 7 and possibly with 6 as well.

I basically only got a GeForce 2 MX, because of color glitches under Direct3D plugins with Nintendo 64 emulation. LOL.

Received my first GeForce on October 5, 2001...
yea GF would have since it was developed around DX vs 3DFX that had to make a API for 3D. Glide being the first API, 3DFX made OpenGL and DX3d wrappers to send that info to the glide instruction set.

At that time though many games where dropping glide support. Diablo 2 had it but they also had Direct 3D. Morrowind and Neverwinter Nights didn't support glide and of course MechCommander 2 being a Microsoft published game was running DirectX. I saw it as the writing on the wall. :D

That was my first graphics card too. I bought a GeForce 2 MX PCI for the computer I got as a High School Graduation gift. It was a Compaq Presario with a AMD K6-2 @ 500MHz, 56K modem 32MB of RAM, 4 GB Hard drive and integrated graphics so no AGP slot. With the GeForce 2 MX card I was able to play those games at low to mid settings at VGA. LOL the good old days.
what is rarely talked about, Diablo II was make from the ground up for GLIDE. at the time 3dfx was starting to have problems and then closed up. the dev's then went back in before release and added DX support to it. that is why when you use DX you notice some stuttering vs non-dx mode's, game runs best in GLIDE.

I remember one of the articles you quoted, an older one, had one game that showed a clear difference in FPS as PCIe bandwidth increased, it a was a Japanese title, one of their survival-horror, Silent Hill type games. I think the article stated it had something to do with post-processing effects.
this would really depend on the card, while not pcie related but on the topic of 3dfx. At the time AGP was just released so 3DFX had to have a middle man chip to convert pci to AGP on the v3/v4/v5's. Those that modded the bios of the pci mac v4/v5's said they got a small fps bump on pci based cards vs apg one in the same game. what buss the chip is designed to talk to makes a difference, just because a board has pcie 3.0 doesnt mean a card for 2.0 is going to use 3.0. there was a increase in data that could be sent 1.0 to 2.0 to 3.0, i thought i read it was a pcie buss speed increase but every thing i see is 100mhz. maybe it was with i read got labled as transactions per second went up with the same buss speed so a card that supported 2.0 would see an increase vs 1.0. you could see a increase if you uppped the pcie buss as well. i use to run pcie speed at 106mhz with out sata controller corrupting data for the HD's. you might not think 6mhz would make a difference but it sure did for me in fps for css back in the day. my minimum fps in that game went up enought to make things way more playable in battles.
 
what is rarely talked about, Diablo II was make from the ground up for GLIDE. at the time 3dfx was starting to have problems and then closed up. the dev's then went back in before release and added DX support to it. that is why when you use DX you notice some stuttering vs non-dx mode's, game runs best in GLIDE.

From what I read the Voodoo 5 was still at the core of it an overclocked Voodoo 1 with new chips added on. Wasn't that card mega huge for it's day.

If that is the case with Diablo 2 then despite the 3DFX card being better it was already over. To not release a game because they had to implement another rendering method because Glide was plummeting like a brick was not a sign of a healthy platform. Neverwinter Nights and Morrowind didn't support Glide at all.

This is a life lesson. One is never immune to being replaced. :D
 
the v5 5500 was yes and the v6 6000 was longer, the 5500 is prolly the length of current high end gpus. also keep in mind the v5 5500 was basicly two v4 4000's in sli on one board, so the v6 6000 would be a quad sli.

im not sure but i think blizzard did a reverse glide to DX3D wrapper so there wanst much work needed. which might also explain why when using DX3D even on todays most powerful hw the game still stutters every so often.
 
Really interesting! Though not the topic but still :) I learn something today about 3
DFX by reading your posts and would love to learn more! I corrently have a Voodoo3 3000 AGP and i need a mobo that has AGP1.0 to test it if it even works (I hope it does). If anybody is willing to teach me about the history of 3DFX ill be really happy :) (PM me if you do).
 
there is a few sites with history about 3DFX and some of their products ect. here are a few links to get started
https://en.wikipedia.org/wiki/3dfx_Interactive
http://www.thedodgegarage.com/3dfx/
http://www.voodoofiles.com/3dfxhelp.asp
http://www.falconfly.de/3dfx.htm
http://vintage3d.org/3dfx1.php#sthash.LexKJBCn.dpbs

i aslo cam across this
http://www.anandtech.com/show/580/2
i would say start from page one but on page two i wanted to point this out.
From the above description the VSA-100 doesn’t appear to be much more than a Voodoo3 with support for a few new visual features and 32-bit color rendering support, but the chip’s support for up to 32-way SLI scalability (hence the name Voodoo Scalable Architecture) is what truly defines it and sets it apart from the Voodoo3.
man that would have been some custom machine from Quantam back in the day, wish we got to see it. Wait i kinda think they did do it, i recall Quantam came out with cards with 2=4 vsa100 per side of the card and in one system stacked 4 or 8 cards in one. now this is going to bug me till i find that name of that card. will post later when i find it.
 
im not sure what that middle card is but that is not a V5 because clearly you can get those fans anywhere, plus half of them are not plugged in. the 5500 had dual vsa 100's the V5 6k had 4, that is a joke picture about the size of the voodoo cards with the vsa100 chips.

that last picture by model name is not a 3DFX product nor Voodoo 5, but a quantum 3D built board using VSA 100's. sorry i dont like things being mislabeled, might be old tech but doesnt help when trying to be accurate to history. as you can clearly see that last card has PCI slots not AGP, there was never more then ONE AGP slot on a board.

*edit*
actually that last pic is of a Voodoo II setup quantum made, go to the dodge garage and see Quantum Mercury "the HG brick". consists of quantum Obsidian2 200SBi with 100mhz ram boards.

on the same site see AAlchemy 8164, one board carrying 8 vsa 100 chips.
 
Last edited:
man that would have been some custom machine from Quantam back in the day, wish we got to see it. Wait i kinda think they did do it, i recall Quantam came out with cards with 2=4 vsa100 per side of the card and in one system stacked 4 or 8 cards in one. now this is going to bug me till i find that name of that card. will post later when i find it.
The Aalchemy system was neat:

For some reason I'm thinking it was either the "Obsidian" or "Mercury" stacked platforms that you may be referring to: http://www.thedodgegarage.com/3dfx/q3d_mercury_brick.htm and http://www.thedodgegarage.com/3dfx/q3d_mercury_faq.htm Nevermind just saw your edit.

edit: Other than competing with their own AIB partners, 3DFX really messed up by skipping hardware T&L right when their competitors added it and games started coming out that looked significantly better with it enabled. Made sure that all their recent screenshots looked garish by comparison.
 
Last edited:
Back