That makes sense. So as GPU's get more powerful, the gaming performance differences between PCIe 2.0 and PCIe 3.0 might become significant?
Did you ever own a 3DFX card? I used to have their Voodoo 3 2000 PCI card -- my 1st 3D accelerator card. It made a huge difference in games -- both in performance and visuals. I still wonder why 3DFX stuck to their 16-bit color for gaming should be good enough for everyone stance.
yep, first one was a voodoo 1 4mb, with the 2d card being a matrox millennium II 4mb with the 4mb addon card. then dual voodoo 2 12mb cards then i went to NV but later bought some v5 5500's.
They never got past 16Bit because they had five generations of the same core. I remember reading about them as a kid and wanting one. But by the time I was in college and had money the GeForce 2 was stomping all over them so I got that instead.
it wasnt really the same core, it had different improvement to it but combined the geomtry engine and texture unit into one. the v5 5500 was the only card at that time they wouldnt take a fps dive into unplayable fps for Quake 3 when FXAA was applied. in quantum 3d systems they used vsa 100 chips, they seemed to get things fixed before 3dfx did for the v6 6000. now the thing about the quantam cards was that they could be sli'd. the some quantum systems could have up to 8 cards with each card having 8 vsa100 chips. it was suppose to be the next gen that added 32bit color. the vsa-100 was from what i recall 24bit color but dithered down to 16bit or im thinking of another card that only lasted one generation.
The GeForce 2 was pretty much only stomping them with Direct3D stuff and mostly related to blending functions. In emulation, it seemed to pretty much only add a blending function that Direct3D required, which wasn't in the Voodoo 3.
The Voodoo 3 in Direct3D had an issue with color rendering under DirectX 7 and possibly with 6 as well.
I basically only got a GeForce 2 MX, because of color glitches under Direct3D plugins with Nintendo 64 emulation. LOL.
Received my first GeForce on October 5, 2001...
yea GF would have since it was developed around DX vs 3DFX that had to make a API for 3D. Glide being the first API, 3DFX made OpenGL and DX3d wrappers to send that info to the glide instruction set.
At that time though many games where dropping glide support. Diablo 2 had it but they also had Direct 3D. Morrowind and Neverwinter Nights didn't support glide and of course MechCommander 2 being a Microsoft published game was running DirectX. I saw it as the writing on the wall.
That was my first graphics card too. I bought a GeForce 2 MX PCI for the computer I got as a High School Graduation gift. It was a Compaq Presario with a AMD K6-2 @ 500MHz, 56K modem 32MB of RAM, 4 GB Hard drive and integrated graphics so no AGP slot. With the GeForce 2 MX card I was able to play those games at low to mid settings at VGA. LOL the good old days.
what is rarely talked about, Diablo II was make from the ground up for GLIDE. at the time 3dfx was starting to have problems and then closed up. the dev's then went back in before release and added DX support to it. that is why when you use DX you notice some stuttering vs non-dx mode's, game runs best in GLIDE.
I remember one of the articles you quoted, an older one, had one game that showed a clear difference in FPS as PCIe bandwidth increased, it a was a Japanese title, one of their survival-horror, Silent Hill type games. I think the article stated it had something to do with post-processing effects.
this would really depend on the card, while not pcie related but on the topic of 3dfx. At the time AGP was just released so 3DFX had to have a middle man chip to convert pci to AGP on the v3/v4/v5's. Those that modded the bios of the pci mac v4/v5's said they got a small fps bump on pci based cards vs apg one in the same game. what buss the chip is designed to talk to makes a difference, just because a board has pcie 3.0 doesnt mean a card for 2.0 is going to use 3.0. there was a increase in data that could be sent 1.0 to 2.0 to 3.0, i thought i read it was a pcie buss speed increase but every thing i see is 100mhz. maybe it was with i read got labled as transactions per second went up with the same buss speed so a card that supported 2.0 would see an increase vs 1.0. you could see a increase if you uppped the pcie buss as well. i use to run pcie speed at 106mhz with out sata controller corrupting data for the HD's. you might not think 6mhz would make a difference but it sure did for me in fps for css back in the day. my minimum fps in that game went up enought to make things way more playable in battles.