• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

the problem with GeForce FX 5200 is...

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

RichRymer10

Registered
Joined
Jan 28, 2003
Location
nYc
it just doesnt make sense to me 250Mhz core 400Mhz 3.6ns memory...and still 1,158 in 3dmark03 and 5,000 something in 3dmark01se. actually those are the scores that i got after i overclocked to 326/526. but, this is reminiscent of the old radeons. the radeon 8500 was good on paper when it came out but it suffered because of driver issues. just now, as the new catalyst drivers were developed, its pretty good. so, i think the geforce fx is suffering the same fate. its sad to say tho, that my fx5200 barely outperforms my old GeForce 2 MX400 32mb SDRam which was o/c'd to 210/200.
 
mine came with 128-bit ram. its from PNY. i think im bottlenecked. its on a 1.4Ghz P4 and the AGP slot is at 4x and the agp architecture (or wuteva its called) is at 64. i heard tahts its ideally supposed to be double to amount of memory (256 for me). i cant change any of these either cuz the system is proprietary.
 
Last edited:
I'd say it sucks because the core is heavily crippled. I seriously doubt it's a driver issue. How does the FX5200 look good on paper at all, besides the "FX" name?
 
RichRymer10 said:
mine came with 128-bit ram. its from PNY. i think im bottlenecked. its on a 1.4Ghz P4 and the AGP slot is at 4x and the agp architecture (or wuteva its called) is at 64. i heard tahts its ideally supposed to be double to amount of memory (256 for me). i cant change any of these either cuz the system is proprietary.

It does have 128bit ram, but there's no compression technology.
 
yeah and a 1.4ghz p4 is like a 1ghz athlon... dude overclock that p4.. or get a new one... it should help... but still you have a 5200, the first of a long line of bad jokes known as the FX series.
 
The GeForce FX 5200 is severely hampered by its horrid fill-rate of a measly 1.4 gigatexels/second. If you're a serious gamer get a GeForce 4-Ti4x00 or a GeForce FX 5900/5900 Ultra which has a fill-rate of 3.6 gigatexels/second.:cool:
 
I cannot recommend any of the FX cards for how much they cost.
I would recommend a Radeon.. but this is an Nvidia forum so I wont do that. While I like ATI, Nvidia's Gf4 ti line is very good if you get a good price. My last card was a GF3 ti500, and I loved it. I still have it in my other rig.

But for the love of god... the 5200 even the ultra... is horrific, buying a new card.. should never be a downgrade. To bad some people who dont know any better buy some of these crappy cards thinking they are getting something good.
 
oops no I had the two bmarks backwards. That's still lower than I had expected. Isn't it supposed to be a touch slower than a TI4200?
 
Ok. I realized what a bad card this is. On Friday, its going back to the store. unless nVidia unleashes something really sick, im putting my money with ATI. It really ****es me off that a GeForce 2 with 32mb SD-Ram is slightly slower.
 
Yes the 5200 sucks..give it a rest Please...Let ATI have their day in the sun and let that dead dog of a 5200 rest in peace.
If I read one more thread about how much the 5200 bites the big one I will spam you all w/ "which GPU should I buy" questions:p

yes I agree Nvidia's marketing is sneaky and misleading...Yes the Ultra lineup is a little dissapointing...things will change but until then how about a "buyer beware" sticky w/ regards to the ultra lineup. This way consumers will make a more informed choice and better understand how the ultra lineup stands up to other cards. I feel sorry for folks who bought a Gf5 expecting a really good GPU only to find out it is no better than a mx card!
I am not putting down Nvidia, but we can help others buy a better Nvidia card
:)
 
Ok, where to start...

Puer Aeternus had a point. It sucks. Well, sort of, it's not quite as bad as you might think....just.

The GeforceFX 5200 is based upon the 5600 core, with some vertex shaders and the likes removed a little. It's based on a 2x2 architecture, meaning it has 2 pipelines, both capable of multi-texturing. The Ti4200 by comparison, has 4x1, which has 4 pipelines, capable of single-texturing only.

What's the difference? Per clock, the Ti4200 can produce twice the number of texels the FX5200 can if they are only single-textured. If they're multi-textured, they work at the same speed. 3DMark2001 is mostly single-texturing based, and so you will find that clock for clock, the Ti4200 is twice as fast (hence why you scored 5000 3DMarks, while a Ti4200 will typically get 10000). In 3DMark03, which is motly multi-texturing, they will score more similarly (and I believe you will find that is true if you look).

So, as long as you only play multi-textured games (Doom III is the only one I know of, but that's not out yet), your performance won't suck. The only problem is, all multi-texturing games are so new, that the card isn't powerful enough anyway. A Ti4200 by comparison, still couldn't play multi-texturing games, but at least on current games it's ok...

But yeah, RMA it :p
 
So, as long as you only play multi-textured games (Doom III is the only one I know of, but that's not out yet), your performance won't suck

No it will still suck... just not because it cant multitexture, because its friggin Slooooowwww.
 
i dont mean to sound cynical but i highly doubt that your gforce 2 got what it did in both 3dmark 2k3 and 3k1se...im on a gforce 4 4600 and in 3dmark2k3 i get 1,500-1,900, and a 13000 in 2k1se depending on drivers. the new ones suck hense my low score of 1,500. i dont even think that the gforce 2 supports half the **** that 2k3 tests. i also have a gforce 2 mx 400 with 64 megs of ram and i dont remember ever hitting anything close to 5000 on 3dm2k1se. oh well.

and as we all know nvidia is the IBM of the video card industry. they have been around forever and make a quality product. with this whole fx thing, i just think they got cought with their pants down. kinda like sega did back in the day when they released Dreamcast as fast as they could to get the jump on PS2, and because of their rush to release they had an insane amount of problems with bad batches of games and screwed up consoles. unlike sega, nvidia wont go under. i think we all know that. and in my opinion i think you should just wait a little while with that 5200, wait for some drivers that dont blow, and over clock that damn processor like ir1 said. he is totally right about the 1.4 being equal to the 1.0 athlon. thats why amd is the gamers cpu. if you invest in anything, make it a new cpu, and an amd at that :D
 
well my GeForce 2 beats the FX, not in benchmarks, but FPS in a real game, Americas Army. and THAT is more important to me than a benchmark. BTW...i got 3066 with this card in 3DMark01se. for some reason i cant publish the link. also if u look up GeForce 2 MX/MX 400 at www.futuremark.com , ull see that the highest benchmark is close to 4,000
 
i dont know what your guys problem is i get excellent FPS in unreal tournament 2k3 try 100+ FPS with all details on and at highest settings and my other games run just fine i mean isnt that the real reason we have these cards is to run games good not to sit around and benchmark them all day. overclock it and go with it and my buddy jo3's advice get an AMD machine and leave ATI alone(NO FLAMES PLEASE)
 
i dont know what your guys problem is i get excellent FPS in unreal tournament 2k3 try 100+ FPS with all details on and at highest settings and my other games run just fine i mean isnt that the real reason we have these cards is to run games good not to sit around and benchmark them all day. overclock it and go with it and my buddy jo3's advice get an AMD machine and leave ATI alone(NO FLAMES PLEASE)

What card? What resolution? AA/AF on? and on what planet?

I run UT2k3 at 1280x1024 4xaa/8xanistro, all details max... and fps are between 50-100.
 
Back