• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Please explain why the FX's are not as good as Radeons

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

wanna_b_rich_13

Member
Joined
Dec 15, 2003
Hi,

I notice a lot of people mentioning the the Geforce FXs aren't worth the money when compared to the Radeons. Could someone explain why?

IMO I thought FXs are pretty good considering they are use the .13 microprocessor while the Radeons are still trying to push as much speed out of the .15s that they are still using.

Is it the speed at which the Radeon memory is that makes them better? The 5950s are running at 2ns memory speed which is the fastest I have seen. I have heard about some manufacutrers having problems with the PCB boards being able to handle such speeds but that is the only problems I have heard.

Any opinions would be appreciated. I am just trying to see WHY people are preferring the one over the other.

Thanks
 
does this tell you something when a 380/340 9800 pro can outperform a 450/850 fx5900 ultra by alot? You see, Nvidia used 4x2 while ati used 8x1 and most games prefer 8x1 so Nvidia needs higher clocks and .13 and "cheat" with their drivers to even get close to ati
 
Overclocker550 said:
does this tell you something when a 380/340 9800 pro can outperform a 450/850 fx5900 ultra by alot? You see, Nvidia used 4x2 while ati used 8x1 and most games prefer 8x1 so Nvidia needs higher clocks and .13 and "cheat" with their drivers to even get close to ati

actually the 380/340 you're refering to is 380/680DDR compared to 450/850DDR, also according to our own oc3dmark team, the fx5900u outperforms the 9800pro, however the 9800pro performs better with aa and af settings on, but as Xstatic said - there is not a big difference between the two
 
the 9800 pro gets much higher 3dmarks than any fx5900u card out there, especially if you use non cheating drivers. Have you seen the 50.xx dets that give the fx5900s just over 100fps in nature stock?
 
R4z0r4mu5 Pr|m3 said:


actually the 380/340 you're refering to is 380/680DDR compared to 450/850DDR, also according to our own oc3dmark team, the fx5900u outperforms the 9800pro, however the 9800pro performs better with aa and af settings on, but as Xstatic said - there is not a big difference between the two

in your dreams.

and it still is flawed with the patch.

mica
 
Performance per dollar. You look at the 9600 Pro and the 5600 Ultra, direct competitors, and the 9600 Pro comes out on top. Look at the 9600 XT and the 5700 Ultra. They're neck and neck and the 5700 Ultra is $30 more and you won't get HL2 with it. The 5950 Ultra and the 9800 XT? The 9800 XT comes out on top again. In all games. It's been this way for a while now.

The only thing that Nvidia is good at is making Quake 3 games run well. And Doom 3, which even John Carmack admits is mostly a DX7 game, because it doesn't really make much use of features supported by DX8 and DX9. But it's not like Nvidia cards were that great at Doom 3 to begin with. They've spent so much time optimising for Doom 3 alone it's ridiculous to the point where they still suck at HL2 and most other DX9 games. You can't buy a graphics card now that has no future. They're going to stop making Quake 3 based games. And while Doom 3 will power a number of engines, they're going to undoubtedly upgrade it by adding DX9 features.

Sure, after Doom 3 is released, DX9 might finally get some attention from Nvidia, but as it stands, not only can you get cards that perform better from ATi, but they're cheaper too. Nvidia needs to look at real benchmarks, not their faked ones, and realize that if they're going to offer inferior performance, they need to lower the price because their sales are going to start dropping fast.

After my Voodoo2 I've always owned an Nvidia product, be TNT2, Geforce2 MX, Geforce 3, but after much comparison in benchmarks and prices, you just can't beat ATi's 9000 series. Across the board they have better deals going. I was going to buy a 5600 Ultra until I found out that they re-released the 5600 Ultra by upgrading some parts because the first one wasn't very good, and now you can't tell at most websites which one you're getting, the original suck version or the new decent version.

And now you have some great promotions by ATi. HL2 voucher? Hell yeah. I don't care if it won't come out for a while. Anyone who wants HL2 would find that a good deal. I'm buying a 9600 XT for $10 more than a 9600 Pro, that used to be the best value in the midrange market until the XT, so now I'm getting a $165 card with a free $50 game. What did Nvidia scrape up? Call of Duty? Hardly a graphical powerhouse. But it suits their needs I guess. What runs better than Quake 3 based games on Nvidia hardware? I'm not interested anyway because the game isn't that great. It's just an attempt to steal back buyers they've lost to ATi. But it won't work because once again the benchmarks show that Nvidia loses again to ATi at Call of Duty.

Nvidia may have higher fill rates, but it just can't compensate for delivering inadequate performance in new games and being more expensive than the competition (that it regularly loses to). Nvidia's last card to beat the competition was the TI 4600, but the whole FX line never made it. Unfortunately, Nvidia's sales department never noticed and have priced all their cards too high, consequently losing many a customer. This isn't Intel vs AMD. There aren't nearly as many fanboys here. There aren't that many "loyal" customers. You buy what gives you the best bang for your buck, and Nvidia hasn't delivered on that in more than a year.
 
while i agree with you that ati has better products, you have missed a few points in your rant. first off doom 3 is an OGL game not dx. ogl performance has always been better on nvidia cards, and that trend continues, even with the fx line. but the shader units in the fx line are crippled, so i have some serious doubts as to how well it performs in doom 3. i also agree that if you plan on buying hl2 anyways, that it is a good idea to get the card that comes with it, if performance wise there is no difference. i dont believe that hl2 should be the reason you buy a radeon card, i think the numbers should speak for themselves on that one. and the fanboy comment, i only wish it were true...
 
theres a chance I might grab a $99 radeon 9600xt a year from now just for the HL2 game. $50 for the game, $50 for the card is a fine deal, but my ti4200 will remain my primary card though
 
that promotion will probably be over by then my friend. i still cant believe you are even compairing a 9600, much less the xt, to your lowly 4200. but whatever floats your boat, you like gaming with sucky iq, cause there really is no debating iq between the 2, then more power to you.
 
I wouldn't count the merits of ANY computer chip based on it's speed and it's silicon feature size... Saying a chip is "better" because it runs at 4ghz and is built on a 0.06 micron lithography process is flawed logic. What exactly is "better" about any of those two things, if the chip isn't able to perform?

Using your logic, one of the quantum computers buried somewhere under MIT is better than any Intel or AMD processor because it runs at some unbelievable number of terahertz (orbits of the electrons around the nucleus of the atom) and is built on a process smaller than 10 nanometers (0.001 micron). But is it actually BETTER?

No, not really. Unless you want to calculate sin(x-1) or pi to the 40-millionth digit and do nothing else.

Better is a purely subjective term; "not as good" is also a purely subjective term. So let's put these into hard facts:

The NV30 and NV35 perform slower than the R300 and R350 processors when computing pixel shaders.
The NV30 and NV35 deliver less clean output than the R300 and R350 processors when computing antialiased scenes.
The NV30 and NV35 perform slower than the R300 and R350 processors when computing under full DX9 pixel color depth.
The NV30 and NV35 color saturation is much darker than the R300 and R350 processors when computing DX8 scenes. This may not be "bad" in someone's eyes, it's just simply different.

There are many others that can be listed just as factually, but how you interpret those facts is what will eventaully create your impression of the FX series of cards.

My general impression of the FX series is that of "NV was caught with their pants down". The lastest NV35 cores are certainly better than the uber-crappy NV30's that came out in response to ATI's 9500/9700 line about 18 months ago. However, better or not, the entire NV3x series leads me to believe that this hardware platform is fundamentally flawed and will require a full redesign to bring it's performance back to a competitive level with ATI's offerings.

Generally speaking, the FX architecture seems very inefficient to me. It's similar to the ever-ongoing AMD vs Intel wars. Do you want high mhz with less work per clock cycle? Or do you want lower mhz with a lot more work per clock cycle? The net result has essentially come out the same, at least in most cases, but how they got there is significantly different.

I'm a results *****. You give me the best results, I'll use you or your service or your component. ATI is giving me the best results right now, so that's what I'm buying. It wasn't always so; ATI sucked pretty hard in their Rage128 and 7500 series of cards. It wasn't until they delved out the 8500 that I even blinked in their general direction -- my previous four video cards were all NV equipment (Riva128, RivaTNT2, GeForce 3Ti200, GeForce 4 Ti4200). I bought those because they were still giving me the results I wanted.

Maybe in thier next iteration or two, they will again start giving me the results I want.
 
...exactally. The 5900 will ALWAYS win over the 9800 in 3DMARK. Why!?!? Its because NV has optimised it. ATi dosnt need optimising what so ever.

Just try and beat a OCed 9800 in a game not 3dmark. (can anyone say Halo???) The fact that ATi takes no performance hit from Ainso / AA is fantastic. Get rid of those jaggies and enjoy games "the way its meant to be played".
 
Back