• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GF4MX Vs GF3

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

-=Mr_B=-

Member
Joined
Aug 18, 2002
Location
Sweden
I read the question "why do the GF3 allways beat the GF4MX cards" or similar somewhere in the threads in here.

The reason, i been told, is a marketing trick.

Remeber the old Gf2MX chip? All they did was basicly raised the speed, and renamed it... no wonder they got poor results, but the card sells better if it says "GF4" in the begining.. dont it... :- )

I would not know, i just know what i been told by a nVidia fan, and thats what he says... offcourse, he still says nVidia has the best card to, just wanted to say it b4 anyone asks.

If anyone is able to 100% confir or deny this, please do so, but i think my friend is right, he dont have the habit of spreading rumors, or missleading info... other then about how much money he makes.. he says "none, im in it for the fun"

Take care.
B!
 
The Geforce3 core was extremely different from older cores; its architecture was different and a very large part of it was dedicated to pixel and vertex shaders, thus making it a DirectX 8 card. It performed pretty well to a GF2 Ultra, beating it by large amounts in most cases. The GF4 is basically an update of the GF3, with some refinements added along with an extra pixel shader.

The GF4MX, on the other hand, is a refinement of the GF2. It's not really an GF2MX, since it's not crippled. However, it doesn't have pixel or vertex shaders. Thus, not only does it perform not much better than a GF2, but it's not even DX8 compatible.

However, the GF4MX is still significantly different compared to the GF2, and is MUCH faster than the GF2MX. While its architecture is similar to that of a GF2, it's not the same by far.
 
Ok, so in som ways it is the same as the GF2 but yet a new thing, makes sence, its still a marketing trick to name the card "GF4" if its not based on the GF4 card.. but then offcourse... they could claim it IS based on the GF4Ti core, and just removed everything needed to make it cheap enough.

Thanks for the clarification,
i'll mess my friend up about it later, but not to much, in general he was right.

Thanks!
B!
 
I have a Jaton GF4MX440, I must say its a kick *** card, able to run Unreal Tournement 2k3 demo at 1024x768 with Max on all details and 16 bots without a hiccup :)
 
GF4MX cards are made for general use and 2D multimedia(e.g. DVD, HDTV), not 3D.
Even so, I get about 100FPS in Descent 2 with my GF4MX 420.
It also runs Fear Factor smoothly with zero CPU usage.
 
Yes, the Geforce4MX is based on the Geforce2. It is basically a souped up GF2.

The GF3 is faster and more advanced (DX8 as opposed to DX7) in all aspects over any Geforce4MX card....
 
Last edited:
star882 said:
GF4MX cards are made for general use and 2D multimedia(e.g. DVD, HDTV), not 3D.
Even so, I get about 100FPS in Descent 2 with my GF4MX 420.
It also runs Fear Factor smoothly with zero CPU usage.

Descent II is a DX5 game. I would be VERY concerned if the MX420 didnt run it at 100fps and if it werent for vsync or built in game fps limiters you should be getting INSANE fps.

But anything with the Geforce name is made for 3D accelleration. Your card should even run UT2k3, not very well, but it should run.
 
Like I said, My GF4MX440 runs UT2k3 Demo at 1024x768 at max details on everything and runs as smooth as butter on a babys butt.

It runs at 1280x1024 with everything set to max exception something small, like blob shadows and character details and decals, or something.
 
Beast Of Blight said:
Like I said, My GF4MX440 runs UT2k3 Demo at 1024x768 at max details on everything and runs as smooth as butter on a babys butt.

It runs at 1280x1024 with everything set to max exception something small, like blob shadows and character details and decals, or something.

the MX420 is sdram ;)
 
Beast Of Blight said:
Like I said, My GF4MX440 runs UT2k3 Demo at 1024x768 at max details on everything and runs as smooth as butter on a babys butt.

It runs at 1280x1024 with everything set to max exception something small, like blob shadows and character details and decals, or something.

I don't want to call you a liar, so I'll just assume you've never played on a card capable of... more.
 
Beast Of Blight said:
Like I said, My GF4MX440 runs UT2k3 Demo at 1024x768 at max details on everything and runs as smooth as butter on a babys butt.

It runs at 1280x1024 with everything set to max exception something small, like blob shadows and character details and decals, or something.

:rolleyes:
 
Well, i dont wanna spoil your fun guys, but he might be happy enought with it? heck im on an integrated i810 chipset, AND have played both Unreals, both GTA3 and most the new and fancy RPG games... There are games i cant play, games that are forcing 32bit mode, since Intel took the easy way out, and use 24bit as max. no, nowhere max in graphics, but it keept me happy enought for years, getting a new card, just coz im getting a new PC.

to everyone his own, if your taste is costing you loads of cash, then my taste suits me better, i got no gold.
B!
 
funnyperson1 said:


the MX420 is sdram ;)

The Radeon 9800Pro uses SDRAM too :rolleyes:

Ok, ok I see your point :D


GF4MX is a souped up GF2MX. It's the same old added with multisampling AA unit, a memory controller from the GF4Ti-series (proper DDR-support with 128-bit bus) and considerably higher clockspeeds. The basic arcitechture is still the same 2x2. At it's time GF2MX was a wonderful lowcost card. Performed pretty much like a GF256SDR (+ much better OC potential).
 
Well, I wouldn't say it's a GF2MX. After all, doesn't it still have all four pipelines enabled, as well as use DDR? I'd compare it more to a GF2GTS.
 
It is based on GF2MX architecture which is based on GF2GTS ;) To my knowledge it has only 2 pipelines with 2 TMU's per pipeline. In multitexturing games it's almost as good as the 4x1 in GF256. GF2GTS is 4x2 architecture.
 
AH, I don't care if you all call me a liar or not.

I am extremly satisfied with my MX440, It has performed well beyond what I expected it to from all the dissings I've heard about the GF4MX cards.
 
i personally can't believe that a card that scores 6-7000points in 3dmark2001 would run ut2003 at 1024x768 max detail at full speed. I' ve yet to try ut2003 though so i could well be wrong, but seeing as i own a gf4 440mx, i can tell from my own experience (when it was sitting in my xp18700 system which crushes a celly2.0ghz) that it is highly unlikely.
 
Okay.

Just look at it this way. Why would I lie about it? What possible gain could I get from lying about it?

All I know is what I play, and what I see. and what I see is what I've said, But as with anything.. experiances may varry. So, ah well.

Just smile and nod
 
Back