• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

this is a good article on what should be considered the gf4 mx serie

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

ataxy

Member
Joined
Feb 12, 2002
Location
Montréal, Qc, Ca
Greg 'KillerG' Wood

Offered By:
Evil Smokin' Dude's House O' 3D
http://www.evil3d.net

Since NVIDIA introduced the GeForce4 line-up for Spring 2002 there has been a fair bit of discussion as to the validity of the naming conventions used for the various card configurations. The major bone of contention with some folks is that both the NV17 and NV25 GPUs have been included under the one "GeForce4" generation name. While the NV17 based configurations have been tagged "GeForce4 MX" the NV25 configurations have been designated "GeForce4 Ti", or Titanium if you will.
As you may or not be aware, the NV17 does not offer the same programmable vertex and pixel shading capabilities offered by the NV25 GPUs. It does however offer some other features that were first introduced in the NV20, or "GeForce3" generation. Those features being the addition of several "in hardware" antialiasing modes and a new memory architecture. The memory architecture upgrades have helped to greatly increase the speed and efficiency of the MX cards.

The real question that we are going to try to settle here is whether or not they deserve to be included in the GeForce 4 generation.

NVIDIA's generations can be said to be only six months in length but every second "generation" announcement has not been a completely new GPU but rather what we have been referring to as a "refresh". This refresh has never included a switch in the "GeForce" numbering but rather an alteration of the specific card configuration designations. For example, in the generation previous to GF4, the GF3 six month refresh introduced the "Titanium" or "Ti" naming and saw the introduction of a couple of slight alterations to the GF3 configurations. Those configurations being the GF3 Ti200 and GF3 Ti500. The Ti200 was a slightly power and price reduced version of the original GF3 while the Ti500 was a slightly juiced up version that commanded a premium price. Does this actually make for a new "generation" of products? In our opinion no, it does not. What makes for a new generation, in our opinion at least, is significant addition to a GPU’s functional assets. Mind you this is just an opinion but it is the premise that we will follow in this article.

The functional assets of both the MX and Ti configurations have been significantly changed by the addition of in hardware AA and the new two-way memory controller and to the MXs, a second vertex shading unit to the Titaniums and nView technology to both. These functional asset changes, again in our opinion, would certainly make for a new generation.

Naming Criteria
What is important about all that business is that we have established what we are referring to when we use the term generation and that NVIDIA, according to their brief history have pretty much released a new generation about every twelve months. Along the road they have also made naming convention changes at each new generation launch. The TNT became the TNT2, the GeForce 256 became the GeForce2 and so on.
The second question that we are going to explore to some degree is whether or not the GF4 MX cards will carry the buyer through to the next major generation or if said buyer will be left feeling a little jilted within the next year when they cannot play the latest game. As we do not have any crystal balls lying about with which to peer into the future of gaming, we will have to resort to the tools we do have at hand. The best tool we currently have for looking into the future where programmable vertex and pixel shaders are taken advantage of is DroneZ from Zetha gameZ.

If other developers take the approach that the DroneZ dev team has, they will do what they can to assure their games are playable on non-current hardware. This is and has been pretty much the standard way to design games and only makes sense if developers wish to have their games appeal to the largest possible audience. DroneZ has done this by including emulation code that will allow your system's CPU to emulate the functions carried out by the programmable shaders as well as by adding some rendering modes which take advantage of several combinations of rendering features for legacy cards.

For comparative purposes, we will be using the GF3 Bump High Quality and the GeForce2 Bump settings with a few tweaks of our own. These tweaks do not touch the rendering method options but rather are adjustments to the texture and frame buffer settings. By default, the benchmark uses 16-bit textures and a 16-bit buffer. We have altered these settings to 32-bit and 24-bit respectively. Again, aside from those changes, nothing has been altered in any way. All our testing was done and all of the screenshots were taken using a resolution of 1024x768.

One of the features that NVIDIA is leaning on to make the inclusion of the new MX cards in the GF4 line valid is Antialiasing [AA]. Therefore we will be testing using the AA modes. Also up for grabs and much talked about of late is Anisotropic Filtering [AF]. As this does help to enhance the image quality offered by these cards we will test with it in use as well.

So, now that we know where we are going with all this, let’s take a look at the numbers and then try to access just what they mean when you are at the computer parts store standing in front of a shelf full of video cards of widely varying prices and features.

Performance Tests
First, let’s see what the performance comparison looks like when the GF4 MX must use CPU emulation in order to run the GF3 High Quality test.

GeForce4 MX 440 19.28 fps
GeForce3 Ti 200 86.71 fps

Ouch, now that is indeed a nasty performance hit. And the P4 1.5GHz test system is certainly no slouch. CPU's obviously suck at this emulation thing. At least in this instance. Obviously, games making similar demands of your system would not be playable at all with an MX card. This is not the end of the story though. Alternatively, this is where we get to the one of the subjective parts of this article. Now we must figure out how the different hardware specific modes compare in performance and appearance. We will be using the GF2 Bump mode for the GF4 MX 440 and the GF3 High Quality mode for the GF3 Ti 200 to test the speed at which the game runs in these different modes on the differing cards.

We will also provide you with several screenshots so that you can compare the results of the differing rendering approaches used. As these screenshots are taken during the rolling demo it is almost impossible to get each one to be exactly the same as the others. But, we have done our very best to get them as close a possible. Simply click on the result for which you wish to see the corresponding screenshot. We will run through all the various AA modes that are usable* [according to our testing] with the MX 440’s single available AF mode [+AF] and the Ti 200’s 2X AF mode [+2XAF] enabled as well as with both features disabled so that you can compare speed and quality results.

GeForce4 MX 440 [GF2 Bump]
NoAA/NoAF 77.38
2XAA/NoAF 60.41
2xAA/+AF 54.30
QCAA/NoAF 60.80
QCAA/+AF 53.82

GeForce3 Ti 200 [GF3 High Quality]
NoAA/NoAF 78.77
2XAA/NoAF 58.76
2xAA/+2XAF 52.82
QCAA/NoAF 50.44
QCAA/+2XAF 46.28

We can plainly see that the MX 440, when running in the mode that is designed for older GF2 based cards, actually completes these benchmarks with results higher than those of the Ti 200 in the GF3 compatible mode. Not too shabby if the resultant image quality does not suffer from the lack of advanced feature use.

Now, we will leave it up to you do decide just how the presentation of the game scenes compare from one mode to the other and how the AA and AF features serve to improve the quality of each. What we cannot do is show you what these game scenes look like as they are running. This a little on the unfortunate side as there is certainly opportunity for some things to escape the screenshot. However, what we can tell you is that KillerG has a working copy of the game itself and he swears that it remains a remarkable visual feast when using the MX 440. We have also played some AquaNox [as much as we could stand!] using the MX 440 with satisfactory results.

*Note: the MX 440 and the Ti 200 simply do not have the power to run this benchmark using the AA 4X mode so we will waste neither our testing time nor your reading time with those results.

Conclusion
It has been noted about the net that John Carmack himself, the legendary guy behind the Doom and Quake series of games, has said that the GF4 MX is not what he had in mind for the upcoming Doom III. That comment aside, one must remember that those who are enthusiasts [and will be demanding the ultimate in game performance] are not purchasing MX cards and that, as the DroneZ example testifies, developers will tend to look after those folks who have less than the ultimate in hardware as gaming advances.
Once you have taken a good look at the frame rate and screenshot evidence, you should be well able to form your own opinion as to whether or not the GeForce4 MX cards are deserving of the 4 or if NVIDIA is trying to scam the innocent and unwitting with this naming convention.

You should, in conjunction with our reviews of the MX 440 and our investigation of AA and AF quality in our AA/AF performance article, also be well equipped to decide if the GeForce MX cards are the right solution for you.

If one researches these MX cards a bit they will see that they are certainly not GF2s. They are whole new animal and they certainly deserve to be designated as such.

In our learned opinion, NVIDIA has not run afoul here. They have done what should be done to bring their current cards under a common naming convention based on their abilities. The MX designation should be more than enough to distinguish the functional differences within the GF4 series. If it is not then people are simply not checking out what it is they are considering for purchase. NVIDIA should not be expected to change their approach to generation indication for those too incompetent or too lazy to bother doing at least some research into what they are buying.

Plus, we fear that there are a lot of folks who have not had opportunity to use a GF4 MX but are forming opinions on this issue just the same. We have had plenty of opportunity to run tests and play games with our MX 440 and we are simply floored by how well this card performs for under $150 bucks [USD]. Would we buy them ourselves? No, we would not but we are spoiled rotten brats and we never settle for less than the fastest and most completely equipped graphics cards available. Would we recommend them for purchase to our friends who are not as serious about gaming as we are? You are damned right we would. We would be foolish to do anything less for our friends. GeForce 4 MXs all around for the gamers out there who don't desire to cough up a weeks salary for a video card. This card is definitely a very good deal and is, in our opinion at least, completely deserved of the name "GeForce4". End of story.
 
Back