• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GF4-MX480 misleading!!!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Lancelot

Member
Joined
Feb 12, 2001
Location
the Netherlands
If you want the fastest GF4-MX and don't need AGP 8x, get the MX460!!! The name MX480 seems to indicate this is the fastest card of the GF4-MX line, but this is false! I've been searching the web on these new cards and over and over again when I read the specs I found out the name 'MX480' is used for a card with a MX440 chipset that can handle 8x AGP!!! So maybe it's the fastest MX in transfer-rates but NOT in core and memory speeds!

Here are MX480 specs from a German site:
__________________________________
VGA Karte Albatron GeForce4 MX480 64MB TV-Out DVI

Power by nVIDIA GeForce4 MX440-8X GPU
AGP 8X/4X/2X with AGP Texturing and Fast Writes
64MB DDR Memory
Supports Twin-View & TV-OUT
D-Sub & DVI Ports
Software bundled: WinDVD
__________________________________

MX480: 270Mhz core/512Mhz mem AGP 8x
MX460: 300Mhz core/550Mhz mem AGP 4x
 

radadman

Member
Joined
Oct 15, 2001
What you say is true, but read on to find that the 480's(with 2.8ns memory) clock easy to 400/700 and higher making them a lot faster than any 4ns mx440/460.
 
OP
Lancelot

Lancelot

Member
Joined
Feb 12, 2001
Location
the Netherlands
ok, the MX480 cards I saw had 3.3ns memory compared to 3.6 for the MX460 but I figured this had something to do with the 8x interface. So you're saying you could get the 270Mhz core upto 400!? I don't think so! Maybe with AGP 2x yes, but that's not what one would buy this card for...
 

quegyboe

Member
Joined
Dec 20, 2000
Location
BC Canada
Maybe with AGP 2x yes, but that's not what one would buy this card for...
Why do you say that? Is it because of the 3.3v signal at 2x? Does that increase the OC limits on the GPU and mem?
 
OP
Lancelot

Lancelot

Member
Joined
Feb 12, 2001
Location
the Netherlands
Nope the 3.3V never gets to the GPU! The vidcard has it's own Voltage regulator. When it gets 3.3Volts it brings it down to 1.5, when it gets 1.5V (on newer mobo's) it just uses that...