• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Does Quadro FX 1800 PCIe 2.0 x16 work in PCIe x16 Dell Precision 470 ?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

PositiveX

New Member
Joined
May 12, 2013
Hi all,
I am upgrading a Dell Precision 470 Workstation {Shipping Date 8/30/2005} that has -- Iguess -- a first gen. PCI express x16 slot ,
and wonder if the nVidia Quadro FX 1800 graphics card that has a gen. 2.0 PCIe x16 interface works OK.
If it would work OK , what are the perfromance implications of putting the newer into the older?
As a humble neoob :chair: I am formally soliciting any advice on this matter.
 
Hey there! Welcome to the forums! :D

First of i'd like to give you the short sweet answer for your question, yes the 2.0 Interface is backwards compatible with the 1.0, it's also compatible in a 3.0 slot should you upgrade again, you can bring you Quadro to be into the newer build. What i should also add, what are you going to be doing with this computer? I find the Quadros to be not worth what you pay for them, when you could get a "Gamer" series card for cheaper that preforms better then a Quadro 4000.

Hope this helps! :D enjoy your stay

-Darren :D
 
Plan to use for learning CAD in college -- never any gaming whatsoever.
-- solid modeling in FreeCAD , AutoCAD , 3DS Pro-e / Wildfire & 3D Printing
...
? So the newer spec PCIe 2.0 card (Quadro FX 1800 - circa 2008)
will operate at the old spec PCIe 1.0 (Dell Precision 470 circa - 2005) lower speed ?
...
Is there any advantage to getting a newer card --
like smaller die process 60 nm vs 130 nm =>
less heat ?
 
For CAD, nothing is better than the Quadro series, but you will pay more than your normal consumer GPU for them.
 
No expert on the Quadros myself but yes PCIe 1.0 through (IE: including the PCIe 2.0) PCIe 3.0 are compatible with each other, this is true with every PCIe card (AFAIK).

Also, the older die processes use much more power (especially jumping a couple of generations to the 60nm) and thus produce far less heat for the same amount of power. Additionally, of course, they're usually faster, unless say you're using a top end of the older generation vs a bottom-of-the-barrel low-end card of the current generation.
 
A wise man (EarthDog) shared this with me.

This link shows about a 5% drop in performance on current-gen GPUs, from PCIe 3.0 x16 when inserted into a PCIe 1.1 x16 slot.
 
Thanks to all for info. :grouphug:
- To recap :
- Quadro is made for CAD
- PCIe compatability = small performace hit for newer into older versions.
-$ of Quadro is OK for used market approx. $50 = budget $ also need ECC ram $
.
- System : Dell P 470 WS - 2.8GHz Xeon
- still to do Load OS on (2) HDDs WindO$ on 1 & Linux on 2
-Must be a single slot width _ next PCI has sound card
(only have 4 slots PCIe x16 ; PCI ; PCIe x4 ; PCI - X)
.
considering : Performance <at GPU review > price
FX 1700 2007 PICe 1.0 130 nm 45W 512MB DDR2 12.8 GB/sec $30
FX 3450 2005 PCIe 1.0 130 nm 83W 256MB GDDR3 32 GB/sec $30
FX 1800 2009 PCIe 2.0 . . . .... 59W 768MB GDDR3 38.4 GB/sec (- PCIe version hit) $70
FX 3700 2008 PCIe 2.0 65 nm 78W 512MB GDDR3 51.2 GB/sec (- PCIe version hit) $85
.
? I am a mechanical designer - electronics is magic to me ; please expalain again about electrical heat dissipation for larger (old) die size vs. smaller :shrug:
? which specs matter most for CAD -- memory due to no animation , only static rendering ?
? Any other CAD card recommendations ?
Thanks again.
 
I'd take the FX 1800 out of that list. Higher video RAM is better for your case.
 
I always thought Cuda Cores were what you were looking for when using CAD... Maybe i'm wrong...

Quadro GPUs still use nVidia's CUDA cores. They're coded differently than the GeForce GPUs though.
 
I was just thinking he should go with a GeForce because they have more cuda cores, where as the GPU he's looking at has something like 60 something cuda cores?
 
I was just thinking he should go with a GeForce because they have more cuda cores, where as the GPU he's looking at has something like 60 something cuda cores?

Its a good thought, but Quadro runs OpenGL vastly better than GeForce.
 
Back