All this stuff about the superiority of DVI is patently rubbish, at best it is a toss up between them, clearly, unless your converters are _very_ bad you will never notice a difference, in fact, my panel works better under analog out(2005 fpw) , although perhaps that is uncommon. If you don't believe me , and have a dual input monitor, download NTEST and switch between the inputs on the focus and moire tests, which print very fine patterns on the display. No difference at all. In fact, my DVI flickers like crazy on the left side of the moire test pattern. Now from my understanding, at a 'per pixel' level, the dvi is actually in better focus than the analog out (the analog nature of the signal 'smears' the picture very slightly) but the difference cannot be observed at a normal distance from the monitor, your eyes simply cannot pick up that much detail, it is analogous to (but different from) the fact that even the most powerful satellites cannot resolve a 'dime' on the surface of the earth due to the diffraction of light through air. Our eyes aren't that good, DVI vs. VGA on small panels is a bull**** argument, just like SATA vs. ATA or AGP vs. PCIex. On extremely high resolutions where the actual bandwidth of a vga connection becomes a problem then DVI is obviously the only way to go. If you don't have a 30" display this probably doesn't apply to you.