• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

DVI vs VGA Connectors

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I tried it both ways with my new Samsung 940B 19", and there's definately a difference.
I'm not a gamer at all, text is my life. Text using the Analog cable (same computer, same AGP card, same settings) is without a doubt blurry.

Using DVI, it's not as perfect as my 21" NEC CRT was, but it's so much better than analog that even my wife (with astigmatism and expensive glasses) could see DVI was better.
 
slightly off-topic but...

One aspect that rarely enters the debate is the quality of the components implemented for vga and dvi on the controllerboard. As with many other products, manufacturers often try to save money on parts (chips, input/outputs, and other houhas). And when a manufacturer chooses to "go cheap" on a dvi+vga controllerboard which aspect do you think they'll go cheap on? (hint: nowadays it's likely to be VGA).

It's almost an apples to oranges debate imho, "DVI vs VGA" is too generic of a term to frame the topic. What people neglect to compare is a DVI lcd --vs-- a 100% VGA lcd. Of course you should use the DVI port if it's there, but not because its inherently 100% better atm, but because the manufacturers likely gave its VGA implementation "short shrift" ;).

In my own experimentation i compared my CMV-522A VGA LCD -vs- my BenQ FP591 DVI LCD and let me assure you the CMV equals or surpasses it in *almost* every way. The CMV was also noticeably better than the BenQ in vga mode. The BenQ FP591's vga implementation was certainly given short shrift even though it has a supposedly high end pixelworks chipset onboard.

BTW: i was made aware of all this stuff by hardcore lcd enthusiasts, people who don't buy the prepackaged stuff, but people who actually buy all the parts from far-east merchants, then put it all together themselves. They mix and match all the parts and features they need such as desired resolution, types of inputs+outputs, SOG and a host of other features, heck they even buy their own "film" to coat the lcd panels with, and not the cheap film you've seen linked around here lately ;). I saved the links to some of these merchants but can't find them for the life of me...
 
Last edited:
LCD is Digital no matter what way you look at it. DVI is the best choice for the monitor if it has both VGA and DVI. Having a Dell 2405FPW and switching between the 2 yes you can tell a difference between the 2 and DVI is crisper.

Now I have an older 17" NEC LCD that has analog only. On that the picture actually looked really good and still does. But now that I look at it after owning my 2405FPW, I would really love to see the difference between the VGA and DVI models when they where released 2+ years ago. Wonder if it would make any difference.
 
a test

I've connected my 2005 fpw both via analog (vga) and dvi to my 7800 gtx. The panel is in its native resolution and the display has been 'cloned' across.

I've snapped a series of 8 images at the same focal length, exposure time, etc. 4 of my desktop, 4 of an image test. 4 are dvi, 4 are analog. I will present 4 cropped images, two of which are dvi, the other two analog. I made no adjustments to any of the images, and if desired, they will be available in CRW format after the 'experiment'. I will call them "x" "v" "d" "r"

PNG image follows the letter.

X
x3tu.png


V
v5ip.png



D and R are rather large in size, to prevent 56k death I've linked them instead.

D

R

respond with the two which you believe are DVI. Will post 'answers' tomorrow night, and make available source images.

EDIT: The images were shot on a Canon EOS 10D in raw format on a tripod and on manual settings, which were unchanged.
 
I think D is Dvi without a doubt. The text samples i can't tell. I will guess X for Dvi LOL
 
X and R are in fact the DVI images. I wish more people would have responded with what they thought were the images. I won't draw any conclusions but I'll say this: you can lead a horse to water... I am in the process of uploading the crw's to filefront. You'll have to get a crw processor to view them. Will upload pngs later.
 
something I'd like to add is that maybe what resolution you're running makes a difference in how much effect there is between the connectors. If you look through this thread people who claim to not notice a difference are mostly on 17" screens, probably 1280x1024. My theory is that there isn't much of a loss with an analog signal at lower resolutions, however once you get up to a higher resolution screen the difference becomes more apparent.

I thought D and X
 
Last edited:
Man, as soon as you save the pics to your computer, or crop the images, or convert formats, or resave the images, your software re-interpolates pixel data..period...so I haven't even looked at your pics.
I'm sorry that your experience doesn't seem to match the majority. I don't know if that's due to the cord, the card, the software, or Jupiter's alignment with the pipes in your basement, but it's not my experience.
DVI plainly wins over VGA on my desktop. There's no grey fringe to the text that I get using VGA, just crisp sharp black letters.

Are you running the monitor maker's recommended resolution with both?, that made a huge difference with my DVI's clarity.
 
http://files.filefront.com/890CANONpngzip/;4755480;;/fileinfo.html

format for everyone. twice the size.

@ fool: I've tried multiple different cables, even video cards (different machines too) the results are the same. CRW files are raw dumps from the CCDS on the camera, there is no 'interpretation' until you process them, and if you choose you can leave them 'as shot'. Interpolation doesn't have anything to do with it, this isn't some cheap piece of **** 4 megapixel camera with 300x digital zoom. I posted the images as shot. Perhaps you would prefer I send the images directly to your optic nerves.


As to the slight colour difference between the images, which I believe many incorrectly used to differentiate them, I cannot directly attribute this to any of my settings, I checked my software settings (the display was cloned) and they were identical as were the hardware settings, my only guess is the vga signal is slightly distorted in this manner (slightly over saturated?) If you have crw processing you can verify this for yourself, otherwise you can just look at the pngs, which are un-adjusted crws. If you closely scrutinize the 'sharpness' of the images you can see that the dvi is actually better in that respect, which is to be expected, but it is rather slight. All things equal, dvi *should* be better hands down, but I don't think the difference is large enough to warrant paying extra for the dvi option. Perhaps some manufacturers purposely cripple their vga models to force people who know to buy the better 'DVI' model. CRT monitors are purely analog in nature and can produce images which I would regard as far superior to anything a display based on LCD could achieve. Models like the GDM-FW900 can push 2304x1440@80Hz on 15 pin d-sub (vga) and I don't think you'll find anyone who says they look 'bad'.
 
greenmaji said:
I could see it and my screen is a POS.. :/
The difference is usually more pronounced on cheaper/older screen due to them not being able to convert the analog signal to digital as accuratly. You can REALLY see how bad VGA is on my Dad's 18.1". My 19" BenQ and my brother's 19" Samsung look almost the same between DVI & VGA.
 
Back