• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

DVI vs VGA Connectors

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

mudpark41

Member
Joined
Sep 25, 2004
Location
New York
I just got a 19' BenQ 4ms LCD Monitor. It has both a DVI and VGA connectors in the back. I have tried both and with DVI there is no auto adjust. VGA connector does auto adjust. Are there any other differences between DVI and VGA because I cannot see any differences.
 
Also, the color balance is skewed by the needless D-A and A-D conversions inherent in using the analog input.
 
Go with DVI.

As larva said already there's no conversion process involved and because it's a digital signal you can run it longer than a VGA cable without degradation.
 
DVI means crisper text. I have compared the two head to head, and quite simply the DVI is the only way.
 
Atleast with higher end monitors, the difference between DVI & VGA is less noticable. I have my 19" on VGA and my 15" on DVI since my card doesn't support dual dvi & the 15" is DVI only. The 19" still seems pretty clear to me. Atleast it's not as bad as at my school... they have 17" LCDs with a native resolution of 1280x1024 on Dell Precision 670 workstations. They each cost over 3k new! Still, they INSIST on running at 800x600 (although they are using DVI)! It makes me sick!!!!
 
Im thinking maybe DVI VS analog doesnt make much of a difference on smaller screens, like 19" and under? Iv heard peaple say, that on higher resolution screens, the DVI makes more of a difference, but I wouldnt know, my monitor is 19" analog only
 
If it's only analog, then stick with it, not worth buying a whole new monitor just to use DVI. Some say the higher end monitors have better images on DVI, and it would make sense for that to be true. You can use one of those DVI to VGA connectors when you connect your monitor to your video card, but then it's not a huge improvement.
 
I don't see any difference between analog and digital on my monitors. The only thing that is different is that DVI is automatically at the correct setting.
 
I know a guy that uses a 50" Hitachi LCD for his monitor(mostly for gaming). I don't know about you guys, but I wouldnt' pay the $4,000 he paid to have a 50" monitor. Seems liek overkill to me. And he still sits like 3 feet away.
 
sno.lcn said:
I know a guy that uses a 50" Hitachi LCD for his monitor(mostly for gaming). I don't know about you guys, but I wouldnt' pay the $4,000 he paid to have a 50" monitor. Seems liek overkill to me. And he still sits like 3 feet away.

Don't knock it till you've tried it. I run all my games on a 30" and I would go larger if I had the cash. It's all a matter of priorities. The same could be said for a guy who watercools for another 1/4 gig on his cpu. A hobby is a hobby is a hobby. :)
 
Iv been thinking about going with something widescreen

but its to hard to choose between the Gateway 2185 or the Dell 2005
 
All this stuff about the superiority of DVI is patently rubbish, at best it is a toss up between them, clearly, unless your converters are _very_ bad you will never notice a difference, in fact, my panel works better under analog out(2005 fpw) , although perhaps that is uncommon. If you don't believe me , and have a dual input monitor, download NTEST and switch between the inputs on the focus and moire tests, which print very fine patterns on the display. No difference at all. In fact, my DVI flickers like crazy on the left side of the moire test pattern. Now from my understanding, at a 'per pixel' level, the dvi is actually in better focus than the analog out (the analog nature of the signal 'smears' the picture very slightly) but the difference cannot be observed at a normal distance from the monitor, your eyes simply cannot pick up that much detail, it is analogous to (but different from) the fact that even the most powerful satellites cannot resolve a 'dime' on the surface of the earth due to the diffraction of light through air. Our eyes aren't that good, DVI vs. VGA on small panels is a bull**** argument, just like SATA vs. ATA or AGP vs. PCIex. On extremely high resolutions where the actual bandwidth of a vga connection becomes a problem then DVI is obviously the only way to go. If you don't have a 30" display this probably doesn't apply to you.
 
Well, I've been going to try the VGA vs. DVI, just haven't had the time. I've had issues with SLI and DVI connections at 1080i resolutions and thought maybe a hook up through the VGA port might correct those. I've been hesitant due to the drop in quality but if, as you state the difference is neglibile (sp?), it might be worth a shot. I'll run benchies and try the switch off this weekend and post results. Especially if the weather goes to crap as predicted.
 
SuperFarStucker said:
All this stuff about the superiority of DVI is patently rubbish, at best it is a toss up between them, clearly, unless your converters are _very_ bad you will never notice a difference,
And since some people just don't notice much, there obviously can't be any difference ;)
 
larva said:
And since some people just don't notice much, there obviously can't be any difference ;)

Well I do make the assumption that my eyes are as good as the average persons and perhaps my panel isn't 'representative' but in two different cables on two different cards (in different machines) and I noticed no difference. I'm sure if I asked my dog he would agree :)

Also, all the machines at my workplace are connected via VGA dongles (Hyundai L90D) and the picture is 'optimal' (though there are some hideous wallpaper selections!). The school also has a math computing lab full of flat panels connected to cheap dell machines which I'm willing to wager don't have DVI connections, and guess what, the image looks fine. My laptop with an integrated intel accelerator outputs a perfectly clear and stable image as well over VGA.

If you want to appeal to scientific rigor go right ahead, might as well discard every last thing ever said on these boards :rolleyes:

I'm sure some cases exist where the monitor doesn't handle VGA connections well, but I haven't seen it. The converse, is in fact true with my panel, but perhaps that is indicative of some sort of problem with my DVI port. In any event the VGA is flawless.
 
my theory, is that where Digital connection can auto adjust many settings, and the Analog doesnt, this might be where many peaple see the difference

they might plug in the analog, and look at a screen that is not at its native refresh rate and see the slight blurriness, and then plug in the DVI and see it at its native refresh rate, and whatever gets auto adjusted, and see that it clearly looks better

this, just another theory from a guy whos never used Digital connection though


I know one thing though, I dont buy that one persons vision is so much different from the next, that one person may see a difference, and another not, unless your colorblind, in wich case your probably not participateing in this debate
 
Back