• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

DVI vs Analog - LCD input

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

microfire

Member
Joined
Oct 8, 2001
DVI vs D-Sub - LCD input

Im looking at getting a 17" LCD.
Iv'e made up my mind what brand and model I will be getting. The problem now is there is 2 types. One has an analog D-Sub input. The other is alittle more expensive has a DVI input.
My video card has support for both of these connectors.
The real question is, what is the noticable difference im going to see with my eyes with DVI vs D-Sub LCD monitor inputs.
 
Last edited:
I'd get the DVI as it is purely a digital signal that is from your video card to your monitor.
If you look at the Nvidia 6800Ultra card it is only DVI and video cards will be moving in this direction. I buy my LCD's that have the DIV because I think the digital signal is better.
 
Your've got a good point. DVI is got to be better otherwise they wouldn't have it.
When you say better, what do you actually see thats different? the colours or what?

Highly impressed with the CMV 1515 that I got for cheap a few days ago, and the comments about it. This little screen is only analog input and kicks the *** of my Philips 109P4 CRT monitor. I don't even want to see that CRT monitor any more, can can actual open my curtains and still see the screen and game very nicely with its 15ms, brightness 400, and good contrast 500, .
The Samsung you have looks damm appealing with 12ms, but weights at 50% more expensive than the CMV 17" screen I am looking at where I live.
This is the screen im thinking of getting for my main rig, CMV CT-722, its 16ms, Brightness 400 cd/m2, Contrast Ratio 450¡G1.
http://www.cmv.com.tw/en/product.asp?pid={AAD6D4B5-F383-4B78-BDD8-17641D22BC00}

Let me know what you think. Im leaning towards the DVI thanks to your advice, its slighty more expensive and I will have to wait till monday or tuesday for it.
 
Last edited:
microfire said:
DVI is got to be better otherwise they wouldn't have it.

BS i haven't noticed a difference with it. its very very very minute. you can't say a difference is comparing a monitor with dvi and one with out, because they are different monitors to begin with. you have to try one with both native dvi and native dsub. but anyways, there is tons of technology that we have and its there because it is, not because its better. look at ata133, it does nothing, what so ever. IIRC its a 3% performance boost over ata 100, which 99.999% of drives can't even break that barrier
 
Someone convince me if I should go DVI or D-Sub.

The real difference at this stage is the D-Sub only LCD is about 15% cheaper than the DVI model from local store. They don't have DVI there, I would have wait order that.
The store has a D-Sub LCD model right ready to go today, I can get this thing right away.
Thats two good reasons.
So the D-Sub only model has good price point and availability.

If DVI is going to be better then I order through web retailer and wait a few days for them to process and deliver it.
Web reatailer has the DVI model for about the same price as the D-Sub model at the chain store. I guess that online service will be nowhere as good as the local chain store, if there is ever a problem(eg. huge bunch of dead pixels), premature death warrenty.



or maybe I should reconsider keeping my big bulk ***, power sucking, curtain closing, eye straining, pixal bluring, CRT ?
Any good reasons to keep the CRT are becoming minimal now, hmm let me think... colour(with curtains closed) maybe, umm, high res gaming(eye candy turned down some), higher refresh rate. There really isn't much reason at all these days huh.
 
Last edited:
it really depends on the monitor. some of the lower quality monitors, dvi is 5x better than d-sub. personally, i'd go with dvi because its digital. plus you won't have to mess with the settings, at least in terms of the screen position/size.
 
As far as settings go, it seems ok with the only D-Sub 15" LCD I have.
5 times better is a big claim to making.

Is any real answer to this question, does anyone really cares, is DVI just another fad, is the advantage of DVI so minimal if at all, its not really an issue but a way for LCD producers to make a few extra bucks?
 
I have a samsung 213T lcd, and I believe in dvi and here's why. After testing both the analog and the dvi inputs on my lcd I have come to this conclusion.

Colors were a bit more vibrant with dvi vs analog.
You don't have to always resize and fix the geometry and color with dvi as it is done automatically for you.
Ghosting (on my panel) was severly reduced when using dvi.
Fine levels of detail in the desktop and in dvd movies were noticed using dvi that were not present over the analog connection.
Text was a bit sharper.
Downsampling (running lower that the panel's native resolution) was handled better.

Those were the diffrences between dvi and analog on my pc and lcd that I was able to notice, hope this helps your descision.
 
thanks that helps heaps.

Theres no doubt, after good study and my small 15" LCD next to 19" CRT, both have advantages, like the color is more real looking at any angle on the CRT, but the LCD is so much easier on the eyes with its sharpness, pixel definition.

This model appears to the best in a higher price range:

Syncmaster 172X, 17", 1280x1024, TFT LCD, Silver

Anygood? Has DVI and 12ms. Is going cost upto 50% more than the CMV 17", you think it will be worth the extra?



btw, the thought of the CRT electron gun directed at my eyes is disturbing me now.
 
Last edited:
Actually the color on my samsung is far superior I believe than my old viewsonic g90. I've read some positive reviews about the 172x around the net so the choice is always up to you to pay more.
 
I am a clarity and resolution freak, and I find it almost impossible to tell the difference between DVI and a standard analog signal. I would just buy the monitor you want, and if it happens to have DVI as well, that will be a bonus.
 
My LCD has DVI and is VERY clear and handles resizes without adjustments... My dad's 19" LCD is D-sub and while it is clear in some parts of the screen, it's also very blurry in other parts of the screen which drives me up the wall. Also, when the screen resizes, you have to readjust the screen, which takes a while. Also, I swear the color is better on mine...
 
I have a Princeton 19" with both D-sub and DVI input, and a FX5800 that supports both modes. I went back to analog mode when I found that you cannot use any "manual" adjustments in the all digital mode. If you mainly running programs with limited video adjustments, being able to use the old fashioned "brightness" and "Color" buttons helps a lot.
 
Like whitewale's Princeton, my LG782LE has both DVI and DSUB inputs, so I likewise can easliy test both input types using the same monitor. As 12AM observed, DVI provides better color saturation and clearer text, ghosting is reduced and details are clearer. Note that the difference is subtle, but noticeable.

FWIW, the LG782LE allows brightness, contrast, and RGB value adjustment while in DVI mode. The only adjustments disabled are tracking and position.
 
For comparison only, when in digital mode I can only change the brightness settings on my montiors osd. Everything else has to be changed by programs/vid card. Stock colors and brightness were excellent though.
 
In case u didnt know, this is what is the difference:

When using a analog connection on the LCD screen:

Videocard produces digital signal. This signal is being converted to analog by the integrated DAC. The signal proceeds to the screen through the D-sub cable, in analog form. When it reaches the screen, the signal will be converted back to digital by the DAC in the screen.

When using a DVI connection on the LCD screen:

The digital signal produced to the graphics card is immediatly transportated to the LCD screen in digital form.


So in other words, when u use a DVI connection, u won't have the 2 DACs messing with the signal and making ur image look crappier. I think u should never even consider buying a LCD screen with a anolog connection when u have a digital output on your graphics card. U don't want to lose image quality because of some pointless Digital-to-analog converting and back.
 
I'll pitch in...

Just got two dell 2001fps and the analog out on my 9800 pro is definitely suckier than the dvi out...

I've already posted in the ati section, but does anyone know if the pci-e or the current x800s sport dual dvi outs? If so, which brand which model?
 
Okay... as I've been looking into DVI vs. DSUB stuff lately, I'll end this debate.

To most eyes, there will only be a difference between DVI and DSUB if the mionitor og graphics card uses cheaper components in the DSUB circuitry, otherwise there wont be a difference on a standard 1½ meter cable.

The true differences kicks in when you have long cables, as used with projectors in meeting rooms and auditoriums, as you can convert the DVI signal easier to fiber optic cabling, where you would otherwise use signal repeaters for the DSUB signal.

In 9/10 cases, DVI isn't much different quality wise for your usual gaming and dekstopuses, but offers some adjustable functions and less circuitry: But it does offers better scalable interface for use with digital components within A/V ediiting, projectors etc... and as such is better.

Cheers, Flix
 
With d-sub you run the risk of streaking, halos, interference, etc. The larger the lcd the more likely these problems become.
 
Back