• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

DVI vs Analog - LCD input

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Yea I noticed that with d-sub using a 9700 card, in particular the image ghosted very slight was to the right, this made the image muddy looking all the time.
I can see why anyone here thinking about choosing should make it an absolute must to get DVI, especailly in this day and age.
 
DVI for sure.

as for the differnce - LCD can only sdisplay 16bit or 16 million colors? - where as CRT's can do the full blown all out!

this is why for professionals - high end CRT's are used and not LCD"s.
 
Well my Samsung 712N does 24bit which is 16.7 million colours. If you ask me after about a million or so colours I tend to forget what the rest are. What about you? I personally like my LCD and have no complaints especially since my CRT to a dump. Everything blurry and all. To be honest it is much cooler in the room cause mine was a 21" Sony. The electric bill will go down and I wont have to change my contacts prescription anytime soon.
 
The 712n is not a 24bit lcd (16.7 million). It is really a 18 bit lcd (16.2 million). Samsung made a mistake on there website. Samsung only produces a 12ms 6bit tn film panel. Currently there is no 8bit 12ms panel, only 6 bit ones. The only 8bit 16ms panel that I know of is the philip's 20.1" panel used in various monitors such as the 2001fp. Also, if you look at the 710t which uses the same panel as the 712n you will see the correct specification listed.
 
Mr.Guvernment said:
DVI for sure.

as for the differnce - LCD can only sdisplay 16bit or 16 million colors? - where as CRT's can do the full blown all out!

this is why for professionals - high end CRT's are used and not LCD"s.
I just set my LCD to 16-bit color and could immediatly tell a difference from 32-bit. My wallpaper's (WinXP Pro symbol) gradient no longer was a smooth gradient. When I set it back to 32-bit, it looked fine. This is on an OLD Gateway 15" LCD. Model # FPD1500.
 
Cyrix_2k said:
I just set my LCD to 16-bit color and could immediatly tell a difference from 32-bit. My wallpaper's (WinXP Pro symbol) gradient no longer was a smooth gradient. When I set it back to 32-bit, it looked fine. This is on an OLD Gateway 15" LCD. Model # FPD1500.

LCDs can display better than 16bit color. Alot of them can display 24bit color (18 bit looks as good as 24bit for the most part until the image is in motion), however they by no means do this perfectly. Pretty much every lcd made for consumer use has serious problems displaying correct shades of color such as red (and I know that is samsung's case everything leans toward blue). You can see almost all the colors (a few shades are sometimes entirely lost), but they won't be as accurate as they could be. Not many crts can display that accurately either. It takes a really nice crt and some high end calibration to get perfect colors. So unless you are a professional that relies on colors being perfect to make a living, any mid to high end lcd should be relatively fine for everyday usage and personal photoshopping.
 
MassiveOverkill said:
OK, stupid question. If I use my DVI output of my Radeon 9800 pro via my DVI-Dsub adapter, am I using a digital signal or not?
That adaptor converts it to analog, so you're now using an analog signal.
 
microfire said:
As far as settings go, it seems ok with the only D-Sub 15" LCD I have.
5 times better is a big claim to making.

Is any real answer to this question, does anyone really cares, is DVI just another fad, is the advantage of DVI so minimal if at all, its not really an issue but a way for LCD producers to make a few extra bucks?


' not trying to offend you..and im not surprised with your responces coz you only have a 15" LCD....when you get the chance to compare D-sub vs DVI in a 19" or bigger LCD in 1600x1200 or higher resolution, maybe your thoughts will change..
 
So I have both outputs of my 9800 pro hooked up to my 2001FP and am swapping back and forth while watching some video and looking at some photos and stuff.

The difference between the 2 is not terribly great. DVI is a bit sharper but not significantly so. Contrast and also seems a bit better on DVI as well.

I would spring for DVI simply because there is no reason not to at this point. It's a better format, and I believe that HDTV uses DVI as well. (Could be totally wrong on that, to lazy to do my homework on this one)

Having moved from a CRT and analog to a good LCD, I will NEVER go back to an analog CRT. It's just a crying shame that nice LCD's like the 2001FP are still prohibitively expensive.

For those of you who have analog LCD's and the image is fuzzy in some places, play with your refresh rate. Most analog LCD's work best at 70 or 75 Hz, and when the refresh is set wrong it looks AWFUL.
 
I can't believe there is actually a debate about the merits of DVI over DSUB. There must be jealous people (or blind) who don't have the input and would rather not acknowlage it's superiority. Could it be any simpler? Pure digital 25Gb/s link from the videocard to the LCD...vs an outclassed, low bandwidth Digital to Analog, Analog to Digital connection. DAC's are great at screwing up your picture, and with all the cheap Taiwanese companies cutting corners, those are the among the first things they fudge to reduce expenses!
Distance has no relevance to this argument. Whether it be 1' or 20', DVI will be superior. It's like saying "My dialup modem only has a 6' phone cord wheras your DSL Modem has a 12' cord, so my dialup modem must be as good if not faster!" :bang head
Believe me, I work for Monster Cable and we have done exhaustive tests validating both the current DVI format, and the forthcoming HDMI format, and the difference is stark, assuming you have the eyes and equipment to appreciate it. Get a good cable, a name brand videocard and a good cable (we all know the cheapo DVI crap dell includes is laughable)
I hope that clears it up for you man. If you have the option of running DVI, there is absolutely no reason to run DSUB instead. :cool:
 
Needitcooler said:
I am a clarity and resolution freak, and I find it almost impossible to tell the difference between DVI and a standard analog signal. I would just buy the monitor you want, and if it happens to have DVI as well, that will be a bonus.

either you arent a clarity freak or you have bad eyes...

i switched my monitor to analog today because i was going to hook up the dvi to my tv, but then i sat down at my computer and instantly i noticed that the image quality was far less than with dvi. it was noticeable in images, and especially text. looking closely at the screen, i noticed that a letter that is one pixel thick would fade into the adjacent pixels. needless to say i switched back to digital immediately.

then i came on to ocforums to tell how apparent the difference was.
 
Last edited:
shrinkydinx said:
in case anyone doubts dvi's superiority...


And is that result going to carry through on all monitor comparisons, or just your particular monitor's difference between analog and DVI? I was thinking about going for a 710N, because I wanted something good for gaming, but after reading about all of this I'm looking at the 710T and the 172x now for the DVI inputs.
 
I just heard that DVI has better quality but slower response than analog, so therefore analog is better for gaming over DVI. Is that true?
 
Droban said:
I just heard that DVI has better quality but slower response than analog, so therefore analog is better for gaming over DVI. Is that true?
How could it be true? To get an image from a DVI signal is very easy as it is a digital signal. To go from digital to analog and analog to digital must take some time and it DOES hurt the definition of an image. DVI can't possibly be slower as the signal is NEVER changed. Analog would be the "slow" one.
 
Back