• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

DVI vs Analog - LCD input

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
its the samsung 910t.

the 910t its a niice monitor, the fuzziness of analog is going to carry through to any monitor with a high resolution...at lower resolutions analog doesnt have as much trouble (though it will never be as perfect as DVI)
 
shrinkydinx said:
its the samsung 910t.

the 910t its a niice monitor, the fuzziness of analog is going to carry through to any monitor with a high resolution...at lower resolutions analog doesnt have as much trouble (though it will never be as perfect as DVI)

So at 1280x1024 it isn't as noticeable?

About the d-sub vs. dvi, on a few other forums that I searched there were people saying that gamers might want to go for the analog connection, and it will give better response. But that didnt make sense to me, on a few review sites they said switching the DVI eliminated the small amount of ghosting that occurred with analog.

BUT the reason they're saying analog is better for games is because say with analog you can 75hertz refresh rate, and with DVI you get 60 or 65.
 
Last edited:
Droban said:
So at 1280x1024 it isn't as noticeable?

About the d-sub vs. dvi, on a few other forums that I searched there were people saying that gamers might want to go for the analog connection, and it will give better response. But that didnt make sense to me, on a few review sites they said switching the DVI eliminated the small amount of ghosting that occurred with analog.

BUT the reason they're saying analog is better for games is because say with analog you can 75hertz refresh rate, and with DVI you get 60 or 65.
That's not a limit of DVI, it's the limit of the LCD.
 
So there are LCD's that run the Analog connection at 75hertz(1280x1024) and DVI at 65heart(1280x1024)?
 
Droban said:
So there are LCD's that run the Analog connection at 75hertz(1280x1024) and DVI at 65heart(1280x1024)?
Your not supposed to run at 75hz with most, if not all, LCDs. You'll prob fry the controller. If it does let you run at those rates, it's prob for compatibility. 60hz/60fps is PLENTY anyway.
 
Droban said:
So at 1280x1024 it isn't as noticeable?

About the d-sub vs. dvi, on a few other forums that I searched there were people saying that gamers might want to go for the analog connection, and it will give better response. But that didnt make sense to me, on a few review sites they said switching the DVI eliminated the small amount of ghosting that occurred with analog.

BUT the reason they're saying analog is better for games is because say with analog you can 75hertz refresh rate, and with DVI you get 60 or 65.

no, the pic i posted was of dvi and analog at 1280 x 1024. i guess these days not many people use monitors where it wouldnt be noticeable...

regarding fps, your eyes could never tell the difference between 60 and 75 hz. partly because its not a big difference, partly because your eyes' framerate isnt even that high :)
 
I'm pretty sure the general consensus is that LCD's still have hz settings for compatibilities sake if nothing else. That being said, does it actually make a difference what hz setting you are on? Does it effect the image on an LCD screen? Does it effect it whether you use DVI or VGA?
 
The Hz setting means nothing. It does not affect how the lcd renders the image at all. It is only there so your lcd can be compatible with your video card. LCDs don't refresh like crts; either the pixel is on or off. To see how many fps your lcd can display look at the response time. A 16ms lcd can display 62.5 fps. A 12ms lcd can display somewhere around or over 80fps. If you set a 12ms lcd to 60hz you will not be limited to 60fps (unless you turn on vsync), you will be limited to the 80-something fps that your lcd's response time limits it too. Also, anyone who says analog is as good as or comparable to dvi is blind. DVI is superior in every way.
 
shrinkydinx said:
in case anyone doubts dvi's superiority...

are those images taken while moveing the around the window?

if so then is it safe to say "ghosting" is caused by slow response time, and "blurring" is caused by analog conversions

or are they the same thing?

Im still very confused about the ghosting thing, cuz I dont think there is a monitor out there without totally contradicting reviews

even the best monitors out there like the Dell 2001fp you can find reviews that say there is ghosting, and the review right under that could say No ghosting at all, and no way of knowing wich party to trust

though I tend to trust the ghosting party, as my own monitor blurrs with 8ms response time, and most reviews on sites like newegg and tigerdirect for this monitor will say "NO ghosting !" and such I guess my ghosting could be due to the the analog, but its an analogonly monitor, so I know all reviewers are useing it, or is analog even responsible for that kind of thing ??

I wish I could figure this out

why do I bother asking, as soon as someone answers, someone else will step in and totally contradict his/her post :p
 
consumer9000 said:
I can't believe there is actually a debate about the merits of DVI over DSUB. There must be jealous people (or blind) who don't have the input and would rather not acknowlage it's superiority. Could it be any simpler? Pure digital 25Gb/s link from the videocard to the LCD...vs an outclassed, low bandwidth Digital to Analog, Analog to Digital connection. DAC's are great at screwing up your picture, and with all the cheap Taiwanese companies cutting corners, those are the among the first things they fudge to reduce expenses!
Distance has no relevance to this argument. Whether it be 1' or 20', DVI will be superior. It's like saying "My dialup modem only has a 6' phone cord wheras your DSL Modem has a 12' cord, so my dialup modem must be as good if not faster!" :bang head
Believe me, I work for Monster Cable and we have done exhaustive tests validating both the current DVI format, and the forthcoming HDMI format, and the difference is stark, assuming you have the eyes and equipment to appreciate it. Get a good cable, a name brand videocard and a good cable (we all know the cheapo DVI crap dell includes is laughable)
I hope that clears it up for you man. If you have the option of running DVI, there is absolutely no reason to run DSUB instead. :cool:

I cant tell between DVI and D-Sub. Even though i know DVI is way better when you have a digital connection as to something that is interferable. What about the cable that dell includes since i have a 2405FPW and i dont see any difference between the 2 >< my guess is its my eyes or this laughable cable you speak of.
 
Have been using DVI and D-Sub connections for a year now, DVI is the way to go, exact reproduction of signal.
Perfect pixels and color, would not never go back a D-Sub connection.
End of story.
 
motherboard1 said:
are those images taken while moveing the around the window?

if so then is it safe to say "ghosting" is caused by slow response time, and "blurring" is caused by analog conversions

or are they the same thing?

Im still very confused about the ghosting thing, cuz I dont think there is a monitor out there without totally contradicting reviews

even the best monitors out there like the Dell 2001fp you can find reviews that say there is ghosting, and the review right under that could say No ghosting at all, and no way of knowing wich party to trust

though I tend to trust the ghosting party, as my own monitor blurrs with 8ms response time, and most reviews on sites like newegg and tigerdirect for this monitor will say "NO ghosting !" and such I guess my ghosting could be due to the the analog, but its an analogonly monitor, so I know all reviewers are useing it, or is analog even responsible for that kind of thing ??

I wish I could figure this out

why do I bother asking, as soon as someone answers, someone else will step in and totally contradict his/her post :p


Wow, this is an old thread.

Anyway, the reason you see different reviews make totally different claims about the same monitors is because everyone's eyes are different. My threshold is about 8ms. If a monitor is at least that fast then I can't see any ghosting, but on slower ones I do. But there are a lot of people who can't detect any ghosting on some popular 12ms panels (like the Dell 2001fp)... yet I look at that same monitor and can clearly see ghosting. It's literally all in the eye of the beholder.

In any event, I'm not sure if using DVI would help with ghosting or blurring at all. I actually kind of doubt it. DVI vs. VGA is all about clarity and accuracy, not really about speed.
 
I find it hard to believe that peaples eyes can be so different from one another, and besides, whats the big diffrence between 16ms 8ms

you might say its twice as much, but when your talking thousanths of a second I cant see their being much of a diffrence really

heres another quick question for anyone who wouldnt mind answering it for me "if you know the answer"

I just bought a little DVI to Analog Adapter, to stick into the DVI port on my Graphics card so I could experiement with haveing 2 monitors hooked up, my new LCD and my old CRT

I know Iv been told that the "'digital to analog to digital"' conversion required with analog LCDs is a bad thing

so Im wondering is this adaptor in the DVI port going to cause an extra conversion and loss of data, so far it doesnt seem any diffrent, but the change might be subtle, if there is or isnt any extra conversion, I want to know so Im not paranoid about it

oooh oooh ohhh, and another question, do all LCD monitors use basically the same technology for Digital/analog conversion, or are some monitors better at useing thier analog input than others?
 
Motherboard...16ms and 8ms aren't actualy as small of a unit as you'd think. if we were talking CRT's/video card refresh rates that would be 62.5 hertz/FPS compared to 125...definetly matters. However, with LCD's, even if the pixel response is 16ms, that's the worst case scenario, usualy pixels don't need to change from one end of the spectrum to the other, so ghosting usualy only occurs from white to black (or vice versa).

There is no difference doing DVI adapter to VGA and into the LCD, as there is running strait up VGA into the LCD. The reason is that the DVI ports output both DVI and VGA, the "Adapter" just uses the VGA set of pins, so it's no different than the VGA.

Now for do some LCD's convert better? I doubt it...it just scans the incoming VGA signal, and converts the resolution and colors to digital representation and tells the pixels to change as appropriate. The technology gap is at the pixel level, making sure it changes fast enough, we can convert analog to digital fast enough to not worry about it.
 
motherboard1 said:
I find it hard to believe that peaples eyes can be so different from one another, and besides, whats the big diffrence between 16ms 8ms

you might say its twice as much, but when your talking thousanths of a second I cant see their being much of a diffrence really

heres another quick question for anyone who wouldnt mind answering it for me "if you know the answer"

I just bought a little DVI to Analog Adapter, to stick into the DVI port on my Graphics card so I could experiement with haveing 2 monitors hooked up, my new LCD and my old CRT

I know Iv been told that the "'digital to analog to digital"' conversion required with analog LCDs is a bad thing

so Im wondering is this adaptor in the DVI port going to cause an extra conversion and loss of data, so far it doesnt seem any diffrent, but the change might be subtle, if there is or isnt any extra conversion, I want to know so Im not paranoid about it

oooh oooh ohhh, and another question, do all LCD monitors use basically the same technology for Digital/analog conversion, or are some monitors better at useing thier analog input than others?

I have a BENQ 767 17" that has only an analog input.Using the adapter does make a difference,not in games and possibly not in native res but I run 1024x768 and with the adapter text such as this forum is fuzzy to the left and right thirds of the screen.I use this res because I am at least two feet from the monitor and cannot read the text at the native 1280x1024.Without the adapter all text seems clear,no fuzzy,no blur.
I do not know why this would be but on my monitor,straight analog is better then the adapter.
 
motherboard1 said:
I find it hard to believe that peaples eyes can be so different from one another, and besides, whats the big diffrence between 16ms 8ms

People's vision differs greatly. Some people are color blind, and there are even two different types of that. So we don't all see colors the same. And not everyone has 20/20 vision. Those qualities alone will result in people forming opposing views on the same LCD panel.

Still not buying it? Abstract art or looking for shapes in clouds are another way people see the same exact thing differently. And then there are those weird hidden pictures, the kind you see in your local mall where there is a picture inside a pattern. Not everyone can see the picture, some people have to stare a while before they see the picture, and others can see it almost instantly. The reason it works this way is because the brain plays a large role in how we see things, and it's well established that we don't all think the same ;)

motherboard1 said:
you might say its twice as much, but when your talking thousanths of a second I cant see their being much of a diffrence really

Thousandths of a second sounds like a really small unit of time, and in the grand scheme of things it probably is. But it's certainly not an unnoticeable amount of time. I've watched races (big NASCAR fan) where two cars crossed the finish line mere thousandths of a second apart. And I could tell which car won without the aid of slow motion instant replay. So could many other people, including the announcer.

I have excellent vision, it was better than 20/20 in my youth. And I most certainly can see ghosting on the majority of 12 & 16 ms panels. Not all the time, but certain scenes in a game or movie will reveal it. This is why I still game on a CRT. But on most 8ms rated panels those same scenes look clear to me.

For reference, my favorite ghosting test is a scene from the original Jurassic Park. About 10 minutes into the movie there is a helicopter flying over the ocean, and the black blades are contrasted to the white-ish sky. On slow panels I can clearly see artifacting as the blades rotate, and that is ghosting at it's finest.
 
shadowdr said:
I have a BENQ 767 17" that has only an analog input.Using the adapter does make a difference,not in games and possibly not in native res but I run 1024x768 and with the adapter text such as this forum is fuzzy to the left and right thirds of the screen.I use this res because I am at least two feet from the monitor and cannot read the text at the native 1280x1024.Without the adapter all text seems clear,no fuzzy,no blur.
I do not know why this would be but on my monitor,straight analog is better then the adapter.

Either your running back at native res with strait up analog to have this happen or your video card is damaged. Your video card is outputting the same VGA signal through either port, the DVI adapter just uses the other VGA signal, so if that one seems blurry and the other doesn't, your mind is playing tricks on you (very possible), or your video cards DVI side port is damaged.
 
I think most peaples vision is better than or atleast 20/20 in their youth

mine was better than 20/20 also, untill I got my computer at age 19, and within a couple years my vision just plumited (didnt need an eye docotor to tell me it was deteriorating fast) I dont know what it is now, Im nearly 24, still good when I havent been in front of a computer for a few days though :p

and this vision rateing would not have anything to do with the ability to detect fast changes in an image me thinks, and has nothing to do with the brain

I wonder what it is about a monitor that messes up your vision ?

maybe the bright light beaming into your eyes for hours causeing some extream fatigue

or maybe with the CRT its the radiation working you over on a cellular level :shrug:
 
Back