• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Difference btwn Analog/DVI > $100?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

bLack0ut

Member
Joined
Dec 21, 2004
Is DVI really that much better than analog? I was reading over on fatwallet and a member stated that most lcds that come with analog only include analog to DVI adapters and that works fine. However, everyone after him said that DVI is soo much better than analog and its worth the $100 extra. Is this is imagined as the difference between 8ms and 4ms response times?
 
I'd say a little of both. I can tell a difference....but it's up to you if you want to spend the cash.
 
What kind of difference? I read an article that described it as a "shimmering" and "slightly off sync" difference.
 
Ya, I read the one you posted in, but that was almost a year ago. Also, there are too many conflicting opinions in that topic anyways. I need something more up to date, with hard facts like a benchie showing the difference
 
Analog input doesn't always look awful and people don't always know exactly what they are looking at. You combine those two factors and you will run into some people who say it's not worth it. Often they have already bought or have already decided to buy an analog-only unit.

Analog input can look terrible, and always imposes at least a slight penalty on image quality. And it nearly always seriously degrades the color accuracy performance. I won't buy LCDs with no DVI input as I've seen you can end up wasting the $300 you spent trying to save $50-100. It's false economy even if your aren't particulary aware of what it's doing to the IQ and color accuracy. You can put up with analog in most cases, but if you are really trying to improve the quality of your display DVI the best place to start.
 
This is a pretty interesting thread for me. I just bought a Samsung SyncMaster 914v from Staples on Black Friday super-sale ($257-$57 MIR), and it's a ton better than my 5 year old 19" CRT monitor. That said, I looked at other higher priced DVI monitors and figured I could live with Analog. I figured Samsung's a good brand, and they will be OK. I game just a little bit, and mostly surf the web with my PC, so it's not worth the extra 100 bones to me.

But from what I researched, Larva's right. DVI is preferable because of the conversion from Digital in video card to analog cable to digital monitor. Cnet says that the difference isn't nearly so pronounced on modern monitors. My choice was between the 19" Norcent (CompUSA brand, I think) with DVI vs. 19" Samsung with standard VGA in. I went with the Sammy.

Is it the best solution? No. Is it the right solution for my budget? Yes. Does it look terrible? No. In fact, I'm kind of rediscovering the forums--the gray never came out clearly on my clapped out 19" CRT.

Good luck!
 
Yeah, it's not to say that analog can't be acceptable. But then again there are cases where it proves not to be. I think the real question you need to ask is "do I want to risk X amount on a monitor without the assurance of DVI". If the monitor is cheap enough you aren't talking about much risk, but if it's a more expensive set you definitely would want it to have DVI.

And if you are just replacing an old CRT you probably will be impressed with the results either way. You just find there is a variability to the analog results that DVI eliminates, so if there's much money on the line you don't really want that variability mucking with your value. And if color matching is your goal, by all means, go DVI.
 
^^^ true - we havea dell 15' analogue @ work i use daily on a backup system and for me it looks fine, no issues - but i have also used a dell 2004 20.1' and i tried both DVI and VGA as i was curious - and i could tell for certain!
 
analog-digital.jpg

same monitor, same vid card.

1280x1024
 
I am assuming that the top is DVI and bottom is Analog, correct?'

I would imagine that the upper picture would as a whole be clear and crisp compared to the bottom. Altho it would be entirely up to the idividual if they thought this would be worth $100 extra.
 
My 930B has DVI and analog, I can switch them back and forth in UT2004 and not notice a difference.


I guess if you're taking 6x optical zoom pictures from 1" away it matters. Maybe on paper it matters. Not in real life it doesn't.
 
Well, what I meant was is there a difference between Analog only moniters and ones that support dvi. Of course moniters that support both analog and dvi will have better picture on dvi, that's what the manufacturers focus on.

Anyways, doesn't matter, I already bought a 19'' analog lcd, 8 ms, for $180. It's on its way.
 
hello, could someone quickly tell me which of these monitors will give me better visual quality?
here's my current monitor which only supports VGA
http://www.sceptre.com/Products/LCD/Specifications/spec_x7gKomodoIV.htm
and this one that's DVI.
http://reviews.cnet.com/NEC_MultiSync_LCD1860NX/4507-3174_7-20818334.html?tag=sub


^ the bottom one's my dad's. i was going to buy a dvi cable if i can confirm that it will give me better image quality. I noticed i saw more aliasing in games with my lcd monitor than what i could compared to my 17 inch (smaller screened) CRT monitor at the same resolution.

Will i see an improvement if i trade monitors with my dad and use dvi cables? and also, should i get the male to male cables?
 
DVI cables are like 6 bucks (ebay on, and yes male to male)...6 bucks is worth it to try IMHO.
 
ajrettke said:
DVI cables are like 6 bucks (ebay on, and yes male to male)...6 bucks is worth it to try IMHO.

The best place to get any sort of cable is monoprice.com which has very high quality cables for dirt cheap prices. Plus you know you are getting a better cable than you could get on ebay.
 
i bought some dvi cables... but they didnt fit. they were dual link. they had 2 extra pins that my monitor didnt have holes for... but my videocard does.
 
On my 2005 FPW the analog input works better than the DVI input. I get shimmering with the DVI in on the left side only, which is really noticeable with gray colours. Tried out two different video cards(in different machines) on two different cables. No shimmer at all with the analog input in any configurations. I ran NTEST and I could not tell the difference between the inputs (you can't notice the shimmering on white backgrounds) on the moire & focus tests (which are really fine patterns). DVI is a non feature unless you need the bandwidth for dual link setups.
 
i had a 19inch samsung 913v monitor sitting around that my dad got for his server sitting around that i assumed was worse. after looking at the specs i realized it was much much better than the 17 inch one iwas using. With this monitor i actually see an improvement with AA and AF on. :) no more jaggies. :D

everythingl ooks very bright though... i mean too bright. i gues i need to fiddle around with the settings.

here it is on newegg. what do u guys think?
http://www.newegg.com/Product/Product.asp?Item=N82E16824001192
 
Digital and anolog make little difference in native resoloution.It is when you take a 1280x1024 and change it to 1024x758.With dvi it stretches the outer thirds of each side to match the number of pixels(like widescreen TV's do).With most things it is not noticable. But when you view type like these threads, the left and right thirds are blurred.With analog input it is more like a crt, pixels don't seem to matter.The type is clear all across the screen.
 
Back