• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Going from DVI to VGA Adapter to DVI without restarting?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Foxie3a

Normal Member
Joined
Sep 7, 2003
I'm having trouble making my DVI work. Both of my monitors (Samsung 19", and Acer 20") are pretty new, and both accept DVI or VGA. Neither of them are working with DVI, so it's not the monitors. I have also tried an older DVI cable I had laying around, and a new "premium" cable I bought. So rule the cables out also.

The video card is the Radeon X1650, pretty new and not abused. It has a DVI and VGA output. I am using both of them, one for each monitor. I am forced to use VGA mode for both of them, so the DVI port is using an adapter. Things are working fine in VGA, and have been for a long time.

The only thing is, DVI is better, and I have invested in the cables, so I want to switch. Also, my current VGA cables are not long enough, and I need to use these long DVI cables I bought, so I will be using DVI, but maybe not until I get my new workstation up and running in about a month's time most likely. I have an HD3850 sitting here waiting to be used, dual DVI, so both will be used then. But for now, I wouldn't mind just getting one of them to work to experiment. Also, it would allow me to move my monitors closer to where I want them. I sort of lean to the side on my desk, and I have an ergo LCD arm that lets the LCDs float where ever I want them. The closer they are to where I want them, the less I have to crank my neck every second.

So I unplug my VGA cable and adapter, and plug in the DVI cable. The monitor, both of them, complain that they aren't receiving a signal. I even turn the monitor off, and back on, they never work when plugged into DVI. I think they look at DVI, but aren't seeing a signal.

That is the problem. Why isn't there a DVI signal? Do I need to reboot the system when going from DVI with a VGA adapter, to actually using DVI? It seems weird that it would require a reboot, I would think it would already be using DVI, and wouldn't know the difference between an actual DVI monitor, and using a DVI to VGA adapter, but maybe the adapter somehow tells the VCard to use analog instead somehow, and isn't actually an adapter?

I know it sounds silly, but I just don't restart my computer much, and if I ever do, I'm probably not in the mood for reaching back there and rerouting wires. If it isn't a reboot that I need, what else is there? It sounds funny, but I have never used anything but VGA.

Thanks! :)
 

CGR

Member
Joined
Jan 4, 2001
Location
Lower NY
Did you go into the monitor settings and see if you have to switch it from vga to dvi?
 
OP
Foxie3a

Foxie3a

Normal Member
Joined
Sep 7, 2003
I tried that. Both monitors won't even show their settings unless if they have a signal, so when in analog mode I turn it to DVI. When I hit ok it acts like it switched and the menu disappears, but when I go back in, it's on analog. Also, I have both outputs plugged into one monitor, so if it went from analog to digital, I would see it switching outputs, but it isn't.

Both monitors aren't working, so it's gotta be the VCard or something. Maybe it needs a restart, I'll try that sometime.

I'm wondering if the monitors are DVI-D and the VCard is normal DVI or something weird like that. I just hope that the HD3850 works perfectly.