• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Radeon + DVI = no signal

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Vfrjim1

Member
Joined
Sep 18, 2001
Location
Rhode Island, USA
I have a Radeon 7500 and my friend has a 8500 and we both seem to have the same problem. After approximately an hour or so of non-use of the PC (no keys touched or mouse moved) but still running an application, I would switch back to the monitor with the DVI input and it would say No Signal. If I remote into the machine, the desktop is active and I can reboot, but the signal will not come back to the monitor till I reboot the PC. I have the PC set to NO energy savings, so it should not be shutting the monitor off. Does anyone have a clue of what is going wrong?

Thanks,

Jim
 

MospeadasDark

Member
Joined
Mar 28, 2002
Location
Lansdowne, VA
Are you turning off the monitor? If you let it sit there while turned on and let windows shut off the monitor it's fine.

If you shut off the monitor physically...Both the Radeon and 8500 will turn off it's DVI signal. Most DVI cards need to detect a DVI monitor in the DVI slot at bootup for it to be active.
 
OP
Vfrjim1

Vfrjim1

Member
Joined
Sep 18, 2001
Location
Rhode Island, USA
The monitor I am using has multiple inputs and DVI is just one of the 7 inputs, so the monitor is not off but is being used on one of it's other inputs(not DVI though).

So what your saying is that it needs to be connected exclusively to the DVI input for windows to keep the DVI output running? Cause that would bite, does Nvidia operate the same way or is this a ATI thing?

Thanks,

Jim
 

MospeadasDark

Member
Joined
Mar 28, 2002
Location
Lansdowne, VA
Vfrjim1 said:

So what your saying is that it needs to be connected exclusively to the DVI input for windows to keep the DVI output running? Cause that would bite, does Nvidia operate the same way or is this a ATI thing?

Thanks,

Jim

It's not windows, it's the card. The videocard need to detect the monitor in DVI before POST to keep the signal. Yeah, it does suck=) but I know my Parhelia doesn't operate this way. I haven't tried a GF4 yet because I lack time=\
 
OP
Vfrjim1

Vfrjim1

Member
Joined
Sep 18, 2001
Location
Rhode Island, USA
Well, I DO allow the monitor to be detected before POST and it still does this, the only thing I do while the PC is running is to switch the input on the monitor to another source and it loses the DVI output anyways.
 

MospeadasDark

Member
Joined
Mar 28, 2002
Location
Lansdowne, VA
Yes, that is what I'm saying.

In order to get a DVI signal you must have the DVI monitor connected before POST.

If at anytime you remove the DVI monitor's connection the signal will die. Analogue still works but there is no longer a DVI signal.
 
OP
Vfrjim1

Vfrjim1

Member
Joined
Sep 18, 2001
Location
Rhode Island, USA
Well, I found out how to solve the problem with the ATI, and it should work with Nvidia, but you need to use Powerstrip to solve the problem. Here are the instructions:

Powerstrip => Options => Monitor information => Write Custom Monitor Driver on the "Options" pulldown. Enter the info you want, turning off DPMS. Save the INF.

To make it effective Start => Run => devmgmt.msc (Device Manager) => Expand "monitors" => Double-click your monitor => Driver => Update Driver => Install from a list or specific location => Don't Search => Next => Have disk => Select the INF you just created.

So far, I've tried it and it works for 6+ hours when before it only worked for one hour before the DVI output would shut off.

My thanks to "saleh" at www.avsforums.com for giving me the solution to this problem and I figure that I would share it with all of you.

Jim