• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Display mishandling input changes

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

trents

Senior Member
Joined
Dec 27, 2008
Just about started an RMA process for my motherboard last night until I stumbled upon the real issue.

I was experimenting with building a Hackintosh on the system outlined in my Sig. If you have never built a Hackintosh you need to know that you generally need to begin the Mac OS installation process with a vanilla build. By that I mean you remove unnecessary add on hardware like video cards and get the Mac OS installed first and then add things back in. In this particular situation it meant removing my Nvidia GTX 1050 Ti and going with the IGP. To make a long story short, I got into a situation where after I put my video card back in I no longer got a video signal, not in Mac OS or in Windows. Message popped up on monitor saying no signal. Both the video card and the onboard video have HDMI output ports so I wasn't switching to a different kind of signal. That wasn't an issue.

I took the card in and out several times but no joy. I concluded that the PCI-e component of the motherboard had failed but before generating an RMA with ASRock I tried one more thing while the Nvidia card was still in place. I hooked up another monitor and bingo! Got a picture again.

So then I thought to myself, maybe the HDMI port or electronics on the monitor are bad. So I got a DVI cable out and switched over to the DVI port on the monitor and the video card and got a signal on the original monitor. Okay I thought, I'll just go with DVI as I use external speakers anyway and don't need the HDMI audio.

But then for some reason I started monkeying with the control buttons on the monitor and checked the input setting. It was on DVI, not HDMI. With the control I switched the input setting on the monitor and now HDMI was working again.

What I don't understand is why the monitor automatically switched to DVI input when I was using HDMI on both the IGP and the Nvidia 1050 Ti to begin with.

Thought I'd share this experience so that others who may run into the same phenomenon will not reach false conclusions about hardware failure like I initially did. What I missed early on was that the no signal message on the display said, "No DVI signal." If I had caught that I would have realized it falsely recognized the change from video card to IGP and visa versa as a change in input signal type even though I was using HDMI in both cases.
 
OP
trents

trents

Senior Member
Joined
Dec 27, 2008
I was thinking that modern displays should automatically detect input type.
 

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
Don't you have an "auto-detect" on/off option in the monitor OSD?
Ahh yes, that manual. :)

I know my two knock off (Yamakasi) monitors need manually switched. Hell my Samsung Smart TV (2015) ONLY does it with DVD (didn't check options to see otherwise to be fair). But, yeah, it isn't a leap to think it should automatically detect it... neither is manually checking though. First tool out of the belt in my world. :)
 
OP
trents

trents

Senior Member
Joined
Dec 27, 2008
Don't you have an "auto-detect" on/off option in the monitor OSD?

No. In the OSD there are only choices for the various input types but no choice for auto. And the OSD does not seem to show unless you already have a valid signal. That was part of the problem. I couldn't get to the OSD to change the input. The only way out turned out to be physically changing the kind of cable. So I'm wondering if the monitor is getting a little marginal. About three years old now.