• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

1080Ti doesn't like HDMI output.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

yoadknux

Member
Joined
May 6, 2016
Hi, I've recently upgraded my MSI 1080 to an Aorus 1080ti because I managed to get a pretty good deal on it. It's a used one so I ran a lot of stress testings and everything looks good. One weird issue that I've been having is that it refuses to work with HDMI. Allow me to explain the situation:

My 1080ti connects to a BenQ 144Hz 1080p screen with a DVI-D Cable. It works well and reaches the designated resolution and refresh rate. I can also connect it using a DisplayPort cable, and it also works with the screen, but I prefer using the DVI-D one. I recently tried connecting the GPU to an HDMI TV using an HDMI-HDMI cable and got no signal, it should be noted that the TV works with a laptop & worked with my old 1080 using the same cable. I thought that's odd so I did another test, this time tried connecting to my BenQ screen using the HDMI cable (it has HDMI input as well) and again got no signal. So I then thought "Well, OK, I guess maybe the HDMI output of the GPU is bad", and I bought a DisplayPort->HDMI Adapter (Display port goes into the GPU) annnd... no signal. I have no additional devices with DP output so I cannot test the adapter.

Allow me to sum-up the problem and my attempted solutions:

The Problem: Unable to connect 1080ti to HDMI TV.

Attempted Solutions:
1. Connect TV to GPU with different HDMI cable - failed
2. Connect other devices to TV with HDMI - worked
3. Connect a different screen to GPU with HDMI - failed
4. Connect a different screen to GPU with DVI-D/DP - worked
5. Connect TV to GPU with a DP-HDMI adapter - failed

Just wondering if someone has ever run into that sort of thing? :confused:
 
that's good.
it's a big holiday weekend so there are few on the forum.
did you install the NVidia audio driver also?
if so, in device manager set the audio device to that and see if audio transmits via the cards hdmi port.
 
Hi, I've recently upgraded my MSI 1080 to an Aorus 1080ti because I managed to get a pretty good deal on it. It's a used one so I ran a lot of stress testings and everything looks good. One weird issue that I've been having is that it refuses to work with HDMI. Allow me to explain the situation:

My 1080ti connects to a BenQ 144Hz 1080p screen with a DVI-D Cable. It works well and reaches the designated resolution and refresh rate. I can also connect it using a DisplayPort cable, and it also works with the screen, but I prefer using the DVI-D one. I recently tried connecting the GPU to an HDMI TV using an HDMI-HDMI cable and got no signal, it should be noted that the TV works with a laptop & worked with my old 1080 using the same cable. I thought that's odd so I did another test, this time tried connecting to my BenQ screen using the HDMI cable (it has HDMI input as well) and again got no signal. So I then thought "Well, OK, I guess maybe the HDMI output of the GPU is bad", and I bought a DisplayPort->HDMI Adapter (Display port goes into the GPU) annnd... no signal. I have no additional devices with DP output so I cannot test the adapter.

Allow me to sum-up the problem and my attempted solutions:

The Problem: Unable to connect 1080ti to HDMI TV.

Attempted Solutions:
1. Connect TV to GPU with different HDMI cable - failed
2. Connect other devices to TV with HDMI - worked
3. Connect a different screen to GPU with HDMI - failed
4. Connect a different screen to GPU with DVI-D/DP - worked
5. Connect TV to GPU with a DP-HDMI adapter - failed

Just wondering if someone has ever run into that sort of thing? :confused:

I'm not sure if this is the answer that you are looking for but I have almost the same problem.
I have a HDMI TV/Monitor (32" RCA @ 1080p) that does not like older Nvidia cards. I have a DVI-D to HDMI converter that I put on the cards to make them work with my TV. All Nvidia 4xx and below and some 5xx and 6xx I have problems with.
Here is my problem :
Nvidia 4xx and below - will power my monitor but every 5 to 8 seconds the display will flash. Sometimes it is only 1 flash and others it's 4-5 flashes in-a-row, like disk'o lights :(
Nvidia 5xx - will power my monitor but every 20 to 25 seconds the display will flash. Sometimes it is only 1 flash and others it's 4-5 flashes in-a-row, like disk'o lights :(
Nvidia 6xx - will power my monitor but every 40 to 45 seconds the display will flash. Sometimes it is only 1 flash and others it's 4-5 flashes in-a-row, like disk'o lights :(
When the screen flashes, in the upper right corner I get "HDMI Connection 2 1920x1080 @ 60hz"
I do not have this problem if I can use the video cards HDMI connector.
I do not have any problems if I use my 24" Computer Monitor and connect DVI-D to DVI-D.

Thank You :)
 
Interesting. I have an EVGA model but my HDMI works on my TV and my VR (Oculus Rift) headset without issues. I know at least talking about the VR aspect some others with different brands had issues with the HDMI to the VR. Least thankfully my experience has been pleasant with it.
 
I have an update. I got he HDMI to work but the plot thickens. Again, just a reminder, I have a 144Hz BenQ screen for which I can use either DVI-D or DP, and I have an HDMI TV. The original problem was that I was unable to get anything with HDMI to work.

So I've done the following test: I disconnected the BenQ screen, connected the HDMI TV, and manually rebooted, because nothing showed up on the screen. It worked! I managed to view the POST screen and everything worked. Upon reaching windows I connected the BenQ screen on DVI-D. BenQ screen was not identified! Only TV identified.

I then rebooted with BOTH DVI-D and HDMI connected. It only identified the DVI-D connection and oddly enough did not let me change the refresh rate to 144Hz, only 65Hz (probably the TV refresh rate?). I shut the computer down and unplugged the DVI-D.

Next, after unplugging DVI-D, with HDMI still on, I connected the BenQ display via DP. So I now have DP->BenQ, HDMI->TV. For the first time I was able to get both of them to work. Here is what I see on NVIDIA control panel and windows display menu:

aAqnNgR.jpg

kNzhgwC.jpg

Someone was wondering about the audio, when the HDMI TV works then it also transfers audio and I can change the default audio device from the HDMI TV to other things.

This is relatively fresh and I am still testing things even at this moment, I have tried shutting down the TV and turning it back on to see what happens and it worked perfect: reverted back to 144Hz for BenQ screen and only identified it. Then upon turning the TV back on it was detected and returned to extended displays.

I will now perform a reboot test to see what happens. I don't want to play too much with it, but I think the problem now is either: a) The GPU doesn't like "real time" connections and b) DVI-D and HDMI don't like each other, anyway both problems are very weird.

What do you guys think?
 
Sometimes on my PC, Windows does not detect hardware automagically. It's intermittent, and is not isolated to one type of device (could be USB, monitor, etc.)

So, if I don't hear the magic "do-da-deet" (that's the best I can do) from Windows when I plug in new hardware, I force a manual scan using device manager...then Windows sees device changes.
 
How old is the HDMI cable? It may not be able to carry the bandwidth required for a 144hz monitor. It's also possible the monitor just doesn't support 144hz over HDMI. Another possibility is the ports on the card don't allow simultaneous HDMI and DVI, or it requires a certain HDMI port not be used while using the DVI port - I think I saw markings on the backplate of my card that indicated which ports could be active at the same time.
 
How old is the HDMI cable? It may not be able to carry the bandwidth required for a 144hz monitor. It's also possible the monitor just doesn't support 144hz over HDMI.
This part is probably irrelevant because usually the refresh rate will change whenever you change the cable, i.e. on my old 1080 the monitor worked fine with HDMI but it lowered the refresh rate to 120Hz.
Another possibility is the ports on the card don't allow simultaneous HDMI and DVI, or it requires a certain HDMI port not be used while using the DVI port - I think I saw markings on the backplate of my card that indicated which ports could be active at the same time.
I didn't notice any markings, but this could very well be what happened here. Read my previous post on this topic.

The DP+HDMI work well together, so the problem is solved. In my opinion, this is either:
1. A compatibility issue with windows 10
2. Some weird issue like you mentioned about outputs working at the same time.
 
I had an issue like this when I UPDATED to windows 10 a while ago. I was running 8 for a while on one of my PCs but decided I'd run double monitors on it with an HDMI cable. 60 hz monitor was working fine plugged into the onboard graphics, HDMI into a 970TI and it just wasn't having it.

Just wanted to share because it might be a win 10 thing.

HeheParty
 
Last edited:
The issue is more or less resolved for the moment. I don't know what causes it, but I was able to get around it. In short, DVI-D and HDMI don't work together. DP and HDMI on the other hand do.
 
Resolved? For a card that retails for ~$700 or more I think not. Are you running 10? Are you giving it enough power? Try booting into a linux boot disc to see if it will support both with dvi and hdmi. As well, you might try taking it to a pc guy who can test it on a different system altogether. I don't even have a video card and dvi and hdmi work perfect. I just don't like that I can't simply press one button to switch displays. You should be able to have dvi with multiple hdmi's at once.
Personally I'd like to get a radeon pro wx9100, they're only $1600.
Edit; Just found Monitor Profile Switcher off SF. One click from the tray switches displays perfect without having to okay or anything.
https://sourceforge.net/projects/monitorswitcher/?source=typ_redirect
 
Last edited:
Sorry I know this thread is a little old but I came upon the same issue. What I figured out was that you need to use a HDMI High Speed cable instead of a standard cable. Once I swapped them out I had no issues.
 
Back