• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

New build ,EVGA Gtx 1080 FTW video problem?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Smoknum

Registered
Joined
Jul 24, 2005
Seem to having some trouble on a new build. Hoping someone here could offer some insight.

I am having issues with video output on my GTX 1080 FTW card. I get video through DVI , but get absolutely nothing using displayport or HDMI connection to my monitor. Im hoping its something simple im overlooking, have fresh new nvidia drivers installed. Everything else seems to be working fine other than this issue. I have built several computers, and this is first time I have ran into an issue like this.


Parts used listed below

Gigabyte Z170 Gaming 7
Intel Skylake 6700k
Gskill DDR4 3000mhz ram
EVGA GTX 1080 FTW
Windows 10 Pro
Acer XG270HU monitor


Any suggestions or ideas would be great

Thanks
Chuck
 
+1 to what ATMINSIDE said.

Also, after you boot into Windows, try plugging into the HDMI/DP input...it should detect automagically.

Are your drivers up to date?
 
When I switch from VGA input to HDMI I have to turn my monitor off then on. The first time I switched I thought something was defective because it is auto input. when I go back to VGA I can just change the input and it works.

I think it has to do with monitor standby mode when you use VGA or DVI it sends a signal to the monitor for standby off then on and HDM only has signal lost.
 
I don't use either of those inputs (VGA/HDMI), so no clue... I am DVI and DP guy. That said, I don't need to shut my monitor off to switch from DVI to DP or vice versa.

I can't imagine too many people are using VGA these days too... kind of archaic.
 
Definitely all good information so far. So the first thing is to check that in fact The HDMI monitor is on the correct screen input has everyone already recommended. If that doesn't work, try changing the HDMI cable to a newer cable, Or switching the HDMI port that you use on the LCD screen .

But anyway, unplug everything except for your DVI monitor. And hover on the windows 10 desktop. Go ahead and turn your HDMI LCD Monitor on. Make sure it is on the correct HDMI input. Now plug in the new cable into the back of the 1080 as well as the back panel I nput on the LCD screen as you are hovering on your windows 10 desktop, and leave the PC on. Now take a look at the image below, and go into your windows 10 display option settings by right clicking on the desktop. You should show two monitors. . Please tell me what your screen is showing compared to the picture below . If both monitors are showing, one monitor is simply turned off in the driver settings. If you are still only showing your DVI monitor. There is some other issue going on.

image.png
 
Last edited:
I don't use either of those inputs (VGA/HDMI), so no clue... I am DVI and DP guy. That said, I don't need to shut my monitor off to switch from DVI to DP or vice versa.

I can't imagine too many people are using VGA these days too... kind of archaic.
At Best buy all the monitors they sell in store are VGA compatible 1080p. Most of the OEM Desktop PCs in stores at Best Buy and Sam's club use VGA connection to the monitor 1080p.

Having a VGA connection might be archaic however I like that the VGA plug can't be pulled out accidently when cleaning around the cable like HDMI can. VGA the picture quality is the same when using the DAC then ADC like they do for the 1080p LCD monitor.

My Samsung monitor is not archaic, it only six-months old I purchased it from Best Buy, it is 27", PLS panel, 4 MS.
 
DVI and DisplayPort also can't be accidentally removed...

EarthDog wasn't referring to the monitor itself as archaic, simply the VGA interface.
 
It depends on the connector. I have Displayport Cables that lock (and there are some that don't). THough, can't say I have run across HDMI cables that have a lock thing on it either. I can also say I have never yanked a HDMI cable out of anything, LOL!

Anyway, yes, I meant the port is archaic. Its gone the way of the dodo. I don't recall the last two gens of GPUs even having a VGA port on them. The fact that you have to use a converter on modern GPUs to plug into a monitor using VGA should tell you something! That said, most aren't as up to date as we are...on the flip side, again, nobody is making them anymore in the GPU and motherboard world... or they are quite rare if they are still found on the latest boards. Its like a PCI slot, more of a legacy thing than commonplace.

So, you are telling me that you are using VGA over DVI on your monitor Wingy? You may want to have a read: http://www.cnet.com/news/hdmi-vs-displayport-vs-dvi-vs-vga-which-connection-to-choose/
Don't use VGA, not if you can help it. While it is capable of fairly high resolutions and frame rates, it's an analog signal. You're not likely to get a pixel-perfect image with today's LCD monitors (hence why you'd use DVI).
 
Last edited:
It depends on the connector. I have Displayport Cables that lock (and there are some that don't). THough, can't say I have run across HDMI cables that have a lock thing on it either. I can also say I have never yanked a HDMI cable out of anything, LOL!

Anyway, yes, I meant the port is archaic. Its gone the way of the dodo. I don't recall the last two gens of GPUs even having a VGA port on them. The fact that you have to use a converter on modern GPUs to plug into a monitor using VGA should tell you something! That said, most aren't as up to date as we are...on the flip side, again, nobody is making them anymore in the GPU and motherboard world... or they are quite rare if they are still found on the latest boards. Its like a PCI slot, more of a legacy thing than commonplace.

So, you are telling me that you are using VGA over DVI on your monitor Wingy? You may want to have a read: http://www.cnet.com/news/hdmi-vs-displayport-vs-dvi-vs-vga-which-connection-to-choose/

My HDMI cable comes out of the GTX 970 Video card with ease however with the monitor I have snag it.

Today, the VGA analog interface is used for high definition video, including resolutions of 1080p and higher. While the transmission bandwidth of VGA is high enough to support even higher resolution playback, there can be picture quality degradation depending on cable quality and length. How discernible this degradation is depends on the individual's eyesight and the display, though it is more noticeable when switching to and from digital inputs like HDMI or DVI. https://en.wikipedia.org/wiki/Video_Graphics_Array

My VGA Cable is the correct length for great operation.
 
Last edited:
I just dont understand why you wouldn't use a digital, modern interface...why you are going from hdmi to your vga input. Why not hdmi all the way? Or displayport? Or dvi?
 
Last edited:
I'm testing HDMI now VS VGA you kind of pushed me to read a lot about VGA and RAMDAC and how it works and it gave me a Idea that maybe it would fix my web Browsing trouble. As of 2006 RAMDAC only runs at 400MHz, I think it causes back pressure and timing trouble from the CPU to VIDEO, RAMDAC, then backs up before Display. With HDMI all digital my web browser is more stable scrolls faster and smoother still testing back to back.

RAMDAC uses ram, so in gaming the Graphics does not change color fast like surfing the web.

Random Access Memory Digital-to-Analog Converter (RAMDAC) is a combination of three fast DACs with a small SRAM used in computer graphics display controllers to store the color palette and to generate the analog signals (usually a voltage amplitude) to drive a color monitor. The logical color number from the display memory is fed into the address inputs of the SRAM to select a palette entry to appear on the data output of the SRAM. This entry is composed of three separate values corresponding to the three components (red, green, and blue) of the desired physical color. Each component value is fed to a separate DAC, whose analog output goes to the monitor
https://en.wikipedia.org/wiki/RAMDAC
 
I'm testing HDMI now VS VGA you kind of pushed me to read a lot about VGA and RAMDAC and how it works and it gave me a Idea that maybe it would fix my web Browsing trouble. As of 2006 RAMDAC only runs at 400MHz, I think it causes back pressure and timing trouble from the CPU to VIDEO, RAMDAC, then backs up before Display. With HDMI all digital my web browser is more stable scrolls faster and smoother still testing back to back.

RAMDAC uses ram, so in gaming the Graphics does not change color fast like surfing the web.

https://en.wikipedia.org/wiki/RAMDAC

Cool story. Use a modern interface. Lol!
 
SAm's, LOL...

Listen, I'm not going in circles with ya. Its a legacy connection and if you can help it, shouldn't be using it. You surely shouldn't be intentionally adding an adapter to use that port. Cheers. :)
 
Back