• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Audio output question

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Jeff G

Member
Joined
May 22, 2016
I'm looking to hook my gaming PC up to my home theater (want to try gaming on the big-screen), just wondering if A) The GTX 1070 outputs audio through the HDMI port and B) Is HDMI my best bet for audio output? I have a higher-end receiver, so it's capable of any compressed/uncompressed signal. I just want to make sure I'm getting the best audio for my surround sound and I'm going to order my cabling today (don't want to place two orders if I don't have to).
 

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
Yes... assuming you have the audio drivers installed from NVIDIA.

I would think the S/PDIF out on the motherboard would be your best source of audio. Not sure how big of a difference it makes vs HDMI though or how to switch sources other than manually selecting the S/PDIF out...
 
OP
Jeff G

Jeff G

Member
Joined
May 22, 2016
...or how to switch sources other than manually selecting the S/PDIF out...

I will have to look closer at the Creative dashboard when I get home, but I think it has an option to output as S/PDIF. Should be as easy as switching it from my current headphone setting (I guess some trial/error is in store). I think I have some good quality S/PDIF cables at home (too short for my end application) that I could do some testing with before ordering.
 
OP
Jeff G

Jeff G

Member
Joined
May 22, 2016
After quite a bit of reading, it looks like this for quality:
1st - Analog. Best quality, but I think my board only supports 5.1 so I'd loose some of the 7.1 setup. Not sure I'd notice, I only went 7.1 because my receiver supported it.
2nd - HDMI, but only if it's 1.3 or higher. My card supports 2.0, so I'm good on that end. I'll have to see what the receiver supports when I get home. Might finally be a reason to upgrade my receiver.
3rd - SPDIF. Almost every comparison review I read stated the shortcoming of the SPDIF to be lack of bandwidth.
 
Last edited:
Joined
Dec 13, 2005
1st - Analog. Best quality, but I think my board only supports 5.1 so I'd loose some of the 7.1 setup. Not sure I'd notice, I only went 7.1 because my receiver supported it.
.

Where'd you read this? Unless you have a decent DAC, I'd wager having your receiver do any D/A conversions is your best bet.
 
OP
Jeff G

Jeff G

Member
Joined
May 22, 2016
Where'd you read this? Unless you have a decent DAC, I'd wager having your receiver do any D/A conversions is your best bet.

I googled HDMI vs SPDIF vs Analog, I read reviews from like 20 different sites and that seemed to be the consensus. If my receiver supports HDMI 1.3, I'll probably go that route.
 

EarthDog

Gulper Nozzle Co-Owner
Joined
Dec 15, 2008
Location
Buckeyes!
Not sure I see the point in analog unless you are listening to vinyl or a lossless format. Most sources are digital these days. I would use HDMI or Optical/SPDIF.
 
Joined
Dec 13, 2005
I googled HDMI vs SPDIF vs Analog, I read reviews from like 20 different sites and that seemed to be the consensus. If my receiver supports HDMI 1.3, I'll probably go that route.

Ah, after seeing ED's reply I think I see the confusion. Sound itself is analog, and any time you convert from analog to digital or back, there's some quality loss. If you have a true analog source, such as an old vinyl turntable, then yes, keeping everything analog is probably best (some exceptions, of course).

However, in this case, keeping the signal digital to your receiver, either over HDMI or S/PDIF which it sounds like you're planning on, then having it convert the signal back to analog will be better, as the circuitry there is bound to be better than what's on your motherboard, as well as more isolated.
 

dominick32

Senior Solid State Aficionado
Joined
Dec 19, 2005
Location
New York
Yes to all suggestions. Aince we are taking about a signal that starts as digital. It's Always 1000% better to output the pure digital signal and have the receiver do the DAC work for you. Remember fellas.. Digital is 1's and 0's. I would just go hdmi out from the GTX. 1070/1080 offers 7.1 channel DTS/DD as well from your movies so you would be able to use the BD/UHDBD or MKV file/etc multi channel audio soundtracks no problem. Good luck sir
 

NiHaoMike

dBa Member
Joined
Mar 1, 2013
Digital receivers do not use DACs in the traditional sense. They use Delta Sigma modulators and a LC filter to implement the DAC, with the power stage in the middle. Any analog inputs go into an ADC, most likely a cheap one, and are only provided for compatibility purposes.

Your best bet would be one HDMI (or DP) from the GPU to the display and a second HDMI from the GPU to the receiver. That avoids the receiver adding any lag. Set the outputs to mirrored.
 

dominick32

Senior Solid State Aficionado
Joined
Dec 19, 2005
Location
New York
Mike anytime you take a digital signal and output it to your speakers that is for all intents and purposes s"DAC WORK", we are not talking about an actual DAC device here. But we are saying that the digital signal received by your receiver is now being Outputed to your speakers in analog form. This is the purest way to output from a computer to your 7.1 receiver/pre amp IMO. An original digital signal straight from the source. Assuming you're outputting from the receiver to traditional analog speakers .

Your idea to use two separate HDMI cable's, one for audio, and one for video although a really nice idea , in my opinion is overkill for an average consumer that just wants some 7.1 audio from his PC. I have never experienced audio lag, and I only see that happening if you have some software glitches in the operating system or you are pushing uncompressed 4K video with Dolby atmos 32 channel audio :) and/or you are a mega rich audiophile! Lol other than that, I like your ideas. :attn:
 
Last edited:

Alaric

New Member
Joined
Dec 4, 2011
Location
Satan's Colon, US
You'll also have to change the default output device in Windows. Yeah, I know, Computers 101, but I still amaze myself when I forget that and then wonder where the sound went. LOL
 
OP
Jeff G

Jeff G

Member
Joined
May 22, 2016
You'll also have to change the default output device in Windows. Yeah, I know, Computers 101, but I still amaze myself when I forget that and then wonder where the sound went. LOL

Lol, its always the simple things that seem to get overlooked the most!
I'm thinking of pulling the trigger on a new atmos receiver now that they've come down in price a little, should make for some fun gaming!
Im getting tired of swapping hdmi cables every time i decide to use a new device, 2 inputs just isn't cutting it anymore.
 

Alaric

New Member
Joined
Dec 4, 2011
Location
Satan's Colon, US
Sometimes it just gets too hot to run my amplifier so I switch my audio to the HDMI and use my crappy TV speakers. Then the next day I fire up the tunes and stare at my electronics wall like the RCA dog for a second. My regular speakers are a couple feet from either side of the TV and its simulated surround sound so I first wonder how I blew every driver but my tweeters before I remember that setting. It's pretty funny. Afterwards. LMAO


You could always get a HDMI hub. http://www.newegg.com/Product/Produ...652&cm_re=HDMI_hub-_-9SIA50M1VN4652-_-Product
 

Alaric

New Member
Joined
Dec 4, 2011
Location
Satan's Colon, US
I've also heard some of them have issues with lag and video/audio sync problems. Cheaper than a new receiver, but sometimes you get what you pay for.
 

NiHaoMike

dBa Member
Joined
Mar 1, 2013
So when they advertise Burr-Brown or ESS DAC's, they're not actual DAC's in the receiver?
They're talking about the DSP that does the PCM to PWM/PDM conversion, which is basically half a DAC. Or it might be some old design that uses a traditional DAC along with analog Delta Sigma converter, but that only made sense prior to the availability of DSPs with built in Delta Sigma conversion. (Or it might be one of those "digital tube amps" that does in fact need to do a complete conversion to analog.)
Mike anytime you take a digital signal and output it to your speakers that is for all intents and purposes s"DAC WORK", we are not talking about an actual DAC device here. But we are saying that the digital signal received by your receiver is now being Outputed to your speakers in analog form. This is the purest way to output from a computer to your 7.1 receiver/pre amp IMO. An original digital signal straight from the source. Assuming you're outputting from the receiver to traditional analog speakers .

Your idea to use two separate HDMI cable's, one for audio, and one for video although a really nice idea , in my opinion is overkill for an average consumer that just wants some 7.1 audio from his PC. I have never experienced audio lag, and I only see that happening if you have some software glitches in the operating system or you are pushing uncompressed 4K video with Dolby atmos 32 channel audio :) and/or you are a mega rich audiophile! Lol other than that, I like your ideas. :attn:
There are "filterless" digital amplifiers that more or less just output a high level digital signal and let the speaker itself finish the conversion to analog. Granted, those are generally only used in speakers with built in digital amplifiers. (The fast edges cause EMI and standing wave issues on long lines.)

Every desktop GPU made nowadays has multiple outputs. Many receivers, especially the higher end ones, add lag to the video output. It often also requires the receiver to be one just to get display.
Sometimes it just gets too hot to run my amplifier so I switch my audio to the HDMI and use my crappy TV speakers. Then the next day I fire up the tunes and stare at my electronics wall like the RCA dog for a second. My regular speakers are a couple feet from either side of the TV and its simulated surround sound so I first wonder how I blew every driver but my tweeters before I remember that setting. It's pretty funny. Afterwards. LMAO


You could always get a HDMI hub. http://www.newegg.com/Product/Produ...652&cm_re=HDMI_hub-_-9SIA50M1VN4652-_-Product
A well designed amp should not get particularly hot with modest volume. (Unless it's a tube amp, of course.)

Those cheap HDMI switches and splitters rarely add a noticeable amount of lag since they're more or less dumb switches and buffers. If your problem is the receiver not having enough inputs but the display does, see if the receiver and display support HDMI ARC.
 

Alaric

New Member
Joined
Dec 4, 2011
Location
Satan's Colon, US
For music the amp is run in Class A @25 wpc, drawing enough current to feed it at 95 wpc in A/B. The rest is turned to heat. With a healthy load on it it builds enough heat to raise the (very small) room temperature. It gets very uncomfortable to the touch. :)
 
OP
Jeff G

Jeff G

Member
Joined
May 22, 2016
The receiver does not support ARC, it's about 8 years old now.
Lack of HDMI inputs was the driving force behind an upgrade, but I'm going to hold off on one until I get my media room rebuilt.
I'm going to give an Atmos setup a try, and I've been wanting to build a floating room for a while now anyway.
And in addition to getting more inputs on something new, I'll also be getting about twice the power output, built in wifi, smart phone control, etc. All the goodies that have come out in the last couple years.
It'll be a worth-while upgrade for sure.