strube

strube

Audioholic Field Marshall
Umm it outputs the sound in PCM. The software does all the decoding before sending it out so if anything the decoder is to blame. Digital is digital so how are 0's and 1's from one source better than another?
I have to disagree with your "digital is digital" statement. Yes it is 1's and 0's but ATI cards are not capable of the same bitrate as other options, and I would argue that bitrate is very important factor in selecting a digital audio device. Bitrate capabilities are hardware dependent, unless I am gravely mistaken, which I am pretty sure I am not.
 
OttoMatic

OttoMatic

Senior Audioholic
I have to disagree with your "digital is digital" statement. Yes it is 1's and 0's but ATI cards are not capable of the same bitrate as other options, and I would argue that bitrate is very important factor in selecting a digital audio device. Bitrate capabilities are hardware dependent, unless I am gravely mistaken, which I am pretty sure I am not.
I thought we already went through this.

Are you saying that you can hear differences between solution X and solution Y when transmitting two-channel digital audio over HDMI, optical or coax digital?

I do see your point, though. What are the supported bitrates for which formats on the ATI card vs others (remember that I am no ATI fan). If the ATI won't support bitrate x, then it's possible that there would be an audible difference on the applicable format. However, I doubt that the ATI card will have a problem passing 44.1/16-bit audio, which is what the OP requested (IIRC).
 
G

gus6464

Audioholic Samurai
I have to disagree with your "digital is digital" statement. Yes it is 1's and 0's but ATI cards are not capable of the same bitrate as other options, and I would argue that bitrate is very important factor in selecting a digital audio device. Bitrate capabilities are hardware dependent, unless I am gravely mistaken, which I am pretty sure I am not.
Once again the card will send a signal out the HDMI in PCM. As far as bitrate is concerned the audio has already been processed by the decoder so the card will output what is given to it. If they are capable of sending a DTS-HD/MA 24bit 48kHz audio stream I don't see how that is lacking. The cards support a maximum of bitrate of 192kHz.
 
G

gus6464

Audioholic Samurai
I thought we already went through this.

Are you saying that you can hear differences between solution X and solution Y when transmitting two-channel digital audio over HDMI, optical or coax digital?

I do see your point, though. What are the supported bitrates for which formats on the ATI card vs others (remember that I am no ATI fan). If the ATI won't support bitrate x, then it's possible that there would be an audible difference on the applicable format. However, I doubt that the ATI card will have a problem passing 44.1/16-bit audio, which is what the OP requested (IIRC).
The Radeon 4600 and 4800 cards are capable of sending 8ch sound @ up to 192kHz.
 
OttoMatic

OttoMatic

Senior Audioholic
The Radeon 4600 and 4800 cards are capable of sending 8ch sound @ up to 192kHz.
Yeah, then I agree that there should be no difference in audio quality on digital transmissions. Certainly, they should all be able to transfer CD-quality audio...
 
D

drummy83

Audiophyte
It's like those who buy a $1000 player to use its digital out...a cheap $50 would send the same digital signal for a lot less.

Expensive audio equipement generally have better DACs, but for a digital out, a cheap audio card would do the same.

I challenge anybody for a blind test between a Xonar digital out and a motherboard digital out...:p

People hear differences often because they want to hear them.

Digital is digital and 010001111000 still 010001111000, even with a cheap source.
 
strube

strube

Audioholic Field Marshall
I do see your point, though. What are the supported bitrates for which formats on the ATI card vs others (remember that I am no ATI fan). If the ATI won't support bitrate x, then it's possible that there would be an audible difference on the applicable format. However, I doubt that the ATI card will have a problem passing 44.1/16-bit audio, which is what the OP requested (IIRC).
Well I guess didn't really process that when I was making my recommendations :rolleyes:, which doesn't make me the greatest advise-giver for this case, I suppose, so I must apologize to the OP. I am biased for the higher sampling rate and bit depth with sound cards - the more samples and data, the better it has to sound (human ability to resolve the differences aside;))! All kidding aside, when it comes down to it, I have to admit I have probably listened to anything capable of more than 48kHz/24-bit audio from my PC a total of three hours of my life :(.

Anyway, back to your question: IIRC, the ATI cards with HDMI have dumbed-down audio hardware/software from Realtek that claimed to support 192/24 but in operation could only support 44.1/16, and caused actual hardware crashes for anything more. This may have been fixed with drivers or firmware updates by now but it really made me mad at the time and soured me to the whole ATI experience (I sent the card back and got an NVIDIA). Another problem I had was that it had a relatively low SNR, though I don't recall exactly what it was.
 
itschris

itschris

Moderator
Seriously... no apoligies. Listen... I'm sort of a "just in case" kinda guy. Yes, I know that there probably isn't a difference between a or b in a lot of things, but I don't mind paying a bit more for the "just in case" factor. When I buy asperin, I could get the CVS brand, but I pay more for the Bayer. When my daughter has allergies, I pay for Zyrtec knowing the store brand is the same thing. I do know that, but there's a little part of me that says it's worth the extra couple of bucks just to be sure. It give me piece of mind.

With the a/v stuff... its kind of the same way. I mean a decent sound card cost $150? Not a lot in the grand scheme of things when I consider myself sitting on the couch listening to my system wondering if I made a mistake by "cheaping out" on a sound card. It's not worth it to me. Besides, I do agree that when you look at things individually, there's little difference if any to be found... be it cables... whatever. But my thought is that there could be a cummulative effect of several things that can make a difference. Proof? I have none... just a mindset that draws me beyond a bare bones approach to anything. For instance, I won't buy $500 speaker cables, but I won't use zip cord either even I read 100 studies tell me there's no difference. Silly? Maybe, but it makes me feel better about things.

So, in light of that long diatribe, let me say that I will be getting a sound card and video card... and probably go with what you mentioned. Do you think though it's worth getting two cards that you connect internally and just have HDMI carrying both audio and video (which will mostly be just computer stuff, recorded shows, and DVD's) or should I keep them entirely seperate and run HDMI for the video and coax for the audio?

Well I guess didn't really process that when I was making my recommendations :rolleyes:, which doesn't make me the greatest advise-giver for this case, I suppose, so I must apologize to the OP. I am biased for the higher sampling rate and bit depth with sound cards - the more samples and data, the better it has to sound (human ability to resolve the differences aside;))! All kidding aside, when it comes down to it, I have to admit I have probably listened to anything capable of more than 48kHz/24-bit audio from my PC a total of three hours of my life :(.

Anyway, back to your question: IIRC, the ATI cards with HDMI have dumbed-down audio hardware/software from Realtek that claimed to support 192/24 but in operation could only support 44.1/16, and caused actual hardware crashes for anything more. This may have been fixed with drivers or firmware updates by now but it really made me mad at the time and soured me to the whole ATI experience (I sent the card back and got an NVIDIA). Another problem I had was that it had a relatively low SNR, though I don't recall exactly what it was.
 
OttoMatic

OttoMatic

Senior Audioholic
I would use whatever digital analog connection is available on the motherboard. And I would get a video card that supports HDMI video output. If it supports audio, great, you might be able to use that in the future.

This is how I have it set up at home, and it works perfectly and sounds identical to a digital feed from my DVD or BD player. Of course, you can do whatever you want, but the extra audio card is redundant. Aren't you the guy that said you didn't want to have a special software for one thing and a custom install for the other thing (above in this thread). That's what you're doing when you get the extra audio card. IMHO.
 
G

gus6464

Audioholic Samurai
Seriously... no apoligies. Listen... I'm sort of a "just in case" kinda guy. Yes, I know that there probably isn't a difference between a or b in a lot of things, but I don't mind paying a bit more for the "just in case" factor. When I buy asperin, I could get the CVS brand, but I pay more for the Bayer. When my daughter has allergies, I pay for Zyrtec knowing the store brand is the same thing. I do know that, but there's a little part of me that says it's worth the extra couple of bucks just to be sure. It give me piece of mind.

With the a/v stuff... its kind of the same way. I mean a decent sound card cost $150? Not a lot in the grand scheme of things when I consider myself sitting on the couch listening to my system wondering if I made a mistake by "cheaping out" on a sound card. It's not worth it to me. Besides, I do agree that when you look at things individually, there's little difference if any to be found... be it cables... whatever. But my thought is that there could be a cummulative effect of several things that can make a difference. Proof? I have none... just a mindset that draws me beyond a bare bones approach to anything. For instance, I won't buy $500 speaker cables, but I won't use zip cord either even I read 100 studies tell me there's no difference. Silly? Maybe, but it makes me feel better about things.

So, in light of that long diatribe, let me say that I will be getting a sound card and video card... and probably go with what you mentioned. Do you think though it's worth getting two cards that you connect internally and just have HDMI carrying both audio and video (which will mostly be just computer stuff, recorded shows, and DVD's) or should I keep them entirely seperate and run HDMI for the video and coax for the audio?
That Xonar card has horrible drivers. Buy a Radeon 4850 first and see if the HDMI audio on it performs to your liking before going out and buying another sound card.
 
Nemo128

Nemo128

Audioholic Field Marshall
XFi hype... *sigh*

If you're streaming music from a PC, especially MP3s, I fail to see how the connection matters between a "high end" sound card and an onboard audio processor. Typical MP3s are ripped at VBR w/ 192 average, so you're losing quality regardless.

I'd be more concerned about the rips, the source of them, and the encoder used than the choice between onboard and XFi. In my extensive experience, XFi tends to cause more issues than it's worth for the artificial increase of sound quality it provides.
 
Biggiesized

Biggiesized

Senior Audioholic
If you want to build a music HTPC, then go with a cheap (sub $100) AMD motherboard with a 780G integrated graphics processor. They also have onboard HDMI output with 7.1 PCM output support. A cheap X2 processor is all you'll need to run it.
 
Nemo128

Nemo128

Audioholic Field Marshall
If you want to build a music HTPC, then go with a cheap (sub $100) AMD motherboard with a 780G integrated graphics processor. They also have onboard HDMI output with 7.1 PCM output support. A cheap X2 processor is all you'll need to run it.
I completely agree, but if increased processing power, lower power consumption, and smaller form factor are desired, I suggest a board based on the Intel G45 chipset.

The Intel DG45FC is incredibly capable when you pair it with the right Core2Duo CPU. Food for thought.
 
itschris

itschris

Moderator
The things is, I'm don't really care about saving 20 or 50 bucks here or there. I just want a capable setup that I won't have to completely ditch a year from now to do something I'm not using it for today...like blu-ray or some other streaming or whatever.

I'd rather overbuild than under i guess is what I'm saying. If one card is $200 and another card is just about as good cost $175 for instance, I'd rather just get the $200 one.
 
G

gus6464

Audioholic Samurai
XFi hype... *sigh*

If you're streaming music from a PC, especially MP3s, I fail to see how the connection matters between a "high end" sound card and an onboard audio processor. Typical MP3s are ripped at VBR w/ 192 average, so you're losing quality regardless.

I'd be more concerned about the rips, the source of them, and the encoder used than the choice between onboard and XFi. In my extensive experience, XFi tends to cause more issues than it's worth for the artificial increase of sound quality it provides.
Also Creative has yet to make decent stable drivers for vista 64bit.
 
G

gus6464

Audioholic Samurai
BTW nvidia has announced their ION platform which uses an Intel Atom and fully supports 1080p video along with 8ch LPCM over HDMI.

This tiny little box has the following inside:
2GB DDR3 ram
Intel Atom
9400M graphics
500GB hard drive



This is a chart showing CPU and memory usage while playing back The Dark Knight at 1080p.



Also this thing uses <30w at max TDP.
 
Nemo128

Nemo128

Audioholic Field Marshall
1. What is XFi?

2. The OP said he's doing lossless.
XFi is Creative's newest chipset after the Audigy2 series. It's overhyped, undersupported, and poorly implemented in Vista64, which is arguably the best OS to use for HTPC stuff.

If he's doing lossless, he better know what the source of the rips was, like I said. Lossless, unless you ripped things yourself, means nothing. Anyone can re-encode MP3s as lossless FLACs for the sake of having "lossless" files. I've seen this plenty of times, when my own lossless rips have far higher freq ranges than the lossless rips I saw posted on usenet.

The ONLY way to guarantee a best quality lossless encode is to encode the original source material yourself.
 
Nemo128

Nemo128

Audioholic Field Marshall
BTW nvidia has announced their ION platform which uses an Intel Atom and fully supports 1080p video along with 8ch LPCM over HDMI.

This tiny little box has the following inside:
2GB DDR3 ram
Intel Atom
9400M graphics
500GB hard drive



This is a chart showing CPU and memory usage while playing back The Dark Knight at 1080p.



Also this thing uses <30w at max TDP.
Been keeping a close eye on the Ion, and will probably be snatching one up when they release it. It will turn the HTPC world upside down.
 
Nemo128

Nemo128

Audioholic Field Marshall
The things is, I'm don't really care about saving 20 or 50 bucks here or there. I just want a capable setup that I won't have to completely ditch a year from now to do something I'm not using it for today...like blu-ray or some other streaming or whatever.

I'd rather overbuild than under i guess is what I'm saying. If one card is $200 and another card is just about as good cost $175 for instance, I'd rather just get the $200 one.
Have you learned nothing from being on this forum? Spending more money != more capability or performance. Monster Cable anyone?!?!

You should look to fulfill a purpose, not spend money because marketing and hype tell you it's the best option.
 
newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top