Hello, Audioholics.
First time poster here. Was wondering if someone could lend some insight into an issue I'm having regarding audible differences when it comes to the "core" DTS audio extracted from a Master Audio soundtrack and a true lossless extension. First, let me say what sparked my curiosity regarding this subject was this post I found by a member in the OPPO BDP-83 thread, in which he describes the behavior of his Onkyo receiver (my equipment is similar to his, but I'll get to that). I have highlighted my central concerns in bold:
I have a first-hand account of an instance where this isn't 100% true.
I own an Onkyo TX-SR705 and, depending on the sampling frequency, LPCM and bitstream TrueHD/DTS-HD are not all handled equally.
With LPCM, complete processing (bass management, room correction, DPL IIx, etc.) is applied to any signal that is up to an including 96 kHz sampling frequency. But if the LPCM is a 192 kHz sampling frequency signal, the 705 will not process it - it will only play it back exactly as it came in (with only the treble/bass "tone" controls available).
With a TrueHD bitstream, the 705 will completely process any signal up to an including a 48 kHz sampling frequency. But at 96 kHz, it will only play it back straight - no processing. And if it is a 192 kHz TrueHD bitstream, it will not play it at all!
With DTS-HD Master Audio, it will process up to 48 kHz signals. It will not process 96 kHz signals, but it will play them back straight. And it will play 192 kHz signals, but it will down sample them to 96 kHz in order to do so!
So with the Onkyo TX-SR705, everything is equal so long as the incoming signal is 48 kHz sampling frequency or less. But if I want to listen to a 96 kHz signal, I'm best off with a LPCM signal coming from the player as the 705 can fully process a 96 kHz LPCM signal, but cannot process TrueHD or DTS-HD MA at that high of a sampling frequency.
I do not know for certain, but my educated guess is that the 705 basically has limited processing power. When it is receiving a TrueHD/DTS-HD bistream, some of its processing power is "taken up" and used to decode the bitstream, leaving less processing power "left over" for things like bass management, room correction, DPL IIx, etc. With a LPCM signal, it doesn't have to "spend" any processor power on the decoding itself, so it is able to fully process a higher sampling frequency.
So that's a long explanation, but it's a first-hand account of an instance where a respectable receiver handles multi-channel LPCM slightly differently from TrueHD/DTS-HD bitstream. The "weird" thing though is that, in the case of my 705, having the player send LPCM actually holds the advantage!
The example I've seen the most of bitstream sounding better than LPCM is when people are comparing bitstream TrueHD/DTS-HD from a stand-alone player vs. the LPCM output from the PS3. I've seen several people claiming that a stand-alone player sending bitstream sounds noticably clearer and more detailed than the PS3's LPCM output.
Now, one theory of mine is that those people haven't properly configured the audio output of the PS3. If you just go into the PS3's Sound Menu, select HDMI for the audio output and then have it automatically configure the audio output, it doesn't always automatically select all of the various multi-channel LPCM output modes that are supported. Some people may also be mistakenly leaving the HDMI audio output setting under the BD/DVD menu to "bitstream" - limiting them to regular DD/DTS output or only 2-channel LPCM. And then, there are all the check boxes if you setup the Sound menu manually. Basically, there are just many possible ways to misconfigure the PS3's audio output, so it wouldn't surprise me if that were the cause of the "lower quality" audio in many cases.
So maybe the best test would be for Gene to compare the BDP-83's audio quality to the PS3's!
That's probably the biggest question out there and the one that is really on my mind. Set up a PS3 properly, have it do the decoding and output the multi-channel LPCM and compare its sound quality to the BDP-83's bitstream and also the BDP-83's decoded LPCM output. If the PS3 really is limiting the audio quality somehow, it should be rather obvious.
Now, alot of this doesn't relate to my equipment because I don't have a PS3 -- what I do have is a new OPPO BDP-83 connected via HDMI to an Onkyo 605, which decodes Master Audio and TrueHD. The problem is, on the last player I was using, I didn't have MA access, nor did the player bitstream TrueHD. When I played discs with MA soundtracks, the player extracted the core DTS signal and bitstreamed these over to the 605. Since playing the same titles on the OPPO, which now bitstreams TrueHD and Master Audio (confirmed by my 605's display, which reads "DTS-HD MSTR" or "Dolby TrueHD"), it seems I cannot hear any sonic differences between the core DTS signals and the fully lossless ones. Are there supposed to be big differences with the lossless tracks versus the core lossy versions, or are they more subtle, if anything?
The connection I made to the above member's post is that I am beginning to wonder if my 605 is doing what he believes his Onkyo is doing, that is not processing Master Audio or TrueHD tracks at full resolution, and that is why I am not hearing sonic improvements with the lossless bitstreams? It was suggested to me that because of a "bug" the 605 series had in early runs, it is possible that my 605 is simply decoding the standard core DTS from these MA tracks, still, even though the display is reading "DTS-HD MSTR", and that is why I don't hear a difference. Furthermore, I am getting the issue the member describes regarding the TrueHD at 192 not bitstreaming over -- the same thing happens with certain TrueHD tracks bitstreamed from the BDP-83, where they simply won't play back on my 605.
Can anyone weigh in on the lossless/core dilemma? Are the audible differences supposed to be very apparent when going from the core stream to the lossless Master Audio tracks, or should they indeed sound about the same? Should I be concerned that my 605 is doing what this other member's AVR is doing, that being perhaps not processing these signals at the correct frequency?
Thanks in advance.