I see many recommendations on this forum that the new lossless codecs available on the BluRay format are superior to the old compressed Dolby Digital and DTS formats. I would agree that they are better in a technological sense: I understand that they can carry a greater number of channels, at greater data widths and at higher sampling rates.
First, if we consider that most movies only have 5.1 sound, we can for the moment ignore any greater number of channels.
Second, let's allow ourselves to believe that a higher sampling rate, per Nyquist, is meaningless since we can accurately reconstruct 20-20kHz with a 44.1 sampling rate.
Finally, let's also assume that greater data widths (16 bit vs 24 bit, etc), are also relatively unimportant. While I do believe that a bit width of 24 is indeed better than 16 due to its increased dynamic range, I, at the same time, believe we can get by very well with 16 bits of data per sample.
I think that generally leaves us with the DD or DTS compression algorithms against the MLP lossless scheme used in Dolby True HD and whatever is used in the DTS lossless scheme.
On these Audioholics forums, I so very frequently see blatant dismissal of any non-blind claim for improved results in one's system. That is, to claim that any CD player sounds any different than any other is quickly dismissed, and with attitude! Amps? As longs as they're within their operating parameters and properly designed, they all sound the same. Similarly, there have been endless discussions on how medium- and high-bit-rate mp3s are indistinguishable from the CD original, with exceptions made for only the most special cases.
So, I'm curious as to why it's widely accepted that Dolby True HD and lossless DTS are freely claimed to be better than DD or DTS. Technically they are better, but who has engaged in blind listening tests, and under what conditions? Is this stuff really, truly better (and by what margin?!?!) than its predecessor? If so, how has it been measured and where are the DBT results?