dts vs. dolby surround 5.1

M

MDS

Audioholic Spartan
pikers said:
Let Mtry hang himself with countering WELL KNOWN fact regarding this issue. It doesn't matter how the end result happens; the fact that is that DTS will outstrip DD in nearly all SQ testing. EASILY DBTd.

And, it isn't apples and oranges. DTS and DD are in one-on-one competition for your earspace. When the chips fall (and they did eight years ago) well, it put DTS on top for now.
We should just let this die and yet I can't resist. Once again Pikers speaks in absolutes with facts not in evidence.

A. Bitrate is only one part of the story; it is not automatically true that higher bitrate equals higher fidelity when you are talking about lossy compression.
B. DTS is 10 dB hotter than DD. Louder is better to most peoples ears.
C. Dolby and DTS have gone back and forth on these issues and each has done extensive tests on the others codecs. Might want to read those articles and learn more.

Now if you prefer DTS, for whatever reason, that is fine, but at least get the facts before making outlandish claims like DTS is super duper and DD is an also ran - it's just not true.
 
j_garcia

j_garcia

Audioholic Jedi
I'll chime in too, and YES, this one needs to be left to die. It all depends on the mix. I've heard bad DTS mixes, and excellent DD mixes. If given a choice, I usually select the DTS mix, just because it's there. They are two different codecs, mastered differently, almost always by different engineers in different studios, and bitrate is not the deciding factor, as MDS notes. To be honest, most of the time I do prefer the DTS mix, but that has nothing to do with the quality of either format.
 
racquetman

racquetman

Audioholic Chief
I'll chime back in and say that I never said bit rate was the deciding factor, although you better believe it is a factor. I just said that if you want to compare apples to apples, you better start with equal bit rates.
 
mtrycrafts

mtrycrafts

Seriously, I have no life.
MDS said:
Might want to read those articles and learn more.
.

What, you are favoring the end of the world? But we have nothing to fear :D
 
C

Cygnus

Senior Audioholic
I prefer Dolby Digital/ ProLogic II, because most theatres (at least in my area) use Dolby Digital) and DD is found on most DVD's, where as DTS isn't. I also have more options on my receiver when using PLII, and I think PLII sounds better the NEO:6.
 
j_garcia

j_garcia

Audioholic Jedi
alandamp said:
I'll chime back in and say that I never said bit rate was the deciding factor, although you better believe it is a factor. I just said that if you want to compare apples to apples, you better start with equal bit rates.
I'd agree with that. I didn't read the whole thread, so I'm not picking a specific comment. If you compare full bitrate DTS to pretty much anything else, there's no comparison - the 1509K will win every time in terms of detail, dynamics and depth. Comparing 448 DD to 748 DTS though, doesn't entirely compare, because they use different algorithms to achieve their respective goals. It's like MP3 vs ATRAC - ATRAC is a more efficient compression method, yet still retains a higher quality sound than a similar MP3 file.

Dolby PLII and NEO:6 aren't formats, they're both DSPs.
 
Pyrrho

Pyrrho

Audioholic Ninja
tbewick said:
I found the following link quite informative:

http://www.spannerworks.net/reference/10_1a.asp

It is a quite detailed technical evaluation of both formats. I personally agree with mtrycrafts, that the two technologies are different and cannot be compared precisely.

From the above link, one thing that I thought was an important point is that DTS has most of the 'work' done in the studio, and is less affected by the receiver you're using.

I think DD is the more cunning technology, because it is quite an achievement to get such consistently good sound quality using such a small bit rate.
That is a very interesting link, and it is too bad more people don't read it before jumping to conclusions. Here are a couple of interesting quotes from it:

Unlike linear PCM systems, neither Dolby Digital nor DTS allocate a fixed numbers of bits to any channel. Instead, Dolby Digital and DTS feed their sub-bands/channels from 'global bit-pools'; the total number of bits allocated to any single channel constantly varies as a result. Sub-bands containing frequencies the human ear is more sensitive to are allocated more bits from the available bit-pool than sub-bands the human ear is less able to detect. Individual frequencies within these sub-bands are allocated data depending on their relative perceptibility when compared to neighbouring frequencies (as determined by the perceptual codes' masking algorithms). In DTS's case, a technique called 'forward-adaptive bit-allocation' is used. Using this technique, the allocation of data to each sub-band is pre-determined exclusively by the encoder. This information is explicitly conveyed to the decoder along with the actual bits to be used. Forward-adaptive bit-allocation's primary advantage is that the psychoacoustic model used resides exclusively within the encoder. Because the model is encoder-based, extremely complex psychoacoustic coding algorithms can be used (as decoder processing ability isn't a limiting factor). Forward-adaptive bit-allocation also allows psychoacoustic model modifications and improvements to be passed directly on to installed decoders, essentially 'future-proofing' DTS decoders from premature obsolescence.

Forward-adaptive bit-allocation's primary drawback is that explicit 'side-information', or 'metadata', is needed to direct and control the decoder's allocation of data to sub-bands; this extra information takes up space that might otherwise have been used for audio reproduction. Dolby Digital uses a hybrid technique incorporating elements of both forward- and backward- adaptive bit-allocation. Like DTS encoders, Dolby Digital encoders must also instruct their decoders to allocate bits to particular sub-bands, but don't need to transmit these instructions with such explicit detail. Dolby Digital decoders already include a very basic 'core' copy of Dolby Digital's perceptual coding algorithm. Because the decoder already 'knows' roughly how the bits should be allocated the encoder only needs to transmit information about specific variations from the decoder's own internal algorithm. Dolby Digital's metadata uses relatively little of the available bandwidth, leaving more data available for audio reproduction (which is a good thing, considering Dolby Digital's bit-pool is considerably smaller than DTS Digital Surround's).
In short, this means that part of the extra data of DTS is NOT audio information, but simply instructions to the decoder, which is something mtrycrafts said in his first post in this thread. This also means that if there were a DTS and a Dolby Digital soundtrack of equal data size, the DTS would be more compressed than the Dolby Digital soundtrack.

When comparing DTS with 448kbps Dolby Digital (and even, to a lesser degree, 384kbps Dolby Digital) any difference noticeable can more likely be attributed to differences in mastering or production than coding schemes. Under identical mastering conditions the two systems should be nearly indistinguishable from one another.

Any attempt to compare the domestic versions of Dolby Digital and DTS with one another is extremely difficult due to one major technical difference. The domestic version of Dolby Digital incorporates a feature, called 'dialog normalization', designed to maintain a consistent centre-channel volume from all Dolby Digital sources. The dialog normalization system is designed to ensure that the average centre-channel volume is always between -25 and -31dBFS (decibels below digital full-scale), regardless of source. As a result, if dialogue is recorded at a higher volume, the Dolby Digital decoder automatically attenuates the volume of all channels to the level at which the centre-channel outputs dialogue at the set 'dialnorm' level (usually -31dBFS for Dolby Digital on DVD). Most movies' centre-channels are recorded at -27dBFS, which results in an overall lowering of 4dB in all channels. Movies can be recorded at anything from -23dBFS (e.g. 'Wild Things') to -31dBFS (e.g. 'Air Force One', non-SuperBit and 'Twister: SE'), resulting in nominal overall volume attenuation of up to 8dB ('Wild Things') or more. All channels maintain their correct relative balance, so no detrimental sonic effects can be attributed to the dialnorm process. But, because the result can be up to an 8dB reduction in volume, there is no easy way to compare DTS and Dolby Digital versions of a film's soundtrack. The overall volume of the DTS version may be 8dB or more higher than the Dolby Digital soundtrack, making direct comparisons nearly impossible. As dialnorm is constantly variable in 1dB increments, the exact difference in overall volume between Dolby Digital and DTS soundtracks often varies from film to film.

Any argument for or against a particular system must be based on competing coding schemas. DTS's supporters claim that it is superior to Dolby's system because it uses a higher bitrate and less aggressive compression scheme. These two facts are essentially irrelevant in determining whether DTS is 'better' than Dolby Digital: neither automatically equates to higher sound quality. The quality of both systems stands or falls on the effectiveness of their respective compression and perceptual coding systems. Both systems use extremely effective coding systems. As both systems are based on completely different technologies, and rely on human perception, there is no technical or scientific means to determine which is 'better'. An apt analogy is that of the Porsche and the Corvette: the Corvette has a powerful V8, while the Porsche has a smaller engine but is turbo-charged. Both cars use very different power sources, yet both are extremely effective at performing their desired functions. Undoubtedly there will be those who argue for one system over another, but any such argument must be based on individual preference rather than scientific theory. There are no technically valid grounds for believing either audio system is inherently better sounding than the other.
It would be preferable from an audiophile perspective for 754kbps DTS's use to be restricted to longer duration films (i.e. over 160 minutes) that require 754kbps in order to be presented on a single disc. However, studios have found themselves unable to resist the practical advantages offered by 754kbps DTS, and this datarate is now the defacto standard for all DTS releases. Digital Theater Systems themselves have stated that 754kbps DTS is not "transparent", a claim they make for 1509 and 1235kbps DTS, and an assertion based largely on their format's higher bitrates.
Despite DTS Digital Surround's shaky introduction and uncertain future, I still believe it would be worth your while investing in a combination Dolby Digital/DTS Digital Surround amplifier/processor. Even though there are no compelling technical grounds for believing DTS is a better system, there are times when the DTS version of a soundtrack does sound considerably better than the Dolby Digital version. This can almost certainly be attributed to the way in which the DTS version has been mastered...
 
Thunder18

Thunder18

Senior Audioholic
jbracing24 said:
Just watched The Incredibles. My Denon displayed Dolby D + DPL IIx C. Wow!
I'm assuming you have a 7.1 speaker set-up. If that is the case, it's doing that so that because you have 7 speakers and Dolby Digital is a 5.1 source. My Pioneer does that with 5.1 Dolby Digital sources and with 5.1 DTS sources it shows DTS + Neo 6 so that it can make use of the surround back speakers. My receiver will also do THX + Dolby Digital + PLIIx or even THX + DTS + Neo 6
 

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top