I think that I am in need of an explanation.
When I read reviews of blu-ray disc (BD) players, they usually include comments on both the audio and video quality. My question is, how can there be any difference between different players? The BD is decoded to an HDMI signal by the BD player. Is this process not deterministic, i.e., the output is always the same?
Many other compressed formats, such as jpeg for pictures or mpeg for movies, work in this way. A jpeg file will produce the exact same RGB values for each pixel no matter on what computer it is decoded. In other words, if two different computers were used to show the same jpeg picture on the same screen (connected by a digital connection such as display port, DVI or HDMI), the picture should look exactly the same, independently of the computer used.
Are the quality differences only present when using the analogue outputs of the BD player? Is there any filtering, noise reduction or other processing going on that would affect also the digital HDMI signal? If so, can such things usually be turned off?
edit: I can understand that the quality differs if the player has to change things, like for example converting a 24p (24 frames per second) recording on a BD to 30p. My question is more related to the case where this is not done, i.e., the same frame rate as the BD is recorded in is also output over HDMI.
To sum it up: it seems very reasonalbe to me that if a digital data stream (the contents of the BD) is read and decoded into another digital data stream (the HDMI signal), one should get the exact same result with any BD player. If I am wrong, I would greatly appreciate it if anyone could explain to me how there could be a difference.