HDCD is a bit of a mysterious spec. Microsoft claims that it's 20-bit, but it order to be read by regular CD players and fit on a regular CD, the data is still stored in 16bit/44.1khz PCM, just like every other CD out there. I don't believe there is any open implementation documentation to prove/disprove this (typical Microsoft), but here's my understanding of how it works: There is no fancy codec or compression algorithm. After mastering the original recording is encoded at 20bit/44.1khz, then the lowest 4-bits of data is thrown away, because they're insignificant anyway, and the remaining data is recorded on the CD. A player that recognizes HDCD will logically shift the bits back up to 20 filling in the bottom 4bits with nothing, which provides greater dynamic range, but is less accurate than a "real" 20bit sample. A regular CD player will simply play the 16bit sample as-is, which will have reduced dynamic range compared to a HDCD player, but will sound just fine. With the dynamic compression given to most CDs today anyway, I doubt the increased range of HDCD is useful anyway. I think I have one or two HDCDs, but I've never noticed anything special about their sound.
Assuming the receiver can decode HDCD, I don't see any reason why the data coming out of the digital output couldn't be detected and decoded in the same way. If the receiver doesn't understand HDCD, you'll just hear it like any other CD player would play it.