I have done some more noodling and reading on the aspect of bit perfect, iTunes, and other rippers like XLD. On the surface, the differences I saw ripping a CD this evening would say that somehow XLD is giving a more consistent and accurate rip. Not necessarily true however.
If I had the patience, I would do as this reviewer did, and run 10 consecutive rips. Then set all ten files up for a digital file compare. iTunes hit it dead on : 10 for 10 rips all bit identical. That's not to say that XLD couldn't do 10 for 10 just as well. The difference between the iTunes rip and the XLD rip may simply be the digital audio offset at the beginning of the CD: ie when does the application start recording music content?
The author of this article compares just that. Once XLD and iTunes begin recording, are the results bit identical after that? In his sample of 10 files done with each, they indeed were bit identical once the music began. Apparently, if one is really patient, and one has the time to lay the files out and do a very detailed compare, it all comes out in the end.
What I learned this evening is that unlike regular computer files in a traditional data file format, CD audio WAV files get laid down and read back up in a fairly unique way that doesn't guarantee read accuracy to the bit level. I am still trying to figure that out and get more data points. It seems to have something to do with the beginning track space and inter track space for the song to song gaps and the initial CD audio playback gap.
I guess that makes a very long winded way of saying, all this is a big surprise to me. I don't know that I have it figured out just yet. I certainly will disclaim any expertise here. There's more to know and understand that I have managed so far.