Here's something to contemplate. If ripping a CD results in identical bit-perfect data to the original and, if you store that ripped data on a hards drive, how could one ripping process sound different than another?
The answer is simple. It cannot.
Yet, few among us would suggest one ripping process sounds the same as another. Thus, if all the evidence is true, that means the data has to be different. Error correction was used. Something other than jitter or timing changed. (We know this because hard drives do not store clock data).
I wonder if there have been any studies or examples of ripped vs. original using different programs. I am often lazy when ripping CVDs and use iTunes at its lowest copy rate. My friend Jason Serinus, who is also a reviewer at Stereophile, scolds me for this practice, suggesting the results are less than optimal. Bad, in fact.
Love to know what's true and not.