I write a lot about how little measurements matter, but that isn’t always the case. In early designs of amplifiers as well as CD players, many of the measurements were downright awful. Like the low frequency response of transformer coupled tube amplifiers or the square wave response of early CD players.
Take a look at this article from this 2013 article written by John Atkinson in Stereophile.
In theory, because the CD system is limited to a maximum bandwidth of half the sampling rate—22kHz—it can’t actually reproduce squarewaves. A squarewave can be shown by Fourier analysis to comprise a series of odd-order harmonics regularly dropping in amplitude with increasing frequency. For perfect reproduction of a 1kHz squarewave, therefore, we would need to be able to reproduce the 1kHz, 3kHz, 5kHz, 7kHz, etc., components, all the way to infinitely high frequency. However, as the CD system will not reproduce the harmonics above the 21st, at 21kHz, the 1kHz squarewave in track 22 will not have a true square shape, but instead will look like fig.5. It looks as though there is overshoot and ringing before and after each transition from high to low and vice versa; in fact, what you see is what is termed Gibbs Phenomenon—the effect of omitting the high harmonics that would otherwise “square up” the waveform.
John’s article was written 28 years after the CD was first introduced.
Here’s what I had to say about square wave measurements of CD players in a 1985 Absolute Sound issue.
Not sure what I was looking at when I made that comment, but apparently it was at least as ugly as the what John posted.
What I wrote still stands true today.
Thanks to reader David Allen for sending me this photo.