When we use our very expensive Audio Precision system to measure distortion levels what we're looking for is additive—harmonics that are generated in response to an input signal.
The simple explanation of how this works is the AP generates a remarkably clean sinewave. It's pure to many zeros (you check by running a distortion analysis of the signal itself without inserting the device under test). The idea is that whatever the amplifier under test does to the pure input signal is called distortion in the form of added harmonics: generated tones that were not part of the input signal.
Zero distortion would mean nothing had been added to the signal.
But what about subtraction? What does all this fancy test equipment tell us about missing information? About losing info as opposed to adding it. Other than a frequency test to see if a steady state sinewave is all there, basically, the system cannot tell us if anything's missing.
And, that's a real shame. There are tests possible, like a difference test where we compare the output vs. the input using a musical signal, but this test is neither routine nor are the results ever zero—meaning we can always measure something's missing.
I think it's important to realize what's on offer when a manufacturer suggests their products haven't any distortion. What is really being said is that their products aren't adding anything.
We don't talk about what's missing.