Adding distortion to eliminate distortion seems as contradictory as trying to prevent war by preparing for it. Yet in audio practice, seeming contradictions abound, feedback making an amplifier sound less clean, lower damping factors creating a better bass reproduction, for example. While these examples are controversial, adding negative distortion to distortion to yield no distortion should not be, yet this technique has few adherents. In fact, many do not know that this technique has been suggested. So it was after reading in audioXpress's July issue an article by Graham Dicker that asked why hasn't the lower distortion that occurs in amplifiers with an even-numbered stages and increase with odd-numbered stages not been noted before, I decided that the topic of distortion cancellation by complementary inverse pre-distortion is worth looking into again. (July--notice how long it takes me to get around to writing on a topic.)
Half a century ago is the last time this topic was seriously covered; and the following two articles are well worth a trip to the library: "Non-Linearity Distortion," Wireless Engineer, January 1956, and "Nonlinear Distortion Reduction by Complementary Distortion," IRE Transactions on Audio, Sept-Oct 1959. It is fascinating to read the angry responses to the last article; but then anger often betrays a lack of understanding. (Paul Klipsh even got into the fray and on the wrong side if I remember correctly.) The argument against lowering distortion by pre-distorting went along the following lines: an amplifier that distorts will only end up distorting the pre-distorted signal, yielding no advantage, as the result will be a distorted pre-distortion; or that while it is possible to cancel 2nd harmonic distortion with this technique, it not possible to cancel the higher harmonics. Truly odd arguments; doesn't a distorted inverse distortion equal no distortion? And why only the 2nd harmonics?