There seems to be a lot of misinformation swirling around an amplifier's efficiency. Maybe I can help by explaining a few things.
First off, what do we mean by the term efficiency? It refers to the production of heat vs. the production of sound. Think of it in the same way you might think of fuel efficiency. How many miles per gallon does a car get? The more efficient the car, the more miles per gallon. The fuel consumed in a car has an energy potential. If all of its potential is converted into the forward motion of the car we can say it is 100% efficient.
If an amplifier were 100% efficient it would generate no heat and whatever wattage it pulled from the wall would be delivered to the speakers.
The typical class A/B power amplifier is about 50% efficient. This means that for a given number of output watts, let's say 100, the amplifier will produce 100 watts of heat.
I am asked all the time if this means a loss of power. To some, the idea that a power amplifier is only 50% efficient suggests that a 100 watt amp might only be able to output 50% of its rated power. After all, it's only 50% efficient. And that's where we get in trouble.
Hopefully, it helps to understand an amplifier's efficiency refers to the losses of input power potential incurred while producing its rated power. An amplifier with a rated power of 100 watts always produces its stated output power level. Efficiency merely explains to us the cost of producing that power.
Thus, when a class AB amplifier outputs 100 watts into the speaker, it is drawing from the wall 200 watts. 100 watts are going into the production of heat, 100 watts are delivered to the load.
Tomorrow I'll confuse the issue more by explaining the mysterious efficiency of a class A amplifier.