Class A/B, as one might deduce, combines the best of Class A and Class B in order to create an amplifier without the drawbacks of either. Thanks to this combination of strengths, Class A/B amplifiers largely dominate the consumer market. So how did they do it? The solution is actually fairly simple in concept: where Class B utilizes a push/pull arrangement with each half of the output stage conducting for 180 degrees, Class A/B amplifiers bump that up to ~181-200 degrees. By doing this, there is far less potential for a “gap” in the cycle to occur, and consequently, crossover distortion is pushed down to the point where it’s of no consequence.
As described by AH's, for argument sake, you can have a class AB amp biased to have each half conduct 181 degrees so there is an overlap of 1 degree, and chances are that even at 0.01 W, depending on the load current, you may end up with some crossover distortions.
Enough on the definition and back on the very topic, if you look at fig.3, Distortions vs watts for 7 bias currents in the article you linked, with the output stage biased at 0.08 amps, the distortion is 0.67% at 1 watt. The graph does not show output below 0.1W but by calculation, you can see that if the bias current drops to 0.008A or even higher, you are going to have significant distortions at 0.01 W. I realize I am taking a short cut, but if I were to come up with a more vigorous argument or example, I would have to spend more time revisiting and digesting part of my text books.