My personal opinion is that it's pretty much a wash.
A class AB amp uses transistors in a push-pull configuration where they are biased in the active region except for the cross-over region which presents it's own unique design challenges.
Class D amps typically use FETs that are either on or off and then uses a reactive filter on the digital waveform to "smooth" it back out into analog.
In the class AB amp the power is dissipated by the transistors. In the class D amp power is primarily dissipated by the filter, although some power is dissipated through the RDS(on) of the FET. The class D is more efficient because the filter can be implemented using reactive components that dissipate power relative to their effective series resistance (ESR) which is comparatively low compared to the linear transistor characteristics of the class AB transistor.
But as far as sonic performance goes, I don't see a reason why a well designed class D amplifier can't be as good as a well designed class AB amplifier although one may need a bigger power supply for the AB. As an extreme example consider an amp that is 50% efficient. For 100 watts out your power supply has to provide 200 watts in. A 100% efficient amp (no such thing) would require 100 watts in to provide 100 watts out. The performance of the amps could be the same although, obviously, the performance of the power supplies is vastly different.
So to me the Pioneer claim that they are "better" really boils down to the fact that they could design a less powerful and thus less expensive power supply.
As has been mentioned many times in this forum: Try it both ways and let your ears decide.
Sorry to go all electrical engineer on y'all, but that's who I am...