Basically, there are two recognized standards for rating power amplifiers: FTC and EIA.
The FTC standard, established back in the '60s or '70s by the Federal Trade Commission, requires that a manufacturer’s power specification must be met with both channels driven over the advertised frequency range at the advertised total harmonic distortion (THD) rating. Typically manufactures use the full bandwidth audio spectrum - 20 Hz to 20 kHz - for this rating.
The EIA rating, established by The Electronic Industries Association, is the power output for a single channel driven at mid-band – typically 1 kHz – with 1% THD. This standard tends to inflate the amplifier’s power rating up to 10 to 20% higher than the FTC rating, because it takes more power to drive a full bandwidth signal than a single frequency signal.
One method isn't necessarily better than the other, as long as you're comparing "apples to apples" from one amp to the next. Most audio enthusiasts feel the FTC method is more of a "real world" method since most audio program material is wide-bandwidth, carrying everything from high frequencies to low.
Notice too that that with FTC ratings, many manufacturers can show that their amplifier's full-range power spec is accomplished with THD specs that are significantly lower than the EIA's 1% figure - 0.01%, 0.001%, etc. An amp with a EIA power rating @ 1% THD might be putting out significantly less power at a lower distortion level.
Hope this helps.
Regards,
Wayne