Under/overrated amplifier Power Ratings and the effect of varying battery/alternator voltages.
People often get excited when they find out that the amplifier that they purchased, had in fact, an underrated power output.
Here is a hypothetical example:
You purchased an amplifier that is rated to output 1000w RMS into a given load, you got it for a good price and it suited the speaker(s) that you are planning on using. However, after purchase (but before you find out that this model amplifier was tested to output 1256w RMS into that same given load, when tested. You suddenly feel like you got a wonderful deal and it will be much louder than you initially though. 256w sure does sound like a lot of extra power, so it has to be a lot louder right? On the other hand, if you payed for a 1000w RMS amplifier and it was only tested at 799w RMS, you would be rather disappointed right? In the real world, these differences in power are not very large.
A 25% increase in power, will (at best), give you a 1dB increase in power. 1dB is the smallest difference you will notice after reasonable period of time. 1dB is not significantly louder, but barely audible. You generally want over 3dB more power to notice a significant increase in volume. A 3dB increase in power requires you to double the power (100% increase). This also is assuming the speaker can actually handle the power linearly - if the speaker cannot handle the power, then the volume will not increase as much.
So in the real world, you will not notice the difference. So in the real world, compared to 1000w, 799w or 1256w is not really a big deal. In fact, you need approximately, a 60% increase in power, to gain over 2dB. This means that if you payed for a 800w amplifier and you got a 1300w amplifier, then you got a good deal. Therefore, you should not be too pedantic about the actual amplifier power output of your amplifier, because it takes a fairly large increase in power to make a worthwhile difference.
Another common theme is people comparing the power output of their amplifier at two different voltages. For example the tested output at 12.5v (a common engine-off voltage), as compared to 14.4v (a common engine-on voltage). They say, hey my amplifier only puts out 99w at 12.5v, but will put out 115.2w at 14.4v. So clearly I should make sure that my voltage stays at 14.4v, so I always have that extra power. Again, in reality it is not a significant increase in power, so maintaing the voltage at 14.4v is not worth worrying about for that reason.