Basically what you want to understand is the effect that the variations in the common mode voltage and the supply voltage have on the differential output current. This can be done using a AC analysis were your AC supply is either connected in series with the supply voltage source or in series with the common-mode voltage source.
This analysis should be done having the maximum (DC) differential output voltage, because that should be the situation where your opamp is more "unbalanced".
If you do these analysis having zero DC output voltage the two halves of the opamp (positive output half and negative output half) have exactly the same bias currents and voltages and therefore the same gm, gds, parasitic capacitances, etc. In this situation your CMRR and PSRR will be very good. So you should perform these simulations in the worst case I mentioned in the paragraph above.
That depends a lot on your application. To give you an example, imagine that you are designing a Sample and Hold (S&H) for an ADC. Consider the the LSB (least significant bit) is 1 mV. Now imagine that you know that, due to the inductance of the bonding wires your supply voltage has oscilations arround 100 MHz -> lets consider that these oscilations have a maximum amplitude of 100 mV.
If you want that this 100 mV oscilation only appears as a 1/4 LSB=0.25 mV oscilation at the output of the S&H, you must have a PSRR of, at least, 20*log10(100mV/0.25mV)=52 dB at 100 MHz.
Dear max,
You know, PSRR of opamp is the ratio of A_dm to A_vdd, in your reply, is there a default condition that the opamp is a unit-gain buffer, the A_dm is one?
By the way, A_dm is the differential gain, A_vdd is the gain from power supply to vout