freewing
Member level 1
I'm simulating the 1dB compression voltage of an amplifier. It's weird that the gain is not monotonically decreasing when the amplitude of input signal increases. For instance, from 25mV to 150mV, the gain increases from 1.55 to 1.63. Then from 150mV to 300mV, it decreases from 1.63 to 1.33. What's the reason and how can I get 1dB compression voltage?