Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

1dB Compression voltage simulation

Status
Not open for further replies.

freewing

Member level 1
Member level 1
Joined
Mar 16, 2005
Messages
35
Helped
1
Reputation
2
Reaction score
1
Trophy points
1,288
Activity points
1,592
I'm simulating the 1dB compression voltage of an amplifier. It's weird that the gain is not monotonically decreasing when the amplitude of input signal increases. For instance, from 25mV to 150mV, the gain increases from 1.55 to 1.63. Then from 150mV to 300mV, it decreases from 1.63 to 1.33. What's the reason and how can I get 1dB compression voltage?
 

the problem is the circuit nolinearity
what the dc operation points? keep that for 1dB compression voltage
 

In order to get 1-dB compression point, you need to run PSS simulation
 

I successfully did the 1dB compression point using PSS. It's more convenient.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top