[Moved]: Linearity improvement due to source degeneration

Status
Not open for further replies.

qwerty99

Newbie level 4
Joined
Mar 16, 2015
Messages
7
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
49
How does source degeneration help improve linearity in a common source amplifier?
 

Re: Linearity improvement due to source degeneration

The effective gm of the transistor is roughly gm/(1+gm*R), R is the degeneration resistance and gm is the trans-conductance of the diff pair. Thus, by making gm*R >> 1, the effective gm is ~ 1/R. Since R is more linear compared to gm, one gets better linearity.
 
For dc bias for CS
Think about an overshot noise happened at VG increasing VGS by delta_vgs. This will cause current ID to increase also by delta_ID.
But this delta_ID will pass through RS whch will increase VS (source voltage) to delta_ID * RS
so this increase will cancell the increased shot at the input maintaing VGS to back to its stabel designed voltage

 

A MOSFET has a non-linear relation between the gate-source voltage and the drain-source current which generates non-linearity in the input voltage versus the output voltage (across the drain load resistor).
A source resistor reduces the effect of this non-linearity (at the expense of lower gain) since it tends to make the drain-source current more a linear function of the input voltage.
Essentially, the much larger AC voltage across the source resistor due to the collector-emitter current swamps out much of the non-linear effect of the small gate-source voltage change.
The disadvantage is that this resistor voltage at the source also subtracts from the effective gate-source input voltage, reducing the circuit gain.
So you trade off gain for better linearity.
 
Is there a way to prove this analytically?
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…