AllenD
Member level 5
Hi team
I have some trouble with Variable gain amplifier(VGA) DC output voltage bias. The topology is shown the attached picture.
![VGA.png VGA.png](https://www.edaboard.com/data/attachments/80/80793-9a30ef128057c07c6a6e7a09d61bc876.jpg)
I understand that if Vb=Vdd, M3 is off, Vout_DC is decided by the gm1/gm2 amplifier. However, when Vb gets smaller, and M3 start to turn on and draw current, Vout_DC should be increased. The more current the M3 draw, the higher the Vout_DC is.
Won't this limit the output voltage range? I could not find any paper discuss this issue and it seems they don't think it's a problem. Hence I suspect I must miss something here...
Can anyone help, please.
Thanks
Allen
I have some trouble with Variable gain amplifier(VGA) DC output voltage bias. The topology is shown the attached picture.
![VGA.png VGA.png](https://www.edaboard.com/data/attachments/80/80793-9a30ef128057c07c6a6e7a09d61bc876.jpg)
I understand that if Vb=Vdd, M3 is off, Vout_DC is decided by the gm1/gm2 amplifier. However, when Vb gets smaller, and M3 start to turn on and draw current, Vout_DC should be increased. The more current the M3 draw, the higher the Vout_DC is.
Won't this limit the output voltage range? I could not find any paper discuss this issue and it seems they don't think it's a problem. Hence I suspect I must miss something here...
Can anyone help, please.
Thanks
Allen