hi all,
I am new to analog design..I am asking how to obtain bias voltages for transistors in my circuits??do I have to make special circuits for them???
I mean if the supply is 1.5V and I need 0.5V , do I have to make potential divider or something??..if this is the method ..how can I make it such that it consumes minimal power??
This depends on where the bias voltages are needed
For bias voltages used in current mirrors ( wide swing current mirrors ), you can simply bias by passing a constant current through a diode connected transistor and then adjust sizing to give the required voltage ( See Johns and Martin Ch6 for more details ).
Applications requiring a very stable Voltage ( as in ADCs ) would need bandgap voltages. In such case, you need to design a bandgap voltage ref.
This depends on where the bias voltages are needed
For bias voltages used in current mirrors ( wide swing current mirrors ), you can simply bias by passing a constant current through a diode connected transistor and then adjust sizing to give the required voltage
thnx..
but with this method I can only obtain one value for a bias voltage ( assuming one reference current source on the chip passing its current in the diode connected device)...is this right?
Yes, if you use one diode connected you'll obtain one bias voltage. But YOU could use MORE diode connected transistors in the same branch (same current) to obtain other bias voltage. If you are working on CMOS you could try with pmos or nmos acording which bias voltage you need.