arsenal
Full Member level 2
i have designed a bandgap in 0.13um and 0.09um,
simulation shows the one in 0.13um works well and the variation of output is less than 5mv while temperature sweeps from -50 to 150 C under 1.6v and 3.6v power supply respectively,
however the one in 0.09u seems much worse.When I compensated the temperature coefficient at 3.6v power supply and gain an error of less than 3mv when swept from -50 to 150C, it shows a much larger variation error of about 15mv in this temerature range at 1.6v power supply, and the compensated temperature coefficient which is a little positive at 3.6v goes negtive at 1.6v.
anyone can tell me what to do to minimize this error at different vdd?
is there any trick in 90nm design?
thank you
Added after 1 hours 34 minutes:
problem solved.
then i wanna know what is the main concern in 90nm design?
simulation shows the one in 0.13um works well and the variation of output is less than 5mv while temperature sweeps from -50 to 150 C under 1.6v and 3.6v power supply respectively,
however the one in 0.09u seems much worse.When I compensated the temperature coefficient at 3.6v power supply and gain an error of less than 3mv when swept from -50 to 150C, it shows a much larger variation error of about 15mv in this temerature range at 1.6v power supply, and the compensated temperature coefficient which is a little positive at 3.6v goes negtive at 1.6v.
anyone can tell me what to do to minimize this error at different vdd?
is there any trick in 90nm design?
thank you
Added after 1 hours 34 minutes:
problem solved.
then i wanna know what is the main concern in 90nm design?