gehan_s
Member level 3
hey all,
i am designing a linear power supply (Vout upto 20V and Iout upto 1A). after going through some regulator ICs i've chosen the lm317. i have covered most of the basics and this is my design.
View attachment power.jpg
i regulate the output voltage via a low passed PWM signal which is then given a gain of 4 and applied to the adjust pin of the lm317 and it works really well (both in software and when i implement the circuit).
for sensing the current i have used the 1ohm resistor and a difference amplifier. the output voltage of it will be equal to the current flowing to the load. this also works really well in software but when i implement it, it does not work. it is more frustrating because when i short circuit the load the current is sensed perfectly (it actually senses 0.01A increments of current). how can i resolve this?????
thanks in advance !!!!!!!!!!
i am designing a linear power supply (Vout upto 20V and Iout upto 1A). after going through some regulator ICs i've chosen the lm317. i have covered most of the basics and this is my design.
View attachment power.jpg
i regulate the output voltage via a low passed PWM signal which is then given a gain of 4 and applied to the adjust pin of the lm317 and it works really well (both in software and when i implement the circuit).
for sensing the current i have used the 1ohm resistor and a difference amplifier. the output voltage of it will be equal to the current flowing to the load. this also works really well in software but when i implement it, it does not work. it is more frustrating because when i short circuit the load the current is sensed perfectly (it actually senses 0.01A increments of current). how can i resolve this?????
thanks in advance !!!!!!!!!!