The power supplies at my work i'm guessing don't have a built in current sensing circuit because you can't current limit before you connect it up to a PCB board circuit
I have to put a 1 ohm , 10 watt resistor in series to measure the voltage across the 1 ohm resistor and convert it to amps to set the current limit on the power supply
Or I put my DVM meter in current mode and put it across the power supply + and - terminals
The Digital Displays for the voltage and current meters on the power supplies are way off compared to measuring the output voltage and current using a DVM meter , why is that?
For circuits that have current sensing comparators, the input current to the current sensing comparator has to be at a current limiting valuse or current bias? or the voltage trip point will be out of spec or out of range?
Example:
Current sensing comparators , current limiting input bias is 100mA , the voltage trip point is 1 volt
If the current limiting input is set to 10 mA , will the voltage trip point of the current sensing or voltage sensing comparator trip at 1 volt?
You have to set the input current to the current sensing comparator and then adjust the voltage using a variable pot on the power supply until you find the comparators trip point, but it if you current limit the power supply at a different current , the voltage trip point will be different because the comparator is a current sensing comparator? why is that?
I have to put a 1 ohm resistor in series with the power supply to set the current limit, plus the current that the circuit is drawing is in the milliamps and the power supplys digital displays reads Zero 0000 but when you connect an external meter across the 1 ohm resistor , the current is in the milliamps. The powers supplies digital display is unless and it wrong