grizedale
Advanced Member level 3
Home made coaxial probe for output voltage ripple measurement?
Hello,
Please could you tell me if the following output voltage ripple measurement, which is the standard procedure at a HUGE international telecoms company, is correct?
It concerns the measurement of the voltage ripple on the SMPS's inside the base station.
This method is used for all the different smps's in the base station, which have frequencies from 1MHz to 100KHz, and output voltages from 60V to 0.9V
The method used was:
Get 1 metre length of 50 Ohm coaxial cable.
Solder a 47R axial resistor to the innner conductor at one end of the cable.
Then solder this 47R resistor to the 'plus' trace of the output voltgage.
Then "peel away" some of the outer conductor of the cable, and solder that to the 'minus' trace of the output voltage.
Then put a BNC connector on the other end of the cable.
Then put this into the scope input.
The scope was a Lecroy LC684DM:
**broken link removed**
My query is that page 5-4 of the above scope manual says that the input impedance of the scope channel can be set to 50 Ohms.....and that is what it is set to for this measurement.....
..So my question is that the characteristic impedance of the 1 metre of coaxial cable is 50 Ohms, so why are they soldering the 47R resistor to the end of the coaxial cable?
Also, sometimes it's fiendishly difficult to solder the "peeled away" outer connector of the cable to the PCB under test, so a short, 1 inch piece of twisted wires is then used at the end of this coaxial cable assembly to make it easier to solder it to the output voltage.....is this acceptable procedure?, or would that 1 inch piece of twisted pair wires infect the reading with too much noise?
Hello,
Please could you tell me if the following output voltage ripple measurement, which is the standard procedure at a HUGE international telecoms company, is correct?
It concerns the measurement of the voltage ripple on the SMPS's inside the base station.
This method is used for all the different smps's in the base station, which have frequencies from 1MHz to 100KHz, and output voltages from 60V to 0.9V
The method used was:
Get 1 metre length of 50 Ohm coaxial cable.
Solder a 47R axial resistor to the innner conductor at one end of the cable.
Then solder this 47R resistor to the 'plus' trace of the output voltgage.
Then "peel away" some of the outer conductor of the cable, and solder that to the 'minus' trace of the output voltage.
Then put a BNC connector on the other end of the cable.
Then put this into the scope input.
The scope was a Lecroy LC684DM:
**broken link removed**
My query is that page 5-4 of the above scope manual says that the input impedance of the scope channel can be set to 50 Ohms.....and that is what it is set to for this measurement.....
..So my question is that the characteristic impedance of the 1 metre of coaxial cable is 50 Ohms, so why are they soldering the 47R resistor to the end of the coaxial cable?
Also, sometimes it's fiendishly difficult to solder the "peeled away" outer connector of the cable to the PCB under test, so a short, 1 inch piece of twisted wires is then used at the end of this coaxial cable assembly to make it easier to solder it to the output voltage.....is this acceptable procedure?, or would that 1 inch piece of twisted pair wires infect the reading with too much noise?
Last edited: