per_lube
Advanced Member level 4
Hi all,
I got zero bias shottky diode based power detector. It gives DC output upto -30 dBm power input.
I want to measure input impedance of this circuit.
I just connected the input of the circuit to VNA port 1; it shows about -3 dB in magnitude plot.
1. Is the method I used to measure input impedance correct? if not what is the proper method?
2. What can I do to improve the input matching?
If I use an impedance matching cct. will it improve the sensitivity (will it detect signals < -30 dBm) ?
cheers,
per_lube
I got zero bias shottky diode based power detector. It gives DC output upto -30 dBm power input.
I want to measure input impedance of this circuit.
I just connected the input of the circuit to VNA port 1; it shows about -3 dB in magnitude plot.
1. Is the method I used to measure input impedance correct? if not what is the proper method?
2. What can I do to improve the input matching?
If I use an impedance matching cct. will it improve the sensitivity (will it detect signals < -30 dBm) ?
cheers,
per_lube