flamingo
Member level 4
For an IF variable gain amplifier before a 8-bit ADC, what is the noise specification? The output of the VGA is constant at 1V peak-to-peak, the channel bandwidth is about 8 MHz (DVB applications).
Can I calculate the maximum acceptable output noise density like this: first, for 8bits ADC, SNR should be greater than 50dB, so output SNR of VGA should be greater than 50. then :
SNRvga=20log(vrms/ vn×√B)>50,
vrms is the output rms of VGA, i.e., 354mVrms. and vn×√B means the output noise power in rms, B is signal channel bandwidth, vn the noise power density in
v/√Hz. So vn < 400nV/√Hz.
Is there any problem? Thank you!!
Can I calculate the maximum acceptable output noise density like this: first, for 8bits ADC, SNR should be greater than 50dB, so output SNR of VGA should be greater than 50. then :
SNRvga=20log(vrms/ vn×√B)>50,
vrms is the output rms of VGA, i.e., 354mVrms. and vn×√B means the output noise power in rms, B is signal channel bandwidth, vn the noise power density in
v/√Hz. So vn < 400nV/√Hz.
Is there any problem? Thank you!!