Question about spec in AD

Status
Not open for further replies.

segabird

Member level 1
Joined
Nov 11, 2004
Messages
38
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
276
Some ADC is described with 80MSPS but with 200M signal bandwith?
Who can explain this?
3x
 

200MHz signal bandwith means clock frequency is 2*200=400MHz (Shannon law)

So each sample is delivered at output of the ADC after 400Mhz/(80MSample/s)=5 clocks
 

mandrei said:
200MHz signal bandwith means clock frequency is 2*200=400MHz (Shannon law)

So each sample is delivered at output of the ADC after 400Mhz/(80MSample/s)=5 clocks

Sorry but I could not understand the figure of 80 sample/s exactly. What for is this figure used ?

Thanks
 

Hi atmaca

Here is my judgment :

Guys wich produced the ADC are clever guys, they will put in the data sheet the maximum performance of their device.

For example to measure the bandwidth of the ADC they will increase the sample clock frequency until the internal Sample and Hold circuit "colapse".

Say, that maximum CLOCK frequency they noticed is 400MHz.

So the maximum input DATA frequency (which is termed "bandwidth" of the input signal) will be at most half of the clock frequency (Shannon law).

They will write in data sheet: Banwidth=200MHz

Now, they cannot deliver at the output of the ADC 200 000 000 samples in each second because the samples need some internal processing (that means internal delay)

They can deliver only 80 000 000 samples/second

In terms of clock frequency they need:

(400 000 000 clocks/second)/(80 000 000 samples/second)=5 clocks/sample

to deliver a single sample at the output.
 

There is some truth in what mandrei said, but it is not quite corrent.

First of all, you have to make the distinction between the sample-and-hold circuit, and the processing afterwards. What happens during the sampling phase, is that a small capacitor is loaded. The bigger this capacitor is, the longer this phase takes. So as the input signal frequency rises, at a certain frequency the capacitor does not have the time any more to get fully charged during the sample-hold phase. The frequency where the digitized signal gets half that the size in the passband (3dB point) is what we call the analog bandwidth, or short bandwidth of the ADC. This bandwidth depends on the sample/hold circuit.

Now suppose we have a very fast sample/hold circuit. We can use this sampler, but digitize at much lower speed. E.G. we can digitize a 150MHz IF signal at 16MHz sampling clock, as long as the sample/hold circuit allows this. We call this subsampling. This is why your sampling clock is lower than the bandwidth. But note that every signal at 16MHz distance will be the same signal to the ADC. So this is why anti-aliasing filters are needed, so suppress these aliasing components.
 

Hi,

The analog bandwidth (defined typically as the 3 dB bandwidth) of the ADC has to be large enough for leave unaltered the frequencies in the Nyquist band when Nyquist (baseband) sampling is used.
In passband sampling (digital down-conversion) the signal is sampled at Intermediate Frequency, and the analog bandwidth of the ADC has to be larger, according to this fact. This is a typical application in software defined radio. ADC’s designed for this have a bandwidth very large (compared with Fsampling/2).
Regards

Z
 

segabird said:
Some ADC is described with 80MSPS but with 200M signal bandwith?
Who can explain this?
3x

80MSPS = 80Msamples/sec so you get exactlya new sample ever 1/80M instant. The clock frequency of the ADC usualy is 80MHz.

The 200MHz bandwith means that you could sample input signals till 200MHz frequency, but remember Nyquist, if your signal is higher than 80MHz you get an output that is Fin-80MHz ....

Hope this could help.

Bastos
 


3x
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…