Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

ADC conversion time and acquisition time

Djaferbey

Junior Member level 1
Junior Member level 1
Joined
May 20, 2024
Messages
18
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
164
Hello,

I have a couple of questions regarding ADC timing characteristics. Firstly, what controls the timing during the conversion process, and what determines the timing of the switch during the acquisition phase? How can I read the datasheet to determine the interval between each acquisition of a voltage by the ADC and the time it takes to convert this voltage to an output code?

I understand that the timing controlling the switch must be sufficient to charge the sample capacitor to the input voltage of the ADC with an accuracy of at least ±1/2 LSB.

Thank you for your assistance.




1718380237374.png
 
Hi,

did you see Figure #13 .. ?

Firstly, what controls the timing during the conversion process, and what determines the timing of the switch during the acquisition phase?
Datasheet says:

Internal Conversion Clock
The LTC2387-18 has an internal clock that is trimmed
to achieve a maximum conversion time of 63ns. With a
typical acquisition time of 27.7ns, throughput perfor-
mance of 15Msps is guaranteed.


I hope you don´t want us to describe all the internal logic. There are papers explaining general SAR ADC function. This usually is more than you need to use the ADC.
I hope you don´t want to copy this IC.

Acquisition takes plase all the time when the ADC is not busy with converting ( a bit delayed after conversion finished).
You see the symbol is "tACQ". Just do a PDF search for "tacq". or only "acq" to find related informations.

the time it takes to convert this voltage to an output code?
the abbreviation for time is "t", and the abbreviation for conversion is "conv". So do a search for "tconv".
For sure you can not know all abbreviations ... but simply look for the obvious ones.

And since "tCONV" is a rather important information you find it in the table you already posted ... 2nd line.

Klaus
 
Hi,

did you see Figure #13 .. ?


Datasheet says:

Internal Conversion Clock
The LTC2387-18 has an internal clock that is trimmed
to achieve a maximum conversion time of 63ns. With a
typical acquisition time of 27.7ns, throughput perfor-
mance of 15Msps is guaranteed.


I hope you don´t want us to describe all the internal logic. There are papers explaining general SAR ADC function. This usually is more than you need to use the ADC.
I hope you don´t want to copy this IC.

Acquisition takes plase all the time when the ADC is not busy with converting ( a bit delayed after conversion finished).
You see the symbol is "tACQ". Just do a PDF search for "tacq". or only "acq" to find related informations.


the abbreviation for time is "t", and the abbreviation for conversion is "conv". So do a search for "tconv".
For sure you can not know all abbreviations ... but simply look for the obvious ones.

And since "tCONV" is a rather important information you find it in the table you already posted ... 2nd line.

Klaus
Hello Klaus,

Im definitely not trying to copy the IC im doing a project for software defined radio applications and i want to test this ADC's dynamic range. Here i need to understand the time it takes to acquire data and to convert it.


The reason why im asking is related to a driver for the ADC. The driver has a direct impact on the capability to copy the input signal based on the RC time constant. (charging the capacitor).
 
Hello Klaus,

Im definitely not trying to copy the IC im doing a project for software defined radio applications and i want to test this ADC's dynamic range. Here i need to understand the time it takes to acquire data and to convert it.


The reason why im asking is related to a driver for the ADC. The driver has a direct impact on the capability to copy the input signal based on the RC time constant. (charging the capacitor).
RTFM.

The data sheet explicitly tells you, as Klaus pointed out, what the acquisition time and conversion time are, and you keep asking “what is the acquisition time and conversion time?”
 
RTFM.

The data sheet explicitly tells you, as Klaus pointed out, what the acquisition time and conversion time are, and you keep asking “what is the acquisition time and conversion time?”
First of all, you have to take a chill. Be respectful. I have already read the statement and understood it, but there is clearly a reason why I'm confused

"The LTC2387-18 has an internal clock that is trimmed to achieve a maximum conversion time of 63ns. With atypical acquisition time of 27.7ns.
1718566894278.png

The acquisition time is 27.7ns when the time cycle is 66.6ns but as you can see above there is clearly a variation in the acquisition time and that's why im asking. If you don't have an answer to question then don't waste my time writing something useless.
 
Last edited:
Hi,

I see nothing disrespectful.

It´s simply not clear waht answer you expect.

--> Ask a clear question.
I can see no variation in acquisition time. It is always tCYC - 39ns. What do you mean?

Klaus
 
I did not find see any "ready" signal that triggers tACQ data to be externally clocked out and the LVDS clock rate must be at least 18*15= 270 Mbps to transfer 18 bits at 15 Msps rate.
1718583148143.png

I found their conversion timing NOT TO SCALE and the ADSL timing somewhat lacking details on streaming. But I understand it will draw a huge burst of current that can be problematic.

I could not answer his questions without more clarification.
I can imagine a lot of questions.
 
LTC2387 timing spec can be divided into two segments:
1. Analog input and acquisition relation to CNV input.
2. Digital interface, relation between CNV and LVDS interface.

I understand that you are asking about first topic. The datasheet gives requirements for analog input signal and CNV to achieve specified performance. But it's not completely specifying acquisition timing, e.g. expectable tACQ and tAP jitter. Besides minimal tACQ the below paragraph should be considered
The LTC2387-18 is optimized for pulsed inputs that are fully settled when sampled, or dynamic signals up to the Nyquist frequency (7.5MHz). Input signals that change faster than 300mV/ns when they are sampled are not recommended. This is equivalent to an 8VP-P sine wave at 12MHz.
Acquisition behaviour is also implicitely specified by the sampling input circuit.
 
System Design specs must create specs top-down or bottom-up for ACQ stability in terms of both Analog and Digital domain errors from DC to the Nyquist rate in both phase and amplitude of spectral domains and timing domain errors including; above slew rate, excess settling time, excess group delay, and noise interference.

Analog performance must have system design specs from bottom-up specs for an "Optimal Receiver" and top-down limits from ADC specs and > 400 Mbps LVDS timing.

If this were a parallel output, the timing might be trivial, but with serial LVDS and size reduction, the added complexity requires a skewed sync'd analog/digital state machine.
I believe that the ACQ+ edge must be more stable than an analog LO SNR using direct conversion. In the SDR system design, the phase noise spectral envelope must be defined to define the required ACQ jitter. This would exclude an FPGA ACQ design as warned on p19 CNV timing and implies a stable synchronous state machine is needed. I expect this requires a 15 MHz VTCXO with a full environment spec < 1ppm and low phase noise to match the 66.6 ns CNV+ timing and not the typical uC XO. The CNV- occurs in the middle of the last bit D0 being transferred at> 400 Mbps. This is needed to guarantee the last bit is transferred with adequate slack time for the next stable ACQ+ pulse. One may evaluate if these 2 clocks must be phase sync'd or not. Even though 18 bits every 15MHz needs > 270 MHz , tACQ = 66.6 - 39 = 27.7 ns

Concern over "CNV interval;63ns max. With a typical acquisition time of 27.7ns."

This ACQ time is the Analog period for tracking a new value which does not exceed the Slew Rate spec. and the CNV triggers a Hold {tAP} in 0 time. {track & hold} vs {sample and hold}
My previous answer implies there is 3.6 ns min. of margin below 66.6ns. for the LVDS signals to achieve this 15 MHz sample rate with up to >400 MHz LVDS ACQ burst data transfer.

You have 27 ns of analog capture time to ensure your Optimal Receiver design specs have signal conditioners that do not degrade group delay, yet protect dV/dt from interference with ESD diodes which add 2pF of capacitance and have a channel BW that does not exceed the Nyquist BW not just at -3dB but your tolerance -x dB for SNR and error rate tolerance .

The CNV timing must be as precise as a LO phase noise to have as good a performance as traditional Analog radios. { or better if desired}

I would consult with any Eval board guidelines and make no deviations without understanding the consequences.
1718628414013.png
 
Last edited:

LaTeX Commands Quick-Menu:

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top