I use a 256 samples/sec ADC to measure some time-series voltage signal. The ADC stores the data points (total of N points) and then once the measurement is done, it transfer them to the PC. We have a C++ program that do some DSP. Specifically, it do three stages of Kaiser-based FIR filtering followed by decimation (4,2,2). Then, a Hanning-based FFT Power Spectral Density estimate.
I noticed that with N=50,000 points measurement, the minimum frequency we get is 0.03 Hz.
Since the program was not written by me, and it is uncommented, it is hard to follow the details. My question: With 50,000 points and 256 points/sec, the total sampling time is 195.3 sec. This should correspond to a minimum frequency of 0.005 Hz, but we have 0.03 Hz. What is the reason for the discrepancy?
NOTE: the ADC card take its input from the analog output of a Lock-In amplifier, which have a 16-bit ADC and DAC.