I'm trying to characterize a transceiver, see post here. I'm not sure if this is an RF or a DSP problem.
Setup is as follows
Transmitter and Receiver are same. An AD9361, connected to an FPGA.
I'm setting a single tone transmit frequency. Then the receiver window to capture. The aim is the characterize the roll-off. To do this I'm moving the transmit frequency and recapturing the window. I believe I'm setting the gain(or attenuation) for transmit and receive AD9361s
The first 1024 or 2048 or even 4096 samples are fed into an FFT so I can get the max value at that frequency.
The receive window is exactly the same, all that is changed is the transmit has moved from 1902MHz to 1903MHz and the time the test is run.
The phasor diagram has some fuzz, which is why I plotted the magnitude on the left hand side above. The mean value for both frequencies is roughly the same, so I'm assuming the transmit isn't fluctuating the power. As you can see on the right hand side the FFT is looking different. Remember the exact same script and process is run. Only difference is I've moved the frequency 1MHz to the right.
If I plot all the different captures I end up with the following.
Playing around with Welch averaging can generate the following inverted nipple
I'm at a lost for understanding whats happening from a first principles point of view.
If the signal I am capturing is out of phase on arrival, surely after so many samples it doesn't mater.
--- Updated ---
Here are some pretty pictures, that show magnitude fluctuation. Does this mean my transmitter if rubbish?