Thank you all for your replies.
I have other questions if you do not mind, especially for caosl.
Now, I guess the signal that is extracted by using "sinusx.m" is just a sinusoid signal from output bitstream of SDM.
However, the toolbox uses this sinusoid signal to obtain "noise" as follows:
[noise] = [output bitstream of SDM] - [sinusoid signal].
I guess [noise] above is somewhat "noise component" + "distortion component",
but please correct me if I am wrong. If I am correct, I guess I can find harmonic distortions using input signal.
However, I do not understand how it works. Why do we not do this:
noise = [output bitstream of SDM] - [input signal]
where [input signal] is the one using in Simulink which is applied to the SDM with the amplitude of (0.5-pi/OSR) and its specific frequency.
[Output bitstream of SDM] should include [input signal] + [noise], where [noise] here is the sum of noise and distortion components.
When I use the real input siganl to the SDM for getting noise, SNR and ENOB get bad, but if I use the extracted signal from "sinusx.m", it is fine as expected.
Could "caosl" or anybody explain to me about what I do not understand of ?
Thank you so much,