sampling theorem matlab
Hello ppl, i m questioning myself why is this happening.
ive implemented the following system using Simulink. basically i am feeding a 1kHz pure sinewave and multiplying it by a pulse having a period of 10 samples and pulse width is 1sample(10us). So effectively sine wave is being sampled at 10kHz.
When i set the simulation time to 10ms and obtain the frequency magnitude spectrum the amplitudes of my frequencies vary (number of samples here is 0 to 1000 therfore 1001)
When i set the simulation time to 9.99ms and obtain the frequency magnitude spectrum the amplitudes now remain constant ! ( 0 to 999 equals 1000 samples)
So why does varying the number of samples(through simulation time) has such a drastic effect? To transform into frequency domain, Matlab's fft function is used. Ive tried to check out doc fft to see wheter the number of samples have effect on the amplitude but I cant actually figure out whats happening.
this is the Matlab code being used,
stem(y,'b.-') <---- Time Plot OK
freq = fft
<---- changing signal to frequency domain
plot([0:length
- 1], abs(freq)) <----- Frequency Spectrum
note y is the number of samples
attached is a link to the simulink system
**broken link removed**