David83
Advanced Member level 1

Hello,
I am trying to simulate an OFDM system where the transmitted symbols are BPSK. The system goes like this:
My question is how to generate the noise such that the SNR of subcarrier k is |Hk|^2*SNR? I do it like this:
\[n=\sqrt{\frac{N\,Ns}{SNR}}(n_R+j\,n_I)\]
where n_R and n_I are both normalized Gaussian random variables. Is this correct?
I am trying to simulate an OFDM system where the transmitted symbols are BPSK. The system goes like this:
- Generate binary stream of N bits
- Convert it to bipolar stream
- Take the oversampled IFFT (multipled by K) with oversampling ratio Ns (K=N*Ns) + CP
- D/A Conversion and baseband transmission
- Add noise
- A/D conversion & Remove CP & FFT (divided by K) operation
- Take the first N symbols
- Do ML detection
My question is how to generate the noise such that the SNR of subcarrier k is |Hk|^2*SNR? I do it like this:
\[n=\sqrt{\frac{N\,Ns}{SNR}}(n_R+j\,n_I)\]
where n_R and n_I are both normalized Gaussian random variables. Is this correct?