How to model/estimate channel in discrete time with different sampling time ?

Status
Not open for further replies.

siato

Newbie level 1
Joined
Mar 18, 2013
Messages
1
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Location
Montreal
Visit site
Activity points
1,277
Hello,

I want to estimate a channel based on LTE 3GPP EVA with given power delay profile (set of average power and delay of channel taps).
tau = [0, 30e-9, 150e-9, 310e-9, 370e-9, 710e-9, 1090e-9, 1730e-9, 2510e-9]; % relative delay (s)
pdb = [0, -1.5, -1.4, -3.6, -0.6, -9.1, -7.0, -12.0, -16.9]; % avg. power (dB)

Unfortunately, the relative delays of channel taps are not multiples of my sampling time \[T_s\]=3.3333e-07 s.
Here, I assumed the number of taps to be estimated is \[ \frac{D_s}{T_s} \], where \[ D_s\] is the delay spread.

First of all, please let me know if there is anything wrong in my approach.
Also, How should I measure my estimation error in this case? Should I compare my estimates with channel sampled at my sampling rate, or should I subtract the reconstructed frequency response of estimate and the original one?
Anyhow, due to the difference in the sampling time and delays, there is an intrinsic error in my estimation no matter how I measure it.

Thanks in advance
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…