Hello,
I was reading a paper related to Resource Allocation in OFDM-based Cognitive Radios, in which the author had stated the rate requirement of the CR users as uniformly set to 20bits/symbol. What is the meaning of rate in terms of bits/symbol? Isn't rate given in terms of bits/second?
Please help.
This is probably referring to the spreading factor. If you encode each data symbol using a 32-bit code, then you have 32 bits per symbol (or, equivalently, we can say the spreading factor is 32).
Normally, to clarify the difference between data bits and bits transmitted into the channel, we instead say there are 32 "chips" per symbol.
Hi, thanks for your reply.
But its given that Rk=sumN{log(1 + pk,n*Hk,n)}, and Rk = Rreq=10bits/symbol which is given as the fixed rate requirement (equality constraint) of the CR users
Rk is the rate of the kth user. N is the number of subchannels.
So does Rk refer to the data rate or the spreading factor as you mentioned? How do I interpret this?
If Rk is measured in bits/symbol, then my guess would be that it's a spreading factor (or something like that). Of course, this will be linked to data rate if the overall bitrate is constant (as is the case in many practical systems).