iVenky
Advanced Member level 2
- Joined
- Jul 11, 2011
- Messages
- 584
- Helped
- 37
- Reputation
- 76
- Reaction score
- 35
- Trophy points
- 1,318
- Location
- College Station, Texas
- Activity points
- 6,124
In the paper "A Mathematical theory of Communication" by Shannon I couldn't understand a theorem (Theorem 4) under the topic "The Entropy of Information Source"-
Can you explain me how we get this theorem?
What is this n(q)? I couldn't understand the sentence that I have underlined. How is log n(q) number of bits required to specify the sequence?
Here's his paper-
**broken link removed**
It's under the topic "The Entropy of an Information Source" in Page 13 and 14. He has written the proof for that in the appendix but I couldn't understand that.
thanks a lot
Here 'H' is the entropy.Consider again the sequences of length N and let them be arranged in order of decreasing probability. We define n(q) to be the number we must take from this set starting with the most probable one in order to accumulate a total probability q for those taken
Theorem 4:
\[
\lim_{N \to +\infty} \frac{log (n(q))}{N}= H
\]
when q does not equal 0 or 1.
We may interpret log n(q) as the number of bits required to specify the sequence when we consider only the most probable sequences with a total probability q.
Can you explain me how we get this theorem?
What is this n(q)? I couldn't understand the sentence that I have underlined. How is log n(q) number of bits required to specify the sequence?
Here's his paper-
**broken link removed**
It's under the topic "The Entropy of an Information Source" in Page 13 and 14. He has written the proof for that in the appendix but I couldn't understand that.
thanks a lot