Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the difference between entropy, differtial entropy..

Status
Not open for further replies.

aliazmat

Banned
Member level 4
Joined
Mar 14, 2008
Messages
76
Helped
5
Reputation
10
Reaction score
2
Trophy points
1,288
Activity points
0
Entropy Rate

What is the difference between entropy, differtial entropy and entropy rate.

any guy from information theory??
 

Re: Entropy Rate

aliazmat said:
What is the difference between entropy, differtial entropy and entropy rate.

any guy from information theory??

Hi.

You know, entropy means an uncertainty having bits unit in random variable. Thus it depends on the distribution(probability) of random variable, not an actual value of random variable. Especially, the probability in an entropy is probability mass function, that is a discrete case.
In other words, when we use continuous probability function, that is differential entropy.
When it comes to entropy rate, it is applicable to stochastic function. Not only one stochastic function, but also mutiple stochastic functions has the entropy. However, as the number of the functions increase, the entropy will vary. That is the rate of entropy.
 

    aliazmat

    Points: 2
    Helpful Answer Positive Rating
Re: Entropy Rate

:::
A measure of how much information that can be
transmitted and received with a negligible probability
of error is called the channel capacity.

To determine this measure of channel potential, assume that a channel
encoder receives a source symbol every Ts second.

With an optimal source code, the average code length of all source symbols is equal to the entropy rate of the source.

If S represents the set of all source symbols and the entropy rate of the source is written as H(S),

the channel encoder will receive on average H(S)/ Ts information bits per second

"""

above lines are from a paper, what I dont understand is that

Entropy rate of the source H(s) means information contained in one transmitted symbol or
The information contained in all of possible source symbols a source has.
 

Re: Entropy Rate

aliazmat said:
:::
A measure of how much information that can be
transmitted and received with a negligible probability
of error is called the channel capacity.

To determine this measure of channel potential, assume that a channel
encoder receives a source symbol every Ts second.

With an optimal source code, the average code length of all source symbols is equal to the entropy rate of the source.

If S represents the set of all source symbols and the entropy rate of the source is written as H(S),

the channel encoder will receive on average H(S)/ Ts information bits per second

"""

above lines are from a paper, what I dont understand is that

Entropy rate of the source H(s) means information contained in one transmitted symbol or
The information contained in all of possible source symbols a source has.


Hi again.

You better refer "Elements of information theory".
In the theorem 5.4.2, the minimum expected codeword length per symbol is described. Consequently, H(s) mentioned above can be expressed in detail as like, H(S) = H(s1,s2, ... ,sn)/n. Thus I can say that it is the rate per one symbol but all information.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top