Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

What is the entropy function and how to use it?

Status
Not open for further replies.
Re: Entropy -What is it?

Consider a hard drive. File system is a linked list with a random nature. Defragmentation recreates or improves the "order". What is the equivalence of dQ/T?
 

Re: Entropy -What is it?

I agree, that entropy reveals the extent of randomness and uncertainty of the system. Nevertheless, if you pay attention to the definition of this notion in Matlab, concerning signals, you'll see some strange things:

Shannon Entropy: E = sum (si^2*log2(si^2)) - Matlab's definition. si - samples of the discrete-time signals

1) according to the expression above, entriopy can be negative (less than zero) whereas this facts contradicts to the intuitive sense of entropy and the definition of it in information theory which maintain that entropy>=0!!!!

2) Imagine a constant signal, with the values very large and equal for any time sample. It will have enormous entropy, but it's easily predictive (because it's a constant) therefore should have small entropy. However, fast oscillation noise with zero mean and unit standard deviation must have the entropy larger than the constant signal. But according to the formula it's vice versa!!!!

How can you explain this curious facts?

With respect.

Dmitrij
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top