W_Heisenberg
Full Member level 4
anyone helps?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It is a measure of the distance between two discrete probability distributions. The same probability distribution would have a relative entropy of 0 with itself, while a large relative entropy indicates a large difference between two distributions. It is also known as the Kullback-Leibler distance between two distributions. I believe it is often used to show that one probability distribution is converging to another via a shrinking relative entropy.
"converging to another via a shrinking relative entropy" means that as you increase n, the relative entropy is decreasing (going to 0), so the distribution is converging. I do not know of a physical meaning to relative entropy; as I stated above, it is a measure of distance between two probability distributions.
Shug mentioned the meaning of relative entropy in statistics. You can review the topic at Wikipedia.
The original post doesn't mention a context, but apparently are you assuming relative entropy could have a meaning in thermodynamics, too. Any indications?
Curiously, you posted the question in the Digital Design and Programming.
Think about it this way: you define a distance metric for two numbers, say the absolute value of their difference. This one is straightforward to define. Another measure of distance is to take their difference squared; once again, this is pretty straightforward.
However, how do you define the distance between two vectors? One way to do it is to measure the 2-norm of the difference vector; once again, pretty straightforward.
Now, how do you define the distance between two distributions? One answer is to use the relative entropy. That is likely where the motivation for relative entropy comes from; people were trying to judge the distance between two distributions, and in order to do so they came up with relative entropy.