Probability
Since a compound symbol c_{ij} is defined
Hence,
Joint Entropy
Since the set of all compound symbols,
In other words, H(C) and H(X,Y) are one and the same (≡) and not just in quantity (=), i.e., it is true for all values of C, X and Y.
What does it mean (practical) to say, H (C) ≡ H(X,Y)?
☛
Equivocation
Since,
⇒ log p(x, y) = log p(x|y) + log p(y)
alternatively
p(x, y) = p(y|x) p(x)
⇒ log p(x, y) = log p(y|x) + log p(x)
What is the practical meaning of H(Y|X)?
☛
Equivocation or conditional entropy, H(Y|X) implies that, given the knowledge of X, H(Y|X) measures the uncertainty remaining in Y. For instance, if Y = f(x) then it implies that knowing x tell us about Y.
What does uncertainty remaining in Y mean, i.e., what does H(Y|X) mean?
☛
To answer let us consider three cases:
- Given all the knowledge of X if it tells us nothing about Y, then uncertainty remaining in Y, H(Y|X) = H(Y).
- Given all the knowledge of X if it tells us everything about Y, then uncertainty remaining in Y, H(Y|X) = 0.
- Given some the knowledge of X so that there is uncertainty in Y, then uncertainty remaining in Y, H(Y|X) ≤ H(Y).
Next:
Chain Rule of Entropy (p:3) ➽