- Applications for the rules with more than two symbols (e.g. three symbols; x1, x2, x3)
- Chain Rule
H(x1, x2, x3) = H(x1) + H(x2|x1) + H(x3|x1, x2) - Conditional Mutual Information
I(X, Y, Z) = H(X|Z) − H(X|Y, Z) Z is considered Side Information.
- Using Probability Distribution to derive mutual information.
- Partial Mutual Information
- Directed graphs is a convenient and useful way to represent information channel where the symbols (input & outputs) are shown as nodes and the connecting directed arrows indicate transitional probabilities.
For example,
- Steps to determining information lost or gained.
- Compute entropy H(X).
- Compute Mutual information.
- Compute Information Loss.
H(X|Y) = H(X) − I(X; Y) And H(X|Y) ≠ 0 implies that there is information loss through the information channel.
- Compute entropy H(X).
- Hard Decision
- Soft Decision
- Both Hard & Soft Decision Models results in information loss,
I(X; Y)Soft > I(X; Y)Hard H(X|Y)Soft < H(X|Y)Hard and Hard decision losses more information than soft decisions. Thus, Quantization is information lossy. - Mechanics of Information Loss
There is less information in Y than in the cartesian product or set X × η (i.e, H(Y) < H(C) = H(X, N)), but more information in Y than information in either X alone or η alone (H(Y) > H(X) & H(Y) > H(N)).
Thus noise information has been added to information.