- Understanding the inequality H(Y) < H(C)
- Most functions are information lossy. This is because entropy of outcome of a function is not necessarily the same as entropy of the compound symbol.
For instance xi + ηj = yk = f(xi, ηj). - Confounding causes for information loss can be identified. For example,
where ⬤ y2 receives inputs from both x0 and x1, and confounds the input.
- The confounding of input can be minimized by normalization to improve signal–to–noise ratio.
- Most functions are information lossy. This is because entropy of outcome of a function is not necessarily the same as entropy of the compound symbol.
- Quality of information is the consideration of information in terms of its usefulnes/uselessness. For example
- Presence of noise is higly undesirable in communication systems.
- Addition is information lossy (2 + 3 = 3 + 2 = 5). Unlike communication systems this is useful for adder as it is a controlled loss.
- Some information channels may be broken down to component information channels.
- Mutual information of erasure channel is greater than that of binary symmetric channel, i.e.,
I(X; Y3) > I(X; Y2).
This is because we can transmit without any information loss through the erasure channel at the cost of sending more data symbols.