Let us illustrate the determination of information gained or lost by considering the example:

1)

*Compute entropy*

*H*(

*X*)

We know that information (entropy) is given by

2)

Hence,

Therefore,

Hence,

= 0.813417
❷

*Compute mutual information**I*(*X*;*Y*)
We know that the partial mutual information *I*(*x _{i}*;

*Y*) is given by

*I*(

*X*;

*Y*) =

*I*(

*x*

_{0};

*Y*) +

*I*(

*x*

_{1};

*Y*)

= 0.813417

3)

= 0.157533
Thus,

*Compute information loss*Since*I*(*X*;*Y*) =*H*(*X*) −*H*(*X*|*Y*), after output*Y*, the information loss from the original symbol*X*is given by*H*(

*X*|

*Y*) =

*H*(

*X*) −

*I*(

*X*;

*Y*)

= 0.157533

*H*(*X*|*Y*) ≠ 0 and we can say that there is information loss through the information channel. ❸
*Next:*

Hard and Soft Decisions (p:4) ➽