In the example:
Observe that:
 There is no feedback.
Therefore

y_{0} and y_{3} receives (red) arrows only from x_{0} and x_{1} respectively with high probabilities p(y_{0}) = 0.32 and p(y_{3}) = 0.48.
Therefore,The decision resulting in the outputs, y_{0} and y_{3} is called Hard decision.
and
Both y_{0} and y_{3} are called Hard decision. 
On the other hand, y_{1} and y_{2} receives (red and blue) arrows from both x_{0} and x_{1} resulting in low probabilities p(y_{1}) = 0.09 and p(y_{2}) = 0.11.
occurrence of y_{1} and y_{2} are not definite.
Therefore,The decision resulting in the outputs, y_{1} and y_{2} is called Soft decision.
and
Both y_{1} and y_{2} are called Soft decision.  Finally, communication theorists calls (above) such example 2input − 4output models as Soft Decision Models. The above communication system is an example of 2bit Digital to Analog converter.
Example of Hard Decision information channel
Consider the information channel,
 Compute entropy H(X)
 Compute mutual information I(X; Y)
I(x_{0}; Y) = 0.389461 Therefore,
I(x_{1}; Y) = 0.300642I(X; Y) = I(x_{0}; Y) + I(x_{1}; Y)
= 0.690103  Compute information loss
The information loss from the original symbol X after output Y is given by,H(XY) = H(X) − I(X; Y) = 0.280847 Thus, H(XY) ≠ 0 and hence there is information loss through the information channel. This information loss is due to the hard decision symbols y_{1} and y_{2}.
Comparing the (above) examples, 2bit Digital to Analog vs. 1bit Analog to Digital converters.
Soft Decision Model Eg: 2bit Digital to Analog converter 
Information Measurements  Hard Decision Model Eg: 1bit Analog to Digital converter 

0.970950  Entropy H(X)  0.970950 
0.813417  Mutual Info. I(X; Y)  0.690103 
0.157533  Info. Loss H(XY)  0.280847 
Observe that:
 I(X; Y) for soft decision model > that for hard decision model
 H(XY) for soft decision model < that for hard decision model
Hard decision losses more information than soft decisions.
In other words, Quantization is information lossy. ❸
Next:
Mechanics of information loss (p:5) ➽