As a continuation of studying the mechanics of information loss starting from Lecture04 and Lecture05, recall that in our example of the Automatic Repeat reQuest, ARQ system

*X*emit symbols

*x*

_{0},

*x*

_{1}; treating both as a single symbol ⟨

*x*

_{0},

*x*

_{1}⟩ into an encoder

*E*emitting encoded sequence ⟨

*x*

_{0},

*x*

_{1}, Π⟩ where parity Π ≜

*x*

_{0}⊕

*x*

_{1}. The total information for the DMS is therefore

*H*(

*C*) =

*H*(

*x*

_{0},

*x*

_{1}, Π) = 2

*H*(

*X*).

Furthermore the erasure channel is used 3× to transmit the 3-bit code resulting in a coded information rate of 2*H*(*X*) × 1 ∕ 3. The erasure channel is represented in directed graph as *Y*^{3} erasure channel

*p*(

*x*) and

_{i}*p*(Π) for the encoder will be different.

Since most possible outputs are "error detected and re-transmit", there are 4 possible outcomes for each of the 4 cases. Therefore, Decodable Outcome = 4 × 4 = 16. Probabilities for these decodable outcomes was *P _{s}* = 0.896.

Therefore, for all the remaining outcomes (which the sink requests for a re-transmit), the Probabilities for the re-transmit will be

The question then is, for perfect transmission (i.e., without wrong symbol transmission)

*What is the information rate penalty?*

In other words,

*How often, on the average must a triplet be transmitted (to successfully deliver the information)?*

To answer this let us represent the transmission process graphically. Consider three states: ◯ Transmission state, ◯ Decode state and ◯ Accept state such that, probability to transmit, *P* = 1. This is shown as . For respective decoding case, the probability to decode *P _{s}* = 0.896. Thus . Also, for each case there are four possibilities but except for one the remaining possibilities are requested as re-transmit by the sink. The probability to re-transmit is

*P*= 1 −

_{r}*P*or .

_{s}*This is called a*

**Markov Process**.Let us define

*P*= 0.104 ⇒

_{r}*T*= 1.11607. Thus the expected number of times one of the triplet symbols ⟨

*x*

_{0},

*x*

_{1}, Π⟩ have to be transmitted is ≈12% increase in number of transmission (

*T*≈

*P*+ 0.12 = 1.12). ❷

_{s}
Let us also define **Effective Information Rate**

*I*(

*X*;

*Y*) = 0.77676 thus

[If *I*(*X*; *Y*) is maximized w.r.t *p*(*x _{i}*) at the source to channel capacity and . Then, transmission with arbitrary information loss is possible (eg. Turbo Codes).]

Referring to the above directed graph:

and |

*H*(

*X*) <

*H*(

*Y*)) is due to noise information. In other words, some kind of coding is necessary for above inequality.

Recall that information may be classified in terms of quality as useful and useless information. Using this classification

- Noise information is regarded useless by a communication engineer
- Noise information is regarded useful by a cryptographer. ❸