Skip to content

Assignment 4

Student ID: 1W23CF13 Name: TAI Yungche Course: Information Theory

Question 1

Part (a)

Given the ternary confusion channel with input alphabet X = {a, b, c} and output alphabet Y = {M, N}, the transition probability matrix is:

\(Q = \begin{bmatrix} 1 & 1/2 & 0 \\ 0 & 1/2 & 1 \end{bmatrix}\)

For a uniform input distribution \(P_X = (\frac{1}{3}, \frac{1}{3}, \frac{1}{3})\), find the mutual information using \(I(X;Y) = H(Y) - H(Y|X)\).

Conditional entropy \(H(Y|X)\):

\(H(Y|X) = \sum_{x \in X} p(x) H(Y|X=x)\)

For each input:

  • When \(x=a\): \(p(y|a) = (1, 0)\), so \(H(Y|X=a) = 0\)
  • When \(x=b\): \(p(y|b) = (1/2, 1/2)\), so \(H(Y|X=b) = 1\) bit
  • When \(x=c\): \(p(y|c) = (0, 1)\), so \(H(Y|X=c) = 0\)

Therefore: \(H(Y|X) = \frac{1}{3}(0) + \frac{1}{3}(1) + \frac{1}{3}(0) = \frac{1}{3}\) bits

Output probability distribution:

  • \(p(M) = \frac{1}{3}(1) + \frac{1}{3}(\frac{1}{2}) + \frac{1}{3}(0) = \frac{1}{2}\)
  • \(p(N) = \frac{1}{3}(0) + \frac{1}{3}(\frac{1}{2}) + \frac{1}{3}(1) = \frac{1}{2}\)

So \(P_Y = (\frac{1}{2}, \frac{1}{2})\) and \(H(Y) = 1\) bit.

Mutual Information:

\(I(X;Y) = H(Y) - H(Y|X) = 1 - \frac{1}{3} = \frac{2}{3}\) bits

Part (b)

Let the input distribution be \(P_X = (p_a, p_b, p_c)\) where \(p_a + p_b + p_c = 1\).

The conditional entropy becomes: \(H(Y|X) = p_a(0) + p_b(1) + p_c(0) = p_b\)

The output probabilities are:

  • \(p(M) = p_a + \frac{p_b}{2}\)
  • \(p(N) = p_c + \frac{p_b}{2}\)

So the mutual information is: \(I(X;Y) = H(p_a + p_b/2, p_c + p_b/2) - p_b\)

The entropy term is at most 1 for a binary distribution, and the \(-p_b\) term is maximized when \(p_b = 0\).

When \(p_b = 0\), the input distribution is \((p_a, 0, p_c)\) with \(p_a + p_c = 1\), and: \(I(X;Y) = H(p_a, p_c) = H(p_a)\)

This is maximized when \(p_a = p_c = \frac{1}{2}\), giving \(I(X;Y) = 1\) bit.

The channel capacity is C = 1 bit, done by the optimal input distribution \(P_X^* = (\frac{1}{2}, 0, \frac{1}{2})\).

Question 3

For the (7,4) Hamming code, use the Venn diagram representation where data bits are placed in intersection regions and parity bits in non-intersecting parts. The mapping is:

  • Data bits: \(w_1 \to c_3\), \(w_2 \to c_5\), \(w_3 \to c_6\), \(w_4 \to c_7\)
  • Parity bits: \(p_1 \to c_1\), \(p_2 \to c_2\), \(p_3 \to c_4\)

Each circle must have even parity.

Part (a)

For the message word \(w = 1010\), place the data bits:

  • \(c_3 = 1\), \(c_5 = 0\), \(c_6 = 1\), \(c_7 = 0\)

Parity bits:

Circle 1 (bits \(c_1, c_3, c_5, c_7\)): \(c_1 \oplus 1 \oplus 0 \oplus 0 = 0 \implies c_1 = 1\)

Circle 2 (bits \(c_2, c_3, c_6, c_7\)): \(c_2 \oplus 1 \oplus 1 \oplus 0 = 0 \implies c_2 = 0\)

Circle 3 (bits \(c_4, c_5, c_6, c_7\)): \(c_4 \oplus 0 \oplus 1 \oplus 0 = 0 \implies c_4 = 1\)

The encoded codeword is \(c = 1011010\).

Part (b)

For \(r = 1011101\), check the parity of each circle:

Circle 1: \(1 \oplus 1 \oplus 1 \oplus 1 = 0\) (even, correct) Circle 2: \(0 \oplus 1 \oplus 0 \oplus 1 = 0\) (even, correct)
Circle 3: \(1 \oplus 1 \oplus 0 \oplus 1 = 1\) (odd, incorrect)

The error is in the region that's inside Circle 3 but outside Circles 1 and 2, which corresponds to bit \(c_4\). Flip the bit to get the corrected codeword \(c' = (1,0,1,0,1,0,1)\).

Extract the data bits from the corrected codeword: \(w_1 = c'_3 = 1\), \(w_2 = c'_5 = 1\), \(w_3 = c'_6 = 0\), \(w_4 = c'_7 = 1\)

Decoded message word: \(w = 1101\).