Question 2 A discrete source transmits messages X1, X2, X; with probabilities p( x ) = 0.3, p( x ) = 0.25, p (X) = 0.45. The source is connected to the channel and the conditional probability matrix is given as below: yı V2 X1 50.9 0.1 01 P(Y/X) = X20 0.8 0.2 X3 10 0.3 0.7 Calculate all the entropies and mutual information with this channel. Hint: Steps to be followed: • Obtain the joint probability matrix P (X,Y). • Obtain the probabilities ply). p (Y).p (y). • Obtain the conditional probability matrix P (X/Y) • Obtain the marginal densities H(X) and H (Y). • Calculate the conditional entropy H (X/Y). • Calculate the joint entropy H (X,Y). • Calculate the mutual information I (X,Y). I[X, Y] =H(X)-H (X/Y) [30 marks)

35 0

Get full Expert solution in seconds

$1.97 ONLY

Unlock Answer

EXPERT ANSWER