題組內容

4. (Total-18%) Consider a binary symmetric channel with input X, output Y, and transition probability a. More specifically, as shown in the figure below, the input and output of the channel may be "0" or "1", P(Y=0|X=0)= P(Y = 1|X = 1) = 1 - a and P(Y = 1|X = 0) = P(Y=0|X=0) = a. The prior probability is P(X = 0) = p.
61d79392e2814.jpg

(d) (5%) Suppose the prior probability is not known in advance. We manage to produce an estimated prior probability 61d794fec2221.jpg which may or may not be equal to the true prior probability P(X = 0) = p. In this case, the cross- entropy between the true and estimated prior distributions P and61d7954804d3c.jpg is defined by 61d795701e320.jpg, which can be considered as an approximated entropy of X. Please show that the cross- entropy 61d79596db4b5.jpg is always no less than the true entropy H(X) of X, i.e. 61d795c2456de.jpg
 (Hint: You may use the Jensen's inequality: plog2 a + (1 -p)log2≤ b < log2(pa +(1 -p)b) for 0 ≤p ≤ 1, a > 0, and b >0.)