GATE Questions & Answers of Information theory: entropy, mutual information and channel capacity theorem

What is the Weightage of Information theory: entropy, mutual information and channel capacity theorem in GATE Exam?

Total 10 Questions have been asked from Information theory: entropy, mutual information and channel capacity theorem topic of Communications subject in previous GATE papers. Average marks 1.70.

Consider a discrete memoryless source with alphabet S={s0,s1,s2,s3,s4,......} and respective probabilities of occurrence P=12,14,18,116,132,..... The entropy of the source (in bits) is _______

A digital communication system uses a repetition code for channel encoding/decoding. During transmission, each bit is repeated three times (0 is transmitted as 000, and 1 is transmitted as 111). It is assumed that the source puts out symbols independently and with equal probability. The decoder operates as follows: In a block of three received bits, if the number of zeros exceeds the number of ones, the decoder decides in favor of a 0, and if the number of ones exceeds the number of zeros, the decoder decides in favor of a 1. Assuming a binary symmetric channel with crossover probability p = 0.1, the average probability of error is ________

A discrete memoryless source has an alphabet {a1, a2, a3, a4} with corresponding probabilities 12, 14,18,18. The minimum required average codeword length in bits to represent this source for error-free reconstruction is ________

A binary communication system makes use of the symbols “zero” and “one”. There are channel errors. Consider the following events: 
x0 : a "zero" is transmitted
x1 : a "one" is transmitted
y0 : a "zero" is received
y1 : a "one" is received
The following probabilities are given: $P\left(x_0\right)=\frac12,\;P\left(y_0\vert x_0\right)=\frac34$ and $ P\left(y_0\vert x_1\right)=\frac12. $ The information in bits that you obtain when you learn which symbol has been received (while you know that a “zero” has been transmitted) is ________

 

An analog baseband signal, bandlimited to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur independently with probabilities p1 = p4 = 0.125 and p2 = p3. The information rate (bits/sec) of the message source is __________

A voice-grade AWGN (additive white Gaussian noise) telephone channel has a bandwidth of 4.0 kHz and two-sided noise power spectral density η2=2.5×10-5 Watt per Hz. If information at the rate of 52 kbps is to be transmitted over this channel with arbitrarily small bit error rate, then the minimum bit-energy Eb (in mJ/bit) necessary is __________

Consider two identically distributed zero-mean random variables U and V . Let the cumulative distribution functions of U and 2V be F(x) and G(x) respectively. Then, for all values of x

A source alphabet consists of N symbols with the probability of the first two symbols being the same. A source encoder increases the probability of the first symbol by a small amount ε and decreases that of the second by ε. After encoding, the entropy of the source

A communication channel with AWGN operating at a signal to noise ratio SNR >>1 and bandwidth B has capacity C1. If the SNR is doubled keeping B constant, the resulting capacity C2 is given by

A memoryless source emits n symbols each with a probability p. The entropy of the source as a function of n