Time Left - 10:00 mins

ESE EC : Communications Champion Quiz 3

Attempt now to get your rank among 264 students!

Question 1

Source S1 produces 4 discrete symbols with equal probability. Source S2 produces 6 discrets symbols with equal probability. If H1 and H2 are the entropies of sources S1 and S2 respectively then which one of the following is correct?

Question 2

Which one for the following statements is correct? The Shannon’s channel capacity formula indicates that in theory

Question 3

An analog signal is band limited to 4 kHz. Its is sampled at the Nyquist rate and samples are quantized into 4 levels. The quantization levels have probabilities The information rate of the source is

Question 4

A source produces 4 symbols with probabilities For this source, a practical coding scheme has an average code word length of 2 bits/symbol. The efficiency of the code is

Question 5

A radio channel has bandwidth of 10 kHz and an S/N ratio of 15 dB. The maximum data rate that can be transmitted is

Question 6

Consider the following codes:
1). Hamming code
2). Huffman code
3). Shannon-Fano code
4). Convolution code
Which of these are source codes?

Question 7

Assertion (A): Entropy of a binary source is maximum if the probabilities of occurrence of both the events are equal.
Reason (R): The average amount of information per source symbol is called entropy for a memory-less source.

Question 8

A good lien code should have which of the following?
1). Favorable psd
2). Low intersymbol interference
3). Adequate timing content
4). Transparency
Select the correct answer using the code given below:

Question 9

The entropy of a digital source is 2.7 bits/symbol. It is producing 100 symbols per second. The source is likely to be which one of the following

Question 10

A discrete source produces 8 symbols and is memory-less. Its entropy is
  • 264 attempts
  • 2 upvotes
  • 16 comments
Jun 9ESE & GATE EC