ANSWERTRIVIA.COM: We ask you, humbly: don't scroll away.

Dear Reader, If you use ANSWERTRIVIA a lot, this message is for you. We're sure you are busy so we'll make this quick: Today we need your help. We don't have salespeople. We depend on donations from exceptional readers, but fewer than 2% give. If you donate just a coffee, lunch or whatever you can today, ANSWERTRIVIA could keep thriving. Thank you.
(Secure PayPal)
*Everything counts! No minimum threshold!
Thank you for inspiring us!

Enter Another Question

3/3/15

Digital Communications Laboratory Viva Questions - Part 5



Theory_Questions:


  1. Two binary random variables X and Y are distributed according to the joint distributions p(X=0, Y=0) =P(X=0, Y=1) =P(X=Y=1) = 1/3 Calculate Entropy of Random Variable X.

  1. 0.82
  2. 0.72
  3. 0.92
  4. 0.80


  1. In the above problem calculate the joint Entropy of random variable X and Y

  1. 1.5850
  2. 1.5750
  3. 1.4860
  4. 1.5840

  1. In the above problem calculate the conditional entropy of random variable y given X H(Y/X)

  1. 0.5661
  2. 0.6667
  3. 0.6468
  4. None

  1. In the above problem calculate the average Mutual information of random variable X and Y

  1. 0.1165
  2. 0.5216
  3. 0.2516
  4. 0.2416



  1. A discrete Memory less source emits three symbols X1,X2,X3 with corresponding probability P(x1)=0.25,P(X2)=0.25,P(X3)=0.5 the random variable X is defined by Y=2X+3 calculate the Entropy of random variable Y?

  1. 1.8
  2. 1.5
  3. 2.5
  4. 1.45

  1. Consider a binary Symmetric channel which is described by transition probabilities P (1/0) =P (0/1) =1/2. Calculate the channel capacity of the BSC

  1. 1
  2. ½
  3. 0
  4. None of the above

  1. A discrete memoryless source emits symbols with probabilities 0.45, 0.35, 0.20. Find out which of the value not an average code word length of symbol emitted by discrete memory less source

  1. 2.5123
  2. 1.5123
  3. 2.0123
  4. 2.4123

  1. A memoryless source has the alphabet X={-5,-3,-1,0,1,3,5} with corresponding probability {0.05,0.1,0.1,0.15,0.05,0.25,0.3} Assuming that the Source is quantized according to the Quantizing rule
q (-5) =q (-3) =4
q (-1) = q (0) = q (1) =0
q (3) =q (5) =4
Find the entropy of the quantized source.
  1. 1.5060
  2. 1.4163
  3. 1.3061
  4. 1.4060




  1. If the band pass signal X(t) is passed through Band Pass System the output is Y(t) then instead of Band pass System if Low Pass system is used than input signal X(t) and output signal Y(t) are converted in to _____________and ______________respectively to function like Band pass system .

  1. Low pass signal and Band pass signal
  2. Up converted and Down Converted
  3. Down converted and Up converted
  4. Both should be down converted
  1. In M-PAM, number of levels if b-bits are transmitted

  1. M-levels
  2. 2b-1 levels
  3. 2b-1levels
  4. M+b levels



Answers
  1. C
  2. A
  3. B
  4. C
  5. B
  6. C
  7. A
  8. D
  9. B
  10. A

No comments:

Post a Comment