DIGITAL COMMUNICATION MCQ

DIGITAL COMMUNICATION MCQ

University

45 Qs

quiz-placeholder

Similar activities

Bio Stems 1&2

Bio Stems 1&2

9th Grade - University

40 Qs

AC/DC Electrical Circuits 1 part 2

AC/DC Electrical Circuits 1 part 2

University

40 Qs

PH8252-PIS MCQ TEST(30.1.20)

PH8252-PIS MCQ TEST(30.1.20)

University

45 Qs

fundamental of electricity

fundamental of electricity

University

50 Qs

Electronic Filters and ADCs in Biomedical Applications

Electronic Filters and ADCs in Biomedical Applications

University

40 Qs

Instrumentation and Sensor Technologies - 2

Instrumentation and Sensor Technologies - 2

University

42 Qs

Electronics

Electronics

10th Grade - University

50 Qs

Mesh Analysis

Mesh Analysis

University

50 Qs

DIGITAL COMMUNICATION MCQ

DIGITAL COMMUNICATION MCQ

Assessment

Quiz

Physics

University

Medium

Created by

Duraibabu Ayyadurai

Used 17+ times

FREE Resource

45 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

The process of converting the analog sample into discrete form is called
Modulation
 Multiplexing
Quantization
Sampling

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

The negative statement for Shannon's theorem states that
If R > C, the error probability increases towards Unity
 If R < C, the error probability is very small
 Both a & b
None of the above

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

The technique that may be used to increase average information per bit is
Shannon-Fano algorithm
 ASK
FSK
Digital modulation techniques

4.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

 For a binary symmetric channel, the random bits are given as
Logic 1 given by probability P and logic 0 by (1-P)
Logic 1 given by probability 1-P and logic 0 by P
Logic 1 given by probability P2 and logic 0 by 1-P
Logic 1 given by probability P and logic 0 by (1-P)2

5.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Information rate is defined as
Information per unit time
 Average number of bits of information per second
 rH
All of the above

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

The relation between entropy and mutual information is
 I(X;Y) = H(X) - H(X/Y)
I(X;Y) = H(X/Y) - H(Y/X)
I(X;Y) = H(X) - H(Y)
I(X;Y) = H(Y) - H(X)

7.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

The memory less source refers to
No previous information
No message storage
Emitted message is independent of previous message
None of the above

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?