NLP-B2-W5

NLP-B2-W5

University

10 Qs

quiz-placeholder

Similar activities

Quiz-Session8

Quiz-Session8

University

10 Qs

Data Science Quizz

Data Science Quizz

University

14 Qs

Data Centers and IT Operations

Data Centers and IT Operations

University

11 Qs

Recap of Sessions 18 & 19

Recap of Sessions 18 & 19

11th Grade - University

10 Qs

Lesson 1 Instrumentation System

Lesson 1 Instrumentation System

University

14 Qs

MPU&MCU Quiz 1

MPU&MCU Quiz 1

University

10 Qs

Lesson 4 Interface Requirements and Communication Standards

Lesson 4 Interface Requirements and Communication Standards

University

14 Qs

MWD Technology Quiz

MWD Technology Quiz

University

10 Qs

NLP-B2-W5

NLP-B2-W5

Assessment

Quiz

Engineering

University

Medium

Created by

Prashanthi Prashanthi

Used 4+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In Kneser-Ney smoothing, what is the primary reason for using discounted probabilities in higher-order n-grams

To ensure that all n-grams, including unseen ones, receive some probability mass.

To reduce computational complexity in large language models.

To penalize frequent n-grams and favor less common ones.

To normalize the probability distribution so that it sums to one.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is true about the backoff mechanism in Kneser-Ney smoothing?

It skips lower-order n-grams entirely if a higher-order n-gram has a non-zero probability.

It backs off to lower-order n-grams by subtracting a fixed discount and redistributing the remaining probability.

It only applies to n-grams that have been observed in the training data.

It applies a constant probability mass to all lower-order n-grams irrespective of their context.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In backoff smoothing technique, we use the bigrams if the evidence for trigram is insufficient.

TRUE  

FALSE

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is a common application of the Naive Bayes classifier?

Predicting stock prices using time series analysis.

Spam detection in email filtering.

Real-time object detection in video streams.

Image classification tasks with large convolutional layers.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following types of data is the Naive Bayes classifier particularly well-suited for?

Data with a large number of missing values.

Data with categorical features where the independence assumption holds reasonably well.

Data with continuous features without any discretization.

Data with a high level of interaction between features.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following statements is true about the Naive Bayes classifier when applied to text classification tasks?

It requires feature scaling to work effectively in text classification

It cannot handle large vocabularies and is prone to overfitting

It is particularly effective because the independence assumption is often reasonably valid in the bag-of-words model

It is less effective than other classifiers due to its simplicity

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is the primary goal of a classification algorithm?

To assign input data to one of several predefined classes or categories

To reduce the dimensionality of the data

To group data points into clusters based on similarity

To predict a continuous output variable

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?