
NLP-B2-W5
Authored by Prashanthi Prashanthi
Engineering
University
Used 4+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
10 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
In Kneser-Ney smoothing, what is the primary reason for using discounted probabilities in higher-order n-grams
To ensure that all n-grams, including unseen ones, receive some probability mass.
To reduce computational complexity in large language models.
To penalize frequent n-grams and favor less common ones.
To normalize the probability distribution so that it sums to one.
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following is true about the backoff mechanism in Kneser-Ney smoothing?
It skips lower-order n-grams entirely if a higher-order n-gram has a non-zero probability.
It backs off to lower-order n-grams by subtracting a fixed discount and redistributing the remaining probability.
It only applies to n-grams that have been observed in the training data.
It applies a constant probability mass to all lower-order n-grams irrespective of their context.
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
In backoff smoothing technique, we use the bigrams if the evidence for trigram is insufficient.
TRUE
FALSE
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following is a common application of the Naive Bayes classifier?
Predicting stock prices using time series analysis.
Spam detection in email filtering.
Real-time object detection in video streams.
Image classification tasks with large convolutional layers.
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following types of data is the Naive Bayes classifier particularly well-suited for?
Data with a large number of missing values.
Data with categorical features where the independence assumption holds reasonably well.
Data with continuous features without any discretization.
Data with a high level of interaction between features.
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following statements is true about the Naive Bayes classifier when applied to text classification tasks?
It requires feature scaling to work effectively in text classification
It cannot handle large vocabularies and is prone to overfitting
It is particularly effective because the independence assumption is often reasonably valid in the bag-of-words model
It is less effective than other classifiers due to its simplicity
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following is the primary goal of a classification algorithm?
To assign input data to one of several predefined classes or categories
To reduce the dimensionality of the data
To group data points into clusters based on similarity
To predict a continuous output variable
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?