NLP-2

NLP-2

University

20 Qs

quiz-placeholder

Similar activities

Moral philosophers

Moral philosophers

University

20 Qs

Power Electronic TEST

Power Electronic TEST

University

20 Qs

Material Handling

Material Handling

University - Professional Development

15 Qs

Vehicle Body Material

Vehicle Body Material

University

20 Qs

LR,DT,RF & CLUSTERING

LR,DT,RF & CLUSTERING

University

20 Qs

Chapter 2 Forecasting

Chapter 2 Forecasting

University

20 Qs

QUIZ PRAKTIKUM PTI-2

QUIZ PRAKTIKUM PTI-2

University

15 Qs

UJIAN AKHIR KEGIATAN PRAKTIKUM PTI-2

UJIAN AKHIR KEGIATAN PRAKTIKUM PTI-2

University

15 Qs

NLP-2

NLP-2

Assessment

Quiz

Education

University

Medium

Created by

RANGASWAMY K

Used 1+ times

FREE Resource

20 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is an N-gram in word-level analysis?

A method for visualizing data

A contiguous sequence of N words from a text

A machine learning model for classification

A type of encryption algorithm

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In the context of N-grams, what does an "unsmoothed N-gram" model lack?

Accurate representation of word frequencies

A mechanism to handle zero probabilities for unseen word combinations

The ability to create sequences of words

The ability to identify unique words

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is a major limitation of unsmoothed N-grams?

They are computationally efficient.

They fail to account for unseen word combinations, assigning zero probability.

They require preprocessing of data.

They always generate incorrect probabilities for frequent words.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of smoothing in N-gram models?

To improve the visualization of text

To assign non-zero probabilities to unseen word combinations

To identify the most frequent words in a text

To reduce the length of the N-grams

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is a commonly used smoothing technique?

Bagging

Laplace (Add-One) Smoothing

Principal Component Analysis (PCA)

Tokenization

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How does Laplace (Add-One) Smoothing work?

It adds a constant to all probabilities to normalize them.

It adds one to the count of each word or N-gram and recalculates probabilities.

It removes all low-frequency words from the analysis.

It reduces the size of the vocabulary.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which smoothing technique is more advanced and accounts for the probability of unseen events in N-grams?

Laplace Smoothing

Backoff Smoothing

N-gram Filtering

Token Replacement

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?