Search Header Logo

NLP_Quiz_1

Authored by Abbas Abbasi

Instructional Technology

University

Used 1+ times

NLP_Quiz_1
AI

AI Actions

Add similar questions

Adjust reading levels

Convert to real-world scenario

Translate activity

More...

    Content View

    Student View

20 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

What is the primary goal of using the bag-of-words model in text classification?

To capture semantic meaning of words

To represent text data as a matrix of word counts or frequencies

To predict the next word in a sequence

To reduce dimensionality of text data

2.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

Which loss function is commonly used for binary text classification tasks?

Mean Squared Error

Hinge Loss

Cross-Entropy Loss

Kullback-Leibler Divergence

3.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

Which of the following statements best describes word embeddings like Word2Vec?

They use one-hot encoding to represent words

They are designed to replace traditional n-gram models

They encode the grammatical structure of sentences

They map words to fixed-size vectors in a continuous vector space based on context

4.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

In the context of word embeddings, what does the term "cosine similarity" measure?

The angle between two word vectors in the vector space, indicating their similarity

The frequency of co-occurrence of words in a corpus

The correlation between word vectors and document frequency

The Euclidean distance between two word vectors

5.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

In text classification, what does the term "n-gram" refer to?

A matrix representation of words

A model for predicting the next word based on previous words

A sequence of 'n' contiguous words or characters

A method for dimensionality reduction

6.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

Which of the following is not a common text preprocessing technique?

  • Stemming

  • Lemmatization

  • Stop word removal

  • Part-of-speech tagging

7.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

In the context of logistic regression for sentiment analysis, what does the feature X1 represent?

The frequency of a word in all tweets

The frequency of a word in negative tweets.

The frequency of a word in positive tweets.

The difference in frequency between positive and negative tweets.

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?