ML B2 CH3

ML B2 CH3

University

10 Qs

quiz-placeholder

Similar activities

ML Unit 3 Quiz

ML Unit 3 Quiz

University

10 Qs

PNL - Tema 5. Word Embeddings y modelos modernos

PNL - Tema 5. Word Embeddings y modelos modernos

University

10 Qs

M S Word Imp Quiz

M S Word Imp Quiz

University

10 Qs

4.6 Integrated Development  Environment (IDE)

4.6 Integrated Development Environment (IDE)

9th Grade - University

15 Qs

Indexing and Slicing in Python

Indexing and Slicing in Python

4th Grade - University

10 Qs

1.4 Review, navigate, and edit a document

1.4 Review, navigate, and edit a document

8th Grade - University

6 Qs

Network and Digital Communication Quiz

Network and Digital Communication Quiz

4th Grade - University

15 Qs

1.1.2b A-Level OCR CS

1.1.2b A-Level OCR CS

University

9 Qs

ML B2 CH3

ML B2 CH3

Assessment

Quiz

Computers

University

Medium

Created by

Jhonston Benjumea

Used 1+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is one key problem of statistics-based NLP techniques?
They require GPU for training
They use real-time prediction
They create huge matrices and require full-batch learning
They ignore text data

Answer explanation

Statistics-based methods create large co-occurrence matrices and need full-batch learning, which is computationally heavy.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How do inference-based techniques differ from statistics-based techniques?
They predict scores for unseen programming code
They do not need context words
They use mini-batch learning and GPU acceleration
They require a thesaurus to function

Answer explanation

Inference-based techniques learn patterns using mini-batches, allowing GPU-based parallel training.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does a one-hot vector represent in NLP?
A gradient value
A probability distribution
A vector with one element set to 1 and others to 0
A random binary pattern

Answer explanation

One-hot vectors represent words using binary vectors with only one active position.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the CBOW model in word2vec?
To convert text into audio
To predict the next sentence
To predict a target word from context words
To translate between languages

Answer explanation

CBOW (Continuous Bag of Words) predicts the center (target) word using the surrounding context.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does the output layer in CBOW represent?
Loss values
Weights only
Scores for each word in the vocabulary
Sentence embeddings

Answer explanation

The output layer produces scores for each vocabulary word; softmax is applied to get probabilities.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which function converts output scores into probabilities in CBOW?
ReLU
Sigmoid
Tanh
Softmax

Answer explanation

Softmax converts the output scores into a probability distribution over all vocabulary words.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Why are only input-side weights often used in word2vec representation?
They require less memory
They are easier to normalize
They provide a meaningful vector representation of word semantics
They include grammatical rules

Answer explanation

Input-side weights in word2vec reflect word meanings and are commonly used in vector space models.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?