NLP_6_7

NLP_6_7

11 Qs

quiz-placeholder

Similar activities

DAILY ACTIONS #1

DAILY ACTIONS #1

KG

10 Qs

Atividade Diagnóstica

Atividade Diagnóstica

KG - University

15 Qs

NS.1 Integer Operation Quiz

NS.1 Integer Operation Quiz

KG - University

12 Qs

Qatar's Transportation - Quiz

Qatar's Transportation - Quiz

9th Grade

15 Qs

EQPE 9: Apostrophes

EQPE 9: Apostrophes

KG - University

10 Qs

NLP_6_7

NLP_6_7

Assessment

Quiz

others

Medium

Created by

Hazem Abdelazim

Used 14+ times

FREE Resource

11 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is NOT a type of word embedding technique?
a) Word2Vec
b) GLoVE
c) Bag of Words

d) Skip-gram

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the purpose of the Continuous Bag of Words (CBoW) model?
To predict the context words given a target word
b) To predict the target word given a set of context words
c) To generate new text based on a given set of words

To predict the next word given pervious words

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the difference between static embeddings and contextualized embeddings?
Static embeddings are trained on a specific task, while contextualized embeddings are trained on a general task

b) Static embeddings are the same as binary bag of words

c) Static embeddings are fixed for all instances of a word, while contextualized embeddings vary depending on the context of the word

Static embeddings are generated using CBoW while contextualized embedding are based on skip-grams

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

What is the purpose of word embeddings?
To capture the meaning of words in a numeric format

To create hand-crafted features for machine learning models

To collect specially-designed data for machine learning models

to compress high dimensional sparse representation of words into a compact form

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the advantage of language models over most other machine learning models?
They need hand-crafted features and specially-collected data
b) They can be trained on running text in a self-supervised manner
c) They require a small corpus of text data
d) They are much smarter

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What type of loss function used in both CBOW and Skip-Gram models?
A) sigmoid
B) They exclusively use a mean squared error loss function.
C) The loss functions prioritize syntactic accuracy over semantic meaning.

D) They employ softmax functions

E) Loss functions in these models are irrelevant as long as the embeddings are accurate.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the dimension of the softmax layer in the CBoW model
one as this is a binary classification
Vocabulary size
equals the embedding dimension used
the same as the dimension of the weight matrix

Create a free account and access millions of resources

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

By signing up, you agree to our Terms of Service & Privacy Policy

Already have an account?