
NLP_6_7
Authored by Hazem Abdelazim
others
Used 16+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
11 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which of the following is NOT a type of word embedding technique?
d) Skip-gram
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the purpose of the Continuous Bag of Words (CBoW) model?
To predict the next word given pervious words
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the difference between static embeddings and contextualized embeddings?
b) Static embeddings are the same as binary bag of words
Static embeddings are generated using CBoW while contextualized embedding are based on skip-grams
4.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
What is the purpose of word embeddings?
To create hand-crafted features for machine learning models
To collect specially-designed data for machine learning models
to compress high dimensional sparse representation of words into a compact form
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the advantage of language models over most other machine learning models?
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What type of loss function used in both CBOW and Skip-Gram models?
D) They employ softmax functions
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the dimension of the softmax layer in the CBoW model
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?