C5M2

C5M2

University

10 Qs

quiz-placeholder

Similar activities

ISP and Data Packets Part 2

ISP and Data Packets Part 2

University

15 Qs

IATB - Quiz 1-1

IATB - Quiz 1-1

University

10 Qs

Sumatif 2-TIK Kelas 8

Sumatif 2-TIK Kelas 8

8th Grade - University

15 Qs

ComNet Lecture 1-1

ComNet Lecture 1-1

University

12 Qs

Contpaqi Contabilidad Sesión 5 ultima

Contpaqi Contabilidad Sesión 5 ultima

University

12 Qs

REMID PTS AK1

REMID PTS AK1

10th Grade - University

15 Qs

Microsoft Word 2019 Advanced – Unit 1 (Review of Basic Concepts)

Microsoft Word 2019 Advanced – Unit 1 (Review of Basic Concepts)

10th Grade - University

15 Qs

Back to School – Brain Recall Quiz (Class X)

Back to School – Brain Recall Quiz (Class X)

10th Grade - University

10 Qs

C5M2

C5M2

Assessment

Quiz

Information Technology (IT)

University

Practice Problem

Medium

Created by

Abylai Aitzhanuly

Used 1+ times

FREE Resource

AI

Enhance your content in a minute

Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Suppose you learn a word embedding for a vocabulary of 10000 words. Then the embedding vectors should be 10000 dimensional, so as to capture the full range of variation and meaning in those words.

TRUE

FALSE

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is t-SNE?

A linear transformation that allows us to solve analogies on word vectors

A non-linear dimensionality reduction technique

A supervised learning algorithm for learning word embeddings

An open-source sequence modeling library

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

TRUE

FALSE

4.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Which of these equations do you think should hold for a good word embedding? (Check all that apply)

eboy - egirl ≈ ebrother - esister

eboy - egirl ≈ esister - ebrother

eboy - ebrother ≈ egirl - esister

eboy - ebrother ≈ esister - egirl

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Let EE be an embedding matrix, and let o1234 be a one-hot vector corresponding to word 1234. Then to get the embedding of word 1234, why don’t we call E * o1234 in Python?

It is computationally wasteful.

The correct formula is ET* o1234.

This doesn’t handle unknown words ().

None of the above: calling the Python snippet as described above is fine.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

When learning word embeddings, we create an artificial task of estimating P(target∣context). It is okay if we do poorly on this artificial prediction task; the more important by-product of this task is that we learn a useful set of word embeddings.

TRUE

FALSE

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In the word2vec algorithm, you estimate P(t∣c), where t is the target word and c is a context word. How are t and c chosen from the training set? Pick the best answer.

c is a sequence of several words immediately before t.

c is the one word that comes immediately before t.

c and t are chosen to be nearby words.

c is the sequence of all the words in the sentence before t.

Access all questions and much more by creating a free account

Create resources

Host any resource

Get auto-graded reports

Google

Continue with Google

Email

Continue with Email

Classlink

Continue with Classlink

Clever

Continue with Clever

or continue with

Microsoft

Microsoft

Apple

Apple

Others

Others

Already have an account?