NLP310c quiz4

NLP310c quiz4

University

20 Qs

quiz-placeholder

Similar activities

A1 - Quizs - Bloque 2

A1 - Quizs - Bloque 2

University

20 Qs

Model Pembelajaran dalam Pendekatan Saintifik

Model Pembelajaran dalam Pendekatan Saintifik

University - Professional Development

15 Qs

FINAL ENGLISH SUBJECT

FINAL ENGLISH SUBJECT

University

18 Qs

TECHNOP- BATTLE

TECHNOP- BATTLE

University

20 Qs

NLP301c quiz6

NLP301c quiz6

University

20 Qs

Why Were Light Novel Fans Furious at Classroom of the Elite?

Why Were Light Novel Fans Furious at Classroom of the Elite?

University

24 Qs

Practicum Context

Practicum Context

University

15 Qs

Translation Theory - Chapter 5 Quiz

Translation Theory - Chapter 5 Quiz

University

21 Qs

NLP310c quiz4

NLP310c quiz4

Assessment

Quiz

World Languages

University

Easy

Created by

Nguyen Duc Chinh - HE150974

Used 1+ times

FREE Resource

20 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which is not a type of Deixis?

Simple

Person

Spatial

Temporal

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Parsing determines Parse Trees (Grammatical Analysis) for a given sentence.

True

False

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following NLP tasks use sequential labeling technique?

POS tagging

Named Entity Recognition

Speech recognition

All of the above

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In lexical semantics, we do study of_________

Multiple words

Group of sentances

Individual words

All of the above

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What are the examples of Non-classical IR models?

Information logic model

situation theory model

interaction models

All of the above

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following is not a problem when using Maximum Likelihood Estimation to obtain the parameters in a language model?

Out-of-vocabulary items

Over-fitting

Smoothing

Unreliable estimates when there is little training data

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following statement is(are) true for Word2Vec model?

The architecture of word2vec consists of only two layers - continuous bag of words and skip-gram model

Continuous bag of word (CBOW) is a Recurrent Neural Network model

Both CBOW and Skip-gram are shallow neural network models

All of the above

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?