PMLE - 6

PMLE - 6

Professional Development

6 Qs

quiz-placeholder

Similar activities

Glasgow Coma Scale

Glasgow Coma Scale

Professional Development

11 Qs

Facial/lesson 2

Facial/lesson 2

Professional Development

11 Qs

Bertelsmann AI Track Quiz Initiative #2

Bertelsmann AI Track Quiz Initiative #2

University - Professional Development

10 Qs

TUBERCULOSIS SEMINAR DAY 2

TUBERCULOSIS SEMINAR DAY 2

Professional Development

10 Qs

GENERAL QA/QC PROCEDURES FOR TESTING LABORATORIES (PART 1)

GENERAL QA/QC PROCEDURES FOR TESTING LABORATORIES (PART 1)

Professional Development

10 Qs

Introduction to WTP Processes

Introduction to WTP Processes

Professional Development

11 Qs

Cold Water Systems Quiz L3P2U5,2.4:4a

Cold Water Systems Quiz L3P2U5,2.4:4a

Professional Development

10 Qs

Post Test FUS

Post Test FUS

Professional Development

10 Qs

PMLE - 6

PMLE - 6

Assessment

Quiz

Science

Professional Development

Hard

Created by

Supratim Bhattacharya

Used 1+ times

FREE Resource

6 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

What are the major stages of an end-to-end workflow to build an NLP project with Vertex AI?

Data preparation, model training, and model serving

Model deployment, model monitoring, and model serving

Model training, model evaluation, and model deployment

Dataset upload, feature engineering, and model training

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

What is the difference between continuous bag-of-words (CBOW) and skip-gram, the two primary techniques of word2vec?

CBOW uses a center word to predict surrounding words, whereas skip-gram uses surrounding words to predict a center word.

CBOW uses surrounding words to predict a center word, whereas skip-gram uses a center word to predict surrounding words.

CBOW uses previous words to predict the next word, whereas skip-gram uses the next word to predict previous words.

CBOW uses the next word to predict previous words, whereas skip-gram uses previous words to predict the next word.

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

What are the benefits of using word embedding (such as word2vec) compared to basic vectorization (such as one-hot encoding) when you convert text to vectors?

Compared to basic vectorization, which converts text to vectors without semantic meaning, word embeddings represent words in a vector space where the distance between them indicates semantic similarity and difference.

All of the above.

Compared to basic vectorization, which converts text to sparse vectors, word embedding converts text to dense vectors.

You can use pre-trained word-embedding to represent text.

4.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

What are the major gates in a standard LSTM (long short-term memory) cell?

A standard LSTM cell includes three gates: the forget gate to forget irrelevant information, the input gate to remember relevant information, and the update gate to update new information.

A standard LSTM cell includes three gates: the input gate to input information, the hidden gate to remember information, and the output gate to output information.

A standard LSTM cell includes two gates: the remember gate to remember relevant information and the forget gate to forget irrelevant information.

A standard LSTM cell includes two gates: the input gate to input information and the output gate to output information.

5.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

What is the problem that an encoder-decoder mainly solves?

None of the above.

One-to-sequence problems such as image captioning, where you generate a few sentences based on one image

Sequence-to-sequence problems such as machine translation where you translate sentences to another language

Sequence-to-one problems such as email spam detection, where you use sequence of text to predict if an email is a spam

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

How does end-to-end MLOps help ML practitioners with the machine learning life cycle?

End-to-end MLOps helps ML practitioners efficiently and responsibly manage, monitor, govern, and explain ML projects throughout the entire development lifecycle.

End-to-end MLOPs lets ML practitioners only train and tune ML models.

End-to-end MLOps lets ML practitioners only perform exploratory data analysis (EDA) and prototyping.

End-to-end MLOPs lets ML practitioners only monitor ML models.