Data Science Model Deployments and Cloud Computing on GCP - Lab - Final Pipeline Visualization Using Vertex UI and Walkt

Data Science Model Deployments and Cloud Computing on GCP - Lab - Final Pipeline Visualization Using Vertex UI and Walkt

Assessment

Interactive Video

Information Technology (IT), Architecture, Social Studies

University

Hard

Created by

Quizizz Content

FREE Resource

The video tutorial explains the successful execution of a pipeline with three stages, taking about 15 minutes. It covers the deployment process and the importance of selecting the correct region. The tutorial then delves into the pipeline root folder, detailing the contents of each step: data retrieval, training, and deployment. It highlights the presence of executor output files and other artifacts. The video concludes by introducing the concept of model evaluation, emphasizing its role in deciding whether to deploy a trained model, which will be explored in the next video.

Read more

5 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How long did the pipeline execution take?

10 minutes

15 minutes

20 minutes

25 minutes

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What does the status indicate after clicking on the endpoint?

Ready

Not Ready

In Progress

Failed

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What are the three steps or components in the ML workflow mentioned?

Data Collection, Training, Deployment

Data Cleaning, Training, Evaluation

Data Collection, Evaluation, Deployment

Training, Testing, Deployment

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What files can be found in the 'Get CC fraud data' directory?

Test dataframe, Train dataframe, Executor output JSON file

Model file, Output artifact, Executor output JSON file

Data file, Configuration file, Log file

None of the above

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the significance of model evaluation in the workflow?

It ensures the model is trained correctly

It determines if the model should be deployed

It is not necessary

It only checks the data quality