Vertex AI Pipelines V1

Vertex AI Pipelines V1

12th Grade

10 Qs

quiz-placeholder

Similar activities

Intro to Drones

Intro to Drones

9th - 12th Grade

14 Qs

Arduino

Arduino

12th Grade

10 Qs

Tools y preguntas específicas

Tools y preguntas específicas

12th Grade

10 Qs

1.1.1c/d - CPU Performance & Pipelining

1.1.1c/d - CPU Performance & Pipelining

12th Grade

13 Qs

Pipelining in Processors

Pipelining in Processors

9th - 12th Grade

8 Qs

Artificial Intelligence

Artificial Intelligence

8th - 12th Grade

15 Qs

Instruction Level Parallelism

Instruction Level Parallelism

12th Grade

10 Qs

Software Design and Development Quick Quiz 3

Software Design and Development Quick Quiz 3

11th - 12th Grade

10 Qs

Vertex AI Pipelines V1

Vertex AI Pipelines V1

Assessment

Quiz

Computers

12th Grade

Hard

Created by

Academia Google

Used 9+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

Media Image

What should you do?

Comment out the part of the pipeline that you are not currently updating.

Enable caching in all the steps of the Kubeflow pipeline.

Delegate feature engineering to BigQuery and remove it from the pipeline.

Add a GPU to the model training step.

2.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

Media Image

What should you do?

Create a pipeline in Vertex AI Pipelines. Configure the first step to compare the contents of the bucket to the last time the pipeline was run. Use the scheduler API to run the pipeline periodically.

Create a Cloud Function that uses a Cloud Storage trigger and deploys a Cloud Composer directed acyclic graph (DAG).

Create a pipeline in Vertex AI Pipelines. Create a Cloud Function that uses a Cloud Storage trigger and deploys the pipeline.

Deploy a Cloud Composer directed acyclic graph (DAG) with a GCSObjectUpdateSensor class that detects when a new file is added to the Cloud Storage bucket.

3.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

Media Image

What should you do?

Load the images directly into the Vertex AI compute nodes by using Cloud Storage FUSE. Read the images by using the tf.data.Dataset.from_tensor_slices function.

Create a Vertex AI managed dataset from your image data. Access the AIP_TRAINING_DATA_URI environment variable to read the images by using the tf.data.Dataset.list_files function.

Convert the images to TFRecords and store them in a Cloud Storage bucket. Read the TFRecords by using the tf.data.TFRecordDataset function.

Store the URLs of the images in a CSV file. Read the file by using the tf.data.experimental.CsvDataset function.

4.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

Media Image

What should you do?

Media Image
Media Image
Media Image
Media Image

5.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

Media Image

What should you do?

Use the Apache Airflow SDK to create multiple operators that use Dataflow and Vertex AI services. Deploy the workflow on Cloud Composer.

Use the MLFlow SDK and deploy it on a Google Kubernetes Engine cluster. Create multiple components that use Dataflow and Vertex AI services.

Use the Kubeflow Pipelines (KFP) SDK to create multiple components that use Dataflow and Vertex AI services. Deploy the workflow on Vertex AI Pipelines.

Use the TensorFlow Extended (TFX) SDK to create multiple components that use Dataflow and Vertex AI services. Deploy the workflow on Vertex AI Pipelines.

6.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

You have trained a model by using data that was preprocessed in a batch Dataflow pipeline. Your use case requires real-time inference. You want to ensure that the data preprocessing logic is applied consistently between training and serving. What should you do?

Perform data validation to ensure that the input data to the pipeline is the same format as the input data to the endpoint.

Refactor the transformation code in the batch data pipeline so that it can be used outside of the pipeline. Use the same code in the endpoint.

Refactor the transformation code in the batch data pipeline so that it can be used outside of the pipeline. Share this code with the end users of the endpoint.

Batch the real-time requests by using a time window and then use the Dataflow pipeline to preprocess the batched requests. Send the preprocessed requests to the endpoint.

7.

MULTIPLE CHOICE QUESTION

3 mins • 1 pt

Media Image

How should you develop the training pipeline?

Media Image
Media Image
Media Image
Media Image

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?