Google Prof Cloud Archi - pt 9

Google Prof Cloud Archi - pt 9

University

30 Qs

quiz-placeholder

Similar activities

Google Prof Cloud Archi - pt 4

Google Prof Cloud Archi - pt 4

University

30 Qs

CertyIQ - Google - Prof Data Eng - pt 7

CertyIQ - Google - Prof Data Eng - pt 7

University

30 Qs

Google Prof Cloud Archi - pt 5

Google Prof Cloud Archi - pt 5

University

30 Qs

Cloud Computing

Cloud Computing

University

35 Qs

Examen Parcial Base de Datos II

Examen Parcial Base de Datos II

University

35 Qs

UAS POK

UAS POK

University

30 Qs

GDSC FET Jain University Study Jam 106

GDSC FET Jain University Study Jam 106

University

25 Qs

CertyIQ - Google - Prof Data Eng - pt 4

CertyIQ - Google - Prof Data Eng - pt 4

University

30 Qs

Google Prof Cloud Archi - pt 9

Google Prof Cloud Archi - pt 9

Assessment

Quiz

Computers

University

Easy

Created by

Katheryne Pierce

Used 27+ times

FREE Resource

30 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

Operational parameters such as oil pressure are adjustable on each of TerramEarth's vehicles to increase their efficiency, depending on their environmental conditions. Your primary goal is to increase the operating efficiency of all 20 million cellular and unconnected vehicles in the field. How can you accomplish this goal?

Have you engineers inspect the data for patterns, and then create an algorithm with rules that make operational adjustments automatically

Capture all operating data, train machine learning models that identify ideal operations, and run locally to make operational adjustments automatically

Implement a Google Cloud Dataflow streaming job with a sliding window, and use Google Cloud Messaging (GCM) to make operational adjustments automatically

Capture all operating data, train machine learning models that identify ideal operations, and host in Google Cloud Machine Learning (ML) Platform to make operational adjustments automatically

2.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For this question, refer to the TerramEarth case study. To be compliant with European GDPR regulation, TerramEarth is required to delete data generated from its European customers after a period of 36 months when it contains personal data. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should you do?

Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.

Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action when with an Age condition of 36 months.

Create a BigQuery time-partitioned table for the European data, and set the partition expiration period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.

Create a BigQuery time-partitioned table for the European data, and set the partition expiration period to 36 months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action with an Age condition of 36 months.

3.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For this question, refer to the TerramEarth case study. TerramEarth has decided to store data files in Cloud Storage. You need to configure Cloud Storage lifecycle rule to store 1 year of data and minimize file storage cost. Which two actions should you take?

Create a Cloud Storage lifecycle rule with Age: 30, Storage Class: Standard, and Action: Set to Coldline, and create a second GCS life-cycle rule with Age: 365, Storage Class: Coldline, and Action: Delete.

Create a Cloud Storage lifecycle rule with Age: 30, Storage Class: Coldline, and Action: Set to Nearline, and create a second GCS life-cycle rule with Age: 91, Storage Class: Coldline, and Action: Set to Nearline.

Create a Cloud Storage lifecycle rule with Age: 90, Storage Class: Standard, and Action: Set to Nearline, and create a second GCS life-cycle rule with Age: 91, Storage Class: Nearline, and Action: Set to Coldline.

Create a Cloud Storage lifecycle rule with Age: 30, Storage Class: Standard, and Action: Set to Coldline, and create a second GCS life-cycle rule with Age: 365, Storage Class: Nearline, and Action: Delete.

4.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For this question, refer to the TerramEarth case study. You need to implement a reliable, scalable GCP solution for the data warehouse for your company, TerramEarth. Considering the TerramEarth business and technical requirements, what should you do?

Replace the existing data warehouse with BigQuery. Use table partitioning.

Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.

Replace the existing data warehouse with BigQuery. Use federated data sources.

Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an additional Compute Engine preemptible instance with 32 CPUs.

5.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For this question, refer to the TerramEarth case study. A new architecture that writes all incoming data to BigQuery has been introduced. You notice that the data is dirty, and want to ensure data quality on an automated daily basis while managing cost. What should you do?

Set up a streaming Cloud Dataflow job, receiving data by the ingestion process. Clean the data in a Cloud Dataflow pipeline.

Create a Cloud Function that reads data from BigQuery and cleans it. Trigger the Cloud Function from a Compute Engine instance.

Create a SQL statement on the data in BigQuery, and save it as a view. Run the view daily, and save the result to a new table.

Use Cloud Dataprep and configure the BigQuery tables as the source. Schedule a daily job to clean the data.

6.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For this question, refer to the TerramEarth case study. Considering the technical requirements, how should you reduce the unplanned vehicle downtime in GCP?

Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.

Use BigQuery as the data warehouse. Connect all vehicles to the network and upload gzip files to a MultiRegional Cloud Storage bucket using gcloud. Use Google Data Studio for analysis and reporting.

Use Cloud Dataproc Hive as the data warehouse. Upload gzip files to a Multi-Regional Cloud Storage bucket. Upload this data into BigQuery using gcloud. Use Google Data Studio for analysis and reporting.

Use Cloud Dataproc Hive as the data warehouse. Directly stream data into partitioned Hive tables. Use Pig scripts to analyze data.

7.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For this question, refer to the TerramEarth case study. You are asked to design a new architecture for the ingestion of the data of the 200,000 vehicles that are connected to a cellular network. You want to follow Googlerecommended practices. Considering the technical requirements, which components should you use for the ingestion of the data?

Google Kubernetes Engine with an SSL Ingress

Cloud IoT Core with public/private key pairs

Compute Engine with project-wide SSH keys

Compute Engine with specific SSH keys

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?