GCP - DE 2

GCP - DE 2

Assessment

Quiz

Computers

KG

Medium

Created by

Luis Miguel Gonzalez Hernandez

Used 2+ times

FREE Resource

Student preview

quiz-placeholder

25 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

You are building a model to predict whether or not it will rain on a given day. You have thousands of input features and want to see if you can improve training speed by removing some features while having a minimum effect on model accuracy. What can you do?

Eliminate features that are highly correlated to the output labels.

Combine highly co-dependent features into one representative feature.

Instead of feeding in each feature individually, average their values in batches of 3.

Remove the features that have null values for more than 50% of the training records.

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

You are deploying a MySQL database workload onto Cloud SQL. The database must be able to scale up to support several readers from various geographic regions. The database must be highly available and meet low RTO and RPO requirements, even in the event of a regional outage. You need to ensure that interruptions to the readers are minimal during a database failover. What should you do?

Create a highly available Cloud SQL instance in region Create a highly available read replica in region B. Scale up read workloads by creating cascading read replicas in multiple regions. Backup the Cloud SQL instances to a multi-regional Cloud Storage bucket. Restore the Cloud SQL backup to a new instance in another region when Region A is down

Create a highly available Cloud SQL instance in region A. Scale up read workloads by creating read replicas in multiple regions. Promote one of the read replicas when region A is down.

Create a highly available Cloud SQL instance in region A. Create a highly available read replica in region B. Scale up read workloads by creating cascading read replicas in multiple regions. Promote the read replica in region B when region A is down.

a)       Create a highly available Cloud SQL instance in region A. Scale up read workloads by creating read replicas in the same region. Failover to the standby Cloud SQL instance when the primary instance fais.

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

You are working on a sensitive project involving private user data. You have set up a project on Google Cloud Platform to house your work internally. An external consultant is going to assist with coding a complex transformation in a Google Cloud Dataflow pipeline for your project. How should you maintain users privacy?

Grant the consultant the Viewer role on the project.

Grant the consultant the Cloud Dataflow Developer role on the project.

Create a service account and allow the consultant to log on with it.

Create an anonymized sample of the data for the consultant to work with in a different project.

4.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Your company uses a proprietary system to send inventory data every 6 hours to a data ingestion service in the cloud. Transmitted data includes a payload of several fields and the timestamp of the transmission. If there are any concerns about a transmission, the system re-transmits the data. How should you deduplicate the data most efficiency?

Assign global unique identifiers (GUID) to each data entry.

Compute the hash value of each data entry, and compare it with all historical data.

Store each data entry as the primary key in a separate database and apply an index.

a)       Maintain a database table to store the hash value and other metadata for each data entry.

5.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

You are deploying 10,000 new Internet of Things devices to collect temperature data in your warehouses globally. You need to process, store and analyze these very large datasets in real time. What should you do?

Send the data to Google Cloud Datastore and then export to BigQuery.

Send the data to Google Cloud Pub/Sub, stream Cloud Pub/Sub to Google Cloud Dataflow, and store the data in Google BigQuery.

Send the data to Cloud Storage and then spin up an Apache Hadoop cluster as needed in Google Cloud Dataproc whenever analysis is required.

Export logs in batch to Google Cloud Storage and then spin up a Google Cloud SQL instance, import the data from Cloud Storage, and run an analysis as needed.

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

Your startup has a web application that currently serves customers out of a single region in Asia. You are targeting funding that will allow your startup to serve customers globally. Your current goal is to optimize for cost, and your post-funding goal is to optimize for global presence and performance. You must use a native JDBC driver. What should you do?

Use Cloud Spanner to configure a single region instance initially, and then configure multi-region Cloud Spanner instances after securing funding.

Use a Cloud SQL for PostgreSQL highly available instance first, and Bigtable with US, Europe, and Asia replication after securing funding.

Use a Cloud SQL for PostgreSQL zonal instance first, and Bigtable with US, Europe, and Asia after securing funding.

Use a Cloud SQL for PostgreSQL zonal instance first, and Cloud SQL for PostgreSQL with highly available configuration after securing funding.

7.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

You have several Spark jobs that run on a Cloud Dataproc cluster on a schedule. Some of the jobs run in sequence, and some of the jobs run concurrently. You need to automate this process. What should you do?

Create a Cloud Dataproc Workflow Template

Create an initialization action to execute the jobs

Create a Directed Acyclic Graph in Cloud Composer

Create a Bash script that uses the Cloud SDK to create a cluster, execute jobs, and then tear down the cluster.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?

Discover more resources for Computers