GCP_Data_Engineer_Day 4

GCP_Data_Engineer_Day 4

Professional Development

10 Qs

quiz-placeholder

Similar activities

Data Sourcing, Security & Modelling

Data Sourcing, Security & Modelling

Professional Development

12 Qs

WCM - Chpt5

WCM - Chpt5

Professional Development

10 Qs

Logging GCP Day 2

Logging GCP Day 2

Professional Development

10 Qs

Google BigQuery Mastery

Google BigQuery Mastery

Professional Development

10 Qs

GCP_Data_engineer_Day5

GCP_Data_engineer_Day5

Professional Development

10 Qs

Pub/Sub GCP

Pub/Sub GCP

Professional Development

12 Qs

Cloud Computing

Cloud Computing

University - Professional Development

10 Qs

Google Cloud Platform Quiz

Google Cloud Platform Quiz

Professional Development

15 Qs

GCP_Data_Engineer_Day 4

GCP_Data_Engineer_Day 4

Assessment

Quiz

Professional Development

Professional Development

Medium

Created by

CloudThat Technologies

Used 3+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

A large enterprise using GCP has recently acquired a startup that has an IoT platform. The acquiring company wants to migrate the IoT platform from an on-premises data center to GCP and wants to use Google Cloud managed services whenever possible. What GCP service would you recommend for ingesting IoT data?

Cloud Storage

Cloud SQL

Cloud Pub/Sub

BigQuery streaming inserts

2.

MULTIPLE CHOICE QUESTION

1 min • 5 pts

You are using Cloud Pub/Sub to buffer records from an application that generates a stream of data based on user interactions with a website. The messages are read by another service that transforms the data and sends it to a machine learning model that will use it for training. A developer has just released some new code, and you notice that messages are sent repeatedly at 10-minute intervals. What might be the cause of this problem?

The new code release changed the subscription ID

The new code release changed the topic ID.

The new code disabled acknowledgments from the consumer

The new code changed the subscription from pull to push.

3.

MULTIPLE CHOICE QUESTION

1 min • 5 pts

It is considered a good practice to make your processing logic idempotent when consuming messages from a Cloud Pub/Sub topic. Why is that?

Messages may be delivered multiple times

Messages may be received out of order.

Messages may be delivered out of order

A consumer service may need to wait extended periods of time between the delivery of messages

4.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

Your department is migrating some stream processing to GCP and keeping some on premises. You are tasked with designing a way to share data from on-premises pipelines that use Kafka with GPC data pipelines that use Cloud Pub/Sub. How would you do that?

Use CloudPubSubConnector and Kafka Connect

Stream data to a Cloud Storage bucket and read from there

Write a service to read from Kafka and write to Cloud Pub/Sub

Use Cloud Pub/Sub Import Service

5.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

A team of data warehouse developers is migrating a set of legacy Python scripts that have been used to transform data as part of an ETL process. They would like to use a service that allows them to use Python and requires minimal administration and operations support. Which GCP service would you recommend?

Cloud Dataproc

Cloud Dataflow

Cloud Spanner

Cloud Dataprep

6.

MULTIPLE CHOICE QUESTION

1 min • 5 pts

 

A group of IoT sensors is sending streaming data to a Cloud Pub/Sub topic. A Cloud Dataflow service pulls messages from the topic and reorders the messages sorted by event time. A message is expected from each sensor every minute. If a message is not received from a sensor, the stream processing application should use the average of the values in the last four messages. What kind of window would you use to implement the missing data logic?

Sliding window

Tumbling window

Extrapolation window

Crossover window

7.

MULTIPLE CHOICE QUESTION

45 sec • 5 pts

You are designing a data pipeline to populate a sales data mart. The sponsor of the project has had quality control problems in the past and has defined a set of rules for filtering out bad data before it gets into the data mart. At what stage of the data pipeline would you implement those rules?

Ingestion

Storage

Transformation

Analysis

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?