
Data 50 preg (311-261) v1
Authored by Academia Google
Computers
12th Grade
Used 7+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
50 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
You need to look at BigQuery data from a specific table multiple times a day. The underlying table you are querying is several petabytes in size, but you want to filter your data and provide simple aggregations to downstream users. You want to run queries faster and get up-to-date insights quicker. What should you do?
Run a scheduled query to pull the necessary data at specific intervals dally.
Use a cached query to accelerate time to results.
Limit the query columns being pulled in the final result.
Create a materialized view based off of the query being run.
2.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
Your chemical company needs to manually check documentation for customer order. You use a pull subscription in Pub/Sub so that sales agents get details from the order. You must ensure that you do not process orders twice with different sales agents and that you do not add more complexity to this workflow. What should you do?
Use a Deduplicate PTransform in Dataflow before sending the messages to the sales agents.
Create a transactional database that monitors the pending messages.
Use Pub/Sub exactly-once delivery in your pull subscription.
Create a new Pub/Sub push subscription to monitor the orders processed in the agent's system.
3.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
You work for an airline and you need to store weather data in a BigQuery table. Weather data will be used as input to a machine learning model. The model only uses the last 30 days of weather data. You want to avoid storing unnecessary data and minimize costs. What should you do?
Create a BigQuery table where each record has an ingestion timestamp. Run a scheduled query to delete all the rows with an ingestion timestamp older than 30 days.
Create a BigQuery table partitioned by datetime value of the weather date. Set up partition expiration to 30 days.
Create a BigQuery table partitioned by ingestion time. Set up partition expiration to 30 days.
Create a BigQuery table with a datetime column for the day the weather data refers to. Run a scheduled query to delete rows with a datetime value older than 30 days.
4.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
What should you do?
5.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
You want to ensure that there is minimal latency when reading the data. What should you do?
6.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
You currently have transactional data stored on-premises in a PostgreSQL database. To modernize your data environment, you want to run transactional workloads and support analytics needs with a single database. You need to move to Google Cloud without changing database management systems, and minimize cost and complexity. What should you do?
Migrate and modernize your database with Cloud Spanner.
Migrate your workloads to AlloyDB for PostgreSQL.
Migrate to BigQuery to optimize analytics.
Migrate your PostgreSQL database to Cloud SQL for PostgreSQL.
7.
MULTIPLE CHOICE QUESTION
2 mins • 1 pt
What data model should you use?
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?