Your organization has two Google Cloud projects, project A and project B. In project A, you have a Pub/Sub topic that receives data from confidential sources. Only the resources in project A should be able to access the data in that topic. You want to ensure that project B and any future project cannot access data in the project A topic. What should you do?
Cloud Project and Data Management Questions

Quiz
•
World Languages
•
11th Grade
•
Hard
Nadia Charcap
FREE Resource
62 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Add firewall rules in project A so only traffic from the VPC in project A is permitted.
Configure VPC Service Controls in the organization with a perimeter around project A.
Use Identity and Access Management conditions to ensure that only users and service accounts in project A can access resources in project A.
Configure VPC Service Controls in the organization with a perimeter around the VPC of project A.
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
You stream order data by using a Dataflow pipeline, and write the aggregated result to Memorystore. You provisioned a Memorystore for Redis instance with Basic Tier, 4 GB capacity, which is used by 40 clients for read-only access. You are expecting the number of read-only clients to increase significantly to a few hundred and you need to be able to support the demand. You want to ensure that read and write access availability is not impacted, and any changes you make can be deployed quickly. What should you do?
Create a new Memorystore for Redis instance with Standard Tier. Set capacity to 4 GB and read replica to No read replicas (high availability only). Delete the old instance.
Create a new Memorystore for Redis instance with Standard Tier. Set capacity to 5 GB and create multiple read replicas. Delete the old instance.
Create a new Memorystore for Memcached instance. Set a minimum of three nodes, and memory per node to 4 GB. Modify the Dataflow pipeline and all clients to use the Memcached instance. Delete the old instance.
Create multiple new Memorystore for Redis instances with Basic Tier (4 GB capacity). Modify the Dataflow pipeline and new clients to use all instances.
3.
MULTIPLE SELECT QUESTION
30 sec • 1 pt
You have a streaming pipeline that ingests data from Pub/Sub in production. You need to update this streaming pipeline with improved business logic. You need to ensure that the updated pipeline reprocesses the previous two days of delivered Pub/Sub messages. What should you do? (Choose two.)
A. Use the Pub/Sub subscription clean-retry-policy flag
B. Use Pub/Sub Snapshot capture two days before the deployment.
C. Create a new Pub/Sub subscription two days before the deployment.
D. Use the Pub/Sub subscription retain-acked-messages flag.
E. Use Pub/Sub Seek with a timestamp.
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
You currently use a SQL-based tool to visualize your data stored in BigQuery. The data visualizations require the use of outer joins and analytic functions. Visualizations must be based on data that is no less than 4 hours old. Business users are complaining that the visualizations are too slow to generate. You want to improve the performance of the visualization queries while minimizing the maintenance overhead of the data preparation pipeline. What should you do?
Create materialized views with the allow_non_incremental_definition option set to true for the visualization queries. Specify the max_staleness parameter to 4 hours and the enable_refresh parameter to true. Reference the materialized views in the data visualization tool.
Create views for the visualization queries. Reference the views in the data visualization tool.
Create a Cloud Function instance to export the visualization query results as parquet files to a Cloud Storage bucket. Use Cloud Scheduler to trigger the Cloud Function every 4 hours. Reference the parquet files in the data visualization tool.
Create materialized views for the visualization queries. Use the incremental updates capability of BigQuery materialized views to handle changed data automatically. Reference the materialized views in the data visualization tool.
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
You need to modernize your existing on-premises data strategy. Your organization currently uses: - Apache Hadoop clusters for processing multiple large data sets, including on-premises Hadoop Distributed File System (HDFS) for data replication. - Apache Airflow to orchestrate hundreds of ETL pipelines with thousands of job steps. You need to set up a new architecture in Google Cloud that can handle your Hadoop workloads and requires minimal changes to your existing orchestration processes. What should you do?
Use Bigtable for your large workloads, with connections to Cloud Storage to handle any HDFS use cases. Orchestrate your pipelines with Cloud Composer.
Use Dataproc to migrate Hadoop clusters to Google Cloud, and Cloud Storage to handle any HDFS use cases. Orchestrate your pipelines with Cloud Composer.
Use Dataproc to migrate Hadoop clusters to Google Cloud, and Cloud Storage to handle any HDFS use cases. Convert your ETL pipelines to Dataflow.
Use Dataproc to migrate your Hadoop clusters to Google Cloud, and Cloud Storage to handle any HDFS use cases. Use Cloud Data Fusion to visually design and deploy your ETL pipelines.
6.
MULTIPLE SELECT QUESTION
30 sec • 1 pt
You recently deployed several data processing jobs into your Cloud Composer 2 environment. You notice that some tasks are failing in Apache Airflow. On the monitoring dashboard, you see an increase in the total workers memory usage, and there were worker pod evictions. You need to resolve these errors. What should you do? (Choose two.)
A. Increase the directed acyclic graph (DAG) file parsing interval.
B. Increase the Cloud Composer 2 environment size from medium to large.
C. Increase the maximum number of workers and reduce worker concurrency.
D. Increase the memory available to the Airflow workers.
E. Increase the memory available to the Airflow trigger.
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
You are on the data governance team and are implementing security requirements to deploy resources. You need to ensure that resources are limited to only the europe-west3 region. You want to follow Google-recommended practices. What should you do?
Set the constraints/gcp.resourceLocations organization policy constraint to in:europe-west3-locations.
Deploy resources with Terraform and implement a variable validation rule to ensure that the region is set to the europe-west3 region for all resources.
Set the constraints/gcp.resourceLocations organization policy constraint to in:eu-locations.
Create a Cloud Function to monitor all resources created and automatically destroy the ones created outside the europe-west3 region.
Create a free account and access millions of resources
Similar Resources on Quizizz
65 questions
HPC 5 - FL1 Midterm Examination

Quiz
•
University
58 questions
Beginning Japanese Ch 6-10 kanji

Quiz
•
9th - 12th Grade
57 questions
Stem changing verbs/present tense

Quiz
•
10th - 12th Grade
64 questions
Terms Used to Descibe CAT Tools

Quiz
•
University
64 questions
ASL 2 Cycle 2 Vocabulary

Quiz
•
11th Grade
60 questions
UTS Metode Kuantitatif

Quiz
•
University
65 questions
Descritores Língua Portuguesa

Quiz
•
12th Grade
58 questions
Midel test A2.2

Quiz
•
1st Grade - University
Popular Resources on Quizizz
15 questions
Character Analysis

Quiz
•
4th Grade
17 questions
Chapter 12 - Doing the Right Thing

Quiz
•
9th - 12th Grade
10 questions
American Flag

Quiz
•
1st - 2nd Grade
20 questions
Reading Comprehension

Quiz
•
5th Grade
30 questions
Linear Inequalities

Quiz
•
9th - 12th Grade
20 questions
Types of Credit

Quiz
•
9th - 12th Grade
18 questions
Full S.T.E.A.M. Ahead Summer Academy Pre-Test 24-25

Quiz
•
5th Grade
14 questions
Misplaced and Dangling Modifiers

Quiz
•
6th - 8th Grade