ACE 2

ACE 2

12th Grade

12 Qs

quiz-placeholder

Similar activities

Windows 10 Quiz

Windows 10 Quiz

12th Grade

15 Qs

Windows Server Management Quiz

Windows Server Management Quiz

9th - 12th Grade

15 Qs

History of Graphic Design

History of Graphic Design

9th - 12th Grade

10 Qs

Technology and design (UNIT 4 LESSON A)

Technology and design (UNIT 4 LESSON A)

12th Grade - Professional Development

8 Qs

Command Line Tools

Command Line Tools

12th Grade

16 Qs

GSuite Basics

GSuite Basics

9th - 12th Grade

12 Qs

Intro to Python Project Stem Unit 7 Vocabulary  Test

Intro to Python Project Stem Unit 7 Vocabulary Test

9th - 12th Grade

7 Qs

Konsep kelas dan objek

Konsep kelas dan objek

9th - 12th Grade

10 Qs

ACE 2

ACE 2

Assessment

Quiz

Instructional Technology

12th Grade

Hard

Created by

David Valladares

Used 3+ times

FREE Resource

12 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Cloud Logging agent on all the instances. You want to minimize cost. What should you do? 

1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances. 2. Update your instances' metadata to add the following value: logs-destination: bq://platform-logs. 

1. In Cloud Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink. 2. Create a Cloud Function that is triggered by messages in the logs topic. 3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.

1. In Cloud Logging, create a filter to view only Compute Engine logs. 2. Click Create Export. 3. Choose BigQuery as Sink Service, and the platform-logs dataset as Sink Destination.

1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset. 2. Configure this Cloud Function to create a BigQuery Job that executes this query: INSERT INTO dataset.platform-logs (timestamp, log) SELECT timestamp, log FROM compute.logs WHERE timestamp > DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY) 3. Use Cloud Scheduler to trigger this Cloud Function once a day.

2.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

You are setting up a Windows VM on Compute Engine and want to make sure you can log in to the VM via RDP. What should you do?

After the VM has been created, use your Google Account credentials to log in into the VM.

After the VM has been created, use gcloud compute reset-windows-password to retrieve the login credentials for the VM.

When creating the VM, add metadata to the instance using 'windows-password' as the key and a password as the value.

After the VM has been created, download the JSON private key for the default Compute Engine service account. Use the credentials in the JSON file to log in to the VM.

3.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

You want to configure an SSH connection to a single Compute Engine instance for users in the dev1 group. This instance is the only resource in this particular Google Cloud Platform project that the dev1 users should be able to connect to. What should you do?

Set metadata to enable-oslogin=true for the instance. Grant the dev1 group the compute.osLogin role. Direct them to use the Cloud Shell to ssh to that instance.

Set metadata to enable-oslogin=true for the instance. Set the service account to no service account for that instance. Direct them to use the Cloud Shell to ssh to that instance.

Enable block project wide keys for the instance. Generate an SSH key for each user in the dev1 group. Distribute the keys to dev1 users and direct them to use their third-party tools to connect.

Enable block project wide keys for the instance. Generate an SSH key and associate the key with that instance. Distribute the key to dev1 users and direct them to use their third-party tools to connect.

4.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

You want to configure a solution for archiving data in a Cloud Storage bucket. The solution must be cost-effective. Data with multiple versions should be archived after 30 days. Previous versions are accessed once a month for reporting. This archive data is also occasionally updated at month-end. What should you do?

Add a bucket lifecycle rule that archives data with newer versions after 30 days to Coldline Storage.

Add a bucket lifecycle rule that archives data with newer versions after 30 days to Nearline Storage.

Add a bucket lifecycle rule that archives data from regional storage after 30 days to Coldline Storage.

Add a bucket lifecycle rule that archives data from regional storage after 30 days to Nearline Storage.

5.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

Your company uses BigQuery for data warehousing. Over time, many different business units in your company have created 1000+ datasets across hundreds of projects. Your CIO wants you to examine all datasets to find tables that contain an employee_ssn column. You want to minimize effort in performing this task.

Go to Data Catalog and search for employee_ssn in the search box.

Write a shell script that uses the bq command line tool to loop through all the projects in your organization.

Write a script that loops through all the projects in your organization and runs a query on INFORMATION_SCHEMA.COLUMNS view to find the employee_ssn column.

Write a Cloud Dataflow job that loops through all the projects in your organization and runs a query on INFORMATION_SCHEMA.COLUMNS view to find employee_ssn column.

6.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

Media Image

You create a Deployment with 2 replicas in a Google Kubernetes Engine cluster that has a single preemptible node pool. After a few minutes, you use kubectl to examine the status of your Pod and observe that one of them is still in Pending status:

The pending Pod's resource requests are too large to fit on a single node of the cluster.

Too many Pods are already running in the cluster, and there are not enough resources left to schedule the pending Pod.

The node pool is configured with a service account that does not have permission to pull the container image used by the pending Pod.

The pending Pod was originally scheduled on a node that has been preempted between the creation of the Deployment and your verification of the Pods' status. It is currently being rescheduled on a new node.

Answer explanation

the "age" the same with the running pod and "restart" both are 0 , means, the containers in both pod never been restarted, the "pending" status pod is the created at the same time with the "running" status pod.

7.

MULTIPLE CHOICE QUESTION

10 mins • 1 pt

You are building a product on top of Google Kubernetes Engine (GKE). You have a single GKE cluster. For each of your customers, a Pod is running in that cluster, and your customers can run arbitrary code inside their Pod. You want to maximize the isolation between your customers' Pods. What should you do?

Use Binary Authorization and whitelist only the container images used by your customers' Pods.

Use the Container Analysis API to detect vulnerabilities in the containers used by your customers' Pods.

Create a GKE node pool with a sandbox type configured to gvisor. Add the parameter runtimeClassName: gvisor to the specification of your customers' Pods.

Use the cos_containerd image for your GKE nodes. Add a nodeSelector with the value cloud.google.com/gke-os-distribution: cos_containerd to the specification of your customers' Pods.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?