
Data Science Model Deployments and Cloud Computing on GCP - PySpark Serverless Autoscaling Properties
Interactive Video
•
Information Technology (IT), Architecture
•
University
•
Practice Problem
•
Hard
Wayground Content
FREE Resource
Read more
5 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the default behavior of Dataproc Serverless when handling Spark workloads?
It uses static resource allocation.
It requires manual scaling of resources.
It does not support scaling.
It dynamically scales resources using Spark's dynamic resource allocation.
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which property indicates whether dynamic resource allocation is enabled for a Spark job?
Initial number of executors
Executor allocation ratio
Dynamic allocation enabled
Maximum number of executors
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the default maximum number of executors for scaling a Spark workload?
500
1000
2
2000
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the default value for the Executor Allocation Ratio property?
0.5
1
0.3
0
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
How does a value of 1 for the Executor Allocation Ratio affect scaling?
It provides maximum scale-up capability and parallelism.
It sets scaling to the minimum value.
It limits scaling to half the maximum value.
It disables scaling.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?