
Data Science Model Deployments and Cloud Computing on GCP - PySpark Serverless Autoscaling Properties
Interactive Video
•
Information Technology (IT), Architecture
•
University
•
Practice Problem
•
Hard
Wayground Content
FREE Resource
Read more
5 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the default behavior of Dataproc Serverless when submitting a Spark workload?
It requires manual scaling of resources.
It automatically scales resources using Spark's dynamic resource allocation.
It does not support scaling.
It only scales down resources.
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which property indicates whether dynamic resource allocation is enabled for a Spark job?
Initial number of executors
Maximum number of executors
Dynamic allocation enabled
Executor allocation ratio
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the default maximum number of executors for scaling a Spark workload?
500
1500
1000
2000
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the default value for the executor allocation ratio?
1.0
0.7
0.3
0.5
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
How does an executor allocation ratio of 1 affect a Spark workload?
It limits the scale-up capability.
It provides maximum scale-up capability and parallelism.
It sets the scale-up capability to half.
It disables scaling.
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?