
Untitled Quiz
Authored by โปร แกรมเมอร์
English
University
Used 4+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
20 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the primary purpose of the Medallion architecture in Databricks?
To visualize data
To schedule jobs
To manage compute resources
To organize data into layers for improved quality and governance
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
In the Medallion Architecture data pipeline demonstrated, what is the core conceptual purpose of the Silver layer transformation step, distinct from the Bronze (raw ingestion) and Gold (final aggregation) layers?
To provide an exact, immutable copy of the raw source data files
To execute complex joins and machine learning model training prior to final consumption
To perform data cleansing, schema refinement, column selection, and the addition of processing metadata (e.g., timestamps)
To manage data governance and set restrictive access controls for business users
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What is the function of the rescued data column during ingestion into a bronze table?
Handles records that do not match the schema
Stores records that match the schema
Encrypts sensitive data
Deletes invalid records
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What must you select before creating streaming tables with Databricks SQL?
A notebook
A cluster
A shared SQL warehouse
A dashboard
5.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
Which product line focuses on ingest, ETL, and streaming?
AI/BI Business Intelligence
Mosaic AI
Lakeflow
Databricks SQL
6.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
What are the three core components of the Databricks Lakeflow offering?
ETL, Streaming, and Ingestion
Compute, Storage, and Governance
Connect, Spark Declarative Pipelines, and Jobs
Delta Lake, Unity Catalog, and Photon
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
The COPY INTO command is described as an idempotent operation, meaning it can be run multiple times against the same source location without duplication. When the command is executed a second time on the same set of files, which specific metric in the operation result confirms that the previous data insertion was successfully skipped?
num_inserted_rows being zero
num_skipped_correct_files being greater than zero
num_affected_rows being zero
num_output_bytes being zero
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?