Azure Data Factory for Beginners - Build Data Ingestion - Introduction

Azure Data Factory for Beginners - Build Data Ingestion - Introduction

Assessment

Interactive Video

Information Technology (IT), Architecture

University

Hard

Created by

Quizizz Content

FREE Resource

The video tutorial explains event driven ingestion, focusing on a hypothetical company, Bylot, which uses a finance system to deliver files to Azure BLOB storage. The system produces three files: two CSVs and one JSON. The tutorial outlines the requirements for a data factory pipeline to ingest these files, emphasizing the need to filter and store CSV files in separate directories within a data engineering data lake.

Read more

5 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary purpose of the new finance system in Bylot?

To manage employee records

To deliver files to Azure BLOB storage

To enhance customer service

To improve marketing strategies

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which file types are produced by the finance system?

One CSV file and two JSON files

Two JSON files and one CSV file

Three XML files

Two CSV files and one JSON file

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What triggers the data factory pipeline in the event-driven ingestion process?

System reboot

Scheduled time intervals

Manual user input

Uploading of files

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the main requirement for the data factory pipeline regarding file types?

Ingest all file types

Ingest only JSON files

Ingest only CSV files

Ingest only XML files

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

How should the data be stored in the data engineering data lake?

In a shared directory with other projects

In separate directories for each file type

In a single directory

In a cloud-based database