DP203_09

DP203_09

1st Grade

10 Qs

quiz-placeholder

Similar activities

Intelligent Applications

Intelligent Applications

1st - 2nd Grade

6 Qs

AZ900

AZ900

1st Grade

10 Qs

MeetUp Analytics

MeetUp Analytics

1st - 3rd Grade

12 Qs

Aplicaciones Model - D

Aplicaciones Model - D

1st Grade

10 Qs

MSE BLITZ - 9th April

MSE BLITZ - 9th April

KG - Professional Development

14 Qs

BASICS OF DATA ANALYTICS UNIT 5 .1 & 5.2

BASICS OF DATA ANALYTICS UNIT 5 .1 & 5.2

1st - 3rd Grade

8 Qs

INTEL COMPANY : MOORE OR LESS IN INNOVATION TECHNOLOGY

INTEL COMPANY : MOORE OR LESS IN INNOVATION TECHNOLOGY

1st Grade

6 Qs

DSP Q1

DSP Q1

1st Grade

15 Qs

DP203_09

DP203_09

Assessment

Quiz

Other

1st Grade

Easy

Created by

Edgar Martínez

Used 11+ times

FREE Resource

10 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You need to implement the surrogate key for the retail store table. The solution must meet the sales transaction dataset requirements.
What should you create?

  • A. a table that has an IDENTITY property

  • B. a system-versioned temporal table

  • C. a user-defined SEQUENCE object

  • D. a table that has a FOREIGN KEY constraint

Answer explanation

Scenario: Implement a surrogate key to account for changes to the retail store addresses.
A surrogate key on a table is a column with a unique identifier for each row. The key is not generated from the table data. Data modelers like to create surrogate keys on their tables when they design data warehouse models. You can use the IDENTITY property to achieve this goal simply and effectively without affecting load performance.
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-identity
Design
and implement data storage

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

HOTSPOT -
You need to design an analytical storage solution for the transactional data. The solution must meet the sales transaction dataset requirements.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Media Image
Media Image

Answer explanation

Box 1: Round-robin -
Round-robin tables are useful for improving loading speed.
Scenario: Partition data that contains sales transaction records. Partitions must be designed to provide efficient loads by month.

Box 2: Hash -
Hash-distributed tables improve query performance on large fact tables.
Scenario:
✑ You plan to create a promotional table that will contain a promotion ID. The promotion ID will be associated to a specific product. The product will be identified by a product ID. The table will be approximately 5 GB.
✑ Ensure that queries joining and filtering sales transaction records based on product ID complete as quickly as possible.
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-distribute

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

HOTSPOT -
You need to implement an Azure Synapse Analytics database object for storing the sales transactions data. The solution must meet the sales transaction dataset requirements.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Media Image
Media Image

Answer explanation

Box 1: Create table -
Scenario: Load the sales transaction dataset to Azure Synapse Analytics

Box 2: RANGE RIGHT FOR VALUES -
Scenario: Partition data that contains sales transaction records. Partitions must be designed to provide efficient loads by month. Boundary values must belong to the partition on the right.
RANGE RIGHT: Specifies the boundary value belongs to the partition on the right (higher values).
FOR VALUES ( boundary_value [,...n] ): Specifies the boundary values for the partition.
Scenario: Load the sales transaction dataset to Azure Synapse Analytics.
Contoso identifies the following requirements for the sales transaction dataset:
✑ Partition data that contains sales transaction records. Partitions must be designed to provide efficient loads by month. Boundary values must belong to the partition on the right.
✑ Ensure that queries joining and filtering sales transaction records based on product ID complete as quickly as possible.
✑ Implement a surrogate key to account for changes to the retail store addresses.
✑ Ensure that data storage costs and performance are predictable.
✑ Minimize how long it takes to remove old records.
Reference:
https://docs.microsoft.com/en-us/sql/t-sql/statements/create-table-azure-sql-data-warehouse

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

You need to design a data retention solution for the Twitter feed data records. The solution must meet the customer sentiment analytics requirements.
Which Azure Storage functionality should you include in the solution?

  • A. change feed

  • B. soft delete

  • C. time-based retention

  • D. lifecycle management

Answer explanation

Scenario: Purge Twitter feed data records that are older than two years.
Data sets have unique lifecycles. Early in the lifecycle, people access some data often. But the need for access often drops drastically as the data ages. Some data remains idle in the cloud and is rarely accessed once stored. Some data sets expire days or months after creation, while other data sets are actively read and modified throughout their lifetimes. Azure Storage lifecycle management offers a rule-based policy that you can use to transition blob data to the appropriate access tiers or to expire data at the end of the data lifecycle.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

HOTSPOT -
Which Azure Data Factory components should you recommend using together to import the daily inventory data from the SQL server to Azure Data Lake Storage?
To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Media Image
Media Image

Answer explanation

Box 1: Self-hosted integration runtime
A self-hosted IR is capable of running copy activity between a cloud data stores and a data store in private network.

Box 2: Schedule trigger -

Schedule every 8 hours -

Box 3: Copy activity -
Scenario:
✑ Customer data, including name, contact information, and loyalty number, comes from Salesforce and can be imported into Azure once every eight hours. Row modified dates are not trusted in the source table.
✑ Product data, including product ID, name, and category, comes from Salesforce and can be imported into Azure once every eight hours. Row modified dates are not trusted in the source table.
Design and develop data processing

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

DRAG DROP -
You need to implement versioned changes to the integration pipelines. The solution must meet the data integration requirements.
In which order should you perform the actions? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:

Media Image
Media Image

Answer explanation

Scenario: Identify a process to ensure that changes to the ingestion and transformation activities can be version-controlled and developed independently by multiple data engineers.
Step 1: Create a repository and a main branch
You need a Git repository in Azure Pipelines, TFS, or GitHub with your app.

Step 2: Create a feature branch -

Step 3: Create a pull request -

Step 4: Merge changes -
Merge feature branches into the main branch using pull requests.

Step 5: Publish changes -
Reference:
https://docs.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Media Image

HOTSPOT -

You need to design a data ingestion and storage solution for the Twitter feeds. The solution must meet the customer sentiment analytics requirements.

What should you include in the solution? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Hot Area:

Media Image
Media Image

Answer explanation

Box 1: Configure Evegent Hubs partitions

Scenario: Maximize the throughput of ingesting Twitter feeds from Event Hubs to Azure Storage without purchasing additional throughput or capacity units.

Event Hubs is designed to help with processing of large volumes of events. Event Hubs throughput is scaled by using partitions and throughput-unit allocations.

Incorrect Answers:

✑ Event Hubs Dedicated: Event Hubs clusters offer single-tenant deployments for customers with the most demanding streaming needs. This single-tenant offering has a guaranteed 99.99% SLA and is available only

on our Dedicated pricing tier.

✑ Auto-Inflate: The Auto-inflate feature of Event Hubs automatically scales up by increasing the number of TUs, to meet usage needs.

Event Hubs traffic is controlled by TUs (standard tier). Auto-inflate enables you to start small with the minimum required TUs you choose. The feature then scales automatically to the maximum limit of TUs you

need, depending on the increase in your traffic.

Box 2: An Azure Data Lake Storage Gen2 account

Scenario: Ensure that the data store supports Azure AD-based access control down to the object level.

Azure Data Lake Storage Gen2 implements an access control model that supports both Azure role-based access control (Azure RBAC) and POSIX-like access control lists (ACLs).

Incorrect Answers:

✑ Azure Databricks: An Azure administrator with the proper permissions can configure Azure Active Directory conditional access to control where and when users are permitted to sign in to Azure Databricks.

✑ Azure Storage supports using Azure Active Directory (Azure AD) to authorize requests to blob data.

You can scope access to Azure blob resources at the following levels, beginning with the narrowest scope:

- An individual container. At this scope, a role assignment applies to all of the blobs in the container, as well as container properties and metadata.

- The storage account. At this scope, a role assignment applies to all containers and their blobs.

- The resource group. At this scope, a role assignment applies to all of the containers in all of the storage accounts in the resource group.

- The subscription. At this scope, a role assignment applies to all of the containers in all of the storage accounts in all of the resource groups in the subscription.

- A management group.

Reference:

https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-features https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?