Question no Answer

Question no Answer

Professional Development

88 Qs

quiz-placeholder

Similar activities

java

java

Professional Development

87 Qs

DBMS QUIZ

DBMS QUIZ

Professional Development

85 Qs

ULANGKAJI ASK TING1

ULANGKAJI ASK TING1

Professional Development

90 Qs

Kinimparahis

Kinimparahis

Professional Development

88 Qs

DATA PROCESSING THIRD TERM EXAM

DATA PROCESSING THIRD TERM EXAM

Professional Development

85 Qs

Exam 1Z0-1033-23 Reviewer

Exam 1Z0-1033-23 Reviewer

Professional Development

92 Qs

functions, file, random , date, math

functions, file, random , date, math

Professional Development

83 Qs

Question no Answer

Question no Answer

Assessment

Quiz

Computers

Professional Development

Medium

Created by

Linh Nguyễn

Used 5+ times

FREE Resource

88 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

A data engineer is configuring an AWS Glue job to read data from an Amazon S3 bucket. The data engineer has set up the necessary AWS Glue connection details and an associated IAM role. However, when the data engineer attempts to run the AWS Glue job, the data engineer receives an error message that indicates that there are problems with the Amazon S3 VPC gateway endpoint.
The data engineer must resolve the error and connect the AWS Glue job to the S3 bucket.
Which solution will meet this requirement?

  • A. Update the AWS Glue security group to allow inbound traffic from the Amazon S3 VPC gateway endpoint.

  • B. Configure an S3 bucket policy to explicitly grant the AWS Glue job permissions to access the S3 bucket.

  • C. Review the AWS Glue job code to ensure that the AWS Glue connection details include a fully qualified domain name.

  • D. Verify that the VPC's route table includes inbound and outbound routes for the Amazon S3 VPC gateway endpoint.

2.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

A retail company has a customer data hub in an Amazon S3 bucket. Employees from many countries use the data hub to support company-wide analytics. A governance team must ensure that the company's data analysts can access data only for customers who are within the same country as the analysts.
Which solution will meet these requirements with the LEAST operational effort?

  • A. Create a separate table for each country's customer data. Provide access to each analyst based on the country that the analyst serves.

  • B. Register the S3 bucket as a data lake location in AWS Lake Formation. Use the Lake Formation row-level security features to enforce the company's access policies.

  • C. Move the data to AWS Regions that are close to the countries where the customers are. Provide access to each analyst based on the country that the analyst serves.

  • D. Load the data into Amazon Redshift. Create a view for each country. Create separate IAM roles for each country to provide access to data from each country. Assign the appropriate roles to the analysts.

3.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

A media company wants to improve a system that recommends media content to customer based on user behavior and preferences. To improve the recommendation system, the company needs to incorporate insights from third-party datasets into the company's existing analytics platform.
The company wants to minimize the effort and time required to incorporate third-party datasets.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Use API calls to access and integrate third-party datasets from AWS Data Exchange.

  • B. Use API calls to access and integrate third-party datasets from AWS DataSync.

  • C. Use Amazon Kinesis Data Streams to access and integrate third-party datasets from AWS CodeCommit repositories.

  • D. Use Amazon Kinesis Data Streams to access and integrate third-party datasets from Amazon Elastic Container Registry (Amazon ECR).

4.

MULTIPLE SELECT QUESTION

1 min • 1 pt

A financial company wants to implement a data mesh. The data mesh must support centralized data governance, data analysis, and data access control. The company has decided to use AWS Glue for data catalogs and extract, transform, and load (ETL) operations.
Which combination of AWS services will implement a data mesh? (Choose two.)

  • A. Use Amazon Aurora for data storage. Use an Amazon Redshift provisioned cluster for data analysis.

  • B. Use Amazon S3 for data storage. Use Amazon Athena for data analysis.

  • C. Use AWS Glue DataBrew for centralized data governance and access control.

  • D. Use Amazon RDS for data storage. Use Amazon EMR for data analysis.

  • E. Use AWS Lake Formation for centralized data governance and access control.

5.

MULTIPLE SELECT QUESTION

1 min • 1 pt

A data engineer maintains custom Python scripts that perform a data formatting process that many AWS Lambda functions use. When the data engineer needs to modify the Python scripts, the data engineer must manually update all the Lambda functions.
The data engineer requires a less manual way to update the Lambda functions.
Which solution will meet this requirement?

  • A. Store a pointer to the custom Python scripts in the execution context object in a shared Amazon S3 bucket.

  • B. Package the custom Python scripts into Lambda layers. Apply the Lambda layers to the Lambda functions.

  • C. Store a pointer to the custom Python scripts in environment variables in a shared Amazon S3 bucket.

  • D. Assign the same alias to each Lambda function. Call reach Lambda function by specifying the function's alias.

6.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

A company created an extract, transform, and load (ETL) data pipeline in AWS Glue. A data engineer must crawl a table that is in Microsoft SQL Server. The data engineer needs to extract, transform, and load the output of the crawl to an Amazon S3 bucket. The data engineer also must orchestrate the data pipeline.
Which AWS service or feature will meet these requirements MOST cost-effectively?

  • A. AWS Step Functions

  • B. AWS Glue workflows

  • C. AWS Glue Studio

  • D. Amazon Managed Workflows for Apache Airflow (Amazon MWAA)

7.

MULTIPLE CHOICE QUESTION

1 min • 1 pt

A financial services company stores financial data in Amazon Redshift. A data engineer wants to run real-time queries on the financial data to support a web-based trading application. The data engineer wants to run the queries from within the trading application.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Establish WebSocket connections to Amazon Redshift.

  • B. Use the Amazon Redshift Data API.

  • C. Set up Java Database Connectivity (JDBC) connections to Amazon Redshift.

  • D. Store frequently accessed data in Amazon S3. Use Amazon S3 Select to run the queries.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?