SAA-C03 (1-21)

SAA-C03 (1-21)

University

21 Qs

quiz-placeholder

Similar activities

Cara Menyediakan Sumber Daya AWS

Cara Menyediakan Sumber Daya AWS

University

20 Qs

Quiz tentang Google Colab

Quiz tentang Google Colab

10th Grade - University

20 Qs

Identify Proper and Common Nouns

Identify Proper and Common Nouns

3rd Grade - University

24 Qs

SAA-C03 (41-57)

SAA-C03 (41-57)

University

17 Qs

MCQs on Cloud Migration

MCQs on Cloud Migration

University

20 Qs

Cloud Deployment and Service Models Quiz

Cloud Deployment and Service Models Quiz

University

20 Qs

History and Functionality of APIs

History and Functionality of APIs

University

20 Qs

Términos de Tecnología en la Nube

Términos de Tecnología en la Nube

University

20 Qs

SAA-C03 (1-21)

SAA-C03 (1-21)

Assessment

Quiz

Information Technology (IT)

University

Hard

Created by

John Bui

Used 3+ times

FREE Resource

21 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

A company collects data for temperature, humidity, and atmospheric pressure in cities across multiple continents. The average volume of data that the company collects from each site daily is 500 GB. Each site has a high-speed Internet connection. The company wants to aggregate the data from all these global sites as quickly as possible in a single Amazon S3 bucket. The solution must minimize operational complexity. Which solution meets these requirements?

Turn on S3 Transfer Acceleration on the destination S3 bucket. Use multipart uploads to directly upload site data to the destination S3 bucket.

Upload the data from each site to an S3 bucket in the closest Region. Use S3 Cross-Region Replication to copy objects to the destination S3 bucket. Then remove the data from the origin S3 bucket.

Schedule AWS Snowball Edge Storage Optimized device jobs daily to transfer data from each site to the closest Region. Use S3 Cross-Region Replication to copy objects to the destination S3 bucket.

Upload the data from each site to an Amazon EC2 instance in the closest Region. Store the data in an Amazon Elastic Block Store (Amazon EBS) volume. At regular intervals, take an EBS snapshot and copy it to the Region that contains the destination S3 bucket. Restore the EBS volume in that Region.

Answer explanation

Turning on S3 Transfer Acceleration allows faster uploads from global sites to the S3 bucket. Using multipart uploads optimizes the transfer of large data volumes, minimizing operational complexity and ensuring quick aggregation.

2.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

A company needs the ability to analyze the log files of its proprietary application. The logs are stored in JSON format in an Amazon S3 bucket. Queries will be simple and will run on-demand. A solutions architect needs to perform the analysis with minimal changes to the existing architecture. What should the solutions architect do to meet these requirements with the LEAST amount of operational overhead?

Use Amazon Redshift to load all the content into one place and run the SQL queries as needed.

Use Amazon CloudWatch Logs to store the logs. Run SQL queries as needed from the Amazon CloudWatch console.

Use Amazon Athena directly with Amazon S3 to run the queries as needed.

Use AWS Glue to catalog the logs. Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed.

Answer explanation

Using Amazon Athena allows direct querying of JSON logs in S3 with minimal setup and no need for data loading, making it the most efficient choice for on-demand analysis with low operational overhead.

3.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

A company uses AWS Organizations to manage multiple AWS accounts for different departments. The management account has an Amazon S3 bucket that contains project reports. The company wants to limit access to this S3 bucket to only users of accounts within the organization in AWS Organizations. Which solution meets these requirements with the LEAST amount of operational overhead?

Add the aws PrincipalOrgID global condition key with a reference to the organization ID to the S3 bucket policy.

Create an organizational unit (OU) for each department. Add the aws:PrincipalOrgPaths global condition key to the S3 bucket policy.

Use AWS CloudTrail to monitor the CreateAccount, InviteAccountToOrganization, LeaveOrganization, and RemoveAccountFromOrganization events. Update the S3 bucket policy accordingly.

Tag each user that needs access to the S3 bucket. Add the aws:PrincipalTag global condition key to the S3 bucket policy.

Answer explanation

Adding the aws:PrincipalOrgID global condition key to the S3 bucket policy allows access only to users from accounts within the AWS Organization, ensuring compliance with the requirement while minimizing operational overhead.

4.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

An application runs on an Amazon EC2 instance in a VPC. The application processes logs that are stored in an Amazon S3 bucket. The EC2 instance needs to access the S3 bucket without connectivity to the internet. Which solution will provide private network connectivity to Amazon S3?

Create a gateway VPC endpoint to the S3 bucket.

Stream the logs to Amazon CloudWatch Logs. Export the logs to the S3 bucket.

Create an instance profile on Amazon EC2 to allow S3 access.

Create an Amazon API Gateway API with a private link to access the S3 endpoint.

Answer explanation

Creating a gateway VPC endpoint allows the EC2 instance to access the S3 bucket privately without needing internet connectivity, making it the correct solution for this scenario.

5.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

A company is hosting a web application on AWS using a single Amazon EC2 instance that stores user-uploaded documents in an Amazon EBS volume. For better scalability and availability, the company duplicated the architecture and created a second EC2 instance and EBS volume in another Availability Zone, placing both behind an Application Load Balancer. After completing this change, users reported that, each time they refreshed the website, they could see one subset of their documents or the other, but never all of the documents at the same time. What should a solutions architect propose to ensure users see all of their documents at once?

Copy the data so both EBS volumes contain all the documents

Configure the Application Load Balancer to direct a user to the server with the documents

Copy the data from both EBS volumes to Amazon EFS. Modify the application to save new documents to Amazon EFS

Configure the Application Load Balancer to send the request to both servers. Return each document from the correct server

Answer explanation

Using Amazon EFS allows both EC2 instances to access the same document storage, ensuring users see all documents simultaneously. This solution eliminates the inconsistency caused by separate EBS volumes.

6.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Which solution will meet these requirements?

Create an S3 bucket. Create an IAM role that has permissions to write to the S3 bucket. Use the AWS CLI to copy all files locally to the S3 bucket.

Create an AWS Snowball Edge job. Receive a Snowball Edge device on premises. Use the Snowball Edge client to transfer data to the device. Return the device so that AWS can import the data into Amazon S3.

Deploy an S3 File Gateway on premises. Create a public service endpoint to connect to the S3 File Gateway. Create an S3 bucket. Create a new NFS file share on the S3 File Gateway. Point the new file share to the S3 bucket. Transfer the data from the existing NFS file share to the S3 File Gateway.

Set up an AWS Direct Connect connection between the on-premises network and AWS. Deploy an S3 File Gateway on premises. Create a public virtual interface (VIF) to connect to the S3 File Gateway. Create an S3 bucket. Create a new NFS file share on the S3 File Gateway. Point the new file share to the S3 bucket. Transfer the data from the existing NFS file share to the S3 File Gateway.

Answer explanation

The correct choice involves deploying an S3 File Gateway, which allows seamless integration with existing NFS shares and direct data transfer to S3, meeting the requirements effectively.

7.

MULTIPLE CHOICE QUESTION

2 mins • 1 pt

Which solution meets these requirements?

Persist the messages to Amazon Kinesis Data Analytics. Configure the consumer applications to read and process the messages.

Deploy the ingestion application on Amazon EC2 instances in an Auto Scaling group to scale the number of EC2 instances based on CPU metrics.

Write the messages to Amazon Kinesis Data Streams with a single shard. Use an AWS Lambda function to preprocess messages and store them in Amazon DynamoDB. Configure the consumer applications to read from DynamoDB to process the messages.

Publish the messages to an Amazon Simple Notification Service (Amazon SNS) topic with multiple Amazon Simple Queue Service (Amazon SQS) subscriptions. Configure the consumer applications to process the messages from the queues.

Answer explanation

The correct choice is to persist messages to Amazon Kinesis Data Analytics, allowing real-time processing and analysis. This solution effectively meets the requirements for message handling and consumer application configuration.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?