8/07/24

8/07/24

1st - 5th Grade

5 Qs

quiz-placeholder

Similar activities

Weather versus Climate

Weather versus Climate

5th - 7th Grade

10 Qs

Rainforest Animals

Rainforest Animals

KG - 2nd Grade

10 Qs

Information Technology 1 unit 9

Information Technology 1 unit 9

1st - 12th Grade

10 Qs

PRIVATE IPA - Sesi 23

PRIVATE IPA - Sesi 23

1st - 5th Grade

10 Qs

Amazon Region

Amazon Region

5th Grade

10 Qs

Quiz Booth 1

Quiz Booth 1

1st Grade

10 Qs

Web Browsers - Quiz

Web Browsers - Quiz

3rd - 5th Grade

9 Qs

plants

plants

4th Grade

10 Qs

8/07/24

8/07/24

Assessment

Quiz

Science

1st - 5th Grade

Hard

Created by

Ben_ _Papuche

Used 2+ times

FREE Resource

5 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

1 min • 10 pts

The engineering team at a leading e-commerce company is anticipating a surge in the traffic because of a flash sale planned for the weekend. You have estimated the web traffic to be 10x. The content of your website is highly dynamic and changes very often.

As a Solutions Architect, which of the following options would you recommend to make sure your infrastructure scales for that day?

Use an Auto Scaling Group

Use a CloudFront distribution in front of your website

Deploy the website on S3

Use a Route53 Multi Value record

Answer explanation

Explication générale

Correct option:

Use an Auto Scaling Group - An Auto Scaling group (ASG) contains a collection of Amazon EC2 instances that are treated as a logical grouping for automatic scaling and management. An Auto Scaling group also enables you to use Amazon EC2 Auto Scaling features such as health check replacements and scaling policies. Both maintaining the number of instances in an Auto Scaling group and automatic scaling are the core functionality of the Amazon EC2 Auto Scaling service.

The size of an Auto Scaling group depends on the number of instances that you set as the desired capacity. You can adjust its size to meet demand, either manually or by using automatic scaling.

An Auto Scaling group starts by launching enough instances to meet its desired capacity. It maintains this number of instances by performing periodic health checks on the instances in the group. The Auto Scaling group continues to maintain a fixed number of instances even if an instance becomes unhealthy. If an instance becomes unhealthy, the group terminates the unhealthy instance and launches another instance to replace it.

ASG is the correct answer here.

Incorrect option:

Use a CloudFront distribution in front of your website - Amazon CloudFront is a fast content delivery network (CDN) service that securely delivers data, videos, applications, and APIs to customers globally with low latency, high transfer speeds, all within a developer-friendly environment. You can use Amazon CloudFront to improve the performance of your website. CloudFront makes your website files (such as HTML, images, and video) available from data centers around the world (called edge locations). When a visitor requests a file from your website, CloudFront automatically redirects the request to a copy of the file at the nearest edge location. This results in faster download times than if the visitor had requested the content from a data center that is located farther away.

CloudFront is not a good solution here as the content is highly dynamic, and CloudFront will cache things.

Deploy the website on S3 - You can use Amazon S3 to host a static website. On a static website, individual web pages include static content. They might also contain client-side scripts. To host a static website on Amazon S3, you configure an Amazon S3 bucket for website hosting and then upload your website content to the bucket. When you configure a bucket as a static website, you enable static website hosting, set permissions, and add an index document. Depending on your website requirements, you can also configure other options, including redirects, web traffic logging, and custom error documents.

Dynamic applications cannot be deployed to S3. This option has been added as a distractor.

Use a Route53 Multi Value record - Amazon Route 53 is a highly available and scalable cloud Domain Name System (DNS) web service. Use Multi Value answer routing policy when you want Route 53 to respond to DNS queries with up to eight healthy records selected at random. Route 53 does not help in scaling your application. This option has been added as a distractor.

References:

https://docs.aws.amazon.com/autoscaling/ec2/userguide/AutoScalingGroup.html

https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html

2.

MULTIPLE CHOICE QUESTION

1 min • 11 pts

An e-commerce company tracks user clicks on its flagship website and performs analytics to provide near-real-time product recommendations. An EC2 instance receives data from the website and sends the data to an Aurora DB instance. Another EC2 instance continuously checks the changes in the database and executes SQL queries to provide recommendations. Now, the company wants a redesign to decouple and scale the infrastructure. The solution must ensure that data can be analyzed in real-time without any data loss even when the company sees huge traffic spikes.

What would you recommend as an AWS Certified Solutions Architect Associate?

Leverage Amazon Kinesis Data Streams to capture the data from the website and feed it into Amazon Kinesis Data Analytics which can query the data in real time. Lastly, the analyzed feed is output into Kinesis Data Firehose to persist the data on Amazon S3

Leverage Amazon Kinesis Data Streams to capture the data from the website and feed it into Kinesis Data Firehose to persist the data on Amazon S3. Lastly, use Amazon Athena to analyze the data in real time

Leverage Amazon Kinesis Data Streams to capture the data from the website and feed it into Amazon QuickSight which can query the data in real time. Lastly, the analyzed feed is output into Kinesis Data Firehose to persist the data on Amazon S3

Leverage Amazon SQS to capture the data from the website. Configure a fleet of EC2 instances under an Auto scaling group to process messages from the SQS queue and trigger the scaling policy based on the number of pending messages in the queue. Perform real-time analytics using a third-party library on the EC2 instances

Answer explanation

Explication générale

Correct option:

Leverage Amazon Kinesis Data Streams to capture the data from the website and feed it into Amazon Kinesis Data Analytics which can query the data in real time. Lastly, the analyzed feed is output into Kinesis Data Firehose to persist the data on Amazon S3

You can use Kinesis Data Streams to build custom applications that process or analyze streaming data for specialized needs. Kinesis Data Streams manages the infrastructure, storage, networking, and configuration needed to stream your data at the level of your data throughput. You don't have to worry about provisioning, deployment, or ongoing maintenance of hardware, software, or other services for your data streams.

For the given use case, you can use Kinesis Data Analytics to transform and analyze incoming streaming data from Kinesis Data Streams in real time. Kinesis Data Analytics takes care of everything required to run streaming applications continuously, and scales automatically to match the volume and throughput of your incoming data. With Kinesis Data Analytics, there are no servers to manage, no minimum fee or setup cost, and you only pay for the resources your streaming applications consume.

Amazon Kinesis Data Analytics:

via - https://aws.amazon.com/kinesis/

Amazon Kinesis Data Firehose is an extract, transform, and load (ETL) service that reliably captures, transforms and delivers streaming data to data lakes, data stores, and analytics services.

For the given use case, post the real-time analysis, the output feed from Kinesis Data Analytics is output into Kinesis Data Firehose which dumps the data into Amazon S3 without any data loss.

Amazon Kinesis Data Firehose:

via - https://aws.amazon.com/kinesis/

Incorrect options:

Leverage Amazon Kinesis Data Streams to capture the data from the website and feed it into Amazon QuickSight which can query the data in real time. Lastly, the analyzed feed is output into Kinesis Data Firehose to persist the data on Amazon S3 - QuickSight cannot use Kinesis Data Streams as a source. In addition, QuickSight cannot be used for real-time streaming data analysis from its source. Therefore this option is incorrect.

Leverage Amazon Kinesis Data Streams to capture the data from the website and feed it into Kinesis Data Firehose to persist the data on Amazon S3. Lastly, use Amazon Athena to analyze the data in real time - Amazon Athena is an interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL. Athena is serverless, so there is no infrastructure to manage, and you pay only for the queries that you run. Athena cannot be used to analyze data in real time. Therefore this option is incorrect.

Leverage Amazon SQS to capture the data from the website. Configure a fleet of EC2 instances under an Auto scaling group to process messages from the SQS queue and trigger the scaling policy based on the number of pending messages in the queue. Perform real-time analytics using a third party library on the EC2 instances - Even though using SQS with EC2 instances can decouple the architecture, however, performing real-time analytics using a third party library on the EC2 instances is not the best fit solution for the given use case. The Kinesis family of services is the better fit for the given scenario as these services allow streaming data ingestion, real-time analysis, and reliable data delivery to the data sink.

References:

https://aws.amazon.com/kinesis/

https://aws.amazon.com/quicksight/resources/faqs/

3.

MULTIPLE SELECT QUESTION

1 min • 11 pts

An e-commerce company has copied 1 PB of data from its on-premises data center to an Amazon S3 bucket in the us-west-1 Region using an AWS Direct Connect link. The company now wants to set up a one-time copy of the data to another S3 bucket in the us-east-1 Region. The on-premises data center does not allow the use of AWS Snowball.

As a Solutions Architect, which of the following options can be used to accomplish this goal? (Select two)

Set up S3 batch replication to copy objects across S3 buckets in another Region using S3 console and then delete the replication configuration

Set up S3 Transfer Acceleration to copy objects across S3 buckets in different Regions using S3 console

Copy data from the source bucket to the destination bucket using the aws S3 sync command

Copy data from the source S3 bucket to a target S3 bucket using the S3 console

Use Snowball Edge device to copy the data from one Region to another Region

Answer explanation

Explication générale

Correct options:

Copy data from the source bucket to the destination bucket using the aws S3 sync command

The aws S3 sync command uses the CopyObject APIs to copy objects between S3 buckets. The sync command lists the source and target buckets to identify objects that are in the source bucket but that aren't in the target bucket. The command also identifies objects in the source bucket that have different LastModified dates than the objects that are in the target bucket. The sync command on a versioned bucket copies only the current version of the object—previous versions aren't copied. By default, this preserves object metadata, but the access control lists (ACLs) are set to FULL_CONTROL for your AWS account, which removes any additional ACLs. If the operation fails, you can run the sync command again without duplicating previously copied objects.

You can use the command like so:

aws s3 sync s3://DOC-EXAMPLE-BUCKET-SOURCE s3://DOC-EXAMPLE-BUCKET-TARGET

Set up S3 batch replication to copy objects across S3 buckets in another Region using S3 console and then delete the replication configuration

S3 Batch Replication provides you a way to replicate objects that existed before a replication configuration was in place, objects that have previously been replicated, and objects that have failed replication. This is done through the use of a Batch Operations job.

You should note that batch replication differs from live replication which continuously and automatically replicates new objects across Amazon S3 buckets. You cannot directly use the AWS S3 console to configure cross-Region replication for existing objects. By default, replication only supports copying new Amazon S3 objects after it is enabled using the AWS S3 console. Replication enables automatic, asynchronous copying of objects across Amazon S3 buckets. Buckets that are configured for object replication can be owned by the same AWS account or by different accounts. Object may be replicated to a single destination bucket or multiple destination buckets. Destination buckets can be in different AWS Regions or within the same Region as the source bucket. Once done, you can delete the replication configuration, as it ensures that batch replication is only used for this one-time data copy operation.

If you want to enable live replication for existing objects for your bucket, you must contact AWS Support and raise a support ticket. This is required to ensure that replication is configured correctly.

Incorrect options:

Use Snowball Edge device to copy the data from one Region to another Region - As the given requirement is about copying the data from one AWS Region to another AWS Region, so Snowball Edge cannot be used here. Snowball Edge Storage Optimized is the optimal data transfer choice if you need to securely and quickly transfer terabytes to petabytes of data to AWS. You can use Snowball Edge Storage Optimized if you have a large backlog of data to transfer or if you frequently collect data that needs to be transferred to AWS and your storage is in an area where high-bandwidth internet connections are not available or cost-prohibitive. Snowball Edge can operate in remote locations or harsh operating environments, such as factory floors, oil and gas rigs, mining sites, hospitals, and on moving vehicles.

Copy data from the source S3 bucket to a target S3 bucket using the S3 console - AWS S3 console cannot be used to copy 1PB of data from one bucket to another as it's not feasible. You should note that this option is different from using the replication options on the AWS console, since here you are using the copy and paste options provided on the AWS console, which is suggested for small or medium data volume. You should use S3 sync for the requirement of one-time copy of data.

Set up S3 Transfer Acceleration to copy objects across S3 buckets in different Regions using S3 console - S3 Transfer Acceleration is a bucket-level feature that enables fast, easy, and secure transfers of files over long distances between your client and an S3 bucket. You cannot use Transfer Acceleration to copy objects across S3 buckets in different Regions using S3 console.

References:

https://aws.amazon.com/premiumsupport/knowledge-center/move-objects-s3-bucket/

https://aws.amazon.com/snowball/faqs/

https://docs.aws.amazon.com/AmazonS3/latest/userguide/replication.html

4.

MULTIPLE CHOICE QUESTION

1 min • 10 pts

An Internet-of-Things (IoT) company would like to have a streaming system that performs real-time analytics on the ingested IoT data. Once the analytics is done, the company would like to send notifications back to the mobile applications of the IoT device owners.

As a solutions architect, which of the following AWS technologies would you recommend to send these notifications to the mobile applications?

Amazon Kinesis with Amazon Simple Notification Service (SNS)

Amazon Kinesis with Simple Email Service (Amazon SES)

Amazon Kinesis with Simple Queue Service (SQS)

Amazon Simple Queue Service (SQS) with Amazon Simple Notification Service (SNS)

Answer explanation

Explication générale

Correct option:

Amazon Kinesis with Amazon Simple Notification Service (SNS) - Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. Amazon Kinesis offers key capabilities to cost-effectively process streaming data at any scale, along with the flexibility to choose the tools that best suit the requirements of your application.

With Amazon Kinesis, you can ingest real-time data such as video, audio, application logs, website clickstreams, and IoT telemetry data for machine learning, analytics, and other applications. Amazon Kinesis enables you to process and analyze data as it arrives and respond instantly instead of having to wait until all your data is collected before the processing can begin.

Kinesis will be great for event streaming from the IoT devices, but not for sending notifications as it doesn't have such a feature.

Amazon Simple Notification Service (SNS) is a highly available, durable, secure, fully managed pub/sub messaging service that enables you to decouple microservices, distributed systems, and serverless applications. Amazon SNS provides topics for high-throughput, push-based, many-to-many messaging. SNS is a notification service and will be perfect for this use case.

Streaming data with Kinesis and using SNS to send the response notifications is the optimal solution for the current scenario.

Incorrect options:

Amazon Simple Queue Service (SQS) with Amazon Simple Notification Service (SNS) - Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available. Kinesis is better for streaming data since queues aren't meant for real-time streaming of data.

Amazon Kinesis with Simple Email Service (Amazon SES) - Amazon Simple Email Service (Amazon SES) is a cloud-based email sending service designed to help digital marketers and application developers send marketing, notification, and transactional emails. It is a reliable, cost-effective service for businesses of all sizes that use email to keep in contact with their customers. It is an email service and not a notification service as is the requirement in the current use case.

Amazon Kinesis with Simple Queue Service (SQS) - As explained above, Kinesis works well for streaming real-time data. SQS is a queuing service that helps decouple system architecture by offering flexibility and ease of maintenance. It cannot send notifications. SQS is paired with SNS to provide this functionality.

References:

https://aws.amazon.com/sns/

https://aws.amazon.com/kinesis/

https://aws.amazon.com/ses/

https://aws.amazon.com/sqs/

5.

MULTIPLE CHOICE QUESTION

30 sec • 3 pts

Lequel de ces végétaux ne perd pas ses feuilles à l’automne ?

Peuplier

Hérable

Houx

Chène

Answer explanation

Le houx possède des feuilles persistantes. Une cire protège les feuilles et les conserve donc durant plusieurs années. Le houx peut résister jusqu’à des températures de -18°C.