What would cause an access denied error when attempting to download an archive file from Amazon S3 during a pipeline execution?
aws learning day

Quiz
•
Science, Computers
•
Professional Development
•
Hard
Emanuel Afanador
Used 5+ times
FREE Resource
34 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
5 mins • 1 pt
Insufficient user permissions for the user initiating the pipeline
Insufficient user permissions for the user uploading the Amazon S3 archive
Insufficient role permissions for the Amazon S3 service role
Insufficient role permissions for the AWS CodePipeline service role
Answer explanation
Option C is incorrect because Amazon S3 does not have a concept of service roles. When a pipeline is initiated, it is done in response either to a change in a source or when a previous change is released by an authorized AWS IAM user or role. However, after the pipeline has been initiated, the AWS CodePipeline service role is used to perform pipeline actions. Thus, options A and B are incorrect. Option D is correct, because the pipeline’s service role requires permissions to download objects from Amazon S3.
2.
MULTIPLE CHOICE QUESTION
5 mins • 1 pt
What would be the most secure means of providing secrets to an AWS CodeBuild environment?
Create a custom build environment with the secrets included in configuration files.
Upload the secrets to Amazon S3 and download the object when the build job runs. Protect the bucket and object with an appropriate bucket policy.
Save the secrets in AWS Systems Manager Parameter Store and query them as needed. Encrypt the secrets with an AWS Key Management Service (AWS KMS) key. Include appropriate AWS KMS permissions to your build environment’s IAM role.
Include the secrets in the source repository or archive.
Answer explanation
Option A is incorrect because a custom build environment would expose the secrets to any user able to create new build jobs using the same environment. Option B is also incorrect. Though uploading the secrets to Amazon S3 would provide some protection, administrators with Amazon S3 access may still be able to view the secrets. Option D is incorrect because AWS does not recommend storing sensitive information in source control repositories, as it is easily viewed by anyone with access to the repository. Option C is correct. By encrypting the secrets with AWS KMS and storing them in AWS Systems Manager Parameter Store, you ensure that the keys are protected both at rest and in transit. Only AWS IAM users or roles with permissions to both the key and parameter store would have access to the secrets.
3.
MULTIPLE CHOICE QUESTION
5 mins • 1 pt
If you want to implement a deployment pipeline that deploys both source files and large binary objects to instance(s), how would you best achieve this while taking cost into consideration?
Store both the source files and binary objects in AWS CodeCommit.
Build the binary objects into the AMI of the instance(s) being deployed. Store the source files in AWS CodeCommit.
Store the source files in AWS CodeCommit. Store the binary objects in an Amazon S3 archive.
Store the source files in AWS CodeCommit. Store the binary objects on an Amazon Elastic Block Store (Amazon EBS) volume, taking snapshots of the volume whenever a new one needs to be created.
Store the source files in AWS CodeCommit. Store the binary objects in Amazon S3 and access them from an Amazon CloudFront distribution.
Answer explanation
Option A is incorrect because storing large binary objects in a Git-based repository can incur massive storage requirements. Any time a binary object is modified in a repository, a new copy is saved. Comparing cost to Amazon S3 storage, it is more expensive to take this approach. By building the binary objects into an Amazon Machine Image (AMI), you are required to create a new AMI any time changes are made to the objects; thus, option B is incorrect. Option D and E introduce unnecessary cost and complexity into the solution. By using both an AWS CodeCommit repository and Amazon S3 archive, the lowest cost and easiest management is achieved.
4.
MULTIPLE CHOICE QUESTION
5 mins • 1 pt
How do you output build artifacts from AWS CodeBuild to AWS CodePipeline?
Write the outputs to STDOUT from the build container.
Specify artifact files in the buildspec.yml configuration file.
Upload the files to Amazon S3 from the build environment.
Output artifacts are not supported with AWS CodeBuild.
Answer explanation
Option A is incorrect because this output is used only in the CodeBuild console. Option D is incorrect because CodeBuild natively supports this functionality. Though option C would technically work, CodeBuild supports output artifacts in the buildspec.yml specification. The BuildSpec includes a files directive to indicate any files from the build environment that will be passed as output artifacts. Thus, option B is correct.
5.
MULTIPLE SELECT QUESTION
5 mins • 1 pt
In what ways can pipeline actions be ordered in a stage? (Select TWO.)
Series
Parallel
Stages support only one action each
First-in-first-out (FIFO)
Last-in-first-out (LIFO)
Answer explanation
Options D and E are incorrect because FIFO/LIFO are not valid pipeline action configurations. Option C is incorrect because pipeline stages support multiple actions. Pipeline actions can be specified to occur both in series and in parallel within the same stage. Thus, options A and B are correct.
6.
MULTIPLE CHOICE QUESTION
5 mins • 1 pt
If you specify a hook script in the ApplicationStop lifecycle event of an AWS CodeDeploy appspec.yml, will it run on the first deployment to your instance(s)?
Yes
No
The ApplicationStop lifecycle event does not exist.
It will run only if your application is running.
Answer explanation
Option B is correct because the ApplicationStop lifecycle event occurs before any new deployment files download. For this reason, it will not run the first time a deployment occurs on an instance. Option C is incorrect, as this is a valid lifecycle event. Option A is incorrect. Option D is incorrect, because lifecycle hooks are not aware of the current state of your application. Lifecycle hook scripts execute any listed commands.
7.
MULTIPLE CHOICE QUESTION
5 mins • 1 pt
What is the only deployment type supported by on-premises instances?
In-place
Blue/green
Immutable
Progressive
Answer explanation
Because AWS does not have the ability to create or destroy infrastructure in customer data centers, options B, C, and D are incorrect. Option A is correct because on-premises instances support only in-place deployments.
Create a free account and access millions of resources
Similar Resources on Quizizz
37 questions
DEVOPS LIFECYCLE

Quiz
•
Professional Development
30 questions
AZ-900 - Azure Fundamentals (ME01)

Quiz
•
Professional Development
30 questions
DevOps

Quiz
•
Professional Development
30 questions
Quiz - Functions in C

Quiz
•
Professional Development
30 questions
DIRECT DISTRUBUTION ORDERING BILL (PURCHASING AND LOGISTIC)

Quiz
•
Professional Development
30 questions
Herramientas DevSecOps en AWS - Modulo 5

Quiz
•
Professional Development
30 questions
AWS Cloud Practitioner CLF-C02 2023 Exam Questions

Quiz
•
Professional Development
35 questions
E4P-26/55

Quiz
•
Professional Development
Popular Resources on Quizizz
15 questions
Character Analysis

Quiz
•
4th Grade
17 questions
Chapter 12 - Doing the Right Thing

Quiz
•
9th - 12th Grade
10 questions
American Flag

Quiz
•
1st - 2nd Grade
20 questions
Reading Comprehension

Quiz
•
5th Grade
30 questions
Linear Inequalities

Quiz
•
9th - 12th Grade
20 questions
Types of Credit

Quiz
•
9th - 12th Grade
18 questions
Full S.T.E.A.M. Ahead Summer Academy Pre-Test 24-25

Quiz
•
5th Grade
14 questions
Misplaced and Dangling Modifiers

Quiz
•
6th - 8th Grade