Question # 1
A company is hosting a static website from an Amazon S3 bucket. The website is available to customers at example.com. The company uses an Amazon Route 53 weighted routing policy with a TTL of 1 day. The company has decided to replace the existing static website with a dynamic web application. The dynamic web application uses an Application Load Balancer (ALB) in front of a fleet of Amazon EC2 instances.
On the day of production launch to customers, the company creates an additional Route 53 weighted DNS record entry that points to the ALB with a weight of 255 and a TTL of 1 hour. Two days later, a DevOps engineer notices that the previous static website is displayed sometimes when customers navigate to example.com.
How can the DevOps engineer ensure that the company serves only dynamic content for example.com?
| A. Delete all objects, including previous versions, from the S3 bucket that contains the static website content.
| B. Update the weighted DNS record entry that points to the S3 bucket. Apply a weight of 0. Specify the domain reset option to propagate changes immediately.
| C. Configure webpage redirect requests on the S3 bucket with a hostname that redirects to the ALB.
| D. Remove the weighted DNS record entry that points to the S3 bucket from the example.com hosted zone. Wait for DNS propagation to become complete.
|
D. Remove the weighted DNS record entry that points to the S3 bucket from the example.com hosted zone. Wait for DNS propagation to become complete.
Question # 2
A company wants to set up a continuous delivery pipeline. The company stores application code in a private GitHub repository. The company needs to deploy the application components to Amazon Elastic Container Service (Amazon ECS). Amazon EC2, and AWS Lambda. The pipeline must support manual approval actions.
Which solution will meet these requirements?
| A. Use AWS CodePipeline with Amazon ECS. Amazon EC2, and Lambda as deploy providers.
| B. Use AWS CodePipeline with AWS CodeDeploy as the deploy provider.
| C. Use AWS CodePipeline with AWS Elastic Beanstalk as the deploy provider.
| D. Use AWS CodeDeploy with GitHub integration to deploy the application.
|
B. Use AWS CodePipeline with AWS CodeDeploy as the deploy provider.
Explanation:
https://docs.aws.amazon.com/codedeploy/latest/ userguide/deployment-steps.html
Question # 3
A company has developed a static website hosted on an Amazon S3 bucket. The website is deployed using AWS CloudFormation. The CloudFormation template defines an S3 bucket and a custom resource that copies content into the bucket from a source location.
The company has decided that it needs to move the website to a new location, so the existing CloudFormation stack must be deleted and re-created. However, CloudFormation reports that the stack could not be deleted cleanly.
What is the MOST likely cause and how can the DevOps engineer mitigate this problem for this and future versions of the website?
| A. Deletion has failed because the S3 bucket has an active website configuration. Modify the Cloud Formation template to remove the WebsiteConfiguration properly from the S3 bucket resource.
| B. Deletion has failed because the S3 bucket is not empty. Modify the custom resource's AWS Lambda function code to recursively empty the bucket when RequestType is Delete.
| C. Deletion has failed because the custom resource does not define a deletion policy. Add a DeletionPolicy property to the custom resource definition with a value of RemoveOnDeletion.
| D. Deletion has failed because the S3 bucket is not empty. Modify the S3 bucket resource in the CloudFormation template to add a DeletionPolicy property with a value of Empty.
|
B. Deletion has failed because the S3 bucket is not empty. Modify the custom resource's AWS Lambda function code to recursively empty the bucket when RequestType is Delete.
Explanation:
Step 1: Understanding the Deletion FailureThe most likely reason why the CloudFormation stack failed to delete is that the S3 bucket was not empty. AWS CloudFormation cannot delete an S3 bucket that contains objects, so if the website files are still in the bucket, the deletion will fail.
Issue:The S3 bucket is not empty during deletion, preventing the stack from being deleted.
Step 2: Modifying the Custom Resource to Handle DeletionTo mitigate this issue, you can modify the Lambda function associated with the custom resource to automatically empty the S3 bucket when the stack is being deleted. By adding logic to handle the RequestType: Delete event, the function can recursively delete all objects in the bucket before allowing the stack to be deleted.
Action:Modify the Lambda function to recursively delete the objects in the S3 bucket when RequestType is set to Delete.
Why:This ensures that the S3 bucket is empty before CloudFormation tries to delete it, preventing the stack deletion failure.
[Reference:AWS documentation onCloudFormation custom resources., This corresponds toOption B: Deletion has failed because the S3 bucket is not empty. Modify the custom resource's AWS Lambda function code to recursively empty the bucket when RequestType is Delete., , ]
Question # 4
A DevOps engineer is designing an application that integrates with a legacy REST API. The application has an AWS Lambda function that reads records from an Amazon Kinesis data stream. The Lambda function sends the records to the legacy REST API.
Approximately 10% of the records that the Lambda function sends from the Kinesis data stream have data errors and must be processed manually. The Lambda function event source configuration has an Amazon Simple Queue Service (Amazon SQS) dead-letter queue as an on-failure destination. The DevOps engineer has configured the Lambda function to process records in batches and has implemented retries in case of failure.
During testing the DevOps engineer notices that the dead-letter queue contains many records that have no data errors and that already have been processed by the legacy REST API. The DevOps engineer needs to configure the Lambda function's event source options to reduce the number of errorless records that are sent to the dead-letter queue.
Which solution will meet these requirements?
| A. Increase the retry attempts
| B. Configure the setting to split the batch when an error occurs
| C. Increase the concurrent batches per shard
| D. Decrease the maximum age of record
|
B. Configure the setting to split the batch when an error occurs
Explanation:
This solution will meet the requirements because it will reduce the number of errorless records that are sent to the dead-letter queue. When you configure the setting to split the batch when an error occurs, Lambda will retry only the records that caused the error, instead of retrying the entire batch. This way, the records that have no data errors and have already been processed by the legacy REST API will not be retried and sent to the dead-letter queue unnecessarily.
https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis.html
Question # 5
A company is using AWS Organizations to centrally manage its AWS accounts. The company has turned on AWS Config in each member account by using AWS Cloud Formation StackSets The company has configured trusted access in Organizations for AWS Config and has configured a member account as a delegated administrator account for AWS Config
A DevOps engineer needs to implement a new security policy The policy must require all current and future AWS member accounts to use a common baseline of AWS Config rules that contain remediation actions that are managed from a central account Non-administrator users who can access member accounts must not be able to modify this common baseline of AWS Config rules that are deployed into each member account
Which solution will meet these requirements?
| A. Create a CloudFormation template that contains the AWS Config rules and remediation actions. Deploy the template from the Organizations management account by using CloudFormation StackSets.
| B. Create an AWS Config conformance pack that contains the AWS Config rules and remediation actions Deploy the pack from the Organizations management account by using CloudFormation StackSets.
| C. Create a CloudFormation template that contains the AWS Config rules and remediation actions Deploy the template from the delegated administrator account by using AWS Config.
| D. Create an AWS Config conformance pack that contains the AWS Config rules and remediation actions. Deploy the pack from the delegated administrator account by using AWS Config.
|
D. Create an AWS Config conformance pack that contains the AWS Config rules and remediation actions. Deploy the pack from the delegated administrator account by using AWS Config.
Explanation:
The correct answer is D. Creating an AWS Config conformance pack that contains the AWS Config rules and remediation actions and deploying it from the delegated administrator account by using AWS Config will meet the requirements. A conformance pack is a collection of AWS Config rules and remediation actions that can be easily deployed as a single entity in an account and a region or across an organization in AWS Organizations1. By using the delegated administrator account, the DevOps engineer can centrally manage the conformance pack and prevent non-administrator users from modifying it in the member accounts.
Option A is incorrect because creating a CloudFormation template that contains the AWS Config rules and remediation actions and deploying it from the Organizations management account by using CloudFormation StackSets will not prevent non-administrator users from modifying the AWS Config rules in the member accounts. Option B is incorrect because deploying the conformance pack from the Organizations management account by using CloudFormation StackSets will not use the trusted access feature of AWS Config and will require additional permissions and resources.
Option C is incorrect because creating a CloudFormation template that contains the AWS Config rules and remediation actions and deploying it from the delegated administrator account by using AWS Config will not leverage the benefits of conformance packs, such as simplified deployment and management.
References:
Conformance Packs - AWS Config
Certified DevOps Engineer - Professional (DOP-C02) Study Guide (page 176)
Question # 6
A company uses Amazon RDS for all databases in Its AWS accounts The company uses AWS Control Tower to build a landing zone that has an audit and logging account All databases must be encrypted at rest for compliance reasons. The company's security engineer needs to receive notification about any noncompliant databases that are in the company's accounts
Which solution will meet these requirements with the MOST operational efficiency?
| A. Use AWS Control Tower to activate the optional detective control (guardrail) to determine whether the RDS storage is encrypted Create an Amazon Simple Notification Service (Amazon SNS) topic in the company's audit account. Create an Amazon EventBridge rule to filter noncompliant events from the AWS Control Tower control (guardrail) to notify the SNS topic. Subscribe the security engineer's email address to the SNS topic
| B. Use AWS Cloud Formation StackSets to deploy AWS Lambda functions to every account. Write the Lambda function code to determine whether the RDS storage is encrypted in the account the function is deployed to Send the findings as an Amazon CloudWatch metric to the management account Create an Amazon Simple Notification Service (Amazon SNS) topic. Create a CloudWatch alarm that notifies the SNS topic when metric thresholds are met. Subscribe t
| C. Create a custom AWS Config rule in every account to determine whether the RDS storage is encrypted Create an Amazon Simple Notification Service (Amazon SNS) topic in the audit account Create an Amazon EventBridge rule to filter noncompliant events from the AWS Control Tower control (guardrail) to notify the SNS topic. Subscribe the security engineer's email address to the SNS topic
| D. Launch an Amazon EC2 instance. Run an hourly cron job by using the AWS CLI to determine whether the RDS storage is encrypted in each AWS account Store the results in an RDS database. Notify the security engineer by sending email messages from the EC2 instance when noncompliance is detected
|
A. Use AWS Control Tower to activate the optional detective control (guardrail) to determine whether the RDS storage is encrypted Create an Amazon Simple Notification Service (Amazon SNS) topic in the company's audit account. Create an Amazon EventBridge rule to filter noncompliant events from the AWS Control Tower control (guardrail) to notify the SNS topic. Subscribe the security engineer's email address to the SNS topic
Activate AWS Control Tower Guardrail:
Use AWS Control Tower to activate a detective guardrail that checks whether RDS storage is encrypted.
Create SNS Topic for Notifications:
Set up an Amazon Simple Notification Service (SNS) topic in the audit account to receive notifications about non-compliant databases.
Create EventBridge Rule to Filter Non-compliant Events:
Create an Amazon EventBridge rule that filters events related to the guardrail's findings on non-compliant RDS instances.
Configure the rule to send notifications to the SNS topic when non-compliant events are detected.
Subscribe Security Engineer's Email to SNS Topic:
Subscribe the security engineer's email address to the SNS topic to receive notifications when non-compliant databases are detected.
By using AWS Control Tower to activate a detective guardrail and setting up SNS notifications for non-compliant events, the company can efficiently monitor and ensure that all RDS databases are encrypted at rest.
References:
AWS Control Tower Guardrails
Amazon SNS
Amazon EventBridge
Question # 7
A company's developers use Amazon EC2 instances as remote workstations. The company is concerned that users can create or modify EC2 security groups to allow unrestricted inbound access.
A DevOps engineer needs to develop a solution to detect when users create unrestricted security group rules. The solution must detect changes to security group rules in near real time, remove unrestricted rules, and send email notifications to the security team. The DevOps engineer has created an AWS Lambda function that checks for security group ID from input, removes rules that grant unrestricted access, and sends notifications through Amazon Simple Notification Service (Amazon SNS).
What should the DevOps engineer do next to meet the requirements?
| A. Configure the Lambda function to be invoked by the SNS topic. Create an AWS CloudTrail subscription for the SNS topic. Configure a subscription filter for security group modification events.
| B. Create an Amazon EventBridge scheduled rule to invoke the Lambda function. Define a schedule pattern that runs the Lambda function every hour.
| C. Create an Amazon EventBridge event rule that has the default event bus as the source. Define the rule’s event pattern to match EC2 security group creation and modification events. Configure the rule to invoke the Lambda function.
| D. Create an Amazon EventBridge custom event bus that subscribes to events from all AWS services. Configure the Lambda function to be invoked by the custom event bus.
|
C. Create an Amazon EventBridge event rule that has the default event bus as the source. Define the rule’s event pattern to match EC2 security group creation and modification events. Configure the rule to invoke the Lambda function.
Explanation:
To meet the requirements, the DevOps engineer should create an Amazon EventBridge event rule that has the default event bus as the source. The rule's event pattern should match EC2 security group creation and modification events, and it should be configured to invoke the Lambda function. This solution will allow for near real-time detection of security group rule changes and will trigger the Lambda function to remove any unrestricted rules and send email notifications to the security team.
https://repost.aws/knowledge-center/monitor-security-group-changes-ec2
Amazon Web Services DOP-C02 Exam Dumps
5 out of 5
Pass Your AWS Certified DevOps Engineer - Professional Exam in First Attempt With DOP-C02 Exam Dumps. Real AWS Certified Professional Exam Questions As in Actual Exam!
— 250 Questions With Valid Answers
— Updation Date : 24-Feb-2025
— Free DOP-C02 Updates for 90 Days
— 98% AWS Certified DevOps Engineer - Professional Exam Passing Rate
PDF Only Price 49.99$
19.99$
Buy PDF
Speciality
Additional Information
Testimonials
Related Exams
- Number 1 Amazon Web Services AWS Certified Professional study material online
- Regular DOP-C02 dumps updates for free.
- AWS Certified DevOps Engineer - Professional Practice exam questions with their answers and explaination.
- Our commitment to your success continues through your exam with 24/7 support.
- Free DOP-C02 exam dumps updates for 90 days
- 97% more cost effective than traditional training
- AWS Certified DevOps Engineer - Professional Practice test to boost your knowledge
- 100% correct AWS Certified Professional questions answers compiled by senior IT professionals
Amazon Web Services DOP-C02 Braindumps
Realbraindumps.com is providing AWS Certified Professional DOP-C02 braindumps which are accurate and of high-quality verified by the team of experts. The Amazon Web Services DOP-C02 dumps are comprised of AWS Certified DevOps Engineer - Professional questions answers available in printable PDF files and online practice test formats. Our best recommended and an economical package is AWS Certified Professional PDF file + test engine discount package along with 3 months free updates of DOP-C02 exam questions. We have compiled AWS Certified Professional exam dumps question answers pdf file for you so that you can easily prepare for your exam. Our Amazon Web Services braindumps will help you in exam. Obtaining valuable professional Amazon Web Services AWS Certified Professional certifications with DOP-C02 exam questions answers will always be beneficial to IT professionals by enhancing their knowledge and boosting their career.
Yes, really its not as tougher as before. Websites like Realbraindumps.com are playing a significant role to make this possible in this competitive world to pass exams with help of AWS Certified Professional DOP-C02 dumps questions. We are here to encourage your ambition and helping you in all possible ways. Our excellent and incomparable Amazon Web Services AWS Certified DevOps Engineer - Professional exam questions answers study material will help you to get through your certification DOP-C02 exam braindumps in the first attempt.
Pass Exam With Amazon Web Services AWS Certified Professional Dumps. We at Realbraindumps are committed to provide you AWS Certified DevOps Engineer - Professional braindumps questions answers online. We recommend you to prepare from our study material and boost your knowledge. You can also get discount on our Amazon Web Services DOP-C02 dumps. Just talk with our support representatives and ask for special discount on AWS Certified Professional exam braindumps. We have latest DOP-C02 exam dumps having all Amazon Web Services AWS Certified DevOps Engineer - Professional dumps questions written to the highest standards of technical accuracy and can be instantly downloaded and accessed by the candidates when once purchased. Practicing Online AWS Certified Professional DOP-C02 braindumps will help you to get wholly prepared and familiar with the real exam condition. Free AWS Certified Professional exam braindumps demos are available for your satisfaction before purchase order.
Send us mail if you want to check Amazon Web Services DOP-C02 AWS Certified DevOps Engineer - Professional DEMO before your purchase and our support team will send you in email.
If you don't find your dumps here then you can request what you need and we shall provide it to you.
Bulk Packages
$50
- Get 3 Exams PDF
- Get $33 Discount
- Mention Exam Codes in Payment Description.
Buy 3 Exams PDF
$70
- Get 5 Exams PDF
- Get $65 Discount
- Mention Exam Codes in Payment Description.
Buy 5 Exams PDF
$100
- Get 5 Exams PDF + Test Engine
- Get $105 Discount
- Mention Exam Codes in Payment Description.
Buy 5 Exams PDF + Engine
 Jessica Doe
AWS Certified Professional
We are providing Amazon Web Services DOP-C02 Braindumps with practice exam question answers. These will help you to prepare your AWS Certified DevOps Engineer - Professional exam. Buy AWS Certified Professional DOP-C02 dumps and boost your knowledge.
|