New Year Sale Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: xmas50

Amazon Web Services SCS-C02 - AWS Certified Security - Specialty

Page: 7 / 14
Total 467 questions

A company has implemented IAM WAF and Amazon CloudFront for an application. The application runs on Amazon EC2 instances that are part of an Auto Scaling group. The Auto Scaling group is behind an Application Load Balancer (ALB).

The IAM WAF web ACL uses an IAM Managed Rules rule group and is associated with the CloudFront distribution. CloudFront receives the request from IAM WAF and then uses the ALB as the distribution's origin.

During a security review, a security engineer discovers that the infrastructure is susceptible to a large, layer 7 DDoS attack.

How can the security engineer improve the security at the edge of the solution to defend against this type of attack?

A.

Configure the CloudFront distribution to use the Lambda@Edge feature. Create an IAM Lambda function that imposes a rate limit on CloudFront viewer requests. Block the request if the rate limit is exceeded.

B.

Configure the IAM WAF web ACL so that the web ACL has more capacity units to process all IAM WAF rules faster.

C.

Configure IAM WAF with a rate-based rule that imposes a rate limit that automatically blocks requests when the rate limit is exceeded.

D.

Configure the CloudFront distribution to use IAM WAF as its origin instead of the ALB.

A company uses AWS Organizations. The company has teams that use an AWS CloudHSM hardware security module (HSM) that is hosted in a central AWS account. One of the teams creates its own new dedicated AWS account and wants to use the HSM that is hosted in the central account.

How should a security engineer share the HSM that is hosted in the central account with the new dedicated account?

A.

Use AWS Resource Access Manager (AWS RAM) to share the VPC subnet ID of the HSM that is hosted in the central account with the new dedicated account. Configure the CloudHSM security group to accept inbound traffic from the private IP addresses of client instances in the new dedicated account.

B.

Use AWS Identity and Access Management (IAM) to create a cross-account rote to access the CloudHSM cluster that is in the central account Create a new IAM user in the new dedicated account Assign the cross-account rote to the new IAM user.

C.

Use AWS 1AM Identity Center (AWS Single Sign-On) to create an AWS Security Token Service (AWS STS) token to authenticate from the new dedicated account to the central account. Use the cross-account permissions that are assigned to the STS token to invoke an operation on the HSM in the central account.

D.

Use AWS Resource Access Manager (AWS RAM) to share the ID of the HSM that is hosted in the central account with the new dedicated account. Configure the CloudHSM security group to accept inbound traffic from the private IP addresses of client instances in the new dedicated account.

A company has AWS accounts in an organization in AWS Organizations. The company has enabled Amazon GuardDuty in its production, support, and test accounts. The company runs important workloads in the production account and centrally stores logs in an Amazon S3 bucket in the support account.

A security engineer must implement a solution to elevate security findings for the production account and the S3 bucket. The solution must automatically elevate findings of HIGH severity to CRITICAL severity.

Which solution will meet these requirements?

A.

Enable AWS Security Hub for all accounts. In the Security Hub administrator account, enable the GuardDuty integration. Create automation rules to elevate findings for the production account and the S3 bucket.

B.

Enable AWS Security Hub for all accounts. In the Security Hub administrator account, enable the GuardDuty integration. Use Amazon EventBridge to create a custom rule to elevate findings for the production account and the S3 bucket.

C.

Use the GuardDuty administrator account to configure a threat list that includes the production account and the S3 bucket. Use Amazon EventBridge and Amazon Simple Notification Service (Amazon SNS) to elevate findings from the threat list.

D.

Use the GuardDuty administrator account to enable S3 protection for the support account that contains the S3 bucket. Configure GuardDuty to elevate findings for the production account and the S3 bucket.

A company's Security Engineer has been tasked with restricting a contractor's IAM account access to the company's Amazon EC2 console without providing access to any other AWS services. The contractor's IAM account must not be able to gain access to any other AWS service, even if the IAM account is assigned additional permissions based on IAM group membership.

What should the Security Engineer do to meet these requirements?

A.

Create an Inline IAM user policy that allows for Amazon EC2 access for the contractor's IAM user.

B.

Create an IAM permissions boundary policy that allows Amazon EC2 access. Associate the contractor's IAM account with the IAM permissions boundary policy.

C.

Create an IAM group with an attached policy that allows for Amazon EC2 access. Associate the contractor's IAM account with the IAM group.

D.

Create an IAM role that allows for EC2 and explicitly denies all other services. Instruct the contractor to always assume this role.

A company usesAWS Organizations to run workloads in multiple AWS accounts Currently the individual team members at the company access all Amazon EC2 instances remotely by using SSH or Remote Desktop Protocol (RDP) The company does not have any audit trails and security groups are occasionally open The company must secure access management and implement a centralized togging solution

Which solution will meet these requirements MOST securely?

A.

Configure trusted access for AWS System Manager in Organizations Configure a bastion host from the management account Replace SSH and RDP by using Systems Manager Session Manager from the management account Configure Session Manager logging to Amazon CloudWatch Logs

B.

Replace SSH and RDP with AWS Systems Manager Session Manager Install Systems Manager Agent (SSM Agent) on the instances Attach the

C.

AmazonSSMManagedlnstanceCore role to the instances Configure session data streaming to Amazon CloudWatch Logs Create a separate logging account that has appropriate cross-account permissions to audit the log data

D.

Install a bastion host in the management account Reconfigure all SSH and RDP to allow access only from the bastion host Install AWS Systems Manager Agent (SSM Agent) on the bastion host Attach the AmazonSSMManagedlnstanceCore role to the bastion host Configure session data streaming to Amazon CloudWatch Logs in a separate logging account to audit log data

E.

Replace SSH and RDP with AWS Systems Manager State Manager Install Systems Manager Agent (SSM Agent) on the instances Attach theAmazonSSMManagedlnstanceCore role to the instances Configure session data streaming to Amazon CloudTrail Use CloudTrail Insights to analyze the trail data

A company uses AWS Organizations to manage its AWS accounts. The company needs to enforce server-side encryption with AWS KMS keys (SSE-KMS) on its Amazon S3 buckets Which solution will meet this requirement?

A.

Edit the S3 bucket policies to require requests to include the s3 x-amz-server-side-encryption header.

B.

Edit the S3 bucket policies to require requests to include the s3 x-amz-server-side-encryption-aws-kms-key-id header.

C.

Create an SCP that requires requests to include the s3 x-amz-server-side-encryption header Attach the SCP to the root OU.

D.

Create an SCP that requires requests to include the s3 x-amz-server-side-encryption-customer-algorithm header Attach the SCP to the root OU.

A company wants to create a log analytics solution for logs generated from its on-premises devices. The logs are collected from the devices onto a server on premises. The company wants to use AWS services to perform near real-time log analysis. The company also wants to store these logs for 365 days for pattern matching and substring search capabilities later.

Which solution will meet these requirements with the LEAST development overhead?

A.

Install Amazon Kinesis Agent on the on-premises server to send the logs to Amazon DynamoDB. Configure an AWS Lambda trigger on DynamoDB streams to perform near real-time log analysis. Export the DynamoDB data to Amazon S3 periodically. Run Amazon Athena queries for pattern matching and substring search. Set up S3 Ufecycle policies to delete the log data after 365 days.

B.

Install Amazon Managed Streaming for Apache Kafka (Amazon MSK) on the on-premises server. Create an MSK cluster to collect the streaming data and analyze the data in real time. Set the data retention period to 365 days to store the logs persistently for pattern matching and substring search.

C.

Install Amazon Kinesis Agent on the on-premises server to send the logs to Amazon Data Firehose. Configure Amazon Managed Service for Apache Flink (previously known as Amazon Kinesis Data Analytics) as the destination for real-time processing. Store the logs in Amazon OpenSearch Service for pattern matching and substring search. Configure an OpenSearch Service Index State Management (ISM) policy to delete the data after 365 days.

D.

Use Amazon API Gateway and AWS Lambda to write the logs from the on-premises server to Amazon DynamoDB. Configure a Lambda trigger on DynamoDB streams to perform near real-time log analysis. Run Amazon Athena federated queries on DynamoDB data for pattern matching and substring search. Set up TTL to delete data after 365 days.

An AWS Lambda function was misused to alter data, and a security engineer must identify who invoked the function and what output was produced. The engineer cannot find any logs create^ by the Lambda function in Amazon CloudWatch Logs.

Which of the following explains why the logs are not available?

A.

The execution role for the Lambda function did not grant permissions to write log data to CloudWatch Logs.

B.

The Lambda function was invoked by using Amazon API Gateway, so the logs are not stored in CloudWatch Logs.

C.

The execution role for the Lambda function did not grant permissions to write to the Amazon S3 bucket where CloudWatch Logs stores the logs.

D.

The version of the Lambda function that was invoked was not current.

A company hosts a web-based application that captures and stores sensitive data in an Amazon DynamoDB table. The company needs to implement a solution that provides end-to-end data protection and the ability to detect unauthorized data changes.

Which solution will meet these requirements?

A.

Use an AWS Key Management Service (AWS KMS) customer managed key. Encrypt the data at rest.

B.

Use AWS Private Certificate Authority. Encrypt the data in transit.

C.

Use the DynamoDB Encryption Client. Use client-side encryption. Sign the table items.

D.

Use the AWS Encryption SDK. Use client-side encryption. Sign the table items.

An online media company has an application that customers use to watch events around the world. The application is hosted on a fleet of Amazon EC2 instances that run Amazon Linux 2. The company uses AWS Systems Manager to manage the EC2 instances. The company applies patches and application updates by using the AWS-AmazonLinux2DefaultPatchBaseline patching baseline in Systems Manager Patch Manager.

The company is concerned about potential attacks on the application during the week of an upcoming event. The company needs a solution that can immediately deploy patches to all the EC2 instances in response to a security incident or vulnerability. The solution also must provide centralized evidence that the patches were applied successfully.

Which combination of steps will meet these requirements? (Select TWO.)

A.

Create a new patching baseline in Patch Manager. Specify Amazon Linux 2 as the product. Specify Security as the classification. Set the automatic approval for patches to 0 days. Ensure that the new patching baseline is the designated default for Amazon Linux 2.

B.

Use the Patch Now option with the scan and install operation in the Patch Manager console to apply patches against the baseline to all nodes. Specify an Amazon S3 bucket as the patching log storage option.

C.

Use the Clone function of Patch Manager to create a copy of the AWS-AmazonLinux2DefaultPatchBaseline built-in baseline. Set the automatic approval for patches to 1 day.

D.

Create a patch policy that patches all managed nodes and sends a patch operation log output to an Amazon S3 bucket. Use a custom scan schedule to set Patch Manager to check every hour for new patches. Assign the baseline to the patch policy.

E.

Use Systems Manager Application Manager to inspect the package versions that were installed on the EC2 instances. Additionally, use Application Manager to validate that the patches were correctly installed.