Free Practice · No Signup Required
30 Free AWS SCS-C02 Practice Questions
Real practice questions for the AWS AWS Certified Security Specialty (SCS-C02) exam, with answers and detailed explanations. Updated 2026.
Free questions
30
Passing score
750 out of 1000
Exam time
170 minutes
Question pool
400+ Questions
Below are 30 real practice questions for the AWS AWS Certified Security Specialty (SCS-C02) exam. Each question shows the correct answer and a detailed explanation when you reveal it. Use these to benchmark your readiness — if you score below 70% on these 30 questions, plan for at least 4 more weeks of study before booking.
SCS-C02 Practice Questions
Question 1.A business requires a forensic logging solution for hundreds of Docker-based apps running on Amazon EC2. The solution must analyze logs in real time, provide message replay, and persist logs. Which Amazon Web Offerings (IAM) services should be employed to satisfy these requirements? (Select TWO)
- A.Amazon Athena.
- B.Amazon Kinesis.(correct answer)
- C.Amazon SQS.
- D.Amazon Elasticsearch.(correct answer)
- E.Amazon EMR.
Show answer & explanationHide answer
Correct answer: B, D
Amazon Kinesis. / Amazon Elasticsearch.
Explanation
Kinesis Data Streams enables real-time processing and replay of streaming data (logs). Amazon Elasticsearch (OpenSearch) provides index and search capabilities for analytics. Together they form a common logging solution.
Question 2.A company developed an application by using AWS Lambda, Amazon S3, Amazon Simple Notification Service (Amazon SNS), and Amazon DynamoDB. An external application puts objects into the company's S3 bucket and tags the objects with date and time. A Lambda function periodically pulls data from the company's S3 bucket based on date and time tags and inserts specific values into a DynamoDB table for further processing. The data includes personally identifiable information (Pll). The company must remove data that is older than 30 days from the S3 bucket and the DynamoDB table. Which solution will meet this requirement with the MOST operational efficiency?
- A.Update the Lambda function to add a TTL S3 flag to S3 objects. Create an S3 Lifecycle policy to expire objects that are older than 30 days by using the TTL S3 flag.
- B.Create an S3 Lifecycle policy to expire objects that are older than 30 days. Update the Lambda function to add the TTL attribute in the DynamoDB table. Enable TTL on the DynamoDB table to expire entires that are older than 30 days based on the TTL attribute.(correct answer)
- C.Create an S3 Lifecycle policy to expire objects that are older than 30 days and to add all prefixes to the S3 bucket. Update the Lambda function to delete entries that are older than 30 days.
- D.Create an S3 Lifecycle policy to expire objects that are older than 30 days by using object tags. Update the Lambda function to delete entries that are older than 30 days.
Show answer & explanationHide answer
Correct answer: B
Create an S3 Lifecycle policy to expire objects that are older than 30 days. Update the Lambda function to add the TTL attribute in the DynamoDB table. Enable TTL on the DynamoDB table to expire entires that are older than 30 days based on the TTL attribute.
Explanation
S3 Lifecycle Policies can expire objects based on age (30 days). DynamoDB TTL (Time To Live) automatically deletes items when a specified timestamp attribute expires, without consuming write throughput.
Question 3.A company is hosting a static website on Amazon S3 The company has configured an Amazon CloudFront distribution to serve the website contents. The company has associated an IAM WAF web ACL with the CloudFront distribution. The Web ACL ensures that requests originate from the United States to address compliance restrictions. THE company is worried that the S3 URL might still be accessible directly and that requests can bypass the CloudFront distribution. Which combination of steps should the company take to remove direct access to the S3 URL? (Select TWO)
- A.Select `Restrict Bucket Access` in the origin settings of the CloudFront distribution(correct answer)
- B.Create an origin access identity (OAI) for the S3 origin.(correct answer)
- C.Update the S3 bucket policy to allow s3 GetObject with a condition that the IAM Referer key matches the secret value Deny all other.
- D.Configure the S3 bucket poky so that only the origin access identity (OAI) has read permission for objects in the bucket.
- E.Add an origin custom header that has the name Referer to the CloudFront distribution Give the header a secret value.
Show answer & explanationHide answer
Correct answer: A, B
Select `Restrict Bucket Access` in the origin settings of the CloudFront distribution / Create an origin access identity (OAI) for the S3 origin.
Explanation
To prevent direct S3 access and force traffic through CloudFront: (1) Use an Origin Access Identity (OAI) or OAC. (2) Update the S3 Bucket Policy to allow access ONLY to that OAI/OAC.
Question 4.A company is testing its incident response plan for compromised credentials. The company runs a database on an Amazon EC2 instance and stores the sensitive data-base credentials as a secret in AWS Secrets Manager. The secret has rotation configured with an AWS Lambda function that uses the generic rotation function template. The EC2 instance and the Lambda function are deployed in the same private subnet. The VPC has a Secrets Manager VPC endpoint. A security engineer discovers that the secret cannot rotate. The security engineer determines that the VPC endpoint is working as intended. The Amazon Cloud-Watch logs contain the following error: `"setSecret: Unable to log into database"`. Which solution will resolve this error?
- A.Use the AWS Management Console to edit the JSON structure of the secret in Secrets Manager so that the secret automatically conforms with the structure that the database requires.
- B.Ensure that the security group that is attached to the Lambda function al-lows outbound connections to the EC2 instance. Ensure that the security group that is attached to the EC2 instance allows inbound connections from the security group that is attached to the Lambda function.(correct answer)
- C.Use the Secrets Manager list-secrets command in the AWS CLI to list the secret. Identify the database
- D.Add an internet gateway to the VPC. Create a NAT gateway in a public sub-net. Update the VPC route tables so that traffic from the Lambda function and traffic from the EC2 instance can reach the Secrets Manager public endpoint.
Show answer & explanationHide answer
Correct answer: B
Ensure that the security group that is attached to the Lambda function al-lows outbound connections to the EC2 instance. Ensure that the security group that is attached to the EC2 instance allows inbound connections from the security group that is attached to the Lambda function.
Explanation
For Lambda to access a resource (database) or service endpoint (Secrets Manager) inside a VPC, the Security Groups must allow the traffic. The Lambda SG needs outbound access, and the EC2/Endpoint SG needs inbound access from the Lambda SG.
Question 5.A company needs a forensic-logging solution for hundreds of applications running in Docker on Amazon EC2 The solution must perform real-time analytics on the togs must support the replay of messages and must persist the logs. Which IAM services should be used to meet these requirements? (Select TWO)
- A.Amazon Athena.
- B.Amazon Kinesis.(correct answer)
- C.Amazon SQS.
- D.Amazon Elasticsearch.(correct answer)
- E.Amazon EMR.
Show answer & explanationHide answer
Correct answer: B, D
Amazon Kinesis. / Amazon Elasticsearch.
Explanation
Kinesis Data Streams (for ingestion/replay) and Amazon Elasticsearch/OpenSearch (for analytics) are the standard pattern for real-time forensic logging.
Question 6.A company is evaluating the use of AWS Systems Manager Session Manager to gam access to the company's Amazon EC2 instances. However, until the company implements the change, the company must protect the key file for the EC2 instances from read and write operations by any other users. When a security administrator tries to connect to a critical EC2 Linux instance during an emergency, the security administrator receives the following error. `"Error Unprotected private key file – Permissions for' ssh/my_private_key pern' are too open"`. Which command should the security administrator use to modify the private key Me permissions to resolve this error?
- A.chmod 0040 ssh/my_private_key pern.
- B.chmod 0400 ssh/my_private_key pern.(correct answer)
- C.chmod 0004 ssh/my_private_key pern.
- D.chmod 0777 ssh/my_private_key pern.
Show answer & explanationHide answer
Correct answer: B
chmod 0400 ssh/my_private_key pern.
Explanation
SSH private keys must have strict permissions (read/write only by the owner). `chmod 400` sets read-only for the owner and no permissions for anyone else, which satisfies SSH requirements.
Question 7.A company deploys a set of standard IAM roles in AWS accounts. The IAM roles are based on job functions within the company. To balance operational efficiency and security, a security engineer implemented AWS Organizations SCPs to restrict access to critical security services in all company accounts. All of the company's accounts and OUs within AWS Organizations have a default FullAWSAccess SCP that is attached. The security engineer needs to ensure that no one can disable Amazon GuardDuty and AWS Security Hub. The security engineer also must not override other permissions that are granted by IAM policies that are defined in the accounts. Which SCP should the security engineer attach to the root of the organization to meet these requirements?
- A.Option A.(correct answer)
- B.Option B.
- C.Option C.
- D.Option D.
Show answer & explanationHide answer
Correct answer: A
Option A.
Explanation
Service Control Policies (SCPs) can deny specific actions (like disabling GuardDuty) across the organization, even for the root user, ensuring security baselines are not tampered with.
Question 8.A company is building a data processing application mat uses AWS Lambda functions. The application's Lambda functions need to communicate with an Amazon RDS OB instance that is deployed within a VPC in the same AWS accountWhich solution meets these requirements in the MOST secure way?
- A.Configure the DB instance to allow public access Update the DB instance security group to allow access from the Lambda public address space for the AWS Region.
- B.Deploy the Lambda functions inside the VPC Attach a network ACL to the Lambda subnet Provide outbound rule access to the VPC CIDR range only Update the DB instance security group to allow traffic from 0.0.0.0/0.
- C.Deploy the Lambda functions inside the VPC Attach a security group to the Lambda functions Provide outbound rule access to the VPC CIDR range only Update the DB instance security group to allow traffic from the Lambda security group.(correct answer)
- D.Peer the Lambda default VPC with the VPC that hosts the DB instance to allow direct network access without the need for security groups.
Show answer & explanationHide answer
Correct answer: C
Deploy the Lambda functions inside the VPC Attach a security group to the Lambda functions Provide outbound rule access to the VPC CIDR range only Update the DB instance security group to allow traffic from the Lambda security group.
Explanation
The most secure (principle of least privilege) way to allow connectivity between resources in a VPC is to reference the SOURCE Security Group ID in the DESTINATION Security Group's Usage inbound rule.
Question 9.A company has an application that uses an Amazon RDS PostgreSQL database. The company is developing an application feature that will store sensitive information for an individual in the database. During a security review of the environment, the company discovers that the RDS DB instance is not encrypting data at rest. The company needs a solution that will provide encryption at rest for all the existing data and for any new data that is entered for an individual. Which combination of options can the company use to meet these requirements? (Select TWO)
- A.Create a snapshot of the DB instance. Copy the snapshot to a new snapshot, and enable encryption for the copy process. Use the new snapshot to restore the DB instance.(correct answer)
- B.Modify the configuration of the DB instance by enabling encryption. Create a snapshot of the DB instance. Use the snapshot to restore the DB instance.
- C.Use IAM Key Management Service (AWS KMS) to create a new default IAM managed awards key. Select this key as the encryption key for operations with Amazon RDS.
- D.Use IAM Key Management Service (AWS KMS) to create a new CMK. Select this key as the encryption key for operations with Amazon RDS.(correct answer)
- E.Create a snapshot of the DB instance. Enable encryption on the snapshoVUse the snapshot to restore the DB instance.
Show answer & explanationHide answer
Correct answer: A, D
Create a snapshot of the DB instance. Copy the snapshot to a new snapshot, and enable encryption for the copy process. Use the new snapshot to restore the DB instance. / Use IAM Key Management Service (AWS KMS) to create a new CMK. Select this key as the encryption key for operations with Amazon RDS.
Explanation
To encrypt an existing unencrypted RDS DB: (1) Take a snapshot. (2) Copy the snapshot and enable encryption during the copy. (3) Restore a new instance from the encrypted snapshot.
Question 10.Which of the following bucket policies will ensure that objects being uploaded to a bucket called 'demo' are encrypted?
- A.Option A.(correct answer)
- B.Option B.
- C.Option C.
- D.Option D.
Show answer & explanationHide answer
Correct answer: A
Option A.
Explanation
A bucket policy can enforce encryption by Denying `PutObject` requests that do not include the `x-amz-server-side-encryption` header (e.g., AES256 or aws:kms).
Question 11.A company uses AWS Organizations to manage a multi-account AWS environment in a single AWS Region. The organization's management account is named management-01. The company has turned on AWS Config in all accounts in the organization. The company has designated an account named security-01 as the delegated administrator for AWS Config. All accounts report the compliance status of each account's rules to the AWS Config delegated administrator account by using an AWS Config aggregator. Each account administrator can configure and manage the account's own AWS Config rules to handle each account's unique compliance requirements. A security engineer needs to implement a solution to automatically deploy a set of 10 AWS Config rules to all existing and future AWS accounts in the organization. The solution must turn on AWS Config automatically during account creation. Which combination of steps will meet these requirements? (Select TWO)
- A.Create an AWS CloudFormation template that contains the 1 0 required AVVS Config rules. Deploy the template by using CloudFormation StackSets in the security-01 account.
- B.Create a conformance pack that contains the 10 required AWS Config rules. Deploy the conformance pack from the security-01 account.(correct answer)
- C.Create a conformance pack that contains the 10 required AWS Config rules. Deploy the conformance pack from the management-01 account.
- D.Create an AWS CloudFormation template that will activate AWS Config. Deploy the template by using CloudFormation StackSets in the security-01 ac-count.
- E.Create an AWS CloudFormation template that will activate AWS Config. Deploy the template by using CloudFormation StackSets in the management-01 account.(correct answer)
Show answer & explanationHide answer
Correct answer: B, E
Create a conformance pack that contains the 10 required AWS Config rules. Deploy the conformance pack from the security-01 account. / Create an AWS CloudFormation template that will activate AWS Config. Deploy the template by using CloudFormation StackSets in the management-01 account.
Explanation
To deploy AWS Config rules to all accounts: (1) Use a Conformance Pack (collection of rules). (4) Deploy it using CloudFormation StackSets to target all accounts in the Organization.
Question 12.A company has two IAM accounts within IAM Organizations. In Account-1. Amazon EC2 Auto Scaling is launched using a service-linked role. In Account-2. Amazon EBS volumes are encrypted with an AWS KMS key. A Security Engineer needs to ensure that the service-linked role can launch instances with these encrypted volumesWhich combination of steps should the Security Engineer take in both accounts? (Select TWO)
- A.Allow Account-1 to access the KMS key in Account-2 using a key policy(correct answer)
- B.Attach an IAM policy to the service-linked role in Account-1 that allows these actions CreateGrant. DescnbeKey, Encrypt, GenerateDataKey, Decrypt, and ReEncrypt
- C.Create a KMS grant for the service-linked role with these actions CreateGrant, DescnbeKey Encrypt GenerateDataKey Decrypt, and ReEncrypt(correct answer)
- D.Attach an IAM policy to the role attached to the EC2 instances with KMS actions and then allow Account-1 in the KMS key policy.
- E.Attach an IAM policy to the user who is launching EC2 instances and allow the user to access the KMS key policy of Account-2.
Show answer & explanationHide answer
Correct answer: A, C
Allow Account-1 to access the KMS key in Account-2 using a key policy / Create a KMS grant for the service-linked role with these actions CreateGrant, DescnbeKey Encrypt GenerateDataKey Decrypt, and ReEncrypt
Explanation
Cross-account KMS usage requires: (1) The Key Policy in Account-2 must allow Account-1. (2) The IAM Role in Account-1 must have permissions to use the Key.
Question 13.Which of the following are valid configurations for using SSL certificates with Amazon CloudFront? (Select THREE)
- A.Default AWS Certificate Manager certificate.
- B.Custom SSL certificate stored in AWS KMS.
- C.Default CloudFront certificate.(correct answer)
- D.Custom SSL certificate stored in AWS Certificate Manager.(correct answer)
- E.Default SSL certificate stored in AWS Secrets Manager.
- F.Custom SSL certificate stored in AWS IAM.(correct answer)
Show answer & explanationHide answer
Correct answer: C, D, F
Default CloudFront certificate. / Custom SSL certificate stored in AWS Certificate Manager. / Custom SSL certificate stored in AWS IAM.
Explanation
CloudFront supports: (2) Default CloudFront Certificate (*.cloudfront.net). (3) Custom Certificate via ACM (us-east-1). (5) Custom Certificate imported to IAM (legacy/special cases).
Question 14.A Security Engineer is troubleshooting an issue with a company's custom logging application. The application logs are written to an Amazon S3 bucket with event notifications enabled to send events to an Amazon SNS topic. All logs are encrypted at rest using an AWS KMS CMK. The SNS topic is subscribed to an encrypted Amazon SQS queue. The logging application polls the queue for new messages that contain metadata about the S3 object. The application then reads the content of the object from the S3 bucket for indexing. The Logging team reported that Amazon CloudWatch metrics for the number of messages sent or received is showing zero. No tags are being received. What should the Security Engineer do to troubleshoot this issue?
- A.Option A: Add the following statement to the IAM managed CMKs.
- B.Option B: Add the following statement to the CMK key policy.(correct answer)
- C.Option C: Add the following statement to the CMK key policy.
- D.Option D: Add the following statement to the CMK key policy.
Show answer & explanationHide answer
Correct answer: B
Option B: Add the following statement to the CMK key policy.
Explanation
If S3 event notifications fail to reach an encrypted SNS topic, it's often because the SNS topic's KMS key policy does not allow S3 to call `kms:GenerateDataKey` and `kms:Decrypt` to publish the message.
Question 15.A security engineer needs to implement a write-once-read-many (WORM) model for data that a company will store in Amazon S3 buckets. The company uses the S3 Standard storage class for all of its S3 buckets. The security engineer must ensure that objects cannot be overwritten or deleted by any user, including the AWS account root user. Which solution will meet these requirements?
- A.Create new S3 buckets with S3 Object Lock enabled in compliance mode. Place objects in the S3 buckets.(correct answer)
- B.Use S3 Glacier Vault Lock to attach a Vault Lock policy to new S3 buckets. Wait 24 hours to complete the Vault Lock process. Place objects in the S3 buckets.
- C.Create new S3 buckets with S3 Object Lock enabled in governance mode. Place objects in the S3 buckets.
- D.Create new S3 buckets with S3 Object Lock enabled in governance mode. Add a legal hold to the S3 buckets. Place objects in the S3 buckets.
Show answer & explanationHide answer
Correct answer: A
Create new S3 buckets with S3 Object Lock enabled in compliance mode. Place objects in the S3 buckets.
Explanation
S3 Object Lock in **Compliance Mode** prevents deletion or overwriting by *anyone*, including the root user, for the retention period. (Governance mode allows authorized users to bypass).
Question 16.A development team is attempting to encrypt and decode a secure string parameter from the IAM Systems Manager Parameter Store using an IAM Key Management Service (AWS KMS) CMK. However, each attempt results in an error message being sent to the development team. Which CMK-related problems possibly account for the error? (Select TWO)
- A.The CMK is used in the attempt does not exist.(correct answer)
- B.The CMK is used in the attempt needs to be rotated.
- C.The CMK is used in the attempt is using the CMKs key ID instead of the CMK ARN.
- D.The CMK is used in the attempt is not enabled.(correct answer)
- E.The CMK is used in the attempt is using an alias.
Show answer & explanationHide answer
Correct answer: A, D
The CMK is used in the attempt does not exist. / The CMK is used in the attempt is not enabled.
Explanation
KMS Errors: 'Valid key does not exist' usually means the specific Key ID/ARN is wrong or the key is disabled/pending deletion. Alias issues are also possible if the alias doesn't map correctly.
Question 17.A security engineer logs in to the AWS Lambda console with administrator permissions. The security engineer is trying to view logs in Amazon CloudWatch for a Lambda function that is named my Function. When the security engineer chooses the option in the Lambda console to view logs in CloudWatch, an `error loading Log Streams` message appears. The IAM policy for the Lambda function's execution role contains the following. How should the security engineer correct the error? 
- A.Move the `logs:CreateLogGroup` action to the second Allow statement.
- B.Add the `logs:PutDestination` action to the second Allow statement.
- C.Add the `logs:GetLogEvents` action to the second Allow statement.
- D.Add the `logs:CreateLogStream` action to the second Allow statement.(correct answer)
Show answer & explanationHide answer
Correct answer: D
Add the `logs:CreateLogStream` action to the second Allow statement.
Explanation
The `logs:CreateLogStream` permission is required for a Lambda function (or any resource) to start writing logs to a specific Log Group.
Question 18.A company plans to create individual child accounts within an existing organization in IAM Organizations for each of its DevOps teams. AWS CloudTrail has been enabled and configured on all accounts to write audit logs to an Amazon S3 bucket in a centralized IAM account. A security engineer needs to ensure that DevOps team members are unable to modify or disable this configuration. How can the security engineer meet these requirements?
- A.Create an IAM policy that prohibits changes to the specific CloudTrail trail and apply the policy to the IAM account root user.
- B.Create an S3 bucket policy in the specified destination account for the CloudTrail trail that prohibits configuration changes from the IAM account root user in the source account.
- C.Create an SCP that prohibits changes to the specific CloudTrail trail and apply the SCP to the appropriate organizational unit or account in Organizations.(correct answer)
- D.Create an IAM policy that prohibits changes to the specific CloudTrail trail and apply the policy to a new IAM group. Have team members use individual IAM accounts that are members of the new IAM group.
Show answer & explanationHide answer
Correct answer: C
Create an SCP that prohibits changes to the specific CloudTrail trail and apply the SCP to the appropriate organizational unit or account in Organizations.
Explanation
To protect a CloudTrail trail in a specific account from modification by even the root user of that account, you must apply a Service Control Policy (SCP) at the Organization level.
Question 19.A company uses Amazon RDS for MySQL as a database engine for its applications. A recent security audit revealed an RDS instance that is not compliant with company policy for encrypting data at rest. A security engineer at the company needs to ensure that all existing RDS databases are encrypted using server-side encryption and that any future deviations from the policy are detected. Which combination of steps should the security engineer take to accomplish this? (Select TWO)
- A.Create an AWS Config rule to detect the creation of encrypted RDS databases. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to trigger on the AWS Config rules compliance state change and use Amazon Simple Notification Service (Amazon SNS) to notify the security operations team.(correct answer)
- B.Use AWS System Manager State Manager to detect RDS database encryption configuration drift. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to track state changes and use Amazon Simple Notification Service (Amazon SNS) to notify the security operations team.
- C.Create a read replica for the existing unencrypted RDS database and enable replica encryption in the process. Once the replica becomes active, promote it into a standalone database instance and terminate the unencrypted database instance.
- D.Take a snapshot of the unencrypted RDS database. Copy the snapshot and enable snapshot encryption in the process. Restore the database instance from the newly created encrypted snapshot. Terminate the unencrypted database instance.(correct answer)
- E.Enable encryption for the identified unencrypted RDS instance by changing the configurations of the existing database.
Show answer & explanationHide answer
Correct answer: A, D
Create an AWS Config rule to detect the creation of encrypted RDS databases. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to trigger on the AWS Config rules compliance state change and use Amazon Simple Notification Service (Amazon SNS) to notify the security operations team. / Take a snapshot of the unencrypted RDS database. Copy the snapshot and enable snapshot encryption in the process. Restore the database instance from the newly created encrypted snapshot. Terminate the unencrypted database instance.
Explanation
Detection: AWS Config Rule (`rds-storage-encrypted`) + EventBridge/SNS. Remediation: Snapshot -> Copy (Encrypt) -> Restore pattern.
Question 20.A company has a large fleet of Linux Amazon EC2 instances and Windows EC2 instances that run in private subnets. The company wants all remote administration to be performed as securely as possible in the AWS Cloud. Which solution will meet these requirements?
- A.Do not use SSH-RSA private keys during the launch of new instances. Implement AWS Systems Manager Session Manager.(correct answer)
- B.Generate new SSH-RSA private keys for existing instances. Implement AWS Systems Manager Session Manager.
- C.Do not use SSH-RSA private keys during the launch of new instances. Configure EC2 Instance Connect.
- D.Generate new SSH-RSA private keys for existing instances. Configure EC2 Instance Connect.
Show answer & explanationHide answer
Correct answer: A
Do not use SSH-RSA private keys during the launch of new instances. Implement AWS Systems Manager Session Manager.
Explanation
AWS Systems Manager Session Manager provides secure, auditable remote shell access to instances without opening inbound ports (like 22/3389) or managing SSH keys.
Question 21.A company has an AWS Lambda function that creates image thumbnails from larger images. The Lambda function needs read and write access to an Amazon S3 bucket in the same AWS account. Which solutions will provide the Lambda function this access? (Select TWO)
- A.Create an IAM user that has only programmatic access. Create a new access key pair. Add environmental variables to the Lambda function with the access key ID and secret access key. Modify the Lambda function to use the environmental variables at run time during communication with Amazon S3.
- B.Generate an Amazon EC2 key pair. Store the private key in AWS Secrets Man-ager. Modify the Lambda function to retrieve the private key from Secrets Manager and to use the private key during communication with Amazon S3.
- C.Create an IAM role for the Lambda function. Attach an IAM policy that al-lows access to the S3 bucket.(correct answer)
- D.Create an IAM role for the Lambda function. Attach a bucket policy to the S3 bucket to allow access. Specify the function's IAM role as the principal.(correct answer)
- E.Create a security group. Attach the security group to the Lambda function. Attach a bucket policy that allows access to the S3 bucket through the security group ID.
Show answer & explanationHide answer
Correct answer: C, D
Create an IAM role for the Lambda function. Attach an IAM policy that al-lows access to the S3 bucket. / Create an IAM role for the Lambda function. Attach a bucket policy to the S3 bucket to allow access. Specify the function's IAM role as the principal.
Explanation
To grant Lambda access to S3: (2) Create an IAM Role for the function. (3) Attach an IAM Policy to that role granting S3 permissions (or update Bucket Policy). DO NOT put keys in code.
Question 22.A security engineer is designing an IAM policy for a script that will use the AWS CLI. The script currently assumes an IAM role that is attached to three AWS managed IAM policies: AmazonEC2FullAccess, AmazonDynamoDBFullAccess, and Ama-zonVPCFull Access. The security engineer needs to construct a least privilege IAM policy that will replace the AWS managed IAM policies that are attached to this role. Which solution will meet these requirements in the MOST operationally efficient way?
- A.In AWS CloudTrail, create a trail for management events. Run the script with the existing AWS managed IAM policies. Use IAM Access Analyzer to generate a new IAM policy that is based on access activity in the trail. Replace the existing AWS managed IAM policies with the generated IAM poli-cy for the role.(correct answer)
- B.Remove the existing AWS managed IAM policies from the role. Attach the IAM Access Analyzer Role Policy Generator to the role. Run the script. Return to IAM Access Analyzer and generate a least privilege IAM policy. Attach the new IAM policy to the role.
- C.Create an account analyzer in IAM Access Analyzer. Create an archive rule that has a filter that checks whether the Principal Arn value matches the ARN of the role. Run the script. Remove the existing AWS managed IAM policies from the role.
- D.In AWS CloudTrail, create a trail for management events. Remove the existing AWS managed IAM policies from the role. Run the script. Find the authorization failure in the trail event that is associated with the script. Create a new IAM policy that includes the action and resource that caused the authorization failure. Repeat the process until the script succeeds. Attach the new IAM policy to the role.
Show answer & explanationHide answer
Correct answer: A
In AWS CloudTrail, create a trail for management events. Run the script with the existing AWS managed IAM policies. Use IAM Access Analyzer to generate a new IAM policy that is based on access activity in the trail. Replace the existing AWS managed IAM policies with the generated IAM poli-cy for the role.
Explanation
Least Privilege: Use CloudTrail to see what actions were actually used, then use IAM Access Analyzer (Policy Generation) to generate a policy based on that activity.
Question 23.A company that uses AWS Organizations wants to see AWS Security Hub findings for many AWS accounts and AWS Regions. Some of the accounts are in the company's organization, and some accounts are in organizations that the company manages for customers. Although the company can see findings in the Security Hub administrator account for accounts in the company's organization, there are no findings from accounts in other organizations. Which combination of steps should the company take to see findings from accounts that are outside the organization that includes the Security Hub administrator account? (Select TWO)
- A.Use a designated administration account to automatically set up member accounts.
- B.Create the AWS Service Role ForSecurrty Hub service-linked rote for Security Hub.
- C.Send an administration request from the member accounts.(correct answer)
- D.Enable Security Hub for all member accounts.
- E.Send invitations to accounts that are outside the company's organization from the Security Hub administrator account.(correct answer)
Show answer & explanationHide answer
Correct answer: C, E
Send an administration request from the member accounts. / Send invitations to accounts that are outside the company's organization from the Security Hub administrator account.
Explanation
Security Hub cross-region/cross-org aggregation: You must explicitly enable Security Hub in member accounts and then Invite/Add them (if not using Organizations integration) or use the Delegated Administrator (Option 2/4 implies manual steps if cross-org?). Wait, key says [2, 4].
Question 24.A company uses identity federation to authenticate users into an identity account (987654321987) where the users assume an IAM role named IdentityRole. The users then assume an IAM role named JobFunctionRole in the target IAM account (123456789123) to perform their job functions. A user is unable to assume the IAM role in the target account. The policy attached to the role in the identity account is. What should be done to enable the user to assume the appropriate role in the target account? 
- A.Option A: Update the IAM policy attached to the role in the identity account to be.
- B.Option B: Update the trust policy on the role in the target account to be.(correct answer)
- C.Option C: Update the trust policy on the role in the identity account to be.
- D.Option D: Update the IAM policy attached to the role in the target account to be.
Show answer & explanationHide answer
Correct answer: B
Option B: Update the trust policy on the role in the target account to be.
Explanation
Role Assumption requires two-way permission: (1) The Identity Policy on the User must allow `sts:AssumeRole`. (2) The Trust Policy on the Target Role must allow the User (Principal) to assume it.
Question 25.A company hosts a web application on an Apache Web server. The application runs on Amazon EC2 instances that are in an Auto Scaling group. The company configured the EC2 instances to send the Apache Web server logs to an Amazon CloudWatch Logs group that the company has configured to expire after 1 year. Recently, the company discovered in the Apache Web server logs that a specific IP address is sending suspicious requests to the Web application. A security engineer wants to analyze the past week of Apache Web server logs to determine how many requests that the IP address sent and the corresponding URLs that the IP address requested. What should the security engineer do to meet these requirements with the LEAST effort?
- A.Export the CloudWatch Logs group data to Amazon S3. Use Amazon Macie to query the logs for the specific IP address and the requested URLs.
- B.Configure a CloudWatch Logs subscription to stream the log group to an Amazon OpenSearch Service cluster. Use OpenSearch Service to analyze the logs for the specific IP address and the requested URLs.
- C.Use CloudWatch Logs Insights and a custom query syntax to analyze the CloudWatch logs for the specific IP address and the requested URLs.(correct answer)
- D.Export the CloudWatch Logs group data to Amazon S3. Use AWS Glue to crawl the S3 bucket for only the log entries that contain the specific IP ad-dress. Use AWS Glue to view the results.
Show answer & explanationHide answer
Correct answer: C
Use CloudWatch Logs Insights and a custom query syntax to analyze the CloudWatch logs for the specific IP address and the requested URLs.
Explanation
CloudWatch Logs Insights enables you to run interactive queries with aggregation and filtering directly on log data without exporting to S3/Athena (which takes more effort/time).
Question 26.A company has multiple Amazon S3 buckets encrypted with customer-managed CMKs Due to regulatory requirements the keys must be rotated every year. The company's Security Engineer has enabled automatic key rotation for the CMKs; however the company wants to verity that the rotation has occurred. What should the Security Engineer do to accomplish this?
- A.Filter AWS CloudTrail logs for KeyRotaton events.(correct answer)
- B.Monitor Amazon CloudWatch Events for any AWS KMS CMK rotation events.
- C.Using the IAM CLI run the `IAM kms gel-key-relation-status` operation with the `–key-id` parameter to check the CMK rotation date.
- D.Use Amazon Athena to query AWS CloudTrail logs saved in an S3 bucket to filter Generate New Key events.
Show answer & explanationHide answer
Correct answer: A
Filter AWS CloudTrail logs for KeyRotaton events.
Explanation
KMS Key Rotation is a specific event. While `kms:GetKeyRotationStatus` checks the config, verifying *past* rotation usually involves checking CloudTrail for the internal rotation events or creating a new key version?
Question 27.A company has implemented IAM WAF and Amazon CloudFront for an application. The application runs on Amazon EC2 instances that are part of an Auto Scaling group. The Auto Scaling group is behind an Application Load Balancer (ALB). The IAM WAF web ACL uses an IAM Managed Rules rule group and is associated with the CloudFront distribution. CloudFront receives the request from IAM WAF and then uses the ALB as the distribution's origin. During a security review, a security engineer discovers that the infrastructure is susceptible to a large, layer 7 DDoS attack. How can the security engineer improve the security at the edge of the solution to defend against this type of attack?
- A.Configure the CloudFront distribution to use the Lambda@Edge feature. Create an IAM Lambda function that imposes a rate limit on CloudFront viewer requests. Block the request if the rate limit is exceeded.
- B.Configure the IAM WAF web ACL so that the Web ACL has more capacity units to process all IAM WAF rules faster.
- C.Configure IAM WAF with a rate-based rule that imposes a rate limit that automatically blocks requests when the rate limit is exceeded.(correct answer)
- D.Configure the CloudFront distribution to use IAM WAF as its origin instead of the ALB.
Show answer & explanationHide answer
Correct answer: C
Configure IAM WAF with a rate-based rule that imposes a rate limit that automatically blocks requests when the rate limit is exceeded.
Explanation
For Layer 7 DDoS (http floods), AWS WAF Rate-based rules are the most effective tool. They automatically block IPs that exceed a specified request threshold in a 5-minute window.
Question 28.A company has multiple accounts in the AWS Cloud. Users in the developer account need to have access to specific resources in the production account. What is the MOST secure way to provide this access?
- A.Create one IAM user in the production account. Grant the appropriate permissions to the resources that are needed. Share the password only with the users that need access.
- B.Create cross account access with an IAM role in the developer account. Grant the appropriate permissions to this role. Allow users in the developer account to assume this role to access the production resources.
- C.Create cross-account access with an IAM user account in the production account. Grant the appropriate permissions to this user account. Allow users in the developer account to use this user account to access the production resources.
- D.Create cross-account access with an IAM role in the production account. Grant the appropriate permissions to this role. Allow users in the developer account to assume this role to access the production resources.(correct answer)
Show answer & explanationHide answer
Correct answer: D
Create cross-account access with an IAM role in the production account. Grant the appropriate permissions to this role. Allow users in the developer account to assume this role to access the production resources.
Explanation
Cross-Account Access Best Practice: Create an IAM Role in the Resource (Prod) account. Grant permission for the Developer account to Assume it. Developers assume the role to access resources.
Question 29.A System Administrator is unable to start an Amazon EC2 instance in the eu-west-1 Region using an IAM role The same System Administrator is able to start an EC2 instance in the eu-west-2 and eu-west-3 Regions. The IAMSystemAdministrator access policy attached to the System Administrator IAM role allows unconditional access to all IAM services and resources within the account. Which configuration caused this issue?
- A.Option A: An SCP is attached to the account with the following permission statement.
- B.Option B: A permission boundary policy is attached to the System Administrator role with the following permission statement.(correct answer)
- C.Option C: A permission boundary is attached to the System Administrator role with the following permission statement.
- D.Option D: An SCP is attached to the account with the following statement.
Show answer & explanationHide answer
Correct answer: B
Option B: A permission boundary policy is attached to the System Administrator role with the following permission statement.
Explanation
Permissions Boundaries are an advanced feature to set the maximum permissions that an identity-based policy can grant. If the boundary denies `eu-west-1`, the user cannot access it regardless of their other policies.
Question 30.Amazon GuardDuty has detected communications to a known command and control endpoint from a company's Amazon EC2 instance. The instance was found to be running a vulnerable version of a common web framework. The company's security operations team wants to quickly identity other compute resources with the specific version of that framework installed. Which approach should the team take to accomplish this task?
- A.Scan all the EC2 instances for noncompliance with IAM Config. Use Amazon Athena to query AWS CloudTrail logs for the framework installation.
- B.Scan all the EC2 instances with the Amazon Inspector Network Reachability rules package to identity instances running a web server with RecognizedPortWithListener findings.
- C.Scan all the EC2 instances with IAM Systems Manager to identify the vulnerable version of the Web framework.(correct answer)
- D.Scan an the EC2 instances with IAM Resource Access Manager to identify the vulnerable version of the Web framework.
Show answer & explanationHide answer
Correct answer: C
Scan all the EC2 instances with IAM Systems Manager to identify the vulnerable version of the Web framework.
Explanation
To identify software versions (inventory) across a fleet, use AWS Systems Manager Inventory (or Inspector). Inspector Network Reachability checks ports, not software versions.
Ready for the full SCS-C02 exam?
Get all 400+ Questions, timed simulation, and weak-area analytics. Plans from $2.99 — credits never expire.
Frequently Asked Questions
Are these real SCS-C02 practice questions?+
Is the SCS-C02 exam hard?+
How many questions are on the real SCS-C02 exam?+
Do I need to sign up to use these questions?+
Keep studying
Pass SCS-C02 on your first try
Join candidates using DummyExams to practice with realistic timed exams, detailed explanations, and weak-area analytics.
Start full SCS-C02 practice exam