'Python lambda function to check If my S3 Buckets are Public & Make Them Private
I wrote a Python lambda function to check If my S3 Buckets are Public & Make them Private in my account. But i keep on getting an error:
({ "errorMessage": "'S3' object has no attribute 'put_bucket_access_block'", "errorType": "AttributeError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 66, in lambda_handler\n validate_instance(event)\n", " File \"/var/task/lambda_function.py\", line 32, in validate_instance\n addacl = s3_client.put_bucket_access_block(Bucket = s3_bucketName,\n", " File \"/var/runtime/botocore/client.py\", line 563, in getattr\n self.class.name, item)\n" ] })
Below is the Lambda function written in Python Boto3:
import boto3
from botocore.exceptions import ClientError
import json
import logging
import sys
log = logging.getLogger()
log.setLevel(logging.DEBUG)
log = logging.getLogger()
log.setLevel(logging.DEBUG)
print('Loading function')
sts_client = boto3.client('sts')
def validate_instance(rec_event):
sns_msg = json.loads(rec_event['Records'][0]['Sns']['Message'])
account_id = sns_msg['account']
event_region = sns_msg['region']
assumedRoleObject = sts_client.assume_role(
RoleArn="arn:aws:iam::{}:role/{}".format(account_id, 'VSC-Admin-Account-Lambda-Execution-Role'),
RoleSessionName="AssumeRoleSession1"
)
credentials = assumedRoleObject['Credentials']
print(credentials)
s3_client = boto3.client('s3', event_region, aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
s3_bucketName = sns_msg['detail']['requestParameters']['bucketName']
public_block = s3_client.put_public_access_block(Bucket = s3_bucketName,
PublicAccessBlockConfiguration={
'BlockPublicAcls': true,
'IgnorePublicAcls':false,
'BlockPublicPolicy':true,
'RestrictPublicBuckets':true
})
enableacl = s3_client.put_bucket_acl(Bucket = s3_bucketName,
ACL='private'
)
put_public_access_block
try:
checkencryption=s3_client.get_bucket_encryption(Bucket=s3_bucketName)
print("checking the encrytption")
rules = checkencryption['ServerSideEncryptionConfiguration']['Rules']
print('Bucket: %s, Encryption: %s' % (s3_bucketName, rules))
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
response = s3_client.put_bucket_encryption(Bucket = s3_bucketName,
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault':{
'SSEAlgorithm': 'AES256'
}
},]
})
else:
print("Bucket: %s, unexpected error: %s" % (s3_bucketName, e))
def lambda_handler(event, context):
log.info("Here is the Received Event")
log.info(json.dumps(event))
validate_instance(event)
Solution 1:[1]
The problem, at the time of writing, is that the version of boto3 bundled with the Lambda Python runtime environment does not include this feature yet, hence put_public_access_block is not a method on the S3 client.
One workaround is to deploy your own Lambda layer that includes a later version of boto3 (that does support put_public_access_block). See related answer for options on including boto3 in a Lambda layer.
Solution 2:[2]
Rather than having to write code, you could simply activate Amazon S3 Block Public Access – Another Layer of Protection for Your Accounts and Buckets | AWS News Blog.
This allows you to simply block public access on a bucket, even if there are settings or a Bucket Policy that makes it public. It can even override object-level permissions if you wish.
Solution 3:[3]
You have the wrong method name. Change put_bucket_access_block to put_public_access_block.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | |
| Solution 2 | John Rotenstein |
| Solution 3 | Skaperen |
