'Mass delete Cloudwatch log groups using Boto3 - delete_log_group
I have a pretty lengthy list of cloudwatch log groups i need to delete....like close to a hundred. Since you have to delete them one at a time I thought a little python script could help me out but now im stuck.
here's my script so far...
import boto3
from botocore.exceptions import ClientError
import json
#Connect to AWS using default AWS credentials in awscli config
cwlogs = boto3.client('logs')
loglist = cwlogs.describe_log_groups(
logGroupNamePrefix='/aws/lambda/staging-east1-'
)
#writes json output to file...
with open('loglist.json', 'w') as outfile:
json.dump(loglist, outfile, ensure_ascii=False, indent=4,
sort_keys=True)
#Opens file and searches through to find given loggroup name
with open("loglist.json") as f:
file_parsed = json.load(f)
for i in file_parsed['logGroups']:
print i['logGroupName']
# cwlogs.delete_log_group(
# logGroupName='string' <---here is where im stuck
# )
How do I take the value of 'logGroupName' in i and convert it to a string that the delete_log_group command can use and iterate through to delete all of my log groups I need to be gone? I tried using json.loads and it errored out with the following...
Traceback (most recent call last): File "CWLogCleaner.py", line 18, in file_parsed = json.loads(f) File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/init.py", line 339, in loads return _default_decoder.decode(s) File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())
Or am I totally going about this the wrong way?
TIA
Solution 1:[1]
Heres what I got working for me. Im sure this is hackey and im no developer but it worked for me...
cwlogs = boto3.client('logs')
loglist = cwlogs.describe_log_groups(
logGroupNamePrefix='ENTER NAME OF YOUR LOG GROUP HERE'
)
#writes json output to file...
with open('loglist.json', 'w') as outfile:
json.dump(loglist, outfile, ensure_ascii=False, indent=4,
sort_keys=True)
#Opens file and searches through to find given loggroup name
with open("loglist.json") as f:
file_parsed = json.load(f)
for i in file_parsed['logGroups']:
print i['logGroupName']
for i in file_parsed['logGroups']:
cwlogs.delete_log_group(
logGroupName=(i['logGroupName'])
)
Solution 2:[2]
None of the solution here worked the way I wanted (some due to pagination), so I built my own script. This deletes logs older than 7 days. You can change the timedelta as you choose or set it to 0 or remove the deletion date condition to remove all logs.
from datetime import datetime, timedelta
import boto3
app_name = 'your function name here'
def login():
client = boto3.client('logs')
paginator = client.get_paginator('describe_log_streams')
response_iterator = paginator.paginate(
logGroupName=f'/aws/lambda/{app_name}',
)
return client, response_iterator
def deletion_date():
tod = datetime.today() - timedelta(days=7)
epoch_date = str(int(tod.timestamp()))
selected_date = int(epoch_date.ljust(13, '0'))
return selected_date
def purger():
n = 0
print('Deleting log files..')
for item in response:
collection = item['logStreams']
for collected_value in collection:
if collected_value['creationTime'] < req_date:
resp = client_.delete_log_stream(
logGroupName=f'/aws/lambda/{app_name}',
logStreamName=f"{collected_value['logStreamName']}"
)
n = n + 1
if resp['ResponseMetadata']['HTTPStatusCode'] != 200:
print(f"Unable to purge logStream: {collected_value['logStreamName']}")
return n
if __name__ == '__main__':
client_, response = login()
req_date = deletion_date()
print(f'{purger()} log streams were purged for the function {app_name}')
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | TheRedSeth |
| Solution 2 |
