'Speed up S3 files copy between buckets-Using python

All, I am trying to copy large number of files from one bucket to other, following is my code but looks like copy is not happening! Not sure does bot3.transfer has copy method.

what I am missing here

            import os
            import boto3
            import botocore
            import boto3.s3.transfer as s3transfer
            from boto3.session import Session
            
            
            clients3 = boto3.client('s3',
                                    aws_access_key_id='',
                                    aws_secret_access_key='')
            
            bucketname = 'source_buckets'
            PREFIX = "folder1/folder2/202204"
            destbucket = 'destination_bucket'
            
            output = clients3.list_objects(Bucket=bucketname, Prefix=PREFIX)
            
            files_src = [folder['Key'] for folder in output['Contents']]
            file_dest = [destbucket + '/' + f.split('/')[2] for f in files_src]
            
            
            botocore_config = botocore.config.Config(max_pool_connections=20)
            s3client = boto3.client('s3', config=botocore_config)
            transfer_config = s3transfer.TransferConfig(
                use_threads=True,
                max_concurrency=20,
            )
            s3t = s3transfer.create_transfer_manager(s3client, transfer_config)
            
            
            for src, dest in zip(files_src, file_dest):
                copy_source = {'Bucket': bucketname,
                               'key': src
                               }
                s3t.copy(copy_source=copy_source,
                         bucket=destbucket,
                         key=dest.split('/')[1]
                         )


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source