'AWS Lambda: upload code from amazon S3 location

I have configured AWS pipeline to deploy my latest nodeJS code as zip file to S3. Instead of going to AWS Lambda and choosing the "upload from Amazon S3 location" option, is there a way to make my lambda service to take the latest code once it is written to S3, so that it will automatically update its function code when you update the zip file?



Solution 1:[1]

Using Lambda triggers, you can create a trigger than is fired whenever a new object is written to the S3 location. This tutorial specifies how you can set this up.

Then, using the AWS SDK, you can update your function's code programatically.

Take a look at the aptly named Update Function Code documentation.

The Python SDK documentation is here. Here is an example of the code from the documentation:

response = client.update_function_code(
    FunctionName='string',
    ZipFile=b'bytes',
    S3Bucket='string',
    S3Key='string',
    S3ObjectVersion='string',
    ImageUri='string',
    Publish=True|False,
    DryRun=True|False,
    RevisionId='string',
    Architectures=[
        'x86_64'|'arm64',
    ]
)

I found this after posting the above, there appears to be an automated solution within CodePipeline itself.

https://docs.aws.amazon.com/lambda/latest/dg/services-codepipeline.html

I imagine this would be a cleaner solution to your problem.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Reegz