'Pass object instance through lambda invoke function in python

I have multiple Lambda functions. I am calling another Lambda function from the first Lambda function using Lambda invoke in python. I have an object instance of the class which has has the dictonary data. I want to additionally pass the object instance to the other lambda functions with the json object. How can I do it?

                objReferenceData = ReferenceData()
                objReferenceData_dict = objReferenceData.__dict__
    "This objReferenceData_dict contains all the data which have dictonary object." 
    
    ## First lambda
                inputForInvoker = responsejson
                logger.info(inputForInvoker)
                response = client.invoke(
                FunctionName = 'arn:aws:firstfun',
                InvocationType = 'RequestResponse',
                Payload = json.dumps(inputForInvoker)
                    )
                responsejson = json.load(response['Payload'])
                    return responsejson
                else:
                    pass
                
    ## second lambda
                inputForInvoker = responsejson
                response = client.invoke(
                FunctionName = 'arn:aws:secondfun',
                InvocationType = 'RequestResponse',
                Payload = json.dumps(inputForInvoker)
                    )
                responsejson = json.load(response['Payload'])
                else:
                    pass

I want to pass the objReferenceData_dict with the responsejson. I tried to send that adding this objReferenceData_dict to the responsejson but the data is too large. The Lambda handler only has a limit of 6mb.



Solution 1:[1]

You seem to understand the api for invoking another lambda synchronously, so it seems your only issue is the 6MB limit. This is a hard limit according to the documentation.

Therefore, you must choose another method for data transfer. There are, of course, many options. Two that come to mind are writing the data as an object to s3 with put_object, or sending the message to an sqs queue since you can Now Send Payloads up to 2GB with Amazon SQS.

For the former, you would invoke the second lambda with the key of the s3 object. For the latter, you would trigger the second lambda via the queue, and then handle the response with a different workflow.

In any case, if you need to invoke with more that 6MB, you cannot pass the payload directly.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 theherk