'AWS Lambda layers exceeding memory - gensim, pandas, nltk, scipi
I am running into this issue while creating my lambda:
Layers consume more than the available size of 262144000 bytes
These are my requirements for the code.
gensim==4.1.2
nltk==3.6.7
numpy==1.22.1
pandas==1.4.0
scipy==1.7.3
Has anyone deployed a lambda using these libraries before?
The unzipped layers file is more than 250 mb. Is there any way to include all these libraries?
Thank you
Solution 1:[1]
You can now package the Lambda function as a container image. In that case, you can use up to 10 GB of space.
You can see more details in the documentation but basically, you create a docker image as you would normally and publish it to some registry, like ECR.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Caldazar |
