'Heroku hosting uses 5 times more RAM than Google Cloud Run

For my Django application, there are many lists, sets, and dictionaries, that I load from pickle files into RAM, for fast access at any moment. I am getting Error R14 (Memory quota exceeded) on Heroku. I have done lots of debugging and memory usage analysis of Python. I know how to optimize code, and I have optimized everything that can be optimized.

Now this is the weird thing: the same application, literally the same deployment, on Google cloud takes 5 times less RAM memory than on Heroku. I deploy both of them from the same Git repo. 1 GB of RAM on google is enough, lets assume it takes around 900 MB RAM there, well on Heroku the same thing takes 4.5 GB RAM, for which I need to "rent" 8 GB RAM, which makes the costs insanely high.

Why is this happening?

So it's not about optimizing my own code. How can the same app on Heroku occupy around 5 times more RAM than on Google?

How I got that measurement of 5 times, is actually, I have an ecommerce website, with products, and they have attributes and values, and also keywords. I store all these values and keywords in RAM, for fast lookup for the search engine. If I deploy on Heroku the same products as I do on Google Cloud run, it overflows the RAM. So to try to fit into the free tier of Heroku of 1 GB, I tested and concluded that deploying only 20% of products fits into memory, anything more than this crashes, while Google Cloud handles 100% of all these products in RAM without problem. Free tier Heroku is actually 512 MB, but they actually allow up to 1 GB.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source