'How to scan a Github repository for all imports and generate a requirements.txt file for ALL the dependencies in that repo?

When cloning a Github repository into a new virtual environment, often the scripts in that repository will not run in that new environment due to it missing certain libraries being imported and used in the repo's scripts.

Ideally if there is a requirements.txt file, it is trivial to install all the dependencies via pip install -r requirements.txt But many repositories on Github lack a requirements.txt file that makes it straightforward to install all the dependencies used in that repo.

So for a given Github repository, what's the most efficient way to check through each .py file/module and get a comprehensive list of all dependencies being imported and used for that entire repository?

Basically, how to generate a requirements.txt file solely from the .py files within a given repo?

I'm unable to find such a tool and I think this would be tremendously helpful for users who want to git clone a repo and immediately be able to install all dependencies and use the code in that repo without the decidedly annoying process of having to run a script, have that script error out due to some library not installed, install the missing library, re-run the script, have it error out again due to yet another library not being installed, installing that library, etc...



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source