'Poetry and PyTorch

I've recently found poetry to manage dependencies. In one project, we use PyTorch. How do I add this to poetry?

We are working on machines that have no access to a CUDA GPU (for simple on the road inferencing/testing) and workstations where we do have access to CUDA GPUs. Is it possible to use poetry to ensure every dev is using the same PyTorch version?

There seems to be no obvious way to decide which PyTorch version to install. I thought about adding the different installation instructions as extra dependencies, but I failed to find an option to get the equivalent settings like:

pip3 install torch==1.3.1+cpu torchvision==0.4.2+cpu -f https://download.pytorch.org/whl/torch_stable.html

I would be fine with setting the total path to the different online wheels, like: https://download.pytorch.org/whl/torch_stable.html/cpu/torch-1.3.1%2Bcpu-cp36-cp36m-win_amd64.whl

But I would rather not but them in git directly... The closest option I've seen in poetry is either downloading them manually and then using file = X command.



Solution 1:[1]

Currently, Poetry doesn't have a -f option (there's an open issue and an open PR), so you can't use the pip instructions. You can install the .whl files directly:

poetry add https://download.pytorch.org/whl/torch_stable.html/cpu/torch-1.3.1%2Bcpu-cp36-cp36m-win_amd64.whl

or add the dependency directly to your .toml file:

[tool.poetry.dependencies]
torch = { url = "https://download.pytorch.org/whl/torch_stable.html/cpu/torch-1.3.1%2Bcpu-cp36-cp36m-win_amd64.whl" }

Solution 2:[2]

After spending a couple of hours on this issue, I found a "solution" by combining Poetry and pip just for PyTorch. You don't need to specify the wheel URLs directly and thus remain cross-platform.

I'm using Poe The Poet, a nice task runner for Poetry that allows to run any arbitrary command.

[tool.poetry.dev-dependencies]
poethepoet = "^0.10.0"

[tool.poe.tasks]
force-cuda11 = "python -m pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 -f https://download.pytorch.org/whl/torch_stable.html"

You can run:

poetry install

and then:

poe force-cuda11  # relies on pip and use PyTorch wheels repo

Solution 3:[3]

An updated solution from this issue in the Poetry github:

poetry add torch --platform linux --python "^3.7"

Solution 4:[4]

In late 2021, utilizing markers and multiple constraints should work.

$ poetry --version
Poetry version 1.1.11
# pyproject.toml
[tool.poetry.dependencies]
python = "~3.9"
torch = [
  {url = "https://download.pytorch.org/whl/cpu/torch-1.10.0%2Bcpu-cp39-cp39-linux_x86_64.whl", markers = "sys_platform == 'linux'"},
  {url = "https://download.pytorch.org/whl/cpu/torch-1.10.0%2Bcpu-cp39-cp39-win_amd64.whl", markers = "sys_platform == 'win32'", }
]
numpy = "^1.21.4"

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
$ poetry install
The currently activated Python version 3.8.12 is not supported by the project (~3.9).
Trying to find and use a compatible version. 
Using python3.9 (3.9.9)
Creating virtualenv machine-learning in /home/redqueen/machine_learning/.venv
Updating dependencies
Resolving dependencies... (36.0s)

Writing lock file

Package operations: 3 installs, 0 updates, 0 removals

  • Installing typing-extensions (4.0.1)
  • Installing numpy (1.21.4)
  • Installing torch (1.10.0+cpu https://download.pytorch.org/whl/cpu/torch-1.10.0%2Bcpu-cp39-cp39-linux_x86_64.whl)

NOTE: Numpy has to be listed. Otherwise you'll get an import error.

Without numpy:

$ python
Python 3.9.9 (main, Nov 23 2021, 00:34:08) 
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
/home/redqueen/machine_learning/.venv/lib/python3.9/site-packages/torch/package/_directory_reader.py:17: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at  ../torch/csrc/utils/tensor_numpy.cpp:68.)
  _dtype_to_storage = {data_type(0).dtype: data_type for data_type in _storages}
>>> quit()

With numpy:

$ python
Python 3.9.9 (main, Nov 23 2021, 00:34:08) 
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.cuda.is_available()
False
>>> quit()

Reference:

https://python-poetry.org/docs/dependency-specification/#python-restricted-dependencies

Disclaimer

I do not have a Windows (or Mac) to test this on.

Solution 5:[5]

There is a fork that I am maintaining called relaxed-poetry It is a very young fork but it supports what you want with the following configuration:


# pyproject.toml

[tool.poetry.dependencies]
python = "^3.8"
torch = { version = "=1.90+cu111", source = "pytorch" }

[[tool.poetry.source]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu111/"
secondary = true

Check it if you like, it can be installed side by side with poetry.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 GilZ
Solution 2
Solution 3 GilZ
Solution 4
Solution 5 bennyl