PyTorch hard depency blocks use of GPU running main.py?

Dear,

With LibreTranslate depending on pytorch trying to get libtranslate to work with AMD rocm/openCL.

The intent is to use

ARGOS_DEVICE_TYPE=auto ./libretranslate --update-models --suggestions --host 0.0.0.0 --port 5000

To that end I’ve used Start Locally | PyTorch with Stable Linux Pip Python Rocm/5.6 which resulted in

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/rocm5.6

which in turn throw an error because torch 2.0.1 was uninstalled and torch 2.1.2 is installed and this conflicts with torch==2.0.1 hard-coded into pyproject.toml

ERROR: pip’s dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
libretranslate 1.5.2 requires torch==2.0.1, but you have torch 2.1.2+rocm5.6 which is incompatible.

I’ve tried downloading and modifying pyproject.toml in source for v1.5.2, to have it read "torch ==2.1.2+rocm5.6" but this does not appear to use the GPU when translating a file

python3 main.py --host 0.0.0.0 --suggestions --update-models

please advise

1 Like

The translation is done with CTranslate2, which doesn’t use PyTorch, so you need to configure CTranslate2 to use the GPU. You can use ARGOS_DEVICE_TYPE to set the device type for CTranslate2.