opened 09:38PM - 08 Jun 22 UTC
# Summary
I'm experimenting with Argos Translate and while running some test…s I noticed that running in cuda mode used a huge amount of system memory. I boiled it down to just the example code snippet given in the readme and ran some more tests. Essentially my observation is that cuda mode uses an order of magnitude more system memory than cpu mode. Why is this and is it possible to reduce the system ram usage?
# System config
* machine AWS g4dn.xlarge
* ubuntu 18.04 LTS
* python 3.9
# Steps to reproduce
1. install python3.9
2. install argostranslate
3. make file "example.py"
4. paste in example code with sleep.
```
import time
import argostranslate.package, argostranslate.translate
from_code = "en"
to_code = "es"
# Download and install Argos Translate package
available_packages = argostranslate.package.get_available_packages()
available_package = list(
filter(
lambda x: x.from_code == from_code and x.to_code == to_code, available_packages
)
)[0]
download_path = available_package.download()
argostranslate.package.install_from_path(download_path)
# Translate
installed_languages = argostranslate.translate.get_installed_languages()
from_lang = list(filter(
lambda x: x.code == from_code,
installed_languages))[0]
to_lang = list(filter(
lambda x: x.code == to_code,
installed_languages))[0]
translation = from_lang.get_translation(to_lang)
translatedText = translation.translate("Hello World!")
print(translatedText)
time.sleep(50)
```
5. run `ARGOS_DEVICE_TYPE=cuda python example.py`
6. in separate terminal run `ps aux | grep example.py`
7. Observer ram usage
8. run `ARGOS_DEVICE_TYPE=cpu python example.py`
9. in separate terminal run `ps aux | grep example.py`
10. Observer ram usage.
# Observations
Combining ps aux and diffing before after with `free -m` I see cuda mode take up ~2700 MB while cpu mode takes ~200MB.
## Edits
1. I've narrowed the increased system ram usage to this line https://github.com/argosopentech/argos-translate/blob/master/argostranslate/translate.py#L381 via debugging.