Multilingual translation works by prepending a token representing the target language to the source text. For example:
__de__ This English sentence will be translated to German.
This works well for translating between languages other than English. For example, to translate from French to Spanish in Argos Translate we currently translate from French to English then “pivot” and translate from English to Spanish which loses some information in the intermediate English representation. With the multilingual strategy the model instead understands multiple languages and can translate directly between all of them.
It’s also possible to train Argos Translate models for language pairs other than English, for example, you could train a direct Spanish to Portuguese model. However, it’s not possible to support direct translations between a large number of languages without multilingual translation. To translate directly between 100 languages would require 100^2 = 10000 models.
very interesting ! is it a plan for argos to focus more on multilingual model training? or in the short/medium term continue to train classic models
Short to medium term I’m planning to continue the current strategy of training models for single language pairs but multilingual translation is probably the best strategy long term.