Multilingual translation with CTranslate2 and pre-trained FairSeq models

This script is demonstrating using a pre-trained FairSeq multilingual model with CTranslate2.

Multilingual translation works by prepending a token representing the target language to the source text. For example:

__de__ This English sentence will be translated to German.

This works well for translating between languages other than English. For example, to translate from French to Spanish in Argos Translate we currently translate from French to English then “pivot” and translate from English to Spanish which loses some information in the intermediate English representation. With the multilingual strategy the model instead understands multiple languages and can translate directly between all of them.

It’s also possible to train Argos Translate models for language pairs other than English, for example, you could train a direct Spanish to Portuguese model. However, it’s not possible to support direct translations between a large number of languages without multilingual translation. To translate directly between 100 languages would require 100^2 = 10000 models.