Explorations on Multi-lingual Neural Machine Translation

Deep (recurrent) neural networks has been shown to successfully learn complex mappings between arbitrary length input and output sequences, within the effective framework of encoder-decoder networks. We investigate the extensions of this sequence to sequence models, to handle multiple sequences at the same time, within the same model. This reduces to the problem of multi-lingual machine translation (MLNMT), as we explore applicability and the benefits of MLNMT on, (1) large scale machine translation tasks,
Back to Top