Wednesday, November 24, 2021

The first-ever multilingual model to win WMT, beating out bilingual models

Recommendable! Good news!

"... To build a universal translator, we believe the MT field should shift away from bilingual models and advance toward multilingual translation — where a single model translates many language pairs at once, including both low-resource (e.g., Icelandic to English) and high-resource (e.g., English to German). Multilingual translation is an appealing approach — it’s simpler, more scalable, and better for low-resource languages. ...
Now we’ve achieved an exciting breakthrough: For the first time, a single multilingual model has outperformed the best specially trained bilingual models across 10 out of 14 language pairs to win WMT, a prestigious MT competition. Our single multilingual model provided the best translations for both low- and high-resource languages, showing that the multilingual approach is indeed the future of MT. ..."

From the abstract:
"... Our final submission is an ensemble of dense and sparse Mixture-of-Expert multilingual translation models, followed by finetuning on in-domain news data and noisy channel reranking. Compared to previous year's winning submissions, our multilingual system improved the translation quality on all language directions, with an average improvement of 2.0 BLEU. In the WMT2021 task, our system ranks first in 10 directions based on automatic evaluation. ..."

The first-ever multilingual model to win WMT, beating out bilingual models




No comments: