Multilingual Training and Transfer (2/15/2022)
Lecture: (by Patrick Fernandes)
- Multilingual Training Methods
- Cross-lingual Transfer Methods
Language in 10: Manglish
Slides: Multilingual Training and Cross-lingual Transfer Slides
Discussion: Read the following paper
- Reference: Towards the Next 1000 Languages in Multilingual Machine Translation: Exploring the Synergy Between Supervised and Self-Supervised LearningData Augmentation for Low-Resource Neural Machine Translation (Siddhant et al. 2022)
References:
- Reference: Transfer Learning for Low-Resource Neural Machine Translation (Zoph et al. 2016)
- Reference: Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation (Johnson et al. 2016)
- Reference: Rapid Adaptation of Neural Machine Translation to New Languages (Neubig et al. 2018)
- Reference: Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation (Siddhant et al. 2020)
- Reference: Multilingual Translation from Denoising Pre-Training (Tang et al. 2021)
- Reference: The Missing Ingredient in Zero-Shot Neural Machine Translation (Arivazhagan et al. 2019)
- Reference: How multilingual is multilingual BERT? (Pires et. al. 2019)
- Reference: Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges (Arivazhagan et al. 2019)
- Reference: Balancing Training for multilingual neural machine translation (Wang et. al. 2020)
- Reference: Scaling Laws for Neural Machine Translation (Ghorbani et. al. 2021)