Language Graph Distillation for Low-Resource Machine Translation
Description
- Research project at Microsoft Research Asia.
- Distillation-based approach to boost the accuracy of multilingual machine translation. Individual models are first trained and regarded as teachers, and then the multilingual model is trained to fit the training data and match the outputs of individual models simultaneously through knowledge distillation.