Stemming from the limited availability of datasets and textual resources for low-resource languages such as isiZulu, there is a significant need to be able to harness knowledge from pre-trained models to improve low resource machine translation. Moreover, a lack of techniques to handle the complexities of morphologically rich languages has compounded the unequal development of translation models, with many widely spoken African languages being left behind. This study explores the potential benefits of transfer learning in an English-isiZulu translation framework. The results indicate the value of transfer learning from closely related languages to enhance the performance of low-resource translation models, thus providing a key strategy for low-resource translation going forward. We gathered results from 8 different language corpora, including one multi-lingual corpus, and saw that isiXhosa-isiZulu outperformed all languages, with a BLEU score of 8.56 on the test set which was better from the multi-lingual corpora pre-trained model by 2.73. We also derived a new coefficient, Nasir's Geographical Distance Coefficient (NGDC) which provides an easy selection of languages for the pre-trained models. NGDC also indicated that isiXhosa should be selected as the language for the pre-trained model.
翻译:低资源语言(如伊祖鲁语)的数据集和文字资源有限,因此,非常需要能够利用受过训练的模型的知识,改进低资源机器翻译;此外,缺乏处理形式丰富语言复杂性的技术,加剧了翻译模式的不平等发展,许多非洲语言广泛使用,使翻译模式的发展更加不均匀;这项研究探讨了在英语-西祖鲁语翻译框架中转让学习的潜在好处;研究结果表明,从密切相关的语言转移学习对于提高低资源翻译模型的性能具有价值,从而为今后提供低资源翻译的关键战略;我们从8种不同语言库中收集了成果,包括一个多语言材料库,发现伊索-西祖鲁语超越了所有语言,测试集的BLEU分数为8.56,比在2.73之前经过多语言公司预先培训的模式更好;我们还得出一个新的系数,纳西尔的地理距离节能(NGDC),为预先培训的模型提供了容易选择的语言。