Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này:
https://elib.vku.udn.vn/handle/123456789/6235| Nhan đề: | Fine-Tuning Multilingual Khmer Neural Machine Translation |
| Tác giả: | Rina, Buoy Sovisal, Chenda Nguonly, Taing Marry, Kong Masakazu, Iwamura Koichi, Kise |
| Từ khoá: | Khmer neural machine translation No language left behind (NLLB) Low-resource languages |
| Năm xuất bản: | thá-2026 |
| Nhà xuất bản: | Springer Nature |
| Tóm tắt: | Google Translate remains a strong baseline machine translation (MT) tool for Khmer. However, as a proprietary tool, it does not allow flexible deployment, customization, or improvement. In contrast, “No Language Left Behind” (NLLB) is an open-source MT solution, but its translation performance for Khmer is significantly weaker than that of Google Translate. Given the low-resource nature of the Khmer language, this paper pragmatically presents a robust machine translation model for translating Khmer to and from English, Thai, Vietnamese, and Laotian. This model is developed by fine-tuning a base NLLB model on a high-quality multilingual parallel corpus. The fine-tuned model achieves performance competitive to Google Translate while significantly outperforming the base NLLB model and the previous studies. |
| Mô tả: | Lecture Notes in Networks and Systems (LNNS,volume 1581); The 14th Conference on Information Technology and Its Applications (CITA 2025) ; pp: 87-99 |
| Định danh: | https://doi.org/10.1007/978-3-032-00972-2_7 https://elib.vku.udn.vn/handle/123456789/6235 |
| ISBN: | 978-3-032-00971-5 (p) 978-3-032-00972-2 (e) |
| Bộ sưu tập: | CITA 2025 (International) |
Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.