Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này:
https://elib.vku.udn.vn/handle/123456789/6186| Nhan đề: | Fine-Tuning Mini Language Models for Legal Multiple-Choice Question Answering: A Comparative Study of Phi-3.5, Qwen 2.5 and Llama 3.2 |
| Tác giả: | Nguyen, Huu Khanh Nguyen, Van Viet Nguyen, Kim Son Luong, Thi Minh Hue Nguyen, T. Vinh Vu, Duc Quang Nguyen, Cong Huu |
| Từ khoá: | Mini language models CaseHold Phi-3.5 Qwen 2.5 Llama 3.2 Legal Question-Answering |
| Năm xuất bản: | thá-2026 |
| Nhà xuất bản: | Springer Nature |
| Tóm tắt: | In this study, we explore the mini language models applications in legal domain, specifically Phi-3.5 Mini, Qwen 2.5 3B and Llama 3.2 3B, for legal multiple-choice question answering. We fine-tuned these models on CaseHOLD dataset to adapt them to the structural and semantic nuances of legal language and reasoning. The results show that fine-tuning improves performance of these models significantly with Phi-3. 5 Mini achieved a Micro F1 score of 76.93%, exceeding previous bests for the field of miniaturised models. Also, Qwen 2.5 3B and Llama 3.2 3B scored similarly competitive scores of 74.27% and 75.40%, respectively, reinforcing their viability as resource-efficient options compared to larger models. Mini language models offer competitive performance with specialize models like Legal-BERT, Caselaw-BERT, while operating on a lower computational resources and ability of natural language understanding. The results from this study illuminate the potential of mini language models as a way to increase access to state-of-art legal natural language processing tools and proposes directions for additional future work to continue exploring their versatility across various legal task and datasets. |
| Mô tả: | Lecture Notes in Networks and Systems (LNNS,volume 1581); The 14th Conference on Information Technology and Its Applications (CITA 2025) ; pp: 671-682 |
| Định danh: | https://doi.org/10.1007/978-3-032-00972-2_49 https://elib.vku.udn.vn/handle/123456789/6186 |
| ISBN: | 978-3-032-00971-5 (p) 978-3-032-00972-2 (e) |
| Bộ sưu tập: | CITA 2025 (International) |
Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.