Please use this identifier to cite or link to this item:
https://elib.vku.udn.vn/handle/123456789/6205Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Do, Thanh Huong | - |
| dc.contributor.author | Vuong, Viet Huy | - |
| dc.contributor.author | Nguyen, Xuan Nam | - |
| dc.contributor.author | Nguyen, D. M. Quang | - |
| dc.contributor.author | Pham, Vu Minh Tu | - |
| dc.date.accessioned | 2026-01-20T01:53:47Z | - |
| dc.date.available | 2026-01-20T01:53:47Z | - |
| dc.date.issued | 2026-01 | - |
| dc.identifier.isbn | 978-3-032-00971-5 (p) | - |
| dc.identifier.isbn | 978-3-032-00972-2 (e) | - |
| dc.identifier.uri | https://doi.org/10.1007/978-3-032-00972-2_30 | - |
| dc.identifier.uri | https://elib.vku.udn.vn/handle/123456789/6205 | - |
| dc.description | Lecture Notes in Networks and Systems (LNNS,volume 1581); The 14th Conference on Information Technology and Its Applications (CITA 2025) ; pp: 401-413 | vi_VN |
| dc.description.abstract | Currently, distance learning has brought flexibility and accessibility but it lacks effective interaction and personalization which is crucial for student engagement and learning performance. This paper proposes an Emotion-Adaptive Multimodal Framework that combines Virtual Reality (VR) and Natural Language Processing (NLP) that used to create an immersive and emotionally adaptive distance learning environment. The system introduces an interactive speech-based Virtual Tutor and recognizes and responds to the emotional state of the learner through speech and facial expression analysis. The field experiments conducted at the Vietnam Posts and Telecommunications Institute of Technology with the enthusiastic support of 250 volunteers, divided into experimental and control groups, demonstrated the effectiveness of the proposed approach. This helps to clearly comment and evaluate the minimal differences between the groups. The experimental group achieved a 167% increase in interaction frequency and a 22.37% higher task completion rate. Additionally, the retry rate increased by 89.35%, demonstrating the pilot’s superiority. After all, the results show that the integration of VR and NLP enhances student engagement, motivation, and learning performance by providing real-time support and personalized feedback based on users’ emotional cues. This confirms the potential of multimodal, emotion-aware AI systems to transform distance education while providing a scalable and practical solution to improve interactivity and learner-centered experiences in virtual environments. | vi_VN |
| dc.language.iso | en | vi_VN |
| dc.publisher | Springer Nature | vi_VN |
| dc.subject | Distance learning | vi_VN |
| dc.subject | VR | vi_VN |
| dc.subject | NLP | vi_VN |
| dc.subject | AI | vi_VN |
| dc.subject | Education technology | vi_VN |
| dc.title | Personalized Distance Learning Framework in VR Using Emotion-Adaptive AI and Multimodal NLP | vi_VN |
| dc.type | Working Paper | vi_VN |
| Appears in Collections: | CITA 2025 (International) | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.