Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/6205
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDo, Thanh Huong-
dc.contributor.authorVuong, Viet Huy-
dc.contributor.authorNguyen, Xuan Nam-
dc.contributor.authorNguyen, D. M. Quang-
dc.contributor.authorPham, Vu Minh Tu-
dc.date.accessioned2026-01-20T01:53:47Z-
dc.date.available2026-01-20T01:53:47Z-
dc.date.issued2026-01-
dc.identifier.isbn978-3-032-00971-5 (p)-
dc.identifier.isbn978-3-032-00972-2 (e)-
dc.identifier.urihttps://doi.org/10.1007/978-3-032-00972-2_30-
dc.identifier.urihttps://elib.vku.udn.vn/handle/123456789/6205-
dc.descriptionLecture Notes in Networks and Systems (LNNS,volume 1581); The 14th Conference on Information Technology and Its Applications (CITA 2025) ; pp: 401-413vi_VN
dc.description.abstractCurrently, distance learning has brought flexibility and accessibility but it lacks effective interaction and personalization which is crucial for student engagement and learning performance. This paper proposes an Emotion-Adaptive Multimodal Framework that combines Virtual Reality (VR) and Natural Language Processing (NLP) that used to create an immersive and emotionally adaptive distance learning environment. The system introduces an interactive speech-based Virtual Tutor and recognizes and responds to the emotional state of the learner through speech and facial expression analysis. The field experiments conducted at the Vietnam Posts and Telecommunications Institute of Technology with the enthusiastic support of 250 volunteers, divided into experimental and control groups, demonstrated the effectiveness of the proposed approach. This helps to clearly comment and evaluate the minimal differences between the groups. The experimental group achieved a 167% increase in interaction frequency and a 22.37% higher task completion rate. Additionally, the retry rate increased by 89.35%, demonstrating the pilot’s superiority. After all, the results show that the integration of VR and NLP enhances student engagement, motivation, and learning performance by providing real-time support and personalized feedback based on users’ emotional cues. This confirms the potential of multimodal, emotion-aware AI systems to transform distance education while providing a scalable and practical solution to improve interactivity and learner-centered experiences in virtual environments.vi_VN
dc.language.isoenvi_VN
dc.publisherSpringer Naturevi_VN
dc.subjectDistance learningvi_VN
dc.subjectVRvi_VN
dc.subjectNLPvi_VN
dc.subjectAIvi_VN
dc.subjectEducation technologyvi_VN
dc.titlePersonalized Distance Learning Framework in VR Using Emotion-Adaptive AI and Multimodal NLPvi_VN
dc.typeWorking Papervi_VN
Appears in Collections:CITA 2025 (International)

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.