Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/4268
Full metadata record
DC FieldValueLanguage
dc.contributor.authorVu, Minh An-
dc.contributor.authorNguyen, Duong Quy-
dc.contributor.authorNguyen, Khanh Huyen-
dc.contributor.authorNguyen, Tuan-
dc.date.accessioned2024-12-04T02:24:08Z-
dc.date.available2024-12-04T02:24:08Z-
dc.date.issued2024-11-
dc.identifier.isbn978-3-031-74126-5-
dc.identifier.urihttps://elib.vku.udn.vn/handle/123456789/4268-
dc.identifier.urihttps://https://doi.org/10.1007/978-3-031-74127-2_5-
dc.descriptionLecture Notes in Networks and Systems (LNNS,volume 882); The 13th Conference on Information Technology and Its Applications (CITA 2024) ; pp: 50-62.vi_VN
dc.description.abstractWe present an approach that leverages 3D human pose estimation and dynamic time warping to evaluate fitness performance practically. To compare separate executions of the same movement from the user input video to our stock database of movement done by experts, user videos are converted to 3D skeletons. Important joint angles of users are manually defined to better match frame by frame to that of the expert with the incorporation of dynamic time warping. Experimental results indicate that our method for skeletal pose comparison achieves highly accurate and intuitive results while maintaining a dependable computation speed, making it well-suited for real-time movement analysis. This paper illustrates the human pose comparison and presents results of 3D skeleton motion analysis in actual user movement in sports, which we have made available along the source code to the research community at Human Pose Feedback System on GitHub.-
dc.language.isoenvi_VN
dc.publisherSpringer Naturevi_VN
dc.subjectAI Fitness Trainervi_VN
dc.subject3D Humanvi_VN
dc.titleAI Fitness Trainer Using 3D Human Pose Estimation and Dynamic Time Warpingvi_VN
dc.typeWorking Papervi_VN
Appears in Collections:CITA 2024 (International)

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.