Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/6170
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHoang, Phuong My Dung-
dc.contributor.authorNguyen, Duc Hien-
dc.contributor.authorMai, Lam-
dc.date.accessioned2026-01-19T07:53:24Z-
dc.date.available2026-01-19T07:53:24Z-
dc.date.issued2026-01-
dc.identifier.isbn978-3-032-00971-5 (p)-
dc.identifier.isbn978-3-032-00972-2 (e)-
dc.identifier.urihttps://doi.org/10.1007/978-3-032-00972-2_65-
dc.identifier.urihttps://elib.vku.udn.vn/handle/123456789/6170-
dc.descriptionLecture Notes in Networks and Systems (LNNS,volume 1581); The 14th Conference on Information Technology and Its Applications (CITA 2025) ; pp: 887-897vi_VN
dc.description.abstractIn this paper, we propose a novel machine learning-based technique to enhance the efficiency and accuracy of data warehouse operations, specifically targeting revenue optimization by integrating various data sources and applying machine learning algorithms to predict customer behavior and demand patterns. Our approach leverages the Extract, Transform, and Load (ETL) process to systematically prepare and optimize data that allows for more informed and proactive revenue management in cinema chains. Furthermore, we evaluate the model’s efficiency by comparing the predicted revenue data with actual figures and calculating the percentage error. Simulation results demonstrate that applying the proposed solution to real-time cinema data achieves up to 90.21% accuracy in intelligent revenue forecasting.vi_VN
dc.language.isoenvi_VN
dc.publisherSpringer Naturevi_VN
dc.subjectRevenue optimizationvi_VN
dc.subjectCinema chainsvi_VN
dc.subjectData warehousingvi_VN
dc.subjectMachine learningvi_VN
dc.subjectPredictive analyticvi_VN
dc.titleEnhanced Revenue Prediction Model Using a Machine Learning-Based Data Warehouse Approachvi_VN
dc.typeWorking Papervi_VN
Appears in Collections:CITA 2025 (International)

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.