Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/3985
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNguyen, Si Thin-
dc.contributor.authorVan, Hung Trong-
dc.contributor.authorVo, Ngoc Dat-
dc.contributor.authorNgo, Le Quan-
dc.date.accessioned2024-07-29T07:57:24Z-
dc.date.available2024-07-29T07:57:24Z-
dc.date.issued2024-05-
dc.identifier.isbn978-3-031-55174-1-
dc.identifier.urihttps://link.springer.com/chapter/10.1007/978-3-031-55174-1_5-
dc.identifier.urihttps://elib.vku.udn.vn/handle/123456789/3985-
dc.descriptionSoftware Engineering and Management: Theory and Application; pp: 55-68vi_VN
dc.description.abstractStochastic gradient descent (SGD) and Alternating least squares (ALS) are two popular algorithms applied on matrix factorization. Moreover recent researches pay attention to how to parallelize them on daily increasing data. About large-scale datasets issue, however, SGD still suffers with low convergence by depending on the parameters. While ALS is not scalable due to the cubic complexity with the target time rank. The remaining issue, how to operate system, almost parallel algorithms conduct matrix factorization on a batch of training data while the system data is real-time. In this work, the authors proposed FSGD algorithm overcomes drawbacks in large-scale issue base on coordinate descent, a novel optimization approach. According to that, algorithm updates rank-one factors one by one to get faster and more stable convergence than SGD and ALS. In addition, FSGD is feasible to parallelize and operates on a stream of incoming data. The experimental results show that FSGD performs much better in solving the matrix factorization issue compared to existing state-of-the-art parallel models.vi_VN
dc.language.isoenvi_VN
dc.publisherSpringer Naturevi_VN
dc.subjectParallel Recommender Systemvi_VN
dc.subjectStream Datavi_VN
dc.subjectStochastic Gradient Descentvi_VN
dc.titleA Study on Parallel Recommender System with Stream Data Using Stochastic Gradient Descentvi_VN
dc.typeWorking Papervi_VN
Appears in Collections:NĂM 2024

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.