Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này: https://elib.vku.udn.vn/handle/123456789/3242
Nhan đề: Using Stochastic Gradient Descent On Parallel Recommender System with Stream Data
Tác giả: Nguyen, Si Thin
Van, Hung Trong
Vo, Ngoc Dat
Ngo, Le Quan
Từ khoá: Computational modeling
Stochastic processes
Training data
Data science
Real-time systems
Complexity theory
Parallel algorithms
Năm xuất bản: thá-2022
Nhà xuất bản: IEEE
Tóm tắt: Stochastic gradient descent (SGD) and Alternating least squares (ALS) are two popular algorithms applied on matrix factorization. Moreover recent researches pay attention to how to parallelize them on daily increading data. About large-scale datasets issue, however, SGD still suffers with low convergence by depending on the parameters. While ALS is not scalable due to the cubic complexity with the target time rank. The remaining issue, how to operate system, almost parallel algorithms conduct matrix factorization on a batch of training data while the system data is real-time. In this work, the authors proposed FSGD algorithm overcomes drawbacks in large-scale issue base on coordinate descent, a novel optimization approach. According to that, algorithm updates rank-one factors one by one to get faster and more stable convergence than SGD and ALS. In addition, FSGD is feasible to paralleize and operates on a stream of incoming data. The experimental results show that FSGD performs much better in solving the matrix factorization issue compared to existing state-of-the-art parallel models.
Mô tả: 2022 IEEE/ACIS, 7th International Conference on Big Data, Cloud Computing, and Data Science (BCD); pp: 88-93
Định danh: https://ieeexplore.ieee.org/document/9900664
http://elib.vku.udn.vn/handle/123456789/3242
ISBN: 978-1-6654-6582-3
Bộ sưu tập: NĂM 2022

Các tập tin trong tài liệu này:

 Đăng nhập để xem toàn văn



Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.