Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này: https://elib.vku.udn.vn/handle/123456789/2160
Nhan đề: Distilling Knowledge in Federated Learning
Nhan đề khác: Huy Q. Le, J. H. Shin, Minh N. H. Nguyen and C. S. Hong*
Tác giả: Le, Huy Q.
Shin, Jong Hoon
Nguyen, Huu Nhat Minh
Hong, Choong Seon
Từ khoá: Performance evaluation
Training
Costs
Computational modeling
Collaborative work
Prediction algorithms
Classification algorithms
Năm xuất bản: thá-2021
Nhà xuất bản: IEEE
Trích dẫn: https://doi.org/10.23919/APNOMS52696.2021.9562670
Tóm tắt: Nowadays, Federated Learning has emerged as the prominent collaborative learning approach among multiple machine learning techniques. This framework enables communication-efficient and privacy-preserving solution that a group of users interacts with a server to collaboratively train a powerful global model without exchanging users' raw data. However, federated learning might face the significant challenge with high communication cost when exchanging the huge model parameters. Moreover, training such a large model on devices is an obstacle under the battery limitation of mobile devices. To address this hindrance, we propose the federated learning with bi-level distillation, namely FedBD. The key idea of this proposal is to exchange the soft targets instead of transferring the model parameters between server and clients. The exchange knowledge was constructed based on the prediction outcomes for the shared reference dataset. By interchanging the knowledge of the learning models, our algorithm obtains the benefits of reducing both communication and computation costs. The proposed mechanism allows the different model architectures between server and learning agents. The experiments show that our proposed method can achieve comparable or even slightly higher accuracy than FedAvg algorithm on the image classification task while using fewer communication resources and power.
Mô tả: 2021 22nd Asia-Pacific Network Operations and Management Symposium (APNOMS)
Định danh: http://elib.vku.udn.vn/handle/123456789/2160
ISBN: 978-1-6654-3174-3
ISSN: 2576-8565
Bộ sưu tập: NĂM 2021

Các tập tin trong tài liệu này:

 Đăng nhập để xem toàn văn



Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.