Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/4272
Title: Pre-trained Self-Attention Framework: An Efficient Mechanism for Source Separation
Authors: Ha, Minh Tan
Fhadli, Muhammad
Nguyen, Kim Quoc
Vu, Quang Duc
Keywords: The suggested method is tested and assessed on a conventional dataset
Model is relearned, enhanced
Issue Date: Nov-2024
Publisher: Springer Nature
Abstract: In this work, the pre-trained model of the self-attention framework is proposed for single-channel speech separation. Firstly, all layers in the pre-trained self-attention framework are frozen. The model is then retrained through three stages using the scheduling mechanism for learning rates and the layers of the framework are unlocked following the schedule. This way, the model is relearned, enhanced, and updated from previous knowledge. This is an effective way to improve the advanced model performance while significantly reducing the time and cost of a training model. This method is beneficial in applying existing models to perform a similar task or enhancing model performance. In this strategy, the pre-trained system outperforms the non-pre-trained system since the following phases of the model’s training repurpose characteristics extracted through the previously trained early phases. The suggested method is tested and assessed on a conventional dataset. The findings from experiments suggest that the methodology has higher performance than the baseline framework and outperforms current methods for the monaural speech separation task.
Description: Lecture Notes in Networks and Systems (LNNS,volume 882); The 13th Conference on Information Technology and Its Applications (CITA 2024) ; pp: 99-110.
URI: https://elib.vku.udn.vn/handle/123456789/4272
https://doi.org/10.1007/978-3-031-74127-2_9
ISBN: 978-3-031-74126-5
Appears in Collections:CITA 2024 (International)

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.