Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này: https://elib.vku.udn.vn/handle/123456789/2736
Nhan đề: AMG-Mixer: A Multi-Axis Attention MLP-Mixer Architecture for Biomedical Image Segmentation
Tác giả: Le, Hoang Minh Quang
Le, Trung Kien
Pham, Van Truong
Tran, Thi Thao
Từ khoá: MLP-Mixer
image segmentation
Axial Attention
Multi-axis MLP
Năm xuất bản: thá-2023
Nhà xuất bản: Springer Nature
Tóm tắt: Previously, Multi-Layer Perceptrons (MLPs) were primarily used in image classification tasks. The emergence of the MLP-Mixer architecture has demonstrated the continued efficacy of MLPs in other visual tasks. To obtain superior results, it is imperative to have pre-trained weights from large datasets, and the Cross-Location (Token Mix) operation must be adaptively modified to suit the specific task at hand. Inspired by this, we proposed AMG-Mixer, an MLP-based architecture for image segmentation. In particular, recognizing the importance of positional information, we proposed AxialMBconv Token Mix utilizing Axial Attention. Additionally, to reduce Axial Attention’s receptive field constraints, we proposed Multi-scale Multi-axis MLP Gated (MS-MAMG) block which employs Multi-Axis MLP. The proposed AMG-Mixer architecture outperformed State-of-the-Art (SOTA) methods on benchmark datasets including GLaS, Data Science Bowl 2018, and Skin Lesion Segmentation ISIC 2018, even without pre-training. The proposed AMG-Mixer architecture has been confirmed effective and high performing in our study. The code is available at https://github.com/quanglets1fvr/amg_mixer
Mô tả: Lecture Notes in Networks and Systems (LNNS, volume 734); CITA: Conference on Information Technology and its Applications; pp: 169-180.
Định danh: https://link.springer.com/chapter/10.1007/978-3-031-36886-8_14
http://elib.vku.udn.vn/handle/123456789/2736
ISBN: 978-3-031-36886-8
Bộ sưu tập: CITA 2023 (International)

Các tập tin trong tài liệu này:

 Đăng nhập để xem toàn văn



Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.