Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/2722
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDang, Dai Tho-
dc.contributor.authorTran, Xuan Thang-
dc.contributor.authorHuynh, Cong Phap-
dc.contributor.authorNguyen, Ngoc Thanh-
dc.date.accessioned2023-09-26T01:31:34Z-
dc.date.available2023-09-26T01:31:34Z-
dc.date.issued2023-07-
dc.identifier.isbn978-3-031-36886-8-
dc.identifier.urihttps://link.springer.com/chapter/10.1007/978-3-031-36886-8_26-
dc.identifier.urihttp://elib.vku.udn.vn/handle/123456789/2722-
dc.descriptionLecture Notes in Networks and Systems (LNNS, volume 734); CITA: Conference on Information Technology and its Applications; pp: 306-317.vi_VN
dc.description.abstractNowadays, a vast volume of text data is generated by Vietnamese people daily on social media platforms. Besides the enormous benefits, this situation creates many challenges. One of them concerns the fact that a tremendous amount of text contains obscene language. This kind of data negatively affects readers, especially young people. Detecting this kind of text is an important problem. In this paper, we investigate this problem using Deep Learning (DL) models such as Convolutional Neural Networks (CNN), Long-Short Term Memory (LSTM), and Bidirectional Long-Short Term Memory (BiLSTM). Besides, we combine LSTM and CNN in both sequence (sequential LSTM-CNN) and parallel (parallel LSTM-CNN) forms and sequential BiLSTM-CNN to solve this task. For word embedding phrase, we use Word2vec and PhoBERT. Experiment results show that the BiLSTM model with PhoBERT gains the best results for the obscene discrimination task, with 81.4% and 81.5% for accuracy and F1-score, respectively.vi_VN
dc.language.isoenvi_VN
dc.publisherSpringer Naturevi_VN
dc.subjectObscene languagevi_VN
dc.subjectDeep Learningvi_VN
dc.subjectVietnamese Social Mediavi_VN
dc.titleUsing Deep Learning for Obscene Language Detection in Vietnamese Social Mediavi_VN
dc.typeWorking Papervi_VN
Appears in Collections:CITA 2023 (International)

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.