Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.11889/6908
DC FieldValueLanguage
dc.contributor.authorAl-Hajj, Moustafaen_US
dc.contributor.authorJarrar, Mustafaen_US
dc.date.accessioned2021-12-14T06:24:21Z-
dc.date.available2021-12-14T06:24:21Z-
dc.date.issued2021-09-
dc.identifier.citationAl-Hajj, M., Jarrar, M., (2021). ArabGlossBERT: Fine-Tuning BERT on Context-Gloss Pairs for WSD. In Proceedings – the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), PP 40--48.en_US
dc.identifier.urihttp://hdl.handle.net/20.500.11889/6908-
dc.description.abstractUsing pre-trained transformer models such as BERT has proven to be effective in many NLP tasks. This paper presents our work to finetune BERT models for Arabic Word Sense Disambiguation (WSD). We treated the WSD task as a sentence-pair binary classification task. First, we constructed a dataset of labeled Arabic context-gloss pairs (∼167k pairs) we extracted from the Arabic Ontology and the large lexicographic database available at Birzeit University. Each pair was labeled as True or False and target words in each context were identified and annotated. Second, we used this dataset for fine-tuning three pretrained Arabic BERT models. Third, we experimented the use of different supervised signals used to emphasize target words in context. Our experiments achieved promising results (accuracy of 84%) although we used a large set of senses in the experiment.en_US
dc.language.isoen_USen_US
dc.publisherINCOMA Ltd.en_US
dc.relation.ispartofseriesProceedings – the International Conference on Recent Advances in Natural Language Processing (RANLP 2021);-
dc.titleArabGlossBERT: Fine-Tuning BERT on Context-Gloss Pairs for WSDen_US
dc.typeArticleen_US
dcterms.identifierhttps://aclanthology.org/2021.ranlp-1.5/en_US
newfileds.departmentEngineering and Technologyen_US
newfileds.item-access-typeopen_accessen_US
newfileds.thesis-prognoneen_US
newfileds.general-subjectnoneen_US
dc.identifier.doi10.26615/978-954-452-072-4_005-
item.languageiso639-1other-
item.grantfulltextopen-
item.fulltextWith Fulltext-
Appears in Collections:Fulltext Publications
Files in This Item:
File Description SizeFormat
2021.ranlp-main.5.pdfFull Text608.94 kBAdobe PDFView/Open
Show simple item record

Page view(s)

216
checked on Apr 14, 2024

Download(s)

64
checked on Apr 14, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.