DSpace Repository

NN at CheckThat! 2023: Subjectivity in News Articles Classification with Transformer Based Models

Show simple item record

dc.contributor.author Dey, Krishno
dc.contributor.author Tarannum, Prerona
dc.contributor.author Hasan, Md. Arid
dc.contributor.author Noori, Sheak Rashed Haider
dc.date.accessioned 2024-07-04T04:49:26Z
dc.date.available 2024-07-04T04:49:26Z
dc.date.issued 2023-08-31
dc.identifier.uri http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/12887
dc.description.abstract The CheckThat! Lab is a challenging lab designed to address the issue of disinformation. We participated in CheckThat! Lab Task 2, which is focused on classification of subjectivity in news articles. This shared task included datasets in six different languages, as well as a multilingual dataset created by combining all six languages. We followed standard preprocessing steps for Arabic, Dutch, English, German, Italian, Turkish, and multilingual text data. We employed a transformer-based pretrained model, specifically XLM-RoBERTa large, for our official submission to the CLEF Task 2. Our results were impressive, as we achieved the 1st, 1st, 2nd, 5th, 2nd, 2nd, and 3rd positions on the leaderboard for the multilingual, Arabic, Dutch, English, German, Italian, and Turkish text data, respectively. Furthermore, we also applied BERT and BERT multilingual (BERT-m) models to assess the subjectivity of the text data. Our study revealed that XLM-RoBERTa large outperformed BERT and BERT-m in all performance measures for this particular dataset provided in the shared task. en_US
dc.language.iso en_US en_US
dc.publisher CEUR Workshop Proceedings en_US
dc.subject Classification en_US
dc.subject Transformer en_US
dc.subject Articles en_US
dc.subject News en_US
dc.title NN at CheckThat! 2023: Subjectivity in News Articles Classification with Transformer Based Models en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account

Statistics