DSpace Repository

Implementation of Large Language Model for Automatic Answer Script Evaluation

Show simple item record

dc.contributor.author Mynul Ahasan Mim, Mir
dc.contributor.author Islam, Sadikul
dc.date.accessioned 2025-08-10T09:48:42Z
dc.date.available 2025-08-10T09:48:42Z
dc.date.issued 2024-07-10
dc.identifier.uri http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/13937
dc.description.abstract This paper explores the implementation of large language models (LLMs) for automatic answer script evaluation, examining their potential to revolutionize educational assessment. LLMs offer significant advantages, including increased efficiency, consistency, and scalability in grading processes, as well as cost savings for educational institutions. However, their deployment also presents challenges, such as high energy consumption, potential biases, privacy concerns, and the socio-economic impact on employment within the educational sector. This study addresses these issues by proposing a comprehensive sustainability plan that includes optimizing energy efficiency, utilizing renewable energy sources, enhancing data protection measures, and ensuring transparency and accountability in automated evaluations. Additionally, strategies for balancing automation with human oversight and upskilling educators are discussed. The findings highlight the need for ongoing research to refine LLM applications, ensuring their ethical, sustainable, and effective integration into educational systems. The implications for further study emphasize the importance of continuous improvement in energy use, bias mitigation, data privacy, and the broader impact on educational practices and employment en_US
dc.publisher Daffodil International University en_US
dc.subject Machine Learning en_US
dc.subject LLM Implementation en_US
dc.subject Education Technology en_US
dc.subject Natural Language Processing (NLP) en_US
dc.title Implementation of Large Language Model for Automatic Answer Script Evaluation en_US
dc.type Other en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account