<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
<title>M.SC. in CSE</title>
<link href="http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16555" rel="alternate"/>
<subtitle/>
<id>http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16555</id>
<updated>2026-04-05T17:19:38Z</updated>
<dc:date>2026-04-05T17:19:38Z</dc:date>
<entry>
<title>Carbon Aware Protocol for fault tolerant network</title>
<link href="http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16550" rel="alternate"/>
<author>
<name>Biswas, Prosunjit</name>
</author>
<id>http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16550</id>
<updated>2026-04-05T04:01:53Z</updated>
<published>2025-12-22T00:00:00Z</published>
<summary type="text">Carbon Aware Protocol for fault tolerant network
Biswas, Prosunjit
The increasing energy consumption of information and communication technologies has raised significant environmental concerns, particularly regarding carbon emissions associated with network infrastructure. This research proposes a novel Carbon Aware Routing Protocol for Delay Tolerant Networks (CARP-DTN) that routes data through paths with lower carbon intensity while maintaining acceptable performance level adjustments. The protocol integrates real-time carbon intensity data from electricity grids with network performance metrics to make environmentally friendly routing decisions. By leveraging the inherent tolerance for delay in certain types of networks, CARP-DTN achieves substantial carbon emission reductions without compromising network reliability. The proposed solution implements a modified A* algorithm that considers three specific parameters: carbon intensity, latency, and reliability. Experimental results demonstrate that CARP-DTN can reduce carbon emissions by 5-20% compared to traditional shortest-path routing algorithms, while maintaining acceptable quality of service for delay-tolerant applications. This research contributes towards green computing initiatives and provides a practical networking routing framework for reducing the carbon footprint of digital infrastructure.
Masters of Thesis
</summary>
<dc:date>2025-12-22T00:00:00Z</dc:date>
</entry>
<entry>
<title>Brain Tumor Detection Using Graph Neural Network</title>
<link href="http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16549" rel="alternate"/>
<author>
<name>Akter, Farjana</name>
</author>
<id>http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16549</id>
<updated>2026-04-05T04:05:18Z</updated>
<published>2025-11-24T00:00:00Z</published>
<summary type="text">Brain Tumor Detection Using Graph Neural Network
Akter, Farjana
Brain tumors make up an important global fitness mission and accurate and timely detection. It is a must in treating for better prognosis of a patient impacted. Most of the conventional so detection methods is based on medical image interpretation by radiodignosts. In fact it is always time may effort consuming subjective and prone to human mistakes. Specially in aid-restrained settings. This work offers a novel approach for brain tumour diagnosis through GNNs. It is based on the relationship between structural information in brain MRI knowledge. The approach is dedicated to full preprocessing of scientific images. This comprises noise discount, depth normalization and cranium stripping. The and then creative graph sketch the actual of tissues thoughts. The GNN architecture is intended to capture all local characteristics and global structural relationships among the brain scans.
Masters of Thesis
</summary>
<dc:date>2025-11-24T00:00:00Z</dc:date>
</entry>
<entry>
<title>Predicting Entrepreneurial Intention Among Computer Science Students Using Structural Equation Modeling and Machine Learning</title>
<link href="http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16548" rel="alternate"/>
<author>
<name>Sillah, Abu Bakarr</name>
</author>
<id>http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16548</id>
<updated>2026-04-05T04:04:30Z</updated>
<published>2025-10-12T00:00:00Z</published>
<summary type="text">Predicting Entrepreneurial Intention Among Computer Science Students Using Structural Equation Modeling and Machine Learning
Sillah, Abu Bakarr
Entrepreneurial intention (EI) among students are critical drivers of innovation, job creation, and economic growth in contemporary societie. This study examines the determinants of entrepreneurial intention (EI) among computer science and engineering students in Bangladesh by integrating structural equation modeling (SEM) and machine learning (ML). Survey data were collected from 929 students. The reflective measurement model estimated in SmartPLS 4 demonstrated strong reliability and validity, while the structural model explained 57.2% of the variance in Entreprenuerial Intention.. Complementary ML models optimized through nested cross-validation, confirmed the robustness of findings, XGBoost yielded the lowest error (RMSE ≈ 1.00; R2 ≈ .58. The integration of SEM and ML advances explanatory and predictive understanding of EI, suggesting that interventions should emphasize mastery-oriented training to enhance PBC, targeted knowledge development to strengthen EK, and orientation-building experiences to reinforce PA.
Masters of Thesis
</summary>
<dc:date>2025-10-12T00:00:00Z</dc:date>
</entry>
<entry>
<title>Attention-Based Sequence-to-Sequence Neural Machine  Translation from English to Bangla</title>
<link href="http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16547" rel="alternate"/>
<author>
<name>Borson, Prattoy Paul</name>
</author>
<id>http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/16547</id>
<updated>2026-04-05T04:05:49Z</updated>
<published>2025-10-09T00:00:00Z</published>
<summary type="text">Attention-Based Sequence-to-Sequence Neural Machine  Translation from English to Bangla
Borson, Prattoy Paul
Neural Machine Translation (NMT) has emerged as a dominant form of automatic languages, especially in the case of lower-resourced languages like Bengali. We developed an English-to-Bengali translation system utilizing the BanglaT5 Transformer model, which was pre-trained on Bengali data. The parallel English-Bengali dataset was cleaned of unwanted characters, to the best of our ability, and normalized for a large enough data pool to train the model, and to ensure uniformity. The sentences in the dataset were tokenized to prepare the input sequences and attention masks, with associated target labels, with the aim of supervised learning. A custom-made dataset with DataLoader from Pytorch allowed the batch size to be tailored and to also facilitate efficient training on the GPU. The BanglaT5 model was fine-tuned using cross-entropy loss with the Adam optimizer to minimize the loss function using backpropagation. Evaluation was performed using BLEU and chrF scores to assess translation accuracy of Bengali sentences generated by the modelagainst provided reference translations. A good illustration of reining in of this system is evident. With great success on a low-resourced NMT problem, a pre-trained Transformer is used. translation. As outlined in this paper, appropriate texts normalizing, prep pre tokenizing, and fine-tuning are important to eventually come up with superior quality Machine Translation. System (MTS). This is a strategy that has been scaled to. host other languages with low resources and is a resourceful guide. South-Asian language-pairs NMT tasks implementation.
Masters of Thesis
</summary>
<dc:date>2025-10-09T00:00:00Z</dc:date>
</entry>
</feed>
