Abstract:
Chilli (Capsicum annuum) is an economically vital crop extensively cultivated in countries like Bangladesh, India, and Indonesia. However, the crop is highly susceptible to fungal, bacterial, and viral diseases that significantly affect yield and farmer income. Traditional disease detection methods are manual, slow, and error-prone, particularly in rural areas where expert agronomists are scarce. To address this issue, this research proposes a real-time chilli leaf disease detection system based on advanced object detection models from the YOLO (You Only Look Once) family. Three models—YOLOv8s, YOLOv9s, and YOLOv10s—were trained and evaluated on a custom-annotated chilli leaf dataset. The YOLOv10 (small) model achieved the highest overall performance, with a mAP50 of 96.9%, mAP50–95 of 91.2%, and the fastest inference speed of 7.3 milliseconds. Compared to YOLOv8s and YOLOv9s, YOLOv10s demonstrated superior accuracy and lower computational load (24.5 GFLOPs), making it the most suitable model for mobile deployment. The best-performing model was optimized using TensorFlow Lite and integrated into a Flutter-based Android mobile application, enabling offline disease detection directly from smartphones. This approach empowers farmers with immediate, on-field diagnostic capabilities without relying on network connectivity or high-end hardware. Additionally, it promotes sustainable agriculture by supporting targeted pesticide use, thereby reducing environmental impact. Through careful model selection, optimization, and mobile app development, the study successfully bridges the gap between research and real-world application, providing a scalable, efficient, and farmerfriendly solution for chilli disease management. Future improvements could focus on multi-crop support, explainable AI integration, and multilingual user interface enhancements to broaden accessibility and impact.