An Efficient Deep Learning-based approach for Recognizing Agricultural Pests in the Wild

Show simple item record

dc.contributor.author Mahjabin, Mohammad Ratul
dc.contributor.author Rahman, Md Sabbir
dc.contributor.author Raf, Mohtasim Hadi
dc.date.accessioned 2024-01-18T05:35:56Z
dc.date.available 2024-01-18T05:35:56Z
dc.date.issued 2023-05-30
dc.identifier.citation [1] X. Wu, C. Zhan, Y.-K. Lai, M.-M. Cheng, and J. Yang, “Ip102: A large-scale benchmark dataset for insect pest recognition,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2019, pp. 8787–8796. [2] M. Tan and Q. Le, “Efficientnetv2: Smaller models and faster training,” in Inter national Conference on Machine Learning. PMLR, 2021, pp. 10 096–10 106. [3] Y. Peng and Y. Wang, “Cnn and transformer framework for insect pest classifica tion,” Ecological Informatics, vol. 72, p. 101846, 2022. [4] H. T. Ung, H. Q. Ung, and B. T. Nguyen, “An efficient insect pest classifica tion using multiple convolutional neural network based models,” arXiv preprint arXiv:2107.12189, 2021. [5] F. Wang, M. Jiang, C. Qian, S. Yang, C. Li, H. Zhang, X. Wang, and X. Tang, “Residual attention network for image classification,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 3156–3164. [6] S. Woo, J. Park, J.-Y. Lee, and I. S. Kweon, “Cbam: Convolutional block at tention module,” in Proceedings of the European conference on computer vision (ECCV), 2018, pp. 3–19. [7] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Uni fied, real-time object detection,” in Proceedings of the IEEE conference on com puter vision and pattern recognition, 2016, pp. 779–788. [8] M. T. Mason, Mechanics of robotic manipulation. MIT press, 2001. [9] M. Caron, H. Touvron, I. Misra, H. Jégou, J. Mairal, P. Bojanowski, and A. Joulin, “Emerging properties in self-supervised vision transformers,” in Pro ceedings of the IEEE/CVF international conference on computer vision, 2021, pp. 9650–9660. [10] G. Huang, Z. Liu, L. Van Der Maaten, and K. Q. Weinberger, “Densely connected convolutional networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 4700–4708. 54 [11] G. Kandalkar, A. Deorankar, and P. Chatur, “Classification of agricultural pests using dwt and back propagation neural networks,” International Journal of Com puter Science and Information Technologies, vol. 5, no. 3, pp. 4034–4037, 2014. [12] D. Xia, P. Chen, B. Wang, J. Zhang, and C. Xie, “Insect detection and classi fication based on an improved convolutional neural network,” Sensors, vol. 18, no. 12, p. 4169, 2018. [13] A. K. Reyes, J. C. Caicedo, and J. E. Camargo, “Fine-tuning deep convolutional networks for plant recognition.” CLEF (Working Notes), vol. 1391, pp. 467–475, 2015. [14] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recog nition,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778. [15] M. Dyrmann, H. Karstoft, and H. S. Midtiby, “Plant species classification using deep convolutional neural network,” Biosystems engineering, vol. 151, pp. 72– 80, 2016. [16] H. Zhang, G. He, J. Peng, Z. Kuang, and J. Fan, “Deep learning of path-based tree classifiers for large-scale plant species identification,” in 2018 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR). IEEE, 2018, pp. 25–30. [17] S. Ji, C. Zhang, A. Xu, Y. Shi, and Y. Duan, “3d convolutional neural networks for crop classification with multi-temporal remote sensing images,” Remote Sensing, vol. 10, no. 1, p. 75, 2018. [18] C.-W. Lin, Q. Ding, W.-H. Tu, J.-H. Huang, and J.-F. Liu, “Fourier dense network to conduct plant classification using uav-based optical images,” IEEE Access, vol. 7, pp. 17 736–17 749, 2019. [19] S. W. Chen, S. S. Shivakumar, S. Dcunha, J. Das, E. Okon, C. Qu, C. J. Taylor, and V. Kumar, “Counting apples and oranges with deep learning: A data-driven approach,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 781–788, 2017. [20] S. P. Mohanty, D. P. Hughes, and M. Salathé, “Using deep learning for image based plant disease detection,” Frontiers in plant science, vol. 7, p. 1419, 2016. [21] A. Kamilaris and F. X. Prenafeta-Boldú, “Deep learning in agriculture: A survey,” Computers and electronics in agriculture, vol. 147, pp. 70–90, 2018. [22] M. H. Saleem, J. Potgieter, and K. M. Arif, “Plant disease detection and classifi cation by deep learning,” Plants, vol. 8, no. 11, p. 468, 2019. 55 [23] K. G. Liakos, P. Busato, D. Moshou, S. Pearson, and D. Bochtis, “Machine learn ing in agriculture: A review,” Sensors, vol. 18, no. 8, p. 2674, 2018. [24] J. G. Barbedo, “Factors influencing the use of deep learning for plant disease recognition,” Biosystems engineering, vol. 172, pp. 84–91, 2018. [25] R. Wang, J. Zhang, W. Dong, J. Yu, C. Xie, R. Li, T. Chen, and H. Chen, “A crop pests image classification algorithm based on deep convolutional neural network,” TELKOMNIKA (Telecommunication Computing Electronics and Con trol), vol. 15, no. 3, pp. 1239–1246, 2017. [26] F. Ren, W. Liu, and G. Wu, “Feature reuse residual networks for insect pest recog nition,” IEEE access, vol. 7, pp. 122 758–122 768, 2019. [27] Z. Liu, J. Gao, G. Yang, H. Zhang, and Y. He, “Localization and classification of paddy field pests using a saliency map and deep convolutional neural network,” Scientific reports, vol. 6, no. 1, pp. 1–12, 2016. [28] L. Nanni, G. Maguolo, and F. Pancino, “Insect pest image detection and recogni tion based on bio-inspired methods,” Ecological Informatics, vol. 57, p. 101089, 2020. [29] E. Ayan, H. Erbay, and F. Varçın, “Crop pest classification with a genetic algorithm-based weighted ensemble of deep convolutional neural networks,” Computers and Electronics in Agriculture, vol. 179, p. 105809, 2020. [30] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278– 2324, 1998. [31] L. Alzubaidi, J. Zhang, A. J. Humaidi, A. Al-Dujaili, Y. Duan, O. Al-Shamma, J. Santamaría, M. A. Fadhel, M. Al-Amidie, and L. Farhan, “Review of deep learning: Concepts, cnn architectures, challenges, applications, future direc tions,” Journal of big Data, vol. 8, pp. 1–74, 2021. [32] J. Feng, L. Wang, M. Sugiyama, C. Yang, Z.-H. Zhou, and C. Zhang, “Boosting and margin theory,” Frontiers of Electrical and Electronic Engineering, vol. 7, pp. 127–133, 2012. [33] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” Communications of the ACM, vol. 60, no. 6, pp. 84–90, 2017. [34] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning,” Image Recogni tion, vol. 7, 2015. 56 [35] Z. Liu, Y. Lin, Y. Cao, H. Hu, Y. Wei, Z. Zhang, S. Lin, and B. Guo, “Swin transformer: Hierarchical vision transformer using shifted windows,” in Proceed ings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 10 012–10 022. [36] F. Zhang, M. Li, G. Zhai, and Y. Liu, “Multi-branch and multi-scale attention learning for fine-grained visual categorization,” in International Conference on Multimedia Modeling. Springer, 2021, pp. 136–147. [37] J. F. H. Santa Cruz, “An ensemble approach for multi-stage transfer learning models for covid-19 detection from chest ct scans,” Intelligence-Based Medicine, vol. 5, p. 100027, 2021. [38] Z. Anwar and S. Masood, “Exploring deep ensemble model for insect and pest detection from images,” Procedia Computer Science, vol. 218, pp. 2328–2337, 2023. [39] M. Hamilton, Z. Zhang, B. Hariharan, N. Snavely, and W. T. Freeman, “Unsupervised semantic segmentation by distilling feature correspondences,” in International Conference on Learning Representations, 2022. [Online]. Avail able: https://openreview.net/forum?id=SaKO6z6Hl0c [40] S.-Y. Zhou and C.-Y. Su, “Efficient convolutional neural network for pest recognition-exquisitenet,” in 2020 IEEE Eurasia Conference on IOT, Commu nication and Engineering (ECICE). IEEE, 2020, pp. 216–219. [41] Z. Yang, X. Yang, M. Li, and W. Li, “Automated garden-insect recognition using improved lightweight convolution network,” Information Processing in Agricul ture, 2021. [42] L. Nanni, A. Manfè, G. Maguolo, A. Lumini, and S. Brahnam, “High perform ing ensemble of convolutional neural networks for insect pest image detection,” Ecological Informatics, vol. 67, p. 101515, 2022. [43] M. K. Khan and M. O. Ullah, “Deep transfer learning inspired automatic insect pest recognition,” in Proceedings of the 3rd International Conference on Com putational Sciences and Technologies; Mehran University of Engineering and Technology, Jamshoro, Pakistan, 2022, pp. 17–19. [44] C. Li, T. Zhen, and Z. Li, “Image classification of pests with residual neural network based on transfer learning,” Applied Sciences, vol. 12, no. 9, p. 4356, 2022. [45] L. Deng, Z. Mao, X. Li, Z. Hu, F. Duan, and Y. Yan, “Uav-based multispec tral remote sensing for precision agriculture: A comparison between different 57 cameras,” ISPRS journal of photogrammetry and remote sensing, vol. 146, pp. 124–136, 2018. [46] C. Xie, J. Zhang, R. Li, J. Li, P. Hong, J. Xia, and P. Chen, “Automatic classifi cation for field crop insects via multiple-task sparse representation and multiple kernel learning,” Computers and Electronics in Agriculture, vol. 119, pp. 123– 132, 2015. [47] K. Thenmozhi and U. S. Reddy, “Crop pest classification based on deep con volutional neural network and transfer learning,” Computers and Electronics in Agriculture, vol. 164, p. 104906, 2019. [48] W. Dawei, D. Limiao, N. Jiangong, G. Jiyue, Z. Hongfei, and H. Zhongzhi, “Recognition pest by image-based transfer learning,” Journal of the Science of Food and Agriculture, vol. 99, no. 10, pp. 4524–4531, 2019. [49] A. Hazafa, N. Jahan, M. A. Zia, K.-U. Rahman, M. Sagheer, and M. Naeem, “Evaluation and optimization of nanosuspensions of chrysanthemum coronar ium and azadirachta indica using response surface methodology for pest man agement,” Chemosphere, vol. 292, p. 133411, en_US
dc.identifier.uri http://hdl.handle.net/123456789/2051
dc.description Supervised by Prof. Dr. Md. Hasanul Kabir, Co-Supervisor, Mr. Sabbir Ahmed, Assistant Professor, Department of Computer Science and Engineering(CSE), Islamic University of Technology(IUT), Board Bazar, Gazipur-1704, Bangladesh en_US
dc.description.abstract One of the biggest challenges that the farmers go through is to fight insect pests during agricultural product yields. The problem can be solved easily and avoid economic losses by taking timely preventive measures. This requires identifying insect pests in an easy and effective manner. Most of the insect species have similarities between them. Without proper help from the agriculturist academician it’s very challenging for the farmers to identify the crop pests accurately. To address this issue we have done extensive experiments considering different methods to find out the best method among all. This paper presents a detailed overview of the experiments done on mainly a robust dataset named IP102 including transfer learning + finetuning, attention mechanism and custom architecture. Some example from another dataset D0 is also shown to show robustness of our experimented techniques. In both datasets our proposed model performed very well with an accuracy of 78% and 99.70% respectively. en_US
dc.language.iso en en_US
dc.publisher Department of Computer Science and Engineering(CSE), Islamic University of Technology(IUT), Board Bazar, Gazipur-1704, Bangladesh en_US
dc.title An Efficient Deep Learning-based approach for Recognizing Agricultural Pests in the Wild en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search IUT Repository


Advanced Search

Browse

My Account

Statistics