版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Department of Software EngineeringUniversity of Engineering and TechnologyTaxila47050Pakistan Department of Computer Science&EngineeringCollege of Applied Studies&Community ServiceKing Saud UniversityRiyadh11362Saudi Arabia Department of Information TechnologyFaculty of Computing and Information TechnologyKing Abdulaziz UniversityJeddah21589Saudi Arabia Department of Computer ScienceUniversity of Engineering and TechnologyTaxila47050Pakistan
出 版 物:《Computers, Materials & Continua》 (计算机、材料和连续体(英文))
年 卷 期:2025年第82卷第2期
页 面:2373-2388页
核心收录:
学科分类:0202[经济学-应用经济学] 02[经济学] 020205[经济学-产业经济学]
基 金:supported by King Saud University Riyadh Saudi Arabia through the Researchers Supporting Project under Grant RSPD2025R697
主 题:Deep learning classification of pests YOLOCSP-PEST pest detection
摘 要:Preservation of the crops depends on early and accurate detection of pests on crops as they cause several diseases decreasing crop production and quality. Several deep-learning techniques have been applied to overcome the issue of pest detection on crops. We have developed the YOLOCSP-PEST model for Pest localization and classification. With the Cross Stage Partial Network (CSPNET) backbone, the proposed model is a modified version of You Only Look Once Version 7 (YOLOv7) that is intended primarily for pest localization and classification. Our proposed model gives exceptionally good results under conditions that are very challenging for any other comparable models especially conditions where we have issues with the luminance and the orientation of the images. It helps farmers working out on their crops in distant areas to determine any infestation quickly and accurately on their crops which helps in the quality and quantity of the production yield. The model has been trained and tested on 2 datasets namely the IP102 data set and a local crop data set on both of which it has shown exceptional results. It gave us a mean average precision (mAP) of 88.40% along with a precision of 85.55% and a recall of 84.25% on the IP102 dataset meanwhile giving a mAP of 97.18% on the local data set along with a recall of 94.88% and a precision of 97.50%. These findings demonstrate that the proposed model is very effective in detecting real-life scenarios and can help in the production of crops improving the yield quality and quantity at the same time.