版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:South China Agr Univ Coll Engn Key Lab Key Technol Agr Machine & Equipment Minist Educ Guangzhou 510642 Peoples R China South China Univ Technol Sch Comp Sci & Engn Guangzhou 510641 Peoples R China Univ Liverpool Dept Comp Sci Liverpool L693BX England
出 版 物:《COMPUTERS AND ELECTRONICS IN AGRICULTURE》 (Comput. Electron. Agric.)
年 卷 期:2025年第231卷
核心收录:
学科分类:09[农学] 0901[农学-作物学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:National Natural Science Foundation of China The 2024 Basic and Applied Research Project of Guangzhou Science and Technology Plan [2024A04J4140] State Key Laboratory of Robotics and Systems (HIT) [SKLRS-2024-KF-08] State Key Laboratory of Agricultural Equipment Technology [NKLAET-202408] Specific University Discipline Construction Project [2023B10564002] Young Talent Support Project of Guangzhou Association for Science and Technology [QT2024-006]
主 题:Agricultural robotics Target detection Image processing Lightweight networks Pineapple
摘 要:Automatic detection of pineapples in complex agricultural environments poses several challenges. During harvesting, pineapples that are suitable for collection exhibit intricate scaly surface textures and a wide range of colors. Moreover, occlusion by leaves and fluctuating lighting conditions further complicate the detection of pineapples. In this paper, we propose a high-precision lightweight detection network based on the improved You Only Look Once version 7-tiny (Pineapple-YOLO) for the robot vision system to realize realtime and accurate detection of pineapple. The Convolutional Block Attention Module (CBAM) is embedded into the backbone network to enhance the feature extraction capability, and the Content-Aware Reassembly of Features (CARAFE) is introduced to perform up-sampling operations and expand the receptive field. The Scylla Intersection over Union (SIoU) loss function is used instead of the Complete Intersection over Union (CIoU) loss function to consider the vector angles and redefine the penalty criteria. Finally, the K-means++ clustering algorithm is used to re-cluster the labels of the pineapple dataset and update the size of the anchor. The experimental results show that Pineapple-YOLO achieves a mAP@0.5 of 89.7%, which is a 6.15% improvement over the original YOLOv7-tiny, demonstrating its superiority over other mainstream target detection models. Furthermore, in diverse natural environments where the agricultural robot operates, the Pineapple-YOLO algorithm sustains a commendable 92% success rate in fruit picking, achieved within an average time of 12 s. This demonstrates the efficiency of the visual module in practical engineering applications.