{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,10]],"date-time":"2026-04-10T16:48:46Z","timestamp":1775839726158,"version":"3.50.1"},"reference-count":54,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2023,1,16]],"date-time":"2023-01-16T00:00:00Z","timestamp":1673827200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"the Italian Ministry of University and Research (MUR) under the PON Agrifood Program","award":["ARS01_01136"],"award-info":[{"award-number":["ARS01_01136"]}]},{"name":"the Italian Ministry of University and Research (MUR) under the PON Agrifood Program","award":["DIT.AD022.180"],"award-info":[{"award-number":["DIT.AD022.180"]}]},{"name":"the Italian Ministry of University and Research (MUR) under the PON Agrifood Program","award":["FOE 2020"],"award-info":[{"award-number":["FOE 2020"]}]},{"name":"E-crops","award":["ARS01_01136"],"award-info":[{"award-number":["ARS01_01136"]}]},{"name":"E-crops","award":["DIT.AD022.180"],"award-info":[{"award-number":["DIT.AD022.180"]}]},{"name":"E-crops","award":["FOE 2020"],"award-info":[{"award-number":["FOE 2020"]}]},{"name":"Transizione industriale e resilienza delle Societ\u00e0 post-Covid19","award":["ARS01_01136"],"award-info":[{"award-number":["ARS01_01136"]}]},{"name":"Transizione industriale e resilienza delle Societ\u00e0 post-Covid19","award":["DIT.AD022.180"],"award-info":[{"award-number":["DIT.AD022.180"]}]},{"name":"Transizione industriale e resilienza delle Societ\u00e0 post-Covid19","award":["FOE 2020"],"award-info":[{"award-number":["FOE 2020"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Weeds are a crucial threat to agriculture, and in order to preserve crop productivity, spreading agrochemicals is a common practice with a potential negative impact on the environment. Methods that can support intelligent application are needed. Therefore, identification and mapping is a critical step in performing site-specific weed management. Unmanned aerial vehicle (UAV) data streams are considered the best for weed detection due to the high resolution and flexibility of data acquisition and the spatial explicit dimensions of imagery. However, with the existence of unstructured crop conditions and the high biological variation of weeds, it remains a difficult challenge to generate accurate weed recognition and detection models. Two critical barriers to tackling this challenge are related to (1) a lack of case-specific, large, and comprehensive weed UAV image datasets for the crop of interest, (2) defining the most appropriate computer vision (CV) weed detection models to assess the operationality of detection approaches in real case conditions. Deep Learning (DL) algorithms, appropriately trained to deal with the real case complexity of UAV data in agriculture, can provide valid alternative solutions with respect to standard CV approaches for an accurate weed recognition model. In this framework, this paper first introduces a new weed and crop dataset named Chicory Plant (CP) and then tests state-of-the-art DL algorithms for object detection. A total of 12,113 bounding box annotations were generated to identify weed targets (Mercurialis annua) from more than 3000 RGB images of chicory plantations, collected using a UAV system at various stages of crop and weed growth. Deep weed object detection was conducted by testing the most recent You Only Look Once version 7 (YOLOv7) on both the CP and publicly available datasets (Lincoln beet (LB)), for which a previous version of YOLO was used to map weeds and crops. The YOLOv7 results obtained for the CP dataset were encouraging, outperforming the other YOLO variants by producing value metrics of 56.6%, 62.1%, and 61.3% for the mAP@0.5 scores, recall, and precision, respectively. Furthermore, the YOLOv7 model applied to the LB dataset surpassed the existing published results by increasing the mAP@0.5 scores from 51% to 61%, 67.5% to 74.1%, and 34.6% to 48% for the total mAP, mAP for weeds, and mAP for sugar beets, respectively. This study illustrates the potential of the YOLOv7 model for weed detection but remarks on the fundamental needs of large-scale, annotated weed datasets to develop and evaluate models in real-case field circumstances.<\/jats:p>","DOI":"10.3390\/rs15020539","type":"journal-article","created":{"date-parts":[[2023,1,17]],"date-time":"2023-01-17T02:58:16Z","timestamp":1673924296000},"page":"539","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":189,"title":["Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images"],"prefix":"10.3390","volume":"15","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-7076-8328","authenticated-orcid":false,"given":"Ignazio","family":"Gallo","sequence":"first","affiliation":[{"name":"Department of Theoretical and Applied Science, University of Insubria, 20100 Varese, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9384-8988","authenticated-orcid":false,"given":"Anwar Ur","family":"Rehman","sequence":"additional","affiliation":[{"name":"Department of Theoretical and Applied Science, University of Insubria, 20100 Varese, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8636-4934","authenticated-orcid":false,"given":"Ramin Heidarian","family":"Dehkordi","sequence":"additional","affiliation":[{"name":"Institute for Electromagnetic Sensing of the Environment, National Research Council, 20133 Milan, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0565-7496","authenticated-orcid":false,"given":"Nicola","family":"Landro","sequence":"additional","affiliation":[{"name":"Department of Theoretical and Applied Science, University of Insubria, 20100 Varese, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4355-0366","authenticated-orcid":false,"given":"Riccardo","family":"La Grassa","sequence":"additional","affiliation":[{"name":"The Italian National Institute for Astrophysics, 00100 Rome, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2156-4166","authenticated-orcid":false,"given":"Mirco","family":"Boschetti","sequence":"additional","affiliation":[{"name":"Institute for Electromagnetic Sensing of the Environment, National Research Council, 20133 Milan, Italy"}]}],"member":"1968","published-online":{"date-parts":[[2023,1,16]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"7","DOI":"10.1017\/wet.2017.70","article-title":"Beyond precision weed control: A model for true integration","volume":"32","author":"Young","year":"2018","journal-title":"Weed Technol."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"339","DOI":"10.3390\/agriengineering3020023","article-title":"Opportunities for robotic systems and automation in cotton production","volume":"3","author":"Barnes","year":"2021","journal-title":"AgriEngineering"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"557","DOI":"10.13031\/trans.14085","article-title":"Frontier: Autonomy in Detection, Actuation, and Planning for Robotic Weeding Systems","volume":"64","author":"Pandey","year":"2021","journal-title":"Trans. ASABE"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"241","DOI":"10.1111\/wre.12418","article-title":"Thermal weed control technologies for conservation agriculture\u2014A review","volume":"60","author":"Bauer","year":"2020","journal-title":"Weed Res."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"342","DOI":"10.1017\/wet.2019.120","article-title":"Crop signal markers facilitate crop detection and weed removal from lettuce and tomato by an intelligent cultivator","volume":"34","author":"Kennedy","year":"2020","journal-title":"Weed Technol."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"215","DOI":"10.1111\/j.1365-3180.2008.00629.x","article-title":"Innovation in mechanical weed control in crop rows","volume":"48","author":"Bleeker","year":"2008","journal-title":"Weed Res."},{"key":"ref_7","first-page":"231","article-title":"Precision weed control system for cotton","volume":"45","author":"Lamm","year":"2002","journal-title":"Trans. ASAE"},{"key":"ref_8","first-page":"4","article-title":"See & Spray: The next generation of weed control","volume":"24","author":"Chostner","year":"2017","journal-title":"Resour. Mag."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"123","DOI":"10.1111\/wre.12526","article-title":"Advances in site-specific weed management in agriculture\u2014A review","volume":"62","author":"Gerhards","year":"2022","journal-title":"Weed Res."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"107091","DOI":"10.1016\/j.compag.2022.107091","article-title":"Performance evaluation of deep transfer learning on multi-class identification of common weed species in cotton production systems","volume":"198","author":"Chen","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1038\/s41598-018-38343-3","article-title":"DeepWeeds: A multiclass weed species image dataset for deep learning","volume":"9","author":"Olsen","year":"2019","journal-title":"Sci. Rep."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"50","DOI":"10.1016\/j.biosystemseng.2018.06.017","article-title":"Transfer learning for the classification of sugar beet and volunteer potato under field conditions","volume":"174","author":"Suh","year":"2018","journal-title":"Biosyst. Eng."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"105306","DOI":"10.1016\/j.compag.2020.105306","article-title":"Towards weeds identification assistance through transfer learning","volume":"171","author":"Mylonas","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"72","DOI":"10.1016\/j.biosystemseng.2016.08.024","article-title":"Plant species classification using deep convolutional neural network","volume":"151","author":"Dyrmann","year":"2016","journal-title":"Biosyst. Eng."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"142","DOI":"10.1109\/TPAMI.2015.2437384","article-title":"Region-based convolutional networks for accurate object detection and segmentation","volume":"38","author":"Girshick","year":"2015","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"128837","DOI":"10.1109\/ACCESS.2019.2939201","article-title":"A survey of deep learning-based object detection","volume":"7","author":"Jiao","year":"2019","journal-title":"IEEE Access"},{"key":"ref_17","unstructured":"Ren, S., He, K., Girshick, R., and Sun, J. (2015, January 7\u201312). Faster r-cnn: Towards real-time object detection with region proposal networks. Proceedings of the 28th International Conference on Neural Information Processing Systems, Montreal, QC, Canada."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"He, K., Gkioxari, G., Doll\u00e1r, P., and Girshick, R. (2017, January 22\u201329). Mask r-cnn. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.322"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016, January 27\u201330). You only look once: Unified, real-time object detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Redmon, J., and Farhadi, A. (2017, January 21\u201326). YOLO9000: Better, faster, stronger. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA.","DOI":"10.1109\/CVPR.2017.690"},{"key":"ref_21","unstructured":"Redmon, J., and Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv."},{"key":"ref_22","unstructured":"Bochkovskiy, A., Wang, C.Y., and Liao, H.Y.M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv."},{"key":"ref_23","unstructured":"Glenn, J. (2022, December 01). What Is YOLOv5?. Available online: https:\/\/github.com\/ultralytics\/yolov5."},{"key":"ref_24","unstructured":"Li, C., Li, L., Jiang, H., Weng, K., Geng, Y., Li, L., Ke, Z., Li, Q., Cheng, M., and Nie, W. (2022). Yolov6: A single-stage object detection framework for industrial applications. arXiv."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Wang, C.Y., Bochkovskiy, A., and Liao, H.Y.M. (2022). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv.","DOI":"10.1109\/CVPR52729.2023.00721"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1186\/s13007-020-00570-z","article-title":"Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields","volume":"16","author":"Gao","year":"2020","journal-title":"Plant Methods"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"106081","DOI":"10.1016\/j.compag.2021.106081","article-title":"Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems","volume":"184","author":"Ahmad","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1038\/s41598-020-66505-9","article-title":"Goosegrass detection in strawberry and tomato using a convolutional neural network","volume":"10","author":"Sharpe","year":"2020","journal-title":"Sci. Rep."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Sun, C., Shrivastava, A., Singh, S., and Gupta, A. (2017, January 22\u201329). Revisiting unreasonable effectiveness of data in deep learning era. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.97"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"105760","DOI":"10.1016\/j.compag.2020.105760","article-title":"A survey of public datasets for computer vision tasks in precision agriculture","volume":"178","author":"Lu","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"100028","DOI":"10.1016\/j.atech.2021.100028","article-title":"Eden library: A long-term database for storing agricultural multi-sensor datasets from uav and proximal platforms","volume":"2","author":"Mylonas","year":"2022","journal-title":"Smart Agric. Technol."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Wu, Z., Chen, Y., Zhao, B., Kang, X., and Ding, Y. (2021). Review of weed detection methods based on computer vision. Sensors, 21.","DOI":"10.3390\/s21113647"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Salazar-Gomez, A., Darbyshire, M., Gao, J., Sklar, E.I., and Parsons, S. (2021). Towards practical object detection for weed spraying in precision agriculture. arXiv.","DOI":"10.1109\/IROS47612.2022.9982139"},{"key":"ref_34","unstructured":"Gallo, I., Rehman, A.U., Dehkord, R.H., Landro, N., La Grassa, R., and Boschetti, M. (2022, December 01). Weed Detection by UAV 416a Image Dataset\u2014Chicory Crop Weed. Available online: https:\/\/universe.roboflow.com\/chicory-crop-weeds-5m7vo\/weed-detection-by-uav-416a\/dataset\/1."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., and Stachniss, C. (June, January 29). UAV-based crop and weed classification for smart farming. Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore.","DOI":"10.1109\/ICRA.2017.7989347"},{"key":"ref_36","first-page":"43","article-title":"Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery","volume":"67","author":"Gao","year":"2018","journal-title":"Int. J. Appl. Earth Obs. Geoinf."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"2870","DOI":"10.1109\/LRA.2018.2846289","article-title":"Fully convolutional networks with sequential information for robust crop and weed detection in precision farming","volume":"3","author":"Lottes","year":"2018","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_38","first-page":"116","article-title":"Effective plant discrimination based on the combination of local binary pattern operators and multiclass support vector machine methods","volume":"6","author":"Le","year":"2019","journal-title":"Inf. Process. Agric."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"588","DOI":"10.1109\/LRA.2017.2774979","article-title":"weednet: Dense semantic weed classification using multispectral images and mav for smart farming","volume":"3","author":"Sa","year":"2017","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"10940","DOI":"10.1109\/ACCESS.2021.3050296","article-title":"Weed identification using deep learning and image processing in vegetable plantation","volume":"9","author":"Jin","year":"2021","journal-title":"IEEE Access"},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Milioto, A., Lottes, P., and Stachniss, C. (2018, January 21\u201325). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia.","DOI":"10.1109\/ICRA.2018.8460962"},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Lottes, P., and Stachniss, C. (2017, January 24\u201328). Semi-supervised online visual crop and weed classification in precision farming exploiting plant arrangement. Proceedings of the 2017 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada.","DOI":"10.1109\/IROS.2017.8206403"},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Etienne, A., Ahmad, A., Aggarwal, V., and Saraswat, D. (2021). Deep Learning-Based Object Detection System for Identifying Weeds Using UAS Imagery. Remote Sens., 13.","DOI":"10.3390\/rs13245182"},{"key":"ref_45","doi-asserted-by":"crossref","unstructured":"Peteinatos, G.G., Reichel, P., Karouta, J., And\u00fajar, D., and Gerhards, R. (2020). Weed identification in maize, sunflower, and potatoes with the aid of convolutional neural networks. Remote Sens., 12.","DOI":"10.3390\/rs12244185"},{"key":"ref_46","doi-asserted-by":"crossref","unstructured":"Bah, M.D., Hafiane, A., and Canals, R. (2018). Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens., 10.","DOI":"10.20944\/preprints201809.0088.v1"},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Di Cicco, M., Potena, C., Grisetti, G., and Pretto, A. (2017, January 24\u201328). Automatic model based dataset generation for fast and accurate crop and weeds detection. Proceedings of the 2017 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada.","DOI":"10.1109\/IROS.2017.8206408"},{"key":"ref_48","doi-asserted-by":"crossref","first-page":"79","DOI":"10.1016\/j.biosystemseng.2021.01.014","article-title":"Combining generative adversarial networks and agricultural transfer learning for weeds identification","volume":"204","author":"Mylonas","year":"2021","journal-title":"Biosyst. Eng."},{"key":"ref_49","unstructured":"Dwyer, J. (2022, December 01). Quickly Label Training Data and Export To Any Format. Available online: https:\/\/roboflow.com\/annotate."},{"key":"ref_50","unstructured":"Chien, W. (2022, December 01). YOLOv7 Repositry with all Instruction. Available online: https:\/\/github.com\/WongKinYiu\/yolov7."},{"key":"ref_51","doi-asserted-by":"crossref","unstructured":"Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.Y., and Berg, A.C. (2016, January 8\u201316). Ssd: Single shot multibox detector. Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands.","DOI":"10.1007\/978-3-319-46448-0_2"},{"key":"ref_52","doi-asserted-by":"crossref","unstructured":"Chen, Q., Wang, Y., Yang, T., Zhang, X., Cheng, J., and Sun, J. (2021, January 20\u201325). You only look one-level feature. Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA.","DOI":"10.1109\/CVPR46437.2021.01284"},{"key":"ref_53","unstructured":"Jensen, P.K. (2011). Survey of Weeds in Maize Crops in Europe, Dept. of Integral Pest Management, Aarhus University."},{"key":"ref_54","unstructured":"(2022, August 01). Image Augmentation in Roboflow. Available online: https:\/\/docs.roboflow.com\/image-transformations\/image-augmentation."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/2\/539\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T18:07:45Z","timestamp":1760119665000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/2\/539"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,1,16]]},"references-count":54,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2023,1]]}},"alternative-id":["rs15020539"],"URL":"https:\/\/doi.org\/10.3390\/rs15020539","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,1,16]]}}}