{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,25]],"date-time":"2026-03-25T16:02:20Z","timestamp":1774454540890,"version":"3.50.1"},"reference-count":32,"publisher":"MDPI AG","issue":"21","license":[{"start":{"date-parts":[[2024,10,22]],"date-time":"2024-10-22T00:00:00Z","timestamp":1729555200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Automating pruning tasks entails overcoming several challenges, encompassing not only robotic manipulation but also environment perception and detection. To achieve efficient pruning, robotic systems must accurately identify the correct cutting points. A possible method to define these points is to choose the cutting location based on the number of nodes present on the targeted cane. For this purpose, in grapevine pruning, it is required to correctly identify the nodes present on the primary canes of the grapevines. In this paper, a novel method of node detection in grapevines is proposed with four distinct state-of-the-art versions of the YOLO detection model: YOLOv7, YOLOv8, YOLOv9 and YOLOv10. These models were trained on a public dataset with images containing artificial backgrounds and afterwards validated on different cultivars of grapevines from two distinct Portuguese viticulture regions with cluttered backgrounds. This allowed us to evaluate the robustness of the algorithms on the detection of nodes in diverse environments, compare the performance of the YOLO models used, as well as create a publicly available dataset of grapevines obtained in Portuguese vineyards for node detection. Overall, all used models were capable of achieving correct node detection in images of grapevines from the three distinct datasets. Considering the trade-off between accuracy and inference speed, the YOLOv7 model demonstrated to be the most robust in detecting nodes in 2D images of grapevines, achieving F1-Score values between 70% and 86.5% with inference times of around 89 ms for an input size of 1280 \u00d7 1280 px. Considering these results, this work contributes with an efficient approach for real-time node detection for further implementation on an autonomous robotic pruning system.<\/jats:p>","DOI":"10.3390\/s24216774","type":"journal-article","created":{"date-parts":[[2024,10,22]],"date-time":"2024-10-22T06:11:25Z","timestamp":1729577485000},"page":"6774","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":5,"title":["Enhancing Grapevine Node Detection to Support Pruning Automation: Leveraging State-of-the-Art YOLO Detection Models for 2D Image Analysis"],"prefix":"10.3390","volume":"24","author":[{"ORCID":"https:\/\/orcid.org\/0009-0003-2400-3601","authenticated-orcid":false,"given":"Francisco","family":"Oliveira","sequence":"first","affiliation":[{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes and Alto Douro (UTAD), 5000-801 Vila Real, Portugal"},{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9999-1550","authenticated-orcid":false,"given":"Daniel Queir\u00f3s","family":"da Silva","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3747-6577","authenticated-orcid":false,"given":"V\u00edtor","family":"Filipe","sequence":"additional","affiliation":[{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes and Alto Douro (UTAD), 5000-801 Vila Real, Portugal"},{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-1403-7221","authenticated-orcid":false,"given":"Tatiana Martins","family":"Pinho","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8299-324X","authenticated-orcid":false,"given":"M\u00e1rio","family":"Cunha","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"Faculty of Sciences, University of Porto (FCUP), 4169-007 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8406-0064","authenticated-orcid":false,"given":"Jos\u00e9 Boaventura","family":"Cunha","sequence":"additional","affiliation":[{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes and Alto Douro (UTAD), 5000-801 Vila Real, Portugal"},{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-6113","authenticated-orcid":false,"given":"Filipe Neves","family":"dos Santos","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2024,10,22]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1100","DOI":"10.1002\/rob.21680","article-title":"A Robot System for Pruning Grape Vines","volume":"34","author":"Botterill","year":"2017","journal-title":"J. Field Robot."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"211","DOI":"10.5344\/ajev.2022.22011","article-title":"Facing Spring Frost Damage in Grapevine: Recent Developments and the Role of Delayed Winter Pruning\u2014A Review","volume":"73","author":"Poni","year":"2022","journal-title":"Am. J. Enol. Vitic."},{"key":"ref_3","unstructured":"Reich, L. (2010). The Pruning Book, Taunton Press."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Silwal, A., Yandun, F., Nellithimaru, A., Bates, T., and Kantor, G. (2021). Bumblebee: A Path Towards Fully Autonomous Robotic Vine Pruning. arXiv.","DOI":"10.55417\/fr.2022051"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"31","DOI":"10.1016\/j.biosystemseng.2023.09.006","article-title":"Modelling wine grapevines for autonomous robotic cane pruning","volume":"235","author":"Williams","year":"2023","journal-title":"Biosyst. Eng."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Oliveira, F., Tinoco, V., Magalh\u00e3es, S., Santos, F.N., and Silva, M.F. (2022, January 29\u201330). End-Effectors for Harvesting Manipulators\u2014State of the Art Review. Proceedings of the 2022 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Santa Maria da Feira, Portugal.","DOI":"10.1109\/ICARSC55462.2022.9784809"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"He, L., and Schupp, J. (2018). Sensing and Automation in Pruning of Apple Trees: A Review. Agronomy, 8.","DOI":"10.3390\/agronomy8100211"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"313","DOI":"10.20870\/oeno-one.2020.54.2.3016","article-title":"Effects of canopy management practices on grapevine bud fruitfulness","volume":"54","author":"Collins","year":"2020","journal-title":"OENO ONE"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Cuevas-Velasquez, H., Gallego, A.J., Tylecek, R., Hemming, J., Van Tuijl, B., Mencarelli, A., and Fisher, R.B. (August, January 31). Real-time Stereo Visual Servoing for Rose Pruning with Robotic Arm. Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France.","DOI":"10.1109\/ICRA40945.2020.9197272"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"105947","DOI":"10.1016\/j.compag.2020.105947","article-title":"Towards practical 2D grapevine bud detection with fully convolutional networks","volume":"182","author":"Bromberg","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"107736","DOI":"10.1016\/j.compag.2023.107736","article-title":"Towards smart pruning: ViNet, a deep-learning approach for grapevine structure estimation","volume":"207","author":"Gentilhomme","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016, January 27\u201330). You Only Look Once: Unified, Real-Time Object Detection. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Wang, C., Bochkovskiy, A., and Liao, H. (2023, January 17\u201324). YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. Proceedings of the 2023 IEEE\/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada.","DOI":"10.1109\/CVPR52729.2023.00721"},{"key":"ref_14","unstructured":"Jocher, G., Chaurasia, A., and Qiu, J. (2024, October 17). Ultralytics YOLOv8. Available online: https:\/\/docs.ultralytics.com\/pt\/models\/yolov8\/."},{"key":"ref_15","unstructured":"Wang, C.Y., and Liao, H.Y.M. (2024). YOLOv9: Learning What You Want to Learn Using Programmable Gradient Information. arXiv."},{"key":"ref_16","unstructured":"Wang, A., Chen, H., Liu, L., Chen, K., Lin, Z., Han, J., and Ding, G. (2024). YOLOv10: Real-Time End-to-End Object Detection. arXiv."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"31","DOI":"10.1051\/ctv\/20183301031","article-title":"Portuguese vines and wines: Heritage, quality symbol, tourism asset","volume":"33","year":"2018","journal-title":"Ci\u00eancia T\u00e9c. Vitiv."},{"key":"ref_18","unstructured":"Oliveira, F.A., and Silva, D.Q. (2024). Douro & D\u00e3o Grapevines Dataset for Node Detection, CERN Data Centre."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Casas, G.G., Ismail, Z.H., Limeira, M.M.C., da Silva, A.A.L., and Leite, H.G. (2023). Automatic Detection and Counting of Stacked Eucalypt Timber Using the YOLOv8 Model. Forests, 14.","DOI":"10.3390\/f14122369"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Xie, S., and Sun, H. (2023). Tea-YOLOv8s: A Tea Bud Detection Model Based on Deep Learning and Computer Vision. Sensors, 23.","DOI":"10.3390\/s23146576"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Lin, T.Y., Maire, M., Belongie, S., Bourdev, L., Girshick, R., Hays, J., Perona, P., Ramanan, D., Zitnick, C.L., and Doll\u00e1r, P. (2015). Microsoft COCO: Common Objects in Context. arXiv.","DOI":"10.1007\/978-3-319-10602-1_48"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"1680","DOI":"10.3390\/make5040083","article-title":"A Comprehensive Review of YOLO Architectures in Computer Vision: From YOLOv1 to YOLOv8 and YOLO-NAS","volume":"5","author":"Terven","year":"2023","journal-title":"Mach. Learn. Knowl. Extr."},{"key":"ref_23","unstructured":"Jocher, G. (2020). YOLOv5 by Ultralytics, CERN Data Centre."},{"key":"ref_24","unstructured":"Li, X., Wang, W., Wu, L., Chen, S., Hu, X., Li, J., Tang, J., and Yang, J. (2020, January 6\u201312). Generalized focal loss: Learning qualified and distributed bounding boxes for dense object detection. Proceedings of the 34th International Conference on Neural Information Processing Systems, Red Hook, NY, USA. NIPS \u201920."},{"key":"ref_25","first-page":"12993","article-title":"Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression","volume":"34","author":"Zheng","year":"2020","journal-title":"Proc. AAAI Conf. Artif. Intell."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"365","DOI":"10.1146\/annurev-control-053018-023617","article-title":"Agricultural Robotics","volume":"2","author":"Vougioukas","year":"2019","journal-title":"Annu. Rev. Control. Robot. Auton. Syst."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"94","DOI":"10.1016\/j.biosystemseng.2016.06.014","article-title":"Agricultural robots for field operations: Concepts and components","volume":"149","author":"Bechar","year":"2016","journal-title":"Biosyst. Eng."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Hsueh, B.Y., Li, W., and Wu, I.C. (2019, January 7\u201311). Stochastic Gradient Descent With Hyperbolic-Tangent Decay on Classification. Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA.","DOI":"10.1109\/WACV.2019.00052"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Orr, G.B., and M\u00fcller, K.R. (1998). Early Stopping\u2014But When?. Neural Networks: Tricks of the Trade, Springer.","DOI":"10.1007\/3-540-49430-8"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Butko, N.J., and Movellan, J.R. (2009, January 20\u201325). Optimal scanning for faster object detection. Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA.","DOI":"10.1109\/CVPRW.2009.5206540"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Wenkel, S., Alhazmi, K., Liiv, T., Alrshoud, S., and Simon, M. (2021). Confidence Score: The Forgotten Dimension of Object Detection Performance Evaluation. Sensors, 21.","DOI":"10.3390\/s21134350"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"106383","DOI":"10.1016\/j.compag.2021.106383","article-title":"Technological advancements towards developing a robotic pruner for apple trees: A review","volume":"189","author":"Zahid","year":"2021","journal-title":"Comput. Electron. Agric."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/24\/21\/6774\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T16:17:38Z","timestamp":1760113058000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/24\/21\/6774"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,10,22]]},"references-count":32,"journal-issue":{"issue":"21","published-online":{"date-parts":[[2024,11]]}},"alternative-id":["s24216774"],"URL":"https:\/\/doi.org\/10.3390\/s24216774","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,10,22]]}}}