{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,7]],"date-time":"2026-05-07T15:49:21Z","timestamp":1778168961353,"version":"3.51.4"},"reference-count":40,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2023,4,14]],"date-time":"2023-04-14T00:00:00Z","timestamp":1681430400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"European Union\u2019s Horizon 2020 research and innovation programme","award":["857202"],"award-info":[{"award-number":["857202"]}]},{"name":"European Union\u2019s Horizon 2020 research and innovation programme","award":["PRR-C05-i03-I-000071"],"award-info":[{"award-number":["PRR-C05-i03-I-000071"]}]},{"name":"Component 5\u2014Capitalization and Business Innovation","award":["857202"],"award-info":[{"award-number":["857202"]}]},{"name":"Component 5\u2014Capitalization and Business Innovation","award":["PRR-C05-i03-I-000071"],"award-info":[{"award-number":["PRR-C05-i03-I-000071"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Agronomy"],"abstract":"<jats:p>The world wine sector is a multi-billion dollar industry with a wide range of economic activities. Therefore, it becomes crucial to monitor the grapevine because it allows a more accurate estimation of the yield and ensures a high-quality end product. The most common way of monitoring the grapevine is through the leaves (preventive way) since the leaves first manifest biophysical lesions. However, this does not exclude the possibility of biophysical lesions manifesting in the grape berries. Thus, this work presents three pre-trained YOLO models (YOLOv5x6, YOLOv7-E6E, and YOLOR-CSP-X) to detect and classify grape bunches as healthy or damaged by the number of berries with biophysical lesions. Two datasets were created and made publicly available with original images and manual annotations to identify the complexity between detection (bunches) and classification (healthy or damaged) tasks. The datasets use the same 10,010 images with different classes. The Grapevine Bunch Detection Dataset uses the Bunch class, and The Grapevine Bunch Condition Detection Dataset uses the OptimalBunch and DamagedBunch classes. Regarding the three models trained for grape bunches detection, they obtained promising results, highlighting YOLOv7 with 77% of mAP and 94% of the F1-score. In the case of the task of detection and identification of the state of grape bunches, the three models obtained similar results, with YOLOv5 achieving the best ones with an mAP of 72% and an F1-score of 92%.<\/jats:p>","DOI":"10.3390\/agronomy13041120","type":"journal-article","created":{"date-parts":[[2023,4,14]],"date-time":"2023-04-14T09:23:43Z","timestamp":1681464223000},"page":"1120","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":37,"title":["Deep Learning YOLO-Based Solution for Grape Bunch Detection and Assessment of Biophysical Lesions"],"prefix":"10.3390","volume":"13","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-5387-0980","authenticated-orcid":false,"given":"Isabel","family":"Pinheiro","sequence":"first","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-1382-8267","authenticated-orcid":false,"given":"Germano","family":"Moreira","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"Faculty of Sciences, University of Porto, 4169-007 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9999-1550","authenticated-orcid":false,"given":"Daniel","family":"Queir\u00f3s da Silva","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3095-197X","authenticated-orcid":false,"given":"Sandro","family":"Magalh\u00e3es","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5798-1298","authenticated-orcid":false,"given":"Ant\u00f3nio","family":"Valente","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4283-1243","authenticated-orcid":false,"given":"Paulo","family":"Moura Oliveira","sequence":"additional","affiliation":[{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8299-324X","authenticated-orcid":false,"given":"M\u00e1rio","family":"Cunha","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"Faculty of Sciences, University of Porto, 4169-007 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-6113","authenticated-orcid":false,"given":"Filipe","family":"Santos","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2023,4,14]]},"reference":[{"key":"ref_1","unstructured":"Statistics Department of the International Organisation of Vine and Wine (OIV) (2021). Annual Assessment of the World Vine and Wine Sector in 2021."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"326","DOI":"10.1093\/yiel\/yvab061","article-title":"Food and Agriculture Organization of the United Nations (FAO)","volume":"31","author":"Mekouar","year":"2020","journal-title":"Yearb. Int. Environ. Law"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"779","DOI":"10.5424\/sjar\/2009074-1092","article-title":"Precision Viticulture. Research topics, challenges and opportunities in site-specific vineyard management","volume":"7","author":"Casasnovas","year":"2009","journal-title":"Span. J. Agric. Res."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Mohimont, L., Alin, F., Rondeau, M., Gaveau, N., and Steffenel, L.A. (2022). Computer Vision and Deep Learning for Precision Viticulture. Agronomy, 12.","DOI":"10.3390\/agronomy12102463"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"100005","DOI":"10.1016\/j.atech.2021.100005","article-title":"Smart applications and digital technologies in viticulture: A review","volume":"1","author":"Tardaguila","year":"2021","journal-title":"Smart Agric. Technol."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"42","DOI":"10.1016\/j.eja.2015.10.008","article-title":"Pollen-based predictive modelling of wine production: Application to an arid region","volume":"73","author":"Cunha","year":"2016","journal-title":"Eur. J. Agron."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"67494","DOI":"10.1109\/ACCESS.2018.2875862","article-title":"Computer Vision and Machine Learning for Viticulture Technology","volume":"6","author":"Seng","year":"2018","journal-title":"IEEE Access"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"B\u00f6mer, J., Zabawa, L., Sieren, P., Kicherer, A., Klingbeil, L., Rascher, U., Muller, O., Kuhlmann, H., and Roscher, R. (2020, January 23\u201328). Automatic differentiation of damaged and unharmed grapes using rgb images and convolutional neural networks. Proceedings of the Computer Vision\u2014ECCV 2020 Workshops, Glasgow, UK.","DOI":"10.1007\/978-3-030-65414-6_24"},{"key":"ref_9","first-page":"346","article-title":"A Survey of Computer Vision Methods for Counting Fruits and Yield Prediction","volume":"2","author":"Syal","year":"2013","journal-title":"Int. J. Comput. Sci. Eng."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Mavridou, E., Vrochidou, E., Papakostas, G., Pachidis, T., and Kaburlasos, V. (2019). Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging, 5.","DOI":"10.3390\/jimaging5120089"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"85","DOI":"10.1016\/j.neunet.2014.09.003","article-title":"Deep learning in neural networks: An overview","volume":"61","author":"Schmidhuber","year":"2015","journal-title":"Neural Netw."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Rodrigues, L., Magalh\u00e3es, S.A., da Silva, D.Q., dos Santos, F.N., and Cunha, M. (2023). Computer Vision and Deep Learning as Tools for Leveraging Dynamic Phenological Classification in Vegetable Crops. Agronomy, 13.","DOI":"10.3390\/agronomy13020463"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Gulzar, Y. (2023). Fruit Image Classification Model Based on MobileNetV2 with Deep Transfer Learning Technique. Sustainability, 15.","DOI":"10.3390\/su15031906"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"142","DOI":"10.1109\/TPAMI.2015.2437384","article-title":"Region-based convolutional networks for accurate object detection and segmentation","volume":"38","author":"Girshick","year":"2015","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Girshick, R. (2015, January 7\u201313). Fast r-cnn. Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile.","DOI":"10.1109\/ICCV.2015.169"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.Y., and Berg, A.C. (2016, January 11\u201314). SSD: Single Shot MultiBox Detector. Proceedings of the 14th European Conference of Computer Vision\u2014ECCV 2016, Amsterdam, The Netherlands.","DOI":"10.1007\/978-3-319-46448-0_2"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016, January 27\u201330). You only look once: Unified, real-time object detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"833","DOI":"10.20870\/oeno-one.2020.54.4.3616","article-title":"Yield components detection and image-based indicators for non-invasive grapevine yield prediction at different phenological phases","volume":"54","author":"Victorino","year":"2020","journal-title":"Oeno One"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"626989","DOI":"10.3389\/frobt.2021.626989","article-title":"Fruit Detection and Pose Estimation for Grape Cluster\u2013Harvesting Robot Using Binocular Imagery Based on Deep Neural Networks","volume":"8","author":"Yin","year":"2021","journal-title":"Front. Robot. AI"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Ghiani, L., Sassu, A., Palumbo, F., Mercenaro, L., and Gambella, F. (2021). In-Field Automatic Detection of Grape Bunches under a Totally Uncontrolled Environment. Sensors, 21.","DOI":"10.3390\/s21113908"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"105247","DOI":"10.1016\/j.compag.2020.105247","article-title":"Grape detection, segmentation and tracking using deep neural networks and three-dimensional association","volume":"170","author":"Santos","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Deng, G., Geng, T., He, C., Wang, X., He, B., and Duan, L. (2020, January 18\u201322). TSGYE: Two-Stage Grape Yield Estimation. Proceedings of the 27th International Conference (ICONIP 2020), Bangkok, Thailand.","DOI":"10.1007\/978-3-030-63820-7_66"},{"key":"ref_24","unstructured":"Heinrich, K., Roth, A., Breithaupt, L., M\u00f6ller, B., and Maresch, J. (2023, February 14). Yield Prognosis for the Agrarian Management of Vineyards Using Deep Learning for Object Counting. Available online: https:\/\/aisel.aisnet.org\/wi2019\/track05\/papers\/3\/."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Aguiar, A.S., Magalh\u00e3es, S.A., dos Santos, F.N., Castro, L., Pinho, T., Valente, J., Martins, R., and Boaventura-Cunha, J. (2021). Grape Bunch Detection at Different Growth Stages Using Deep Learning Quantized Models. Agronomy, 11.","DOI":"10.3390\/agronomy11091890"},{"key":"ref_26","unstructured":"Sozzi, M., Cantalamessa, S., Cogato, A., Kayad, A., and Marinello, F. (2021). Precision Agriculture, Wageningen Academic Publisher."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"347","DOI":"10.1016\/j.biosystemseng.2021.11.011","article-title":"A real-time table grape detection method based on improved YOLOv4-tiny network in complex background","volume":"212","author":"Li","year":"2021","journal-title":"Biosyst. Eng."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Sozzi, M., Cantalamessa, S., Cogato, A., Kayad, A., and Marinello, F. (2022). Automatic Bunch Detection in White Grape Varieties Using YOLOv3, YOLOv4, and YOLOv5 Deep Learning Algorithms. Agronomy, 12.","DOI":"10.3390\/agronomy12020319"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Zhang, C., Ding, H., Shi, Q., and Wang, Y. (2022). Grape Cluster Real-Time Detection in Complex Natural Scenes Based on YOLOv5s Deep Learning Network. Agriculture, 12.","DOI":"10.3390\/agriculture12081242"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"729097","DOI":"10.3389\/fpls.2022.729097","article-title":"Detection of Anomalous Grapevine Berries Using Variational Autoencoders","volume":"13","author":"Miranda","year":"2022","journal-title":"Front. Plant Sci."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"14","DOI":"10.1007\/s10846-022-01595-3","article-title":"Active perception fruit harvesting robots\u2014A systematic review","volume":"105","author":"Magalhaes","year":"2022","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Rakhmatulin, I., Kamilaris, A., and Andreasen, C. (2021). Deep neural networks to detect weeds from crops in agricultural environments in real-time: A review. Remote. Sens., 13.","DOI":"10.2139\/ssrn.3959386"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Fountas, S., Mylonas, N., Malounas, I., Rodias, E., Hellmann Santos, C., and Pekkeriet, E. (2020). Agricultural robotics for field operations. Sensors, 20.","DOI":"10.3390\/s20092672"},{"key":"ref_34","unstructured":"Meier, U. (1997). Growth Stages of Mono- and Dicotyledonous Plants, Blackwell Wissenschafts."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Padilla, R., Passos, W.L., Dias, T.L., Netto, S.L., and Da Silva, E.A. (2021). A comparative analysis of object detection metrics with a companion open-source toolkit. Electronics, 10.","DOI":"10.3390\/electronics10030279"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"70","DOI":"10.1016\/j.compag.2018.02.016","article-title":"Deep learning in agriculture: A survey","volume":"147","author":"Kamilaris","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_37","unstructured":"Jocher, G., Chaurasia, A., Stoken, A., Borovec, J., NanoCode012, Kwon, Y., Michael, K., TaoXie, Fang, J., and imyhxy (2023, February 14). Ultralytics\/yolov5: V7.0\u2014YOLOv5 SOTA Realtime Instance Segmentation. Available online: https:\/\/github.com\/ultralytics\/yolov5\/discussions\/10258."},{"key":"ref_38","unstructured":"Wang, C.Y., Bochkovskiy, A., and Liao, H.Y.M. (2022). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv."},{"key":"ref_39","unstructured":"Wang, C.Y., Yeh, I.H., and Liao, H.Y.M. (2021). You only learn one representation: Unified network for multiple tasks. arXiv."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Lin, T.Y., Maire, M., Belongie, S., Bourdev, L., Girshick, R., Hays, J., Perona, P., Ramanan, D., Zitnick, C.L., and Doll\u00e1r, P. (2014). Microsoft COCO: Common Objects in Context. arXiv.","DOI":"10.1007\/978-3-319-10602-1_48"}],"container-title":["Agronomy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-4395\/13\/4\/1120\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T19:16:07Z","timestamp":1760123767000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-4395\/13\/4\/1120"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,4,14]]},"references-count":40,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2023,4]]}},"alternative-id":["agronomy13041120"],"URL":"https:\/\/doi.org\/10.3390\/agronomy13041120","relation":{},"ISSN":["2073-4395"],"issn-type":[{"value":"2073-4395","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,4,14]]}}}