{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,18]],"date-time":"2026-01-18T01:06:01Z","timestamp":1768698361472,"version":"3.49.0"},"reference-count":53,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2023,2,4]],"date-time":"2023-02-04T00:00:00Z","timestamp":1675468800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Funds through the FCT\u2014Funda\u00e7\u00e3o para a Ci\u00eancia e a Tecnologia, I.P. (Portuguese Foundation for Science and Technology)","award":["PTDC\/ASP-HOR\/1338\/2021"],"award-info":[{"award-number":["PTDC\/ASP-HOR\/1338\/2021"]}]},{"name":"National Funds through the FCT\u2014Funda\u00e7\u00e3o para a Ci\u00eancia e a Tecnologia, I.P. (Portuguese Foundation for Science and Technology)","award":["857202"],"award-info":[{"award-number":["857202"]}]},{"name":"European Union\u2019s Horizon 2020 research and innovation programme","award":["PTDC\/ASP-HOR\/1338\/2021"],"award-info":[{"award-number":["PTDC\/ASP-HOR\/1338\/2021"]}]},{"name":"European Union\u2019s Horizon 2020 research and innovation programme","award":["857202"],"award-info":[{"award-number":["857202"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Agronomy"],"abstract":"<jats:p>The efficiency of agricultural practices depends on the timing of their execution. Environmental conditions, such as rainfall, and crop-related traits, such as plant phenology, determine the success of practices such as irrigation. Moreover, plant phenology, the seasonal timing of biological events (e.g., cotyledon emergence), is strongly influenced by genetic, environmental, and management conditions. Therefore, assessing the timing the of crops\u2019 phenological events and their spatiotemporal variability can improve decision making, allowing the thorough planning and timely execution of agricultural operations. Conventional techniques for crop phenology monitoring, such as field observations, can be prone to error, labour-intensive, and inefficient, particularly for crops with rapid growth and not very defined phenophases, such as vegetable crops. Thus, developing an accurate phenology monitoring system for vegetable crops is an important step towards sustainable practices. This paper evaluates the ability of computer vision (CV) techniques coupled with deep learning (DL) (CV_DL) as tools for the dynamic phenological classification of multiple vegetable crops at the subfield level, i.e., within the plot. Three DL models from the Single Shot Multibox Detector (SSD) architecture (SSD Inception v2, SSD MobileNet v2, and SSD ResNet 50) and one from You Only Look Once (YOLO) architecture (YOLO v4) were benchmarked through a custom dataset containing images of eight vegetable crops between emergence and harvest. The proposed benchmark includes the individual pairing of each model with the images of each crop. On average, YOLO v4 performed better than the SSD models, reaching an F1-Score of 85.5%, a mean average precision of 79.9%, and a balanced accuracy of 87.0%. In addition, YOLO v4 was tested with all available data approaching a real mixed cropping system. Hence, the same model can classify multiple vegetable crops across the growing season, allowing the accurate mapping of phenological dynamics. This study is the first to evaluate the potential of CV_DL for vegetable crops\u2019 phenological research, a pivotal step towards automating decision support systems for precision horticulture.<\/jats:p>","DOI":"10.3390\/agronomy13020463","type":"journal-article","created":{"date-parts":[[2023,2,6]],"date-time":"2023-02-06T04:08:07Z","timestamp":1675656487000},"page":"463","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":22,"title":["Computer Vision and Deep Learning as Tools for Leveraging Dynamic Phenological Classification in Vegetable Crops"],"prefix":"10.3390","volume":"13","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-4507-2481","authenticated-orcid":false,"given":"Leandro","family":"Rodrigues","sequence":"first","affiliation":[{"name":"Faculty of Sciences, University of Porto, Rua do Campo Alegre s\/n, 4169-007 Porto, Portugal"},{"name":"INESC TEC-Institute for Systems and Computer Engineering, Technology and Science, Rua Dr. Roberto Frias s\/n, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3095-197X","authenticated-orcid":false,"given":"Sandro Augusto","family":"Magalh\u00e3es","sequence":"additional","affiliation":[{"name":"INESC TEC-Institute for Systems and Computer Engineering, Technology and Science, Rua Dr. Roberto Frias s\/n, 4200-465 Porto, Portugal"},{"name":"Faculty of Engineering, University of Porto, Rua Dr. Roberto Frias s\/n, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9999-1550","authenticated-orcid":false,"given":"Daniel Queir\u00f3s","family":"da Silva","sequence":"additional","affiliation":[{"name":"INESC TEC-Institute for Systems and Computer Engineering, Technology and Science, Rua Dr. Roberto Frias s\/n, 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro (UTAD), 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-6113","authenticated-orcid":false,"given":"Filipe Neves","family":"dos Santos","sequence":"additional","affiliation":[{"name":"INESC TEC-Institute for Systems and Computer Engineering, Technology and Science, Rua Dr. Roberto Frias s\/n, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8299-324X","authenticated-orcid":false,"given":"M\u00e1rio","family":"Cunha","sequence":"additional","affiliation":[{"name":"Faculty of Sciences, University of Porto, Rua do Campo Alegre s\/n, 4169-007 Porto, Portugal"},{"name":"INESC TEC-Institute for Systems and Computer Engineering, Technology and Science, Rua Dr. Roberto Frias s\/n, 4200-465 Porto, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2023,2,4]]},"reference":[{"key":"ref_1","unstructured":"Lieth, H. (1974). Ecological Studies, Springer."},{"key":"ref_2","unstructured":"Liang, L. (2019). Reference Module in Earth Systems and Environmental Sciences, Elsevier."},{"key":"ref_3","first-page":"217","article-title":"Importance of phenological observations and predictions in agriculture","volume":"50","author":"Ruml","year":"2005","journal-title":"J. Agric. Sci."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Schwartz, M.D. (2013). Phenology: An Integrative Environmental Science, Springer.","DOI":"10.1007\/978-94-007-6925-0"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"67","DOI":"10.3897\/neobiota.41.29965","article-title":"Variation in phenology and overall performance traits can help to explain the plant invasion process amongst Mediterranean ecosystems","volume":"41","author":"Casado","year":"2018","journal-title":"NeoBiota"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"108042","DOI":"10.1016\/j.ecolind.2021.108042","article-title":"Assessing the inter-annual variability of vegetation phenological events observed from satellite vegetation index time series in dryland sites","volume":"130","author":"Kato","year":"2021","journal-title":"Ecol. Indic."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Kasampalis, D.A., Alexandridis, T.K., Deva, C., Challinor, A., Moshou, D., and Zalidis, G. (2018). Contribution of Remote Sensing on Crop Models: A Review. J. Imaging, 4.","DOI":"10.3390\/jimaging4040052"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1237","DOI":"10.1007\/s11430-019-9622-2","article-title":"Progress in plant phenology modeling under global climate change","volume":"63","author":"Fu","year":"2020","journal-title":"Sci. China Earth Sci."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"10","DOI":"10.1186\/s40066-020-00283-5","article-title":"Challenges and opportunities in crop simulation modelling under seasonal and projected climate change scenarios for crop production in South Africa","volume":"10","author":"Kephe","year":"2021","journal-title":"Agric. Food Secur."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"327","DOI":"10.1016\/j.agrformet.2018.11.002","article-title":"Monitoring crop phenology using a smartphone based near-surface remote sensing approach","volume":"265","author":"Hufkens","year":"2019","journal-title":"Agric. For. Meteorol."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Guo, Y., Chen, S., Fu, Y.H., Xiao, Y., Wu, W., Wang, H., and Beurs, K.d. (2022). Comparison of Multi-Methods for Identifying Maize Phenology Using PhenoCams. Remote Sens., 14.","DOI":"10.3390\/rs14020244"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Chac\u00f3n-Maldonado, A.M., Molina-Cabanillas, M.A., Troncoso, A., Mart\u00ednez-\u00c1lvarez, F., and Asencio-Cort\u00e9s, G. (2022, January 5\u20137). Olive Phenology Forecasting Using Information Fusion-Based Imbalanced Preprocessing and Automated Deep Learning. Proceedings of the Hybrid Artificial Intelligent Systems Conference, Salamanca, Spain.","DOI":"10.1007\/978-3-031-15471-3_24"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Milicevic, M., Zubrinic, K., Grbavac, I., and Obradovic, I. (2020). Application of Deep Learning Architectures for Accurate Detection of Olive Tree Flowering Phenophase. Remote Sens., 12.","DOI":"10.3390\/rs12132120"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Jing, H., Xiujuan, W., Haoyu, W., Xingrong, F., and Mengzhen, K. (2017, January 20\u201322). Prediction of crop phenology\u2014A component of parallel agriculture management. Proceedings of the 2017 Chinese Automation Congress, Jinan, China.","DOI":"10.1109\/CAC.2017.8244172"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Giordano, M., Petropoulos, S.A., and Rouphael, Y. (2021). Response and Defence Mechanisms of Vegetable Crops against Drought, Heat and Salinity Stress. Agriculture, 11.","DOI":"10.3390\/agriculture11050463"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"428","DOI":"10.1016\/j.tplants.2013.04.008","article-title":"Cell to whole-plant phenotyping: The best is yet to come","volume":"18","author":"Dhondt","year":"2013","journal-title":"Trends Plant Sci."},{"key":"ref_17","first-page":"183","article-title":"A role of computer vision in fruits and vegetables among various horticulture products of agriculture fields: A survey","volume":"7","author":"Tripathi","year":"2020","journal-title":"Inf. Process. Agric."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"69","DOI":"10.1016\/j.compag.2018.08.001","article-title":"Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review","volume":"153","author":"Rieder","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"2428","DOI":"10.1109\/TMECH.2017.2760866","article-title":"A Survey of Ranging and Imaging Techniques for Precision Agriculture Phenotyping","volume":"22","author":"Narvaez","year":"2017","journal-title":"IEEE\/ASME Trans. Mechatron."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1007\/s00521-021-06651-x","article-title":"A fast accurate fine-grain object detection model based on YOLOv4 deep neural network","volume":"34","author":"Roy","year":"2022","journal-title":"Neural Comput. Appl."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"10940","DOI":"10.1109\/ACCESS.2021.3050296","article-title":"Weed Identification Using Deep Learning and Image Processing in Vegetable Plantation","volume":"9","author":"Jin","year":"2021","journal-title":"IEEE Access"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Aguiar, A.S., Magalh\u00e3es, S.A., dos Santos, F.N., Castro, L., Pinho, T., Valente, J., Martins, R., and Boaventura-Cunha, J. (2021). Grape bunch detection at different growth stages using deep learning quantized models. Agronomy, 11.","DOI":"10.3390\/agronomy11091890"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"70","DOI":"10.1016\/j.compag.2018.02.016","article-title":"Deep learning in agriculture: A survey","volume":"147","author":"Kamilaris","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"123","DOI":"10.1038\/s41438-021-00560-9","article-title":"Applications of deep-learning approaches in horticultural research: A review","volume":"8","author":"Yang","year":"2021","journal-title":"Hortic. Res."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"805738","DOI":"10.3389\/fpls.2022.805738","article-title":"Deep Learning in Plant Phenological Research: A Systematic Literature Review","volume":"13","author":"Katal","year":"2022","journal-title":"Front. Plant Sci."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"diab017","DOI":"10.1093\/insilicoplants\/diab017","article-title":"Evolution and application of digital technologies to predict crop type and crop phenology in agriculture","volume":"3","author":"Potgieter","year":"2021","journal-title":"Silico Plants"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"187","DOI":"10.1016\/j.molp.2020.01.008","article-title":"Crop Phenomics and High-Throughput Phenotyping: Past Decades, Current Challenges, and Future Perspectives","volume":"13","author":"Yang","year":"2020","journal-title":"Mol. Plant"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1007\/s10681-022-02992-3","article-title":"Deep learning: As the new frontier in high-throughput plant phenotyping","volume":"218","author":"Arya","year":"2022","journal-title":"Euphytica"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Yalcin, H. (2017, January 7\u201310). Plant phenology recognition using deep learning: Deep-Pheno. Proceedings of the 6th International Conference on Agro-Geoinformatics, Fairfax, VA, USA.","DOI":"10.1109\/Agro-Geoinformatics.2017.8046996"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"154","DOI":"10.1007\/s11119-020-09734-2","article-title":"Real-time detection of rice phenology through convolutional neural network using handheld camera images","volume":"22","author":"Han","year":"2021","journal-title":"Precis. Agric."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"106866","DOI":"10.1016\/j.compag.2022.106866","article-title":"Monitoring crop phenology with street-level imagery using computer vision","volume":"196","author":"Yordanov","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Taylor, S.D., and Browning, D.M. (2022). Classification of Daily Crop Phenology in PhenoCams Using Deep Learning and Hidden Markov Models. Remote Sens., 14.","DOI":"10.3390\/rs14020286"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"106123","DOI":"10.1016\/j.compag.2021.106123","article-title":"DeepPhenology: Estimation of apple flower phenology distributions based on deep learning","volume":"185","author":"Wang","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Molina, M.\u00c1., Jim\u00e9nez-Navarro, M.J., Mart\u00ednez-\u00c1lvarez, F., and Asencio-Cort\u00e9s, G. (2021, January 22\u201324). A Model-Based Deep Transfer Learning Algorithm for Phenology Forecasting Using Satellite Imagery. Proceedings of the Hybrid Artificial Intelligent Systems, Bilbao, Spain.","DOI":"10.1007\/978-3-030-86271-8_43"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Pearse, G., Watt, M.S., Soewarto, J., and Tan, A.Y. (2021). Deep Learning and Phenology Enhance Large-Scale Tree Species Classification in Aerial Imagery during a Biosecurity Response. Remote Sens., 13.","DOI":"10.3390\/rs13091789"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"13151","DOI":"10.1109\/ACCESS.2020.2965462","article-title":"Leveraging Artificial Intelligence for Large-Scale Plant Phenology Studies From Noisy Time-Lapse Images","volume":"8","author":"Correia","year":"2020","journal-title":"IEEE Access"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"765","DOI":"10.1002\/rse2.275","article-title":"Automatic flower detection and phenology monitoring using time-lapse cameras and deep learning","volume":"8","author":"Mann","year":"2022","journal-title":"Remote Sens. Ecol. Conserv."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"361","DOI":"10.1016\/j.compag.2018.09.021","article-title":"AgroAVNET for crops and weeds classification: A step forward in automatic farming","volume":"154","author":"Chavan","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_39","unstructured":"Ofori, M., and El-Gayar, O. (2020, January 10\u201314). Towards Deep Learning for Weed Detection: Deep Convolutional Neural Network Architectures for Plant Seedling Classification. Proceedings of the Americas Conference on Information Systems, Virtual."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"103","DOI":"10.1186\/s13007-020-00647-9","article-title":"Deep learning-based detection of seedling development","volume":"16","author":"Samiei","year":"2020","journal-title":"Plant Methods"},{"key":"ref_41","unstructured":"Meier, U. (2018). (Ed.) Growth Stages of Mono- and Dicotyledonous Plants, Julius K\u00fchn-Institut."},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"303","DOI":"10.1007\/s11263-009-0275-4","article-title":"The Pascal Visual Object Classes (VOC) Challenge","volume":"88","author":"Everingham","year":"2010","journal-title":"Int. J. Comput. Vis."},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Padilla, R., Passos, W.L., Dias, T.L.B., Netto, S.L., and da Silva, E.A.B. (2021). A Comparative Analysis of Object Detection Metrics with a Companion Open-Source Toolkit. Electronics, 10.","DOI":"10.3390\/electronics10030279"},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Terra, F., Rodrigues, L., Magalh\u00e3es, S., Santos, F., Moura, P., and Cunha, M. (2021, January 20\u201322). PixelCropRobot, a cartesian multitask platform for microfarms automation. Proceedings of the 2021 International Symposium of Asian Control Association on Intelligent Robotics and Industrial Automation (IRIA), Goa, India.","DOI":"10.1109\/IRIA53009.2021.9588786"},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"21","DOI":"10.1007\/978-3-319-46448-0_2","article-title":"SSD: Single Shot MultiBox Detector","volume":"9905","author":"Liu","year":"2016","journal-title":"Lect. Notes Comput. Sci."},{"key":"ref_46","doi-asserted-by":"crossref","unstructured":"Magalh\u00e3es, S.A., Castro, L., Moreira, G., dos Santos, F.N., Cunha, M., Dias, J., and Moreira, A.P. (2021). Evaluating the Single-Shot MultiBox Detector and YOLO Deep Learning Models for the Detection of Tomatoes in a Greenhouse. Sensors, 21.","DOI":"10.3390\/s21103569"},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Moreira, G., Magalh\u00e3es, S.A., Pinho, T., dos Santos, F.N., and Cunha, M. (2022). Benchmark of Deep Learning and a Proposed HSV Colour Space Models for the Detection and Classification of Greenhouse Tomato. Agronomy, 12.","DOI":"10.3390\/agronomy12020356"},{"key":"ref_48","unstructured":"Ioffe, S., and Szegedy, C. (2015, January 6\u201311). Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. Proceedings of the 32nd International Conference on International Conference on Machine Learning, Lille, France."},{"key":"ref_49","doi-asserted-by":"crossref","unstructured":"Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., and Rabinovich, A. (2015, January 7\u201312). Going deeper with convolutions. Proceedings of the Conference on Computer Vision and Pattern Recognition, Boston, MA, USA.","DOI":"10.1109\/CVPR.2015.7298594"},{"key":"ref_50","doi-asserted-by":"crossref","unstructured":"Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.C. (2018, January 18\u201322). MobileNetV2: Inverted Residuals and Linear Bottlenecks. Proceedings of the Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","DOI":"10.1109\/CVPR.2018.00474"},{"key":"ref_51","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (July, January 26). Deep Residual Learning for Image Recognition. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_52","unstructured":"Bochkovskiy, A., Wang, C.Y., and Liao, H.Y.M. (2020). YOLOv4: Optimal speed and accuracy of object detection. arXiv."},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"Brodersen, K.H., Ong, C.S., Stephan, K.E., and Buhmann, J.M. (2010, January 23\u201326). The Balanced Accuracy and Its Posterior Distribution. Proceedings of the 20th International Conference on Pattern Recognition, Istanbul, Turkey.","DOI":"10.1109\/ICPR.2010.764"}],"container-title":["Agronomy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-4395\/13\/2\/463\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T18:24:09Z","timestamp":1760120649000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-4395\/13\/2\/463"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,2,4]]},"references-count":53,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2023,2]]}},"alternative-id":["agronomy13020463"],"URL":"https:\/\/doi.org\/10.3390\/agronomy13020463","relation":{},"ISSN":["2073-4395"],"issn-type":[{"value":"2073-4395","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,2,4]]}}}