{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,16]],"date-time":"2026-04-16T08:04:10Z","timestamp":1776326650982,"version":"3.50.1"},"reference-count":51,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2021,2,5]],"date-time":"2021-02-05T00:00:00Z","timestamp":1612483200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Agriculture"],"abstract":"<jats:p>The development of robotic solutions in unstructured environments brings several challenges, mainly in developing safe and reliable navigation solutions. Agricultural environments are particularly unstructured and, therefore, challenging to the implementation of robotics. An example of this is the mountain vineyards, built-in steep slope hills, which are characterized by satellite signal blockage, terrain irregularities, harsh ground inclinations, and others. All of these factors impose the implementation of precise and reliable navigation algorithms, so that robots can operate safely. This work proposes the detection of semantic natural landmarks that are to be used in Simultaneous Localization and Mapping algorithms. Thus, Deep Learning models were trained and deployed to detect vine trunks. As significant contributions, we made available a novel vine trunk dataset, called VineSet, which was constituted by more than 9000 images and respective annotations for each trunk. VineSet was used to train state-of-the-art Single Shot Multibox Detector models. Additionally, we deployed these models in an Edge-AI fashion and achieve high frame rate execution. Finally, an assisted annotation tool was proposed to make the process of dataset building easier and improve models incrementally. The experiments show that our trained models can detect trunks with an Average Precision up to 84.16% and our assisted annotation tool facilitates the annotation process, even in other areas of agriculture, such as orchards and forests. Additional experiments were performed, where the impact of the amount of training data and the comparison between using Transfer Learning and training from scratch were evaluated. In these cases, some theoretical assumptions were verified.<\/jats:p>","DOI":"10.3390\/agriculture11020131","type":"journal-article","created":{"date-parts":[[2021,2,5]],"date-time":"2021-02-05T08:33:48Z","timestamp":1612514028000},"page":"131","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":26,"title":["Bringing Semantics to the Vineyard: An Approach on Deep Learning-Based Vine Trunk Detection"],"prefix":"10.3390","volume":"11","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-6909-0209","authenticated-orcid":false,"given":"Andr\u00e9 Silva","family":"Aguiar","sequence":"first","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9601-5693","authenticated-orcid":false,"given":"Nuno Namora","family":"Monteiro","sequence":"additional","affiliation":[{"name":"Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-6113","authenticated-orcid":false,"given":"Filipe Neves dos","family":"Santos","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3224-4926","authenticated-orcid":false,"given":"Eduardo J.","family":"Solteiro Pires","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9999-1550","authenticated-orcid":false,"given":"Daniel","family":"Silva","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0317-4714","authenticated-orcid":false,"given":"Armando Jorge","family":"Sousa","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"},{"name":"Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8406-0064","authenticated-orcid":false,"given":"Jos\u00e9","family":"Boaventura-Cunha","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2021,2,5]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"289","DOI":"10.1016\/S0169-2046(03)00156-7","article-title":"The Alto Douro Wine Region greenway","volume":"68","author":"Andresen","year":"2004","journal-title":"Landsc. Urban Plan."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"429","DOI":"10.1007\/s10846-016-0340-5","article-title":"Towards a reliable robot for steep slope vineyards monitoring","volume":"83","author":"Sobreira","year":"2016","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Rold\u00e1n, J.J., del Cerro, J., Garz\u00f3n-Ramos, D., Garcia-Aunon, P., Garz\u00f3n, M., de Le\u00f3n, J., and Barrientos, A. (2018). Robots in Agriculture: State of Art and Practical Experiences. Service Robots, InTech.","DOI":"10.5772\/intechopen.69874"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Duckett, T., Pearson, S., Blackmore, S., Grieve, B., Chen, W.H., Cielniak, G., Cleaversmith, J., Dai, J., Davis, S., and Fox, C. (2018). Agricultural Robotics: The Future of Robotic Agriculture. arXiv.","DOI":"10.31256\/WP2018.2"},{"key":"ref_5","unstructured":"Dos Santos, F.N., Sobreira, H.M.P., Campos, D.F.B., Morais, R., Moreira, A.P.G.M., and Contente, O.M.S. (2015, January 8\u201310). Towards a Reliable Monitoring Robot for Mountain Vineyards. Proceedings of the 2015 IEEE International Conference on Autonomous Robot Systems and Competitions, Vila Real, Portugal."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"195","DOI":"10.1016\/j.compag.2011.07.007","article-title":"Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection","volume":"78","author":"Cheein","year":"2011","journal-title":"Comput. Electron. Agric."},{"key":"ref_7","unstructured":"Moura Oliveira, P., Novais, P., and Reis, L.P. (2019). Monocular Visual Odometry Using Fisheye Lens Cameras. Progress in Artificial Intelligence, Springer International Publishing."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Aguiar, A.S., dos Santos, F.N., Cunha, J.B., Sobreira, H., and Sousa, A.J. (2020). Localization and Mapping for Robots in Agriculture and Forestry: A Survey. Robotics, 9.","DOI":"10.3390\/robotics9040097"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"3212","DOI":"10.1109\/TNNLS.2018.2876865","article-title":"Object Detection With Deep Learning: A Review","volume":"30","author":"Zhao","year":"2019","journal-title":"IEEE Trans. Neural Netw. Learn. Syst."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep Learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_11","unstructured":"Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep Learning, MIT Press."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"85","DOI":"10.1016\/j.neunet.2014.09.003","article-title":"Deep learning in neural networks: An overview","volume":"61","author":"Schmidhuber","year":"2015","journal-title":"Neural Netw."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"1345","DOI":"10.1109\/TKDE.2009.191","article-title":"A Survey on Transfer Learning","volume":"22","author":"Pan","year":"2010","journal-title":"IEEE Trans. Knowl. Data Eng."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"9","DOI":"10.1186\/s40537-016-0043-6","article-title":"A survey of transfer learning","volume":"3","author":"Weiss","year":"2016","journal-title":"J. Big Data"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"14","DOI":"10.1016\/j.knosys.2015.01.010","article-title":"Transfer learning using computational intelligence: A survey","volume":"80","author":"Lu","year":"2015","journal-title":"Knowl. Based Syst."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"1019","DOI":"10.1109\/TNNLS.2014.2330900","article-title":"Transfer Learning for Visual Categorization: A Survey","volume":"26","author":"Shao","year":"2015","journal-title":"IEEE Trans. Neural Netw. Learn. Syst."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Torrey, L., and Shavlik, J. (2009). Transfer Learning. Handbook of Research on Machine Learning Applications, IGI Global.","DOI":"10.4018\/978-1-60566-766-9.ch011"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Silva, M.F., Lu\u00eds Lima, J., Reis, L.P., Sanfeliu, A., and Tardioli, D. (2020). Deep Learning Applications in Agriculture: A Short Review. Robot 2019: Fourth Iberian Robotics Conference, Springer International Publishing.","DOI":"10.1007\/978-3-030-35990-4"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"70","DOI":"10.1016\/j.compag.2018.02.016","article-title":"Deep learning in agriculture: A survey","volume":"147","author":"Kamilaris","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"312","DOI":"10.1017\/S0021859618000436","article-title":"A review of the use of convolutional neural networks in agriculture","volume":"156","author":"Kamilaris","year":"2018","journal-title":"J. Agric. Sci."},{"key":"ref_21","first-page":"394","article-title":"Real-time vineyard trunk detection for a grapes harvesting robot via deep learning","volume":"Volume 11605","author":"Osten","year":"2021","journal-title":"Proceedings of the Thirteenth International Conference on Machine Vision"},{"key":"ref_22","unstructured":"Ren, S., He, K., Girshick, R., and Sun, J. (2015). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. arXiv."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016, January 27\u201330). You Only Look Once: Unified, Real-Time Object Detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"9975","DOI":"10.3390\/rs70809975","article-title":"aTrunk\u2014An ALS-Based Trunk Detection Algorithm","volume":"7","author":"Lamprecht","year":"2015","journal-title":"Remote Sens."},{"key":"ref_25","unstructured":"Shalal, N., Low, T., McCarthy, C., and Hancock, N. (2013, January 2\u20134). A preliminary evaluation of vision and laser sensing for tree trunk detection and orchard mapping. Proceedings of the Australasian Conference on Robotics and Automation (ACRA 2013), Sydney, Australia."},{"key":"ref_26","first-page":"20","article-title":"Trunk detection based on laser radar and vision data fusion","volume":"11","author":"Xue","year":"2018","journal-title":"Int. J. Agric. Biol. Eng."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"172","DOI":"10.1016\/j.compag.2016.09.002","article-title":"A novel tree trunk detection method for oil-palm plantation navigation","volume":"128","author":"Juman","year":"2016","journal-title":"Comput. Electron. Agric."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1075","DOI":"10.1002\/rob.21583","article-title":"A Pipeline for Trunk Detection in Trellis Structured Apple Orchards","volume":"32","author":"Bargoti","year":"2015","journal-title":"J. Field Robot."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"92","DOI":"10.1016\/j.biosystemseng.2018.06.002","article-title":"An automatic trunk-detection system for intensive olive harvesting with trunk shaker","volume":"172","year":"2018","journal-title":"Biosyst. Eng."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"17","DOI":"10.1016\/j.compind.2018.03.010","article-title":"Apple flower detection using deep convolutional networks","volume":"99","author":"Dias","year":"2018","journal-title":"Comput. Ind."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Zheng, Y.Y., Kong, J.L., Jin, X.B., Wang, X.Y., Su, T.L., and Zuo, M. (2019). CropDeep: The Crop Vision Dataset for Deep-Learning-Based Classification and Detection in Precision Agriculture. Sensors, 19.","DOI":"10.3390\/s19051058"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1107","DOI":"10.1007\/s11119-019-09642-0","article-title":"Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of \u2018MangoYOLO\u2019","volume":"20","author":"Koirala","year":"2019","journal-title":"Precis. Agric."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"417","DOI":"10.1016\/j.compag.2019.01.012","article-title":"Apple detection during different growth stages in orchards using the improved YOLO-V3 model","volume":"157","author":"Tian","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Bargoti, S., and Underwood, J. (June, January 29). Deep fruit detection in orchards. Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore.","DOI":"10.1109\/ICRA.2017.7989417"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., and McCool, C. (2016). DeepFruits: A Fruit Detection System Using Deep Neural Networks. Sensors, 16.","DOI":"10.3390\/s16081222"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Kirk, R., Cielniak, G., and Mangan, M. (2020). L*a*b*Fruits: A Rapid and Robust Outdoor Fruit Detection System Combining Bio-Inspired Features with One-Stage Deep Learning Networks. Sensors, 20.","DOI":"10.3390\/s20010275"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Li, W., Fu, H., Yu, L., and Cracknell, A. (2017). Deep Learning Based Oil Palm Tree Detection and Counting for High-Resolution Remote Sensing Images. Remote Sens., 9.","DOI":"10.3390\/rs9010022"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"17","DOI":"10.1016\/j.compag.2016.02.003","article-title":"Automatic moth detection from trap images for pest management","volume":"123","author":"Ding","year":"2016","journal-title":"Comput. Electron. Agric."},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Zhong, Y., Gao, J., Lei, Q., and Zhou, Y. (2018). A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture. Sensors, 18.","DOI":"10.3390\/s18051489"},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Steen, K.A., Christiansen, P., Karstoft, H., and J\u00f8rgensen, R.N. (2016). Using Deep Learning to Challenge Safety Standard for Highly Autonomous Machines in Agriculture. J. Imaging, 2.","DOI":"10.3390\/jimaging2010006"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2016, January 27\u201330). Deep Residual Learning for Image Recognition. Proceedings of the The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_42","unstructured":"Redmon, J., and Farhadi, A. (2018). YOLOv3: An Incremental Improvement. arXiv."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"105535","DOI":"10.1016\/j.compag.2020.105535","article-title":"Vineyard trunk detection using deep learning\u2014An experimental device benchmark","volume":"175","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"77308","DOI":"10.1109\/ACCESS.2020.2989052","article-title":"Visual Trunk Detection Using Transfer Learning and a Deep Learning-Based Coprocessor","volume":"8","author":"Aguiar","year":"2020","journal-title":"IEEE Access"},{"key":"ref_45","doi-asserted-by":"crossref","unstructured":"Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S.E., Fu, C., and Berg, A.C. (2015, January 6\u201312). SSD: Single Shot MultiBox Detector. Proceedings of the European Conference on Computer Vision, Zurich, Switzerland.","DOI":"10.1007\/978-3-319-46448-0_2"},{"key":"ref_46","unstructured":"Howard, A.G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., and Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv."},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., and Wojna, Z. (2015). Rethinking the Inception Architecture for Computer Vision. arXiv.","DOI":"10.1109\/CVPR.2016.308"},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., and Rabinovich, A. (2015, January 7\u201312). Going Deeper With Convolutions. Proceedings of the The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA.","DOI":"10.1109\/CVPR.2015.7298594"},{"key":"ref_49","doi-asserted-by":"crossref","first-page":"684","DOI":"10.1017\/S0263574719000961","article-title":"Path Planning Aware of Robot\u2019s Center of Mass for Steep Slope Vineyards","volume":"38","author":"Santos","year":"2020","journal-title":"Robotica"},{"key":"ref_50","unstructured":"Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Devin, M., Ghemawat, S., Irving, G., and Isard, M. (2016). TensorFlow: A system for large-scale machine learning. arXiv."},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"303","DOI":"10.1007\/s11263-009-0275-4","article-title":"The pascal visual object classes (voc) challenge","volume":"88","author":"Everingham","year":"2010","journal-title":"Int. J. Comput. Vis."}],"container-title":["Agriculture"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2077-0472\/11\/2\/131\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T05:20:08Z","timestamp":1760160008000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2077-0472\/11\/2\/131"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,2,5]]},"references-count":51,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2021,2]]}},"alternative-id":["agriculture11020131"],"URL":"https:\/\/doi.org\/10.3390\/agriculture11020131","relation":{},"ISSN":["2077-0472"],"issn-type":[{"value":"2077-0472","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,2,5]]}}}