{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,11]],"date-time":"2026-05-11T12:57:13Z","timestamp":1778504233206,"version":"3.51.4"},"reference-count":44,"publisher":"MDPI AG","issue":"17","license":[{"start":{"date-parts":[[2022,8,26]],"date-time":"2022-08-26T00:00:00Z","timestamp":1661472000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"PrunusBot project","award":["PDR2020-101-031358"],"award-info":[{"award-number":["PDR2020-101-031358"]}]},{"name":"EAFRD","award":["PDR2020-101-031358"],"award-info":[{"award-number":["PDR2020-101-031358"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Portable devices play an essential role where edge computing is necessary and mobility is required (e.g., robots in agriculture within remote-sensing applications). With the increasing applications of deep neural networks (DNNs) and accelerators for edge devices, several methods and applications have been proposed for simultaneous crop and weed detection. Although preliminary studies have investigated the performance of inference time for semantic segmentation of crops and weeds in edge devices, performance degradation has not been evaluated in detail when the required optimization is applied to the model for operation in such edge devices. This paper investigates the relationship between model tuning hyperparameters to improve inference time and its effect on segmentation performance. The study was conducted using semantic segmentation model DeeplabV3 with a MobileNet backbone. Different datasets (Cityscapes, PASCAL and ADE20K) were analyzed for a transfer learning strategy. The results show that, when using a model hyperparameter depth multiplier (DM) of 0.5 and the TensorRT framework, segmentation performance mean intersection over union (mIOU) decreased by 14.7% compared to that of a DM of 1.0 and no TensorRT. However, inference time accelerated dramatically by a factor of 14.8. At an image resolution of 1296\u00d7966, segmentation performance of 64% mIOU and inference of 5.9 frames per second (FPS) was achieved in Jetson Nano\u2019s device. With an input image resolution of 513\u00d7513, and hyperparameters output stride OS = 32 and DM = 0.5, an inference time of 0.04 s was achieved resulting in 25 FPS. The results presented in this paper provide a deeper insight into how the performance of the semantic segmentation model of crops and weeds degrades when optimization is applied to adapt the model to run on edge devices. Lastly, an application is described for the semantic segmentation of weeds embedded in the edge device (Jetson Nano) and integrated with the robotic orchard. The results show good spraying accuracy and feasibility of the method.<\/jats:p>","DOI":"10.3390\/rs14174217","type":"journal-article","created":{"date-parts":[[2022,8,30]],"date-time":"2022-08-30T01:37:55Z","timestamp":1661823475000},"page":"4217","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":39,"title":["Real-Time Weed Control Application Using a Jetson Nano Edge Device and a Spray Mechanism"],"prefix":"10.3390","volume":"14","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-6027-7763","authenticated-orcid":false,"given":"Eduardo","family":"Assun\u00e7\u00e3o","sequence":"first","affiliation":[{"name":"C-MAST Center for Mechanical and Aerospace Science and Technologies, University of Beira Interior, 6201-001 Covilh\u00e3, Portugal"},{"name":"Department of Electromechanical Engineering, University of Beira Interior, Rua Marqu\u00eas d\u2019\u00c1vila e Bolama, 6201-001 Covilh\u00e3, Portugal"},{"name":"Instituto de Telecomunica\u00e7\u00f5es, Department of Computer Science, University of Beira Interior, 6201-001 Covilh\u00e3, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1691-1709","authenticated-orcid":false,"given":"Pedro D.","family":"Gaspar","sequence":"additional","affiliation":[{"name":"C-MAST Center for Mechanical and Aerospace Science and Technologies, University of Beira Interior, 6201-001 Covilh\u00e3, Portugal"},{"name":"Department of Electromechanical Engineering, University of Beira Interior, Rua Marqu\u00eas d\u2019\u00c1vila e Bolama, 6201-001 Covilh\u00e3, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8599-6737","authenticated-orcid":false,"given":"Ricardo","family":"Mesquita","sequence":"additional","affiliation":[{"name":"Department of Electromechanical Engineering, University of Beira Interior, Rua Marqu\u00eas d\u2019\u00c1vila e Bolama, 6201-001 Covilh\u00e3, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-6599-0688","authenticated-orcid":false,"given":"Maria P.","family":"Sim\u00f5es","sequence":"additional","affiliation":[{"name":"School of Agriculture, Polytechnic Institute of Castelo Branco, 6000-084 Castelo Branco, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2319-8211","authenticated-orcid":false,"given":"Khadijeh","family":"Alibabaei","sequence":"additional","affiliation":[{"name":"C-MAST Center for Mechanical and Aerospace Science and Technologies, University of Beira Interior, 6201-001 Covilh\u00e3, Portugal"},{"name":"Department of Electromechanical Engineering, University of Beira Interior, Rua Marqu\u00eas d\u2019\u00c1vila e Bolama, 6201-001 Covilh\u00e3, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6901-795X","authenticated-orcid":false,"given":"Andr\u00e9","family":"Veiros","sequence":"additional","affiliation":[{"name":"Department of Electromechanical Engineering, University of Beira Interior, Rua Marqu\u00eas d\u2019\u00c1vila e Bolama, 6201-001 Covilh\u00e3, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2551-8570","authenticated-orcid":false,"given":"Hugo","family":"Proen\u00e7a","sequence":"additional","affiliation":[{"name":"Instituto de Telecomunica\u00e7\u00f5es, Department of Computer Science, University of Beira Interior, 6201-001 Covilh\u00e3, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2022,8,26]]},"reference":[{"key":"ref_1","unstructured":"Sim\u00f5es, M. (2017). +P\u00eassego \u2013 Resultados de Apoio \u00e0 Gest\u00e3o, Centro Operativo e Tecnol\u00f3gico Hortofrut\u00edcola Nacional. Technical Report."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Alibabaei, K., Gaspar, P.D., and Lima, T.M. (2020, January 8\u20139). Modeling evapotranspiration using Encoder-Decoder Model. Proceedings of the 2020 International Conference on Decision Aid Sciences and Application (DASA), Sakheer, Bahrain.","DOI":"10.1109\/DASA51403.2020.9317100"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Assun\u00e7\u00e3o, E., Diniz, C., Gaspar, P.D., and Proen\u00e7a, H. (2020, January 8\u20139). Decision-making support system for fruit diseases classification using Deep Learning. Proceedings of the 2020 International Conference on Decision Aid Sciences and Application (DASA), Sakheer, Bahrain.","DOI":"10.1109\/DASA51403.2020.9317219"},{"key":"ref_4","unstructured":"Shanmugam, S., Assun\u00e7\u00e3o, E., Mesquita, R., Veiros, A., and Gaspar, P.D. (2020). Automated weed detection systems: A review. KnE Eng., 271\u2013284. Available online: http:\/\/3.65.204.3\/index.php\/KnE-Engineering\/article\/view\/7046."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Cunha, J., Gaspar, P.D., Assun\u00e7\u00e3o, E., and Mesquita, R. (2021, January 13\u201316). Prediction of the Vigor and Health of Peach Tree Orchard. Proceedings of the International Conference on Computational Science and Its Applications, Cagliari, Italy.","DOI":"10.1007\/978-3-030-86970-0_38"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Mesquita, R., and Gaspar, P.D. (2021). A Novel Path Planning Optimization Algorithm Based on Particle Swarm Optimization for UAVs for Bird Monitoring and Repelling. Processes, 10.","DOI":"10.3390\/pr10010062"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Alibabaei, K., Gaspar, P.D., and Lima, T.M. (2021). Modeling soil water content and reference evapotranspiration from climate data using deep learning method. Appl. Sci., 11.","DOI":"10.3390\/app11115029"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Alibabaei, K., Gaspar, P.D., and Lima, T.M. (2021). Crop yield estimation using deep learning based on climate big data and irrigation scheduling. Energies, 14.","DOI":"10.3390\/en14113004"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Alibabaei, K., Gaspar, P.D., Lima, T.M., Campos, R.M., Gir\u00e3o, I., Monteiro, J., and Lopes, C.M. (2022). A Review of the Challenges of Using Deep Learning Algorithms to Support Decision-Making in Agricultural Activities. Remote Sens., 14.","DOI":"10.3390\/rs14030638"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"107480","DOI":"10.1016\/j.agwat.2022.107480","article-title":"Irrigation optimization with a deep reinforcement learning model: Case study on a site in Portugal","volume":"263","author":"Alibabaei","year":"2022","journal-title":"Agric. Water Manag."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Alibabaei, K., Gaspar, P.D., Assun\u00e7\u00e3o, E., Alirezazadeh, S., Lima, T.M., Soares, V.N., and Caldeira, J.M. (2022). Comparison of On-Policy Deep Reinforcement Learning A2C with Off-Policy DQN in Irrigation Optimization: A Case Study at a Site in Portugal. Computers, 11.","DOI":"10.3390\/computers11070104"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Alibabaei, K., Assun\u00e7\u00e3o, E., Gaspar, P.D., Soares, V.N., and Caldeira, J.M. (2022). Real-Time Detection of Vine Trunk for Robot Localization Using Deep Learning Models Developed for Edge TPU Devices. Future Internet, 14.","DOI":"10.3390\/fi14070199"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"428","DOI":"10.1016\/j.sysarc.2019.01.011","article-title":"A Survey on optimized implementation of deep learning models on the NVIDIA Jetson platform","volume":"97","author":"Mittal","year":"2019","journal-title":"J. Syst. Archit."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Milioto, A., Lottes, P., and Stachniss, C. (2018, January 21\u201325). Real-Time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs. Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia.","DOI":"10.1109\/ICRA.2018.8460962"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"1344","DOI":"10.1109\/LRA.2017.2667039","article-title":"Mixtures of Lightweight Deep Convolutional Neural Networks: Applied to Agricultural Robotics","volume":"2","author":"McCool","year":"2017","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Khan, A., Ilyas, T., Umraiz, M., Mannan, Z.I., and Kim, H. (2020). CED-Net: Crops and Weeds Segmentation for Smart Farming Using a Small Cascaded Encoder-Decoder Architecture. Electronics, 9.","DOI":"10.3390\/electronics9101602"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"81724","DOI":"10.1109\/ACCESS.2020.2991354","article-title":"Semantic Segmentation of Crop and Weed using an Encoder-Decoder Network and Image Enhancement Method under Uncontrolled Outdoor Illumination","volume":"8","author":"Wang","year":"2020","journal-title":"IEEE Access"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Fawakherji, M., Youssef, A., Bloisi, D., Pretto, A., and Nardi, D. (2019, January 25\u201327). Crop and Weeds Classification for Precision Agriculture Using Context-Independent Pixel-Wise Segmentation. Proceedings of the 2019 Third IEEE International Conference on Robotic Computing (IRC), Naples, Italy.","DOI":"10.1109\/IRC.2019.00029"},{"key":"ref_19","unstructured":"Olsen, A. (2020). Improving the Accuracy of Weed Species Detection for Robotic Weed Control in Complex Real-Time Environments. [Ph.D. Thesis, James Cook University]."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"339","DOI":"10.1016\/j.compag.2018.12.048","article-title":"Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence","volume":"157","author":"Partel","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"105091","DOI":"10.1016\/j.compag.2019.105091","article-title":"Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure","volume":"167","author":"Abdalla","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_22","first-page":"535","article-title":"Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network","volume":"7","author":"Asad","year":"2020","journal-title":"Inf. Process. Agric."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"e0215676","DOI":"10.1371\/journal.pone.0215676","article-title":"Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields","volume":"14","author":"Ma","year":"2019","journal-title":"PLoS ONE"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Lameski, P., Zdravevski, E., Trajkovik, V., and Kulakov, A. (2017, January 18\u201323). Weed detection dataset with RGB images taken under variable light conditions. Proceedings of the International Conference on ICT Innovations, Skopje, Macedonia.","DOI":"10.1007\/978-3-319-67597-8_11"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Naushad, R., Kaur, T., and Ghaderpour, E. (2021). Deep Transfer Learning for Land Use and Land Cover Classification: A Comparative Study. Sensors, 21.","DOI":"10.3390\/s21238083"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"341","DOI":"10.3390\/signals3020022","article-title":"An Empirical Study on Ensemble of Segmentation Approaches","volume":"3","author":"Nanni","year":"2022","journal-title":"Signals"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Hadidi, R., Cao, J., Xie, Y., Asgari, B., Krishna, T., and Kim, H. (2019, January 3\u20135). Characterizing the Deployment of Deep Neural Networks on Commercial Edge Devices. Proceedings of the 2019 IEEE International Symposium on Workload Characterization (IISWC), Orlando, FL, USA.","DOI":"10.1109\/IISWC47752.2019.9041955"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Haug, S., and Ostermann, J. (2014, January 6\u201312). A crop\/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. Proceedings of the European Conference on Computer Vision, Zurich, Switzerland.","DOI":"10.1007\/978-3-319-16220-1_8"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Long, J., Shelhamer, E., and Darrell, T. (2015, January 7\u201312). Fully convolutional networks for semantic segmentation. Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA.","DOI":"10.1109\/CVPR.2015.7298965"},{"key":"ref_30","first-page":"3523","article-title":"Image Segmentation Using Deep Learning: A Survey","volume":"44","author":"Minaee","year":"2021","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_31","first-page":"234","article-title":"U-Net: Convolutional Networks for Biomedical Image Segmentation","volume":"Volume 9351","author":"Ronneberger","year":"2015","journal-title":"Medical Image Computing and Computer-Assisted Intervention (MICCAI)"},{"key":"ref_32","unstructured":"Chen, L.C., Papandreou, G., Kokkinos, I., Murphy, K., and Yuille, A.L. (2016). SSemantic Image Segmentation with Deep Convolutional Nets and Fully Connected CRFs. arXiv."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L. (2018, January 18\u201322). MobileNetV2: Inverted Residuals and Linear Bottlenecks. Proceedings of the 2018 IEEE\/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","DOI":"10.1109\/CVPR.2018.00474"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Chen, L.C., Zhu, Y., Papandreou, G., Schroff, F., and Adam, H. (2018, January 8\u201314). Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation. Proceedings of the ECCV 2018, Munich, Germany.","DOI":"10.1007\/978-3-030-01234-2_49"},{"key":"ref_35","unstructured":"Chen, C., Du, X., Hou, L., Kim, J., Li, J., Li, Y., Rashwan, A., Yang, F., and Yu, H. (2022, July 30). TensorFlow Official Model Garden. Available online: https:\/\/github.com\/tensorflow\/models\/tree\/master\/official."},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Chu, B., Madhavan, V., Beijbom, O., Hoffman, J., and Darrell, T. (15\u201316, January 8\u201310). Best Practices for Fine-Tuning Visual Classifiers to New Domains. Proceedings of the ECCV Workshops 2016, Amsterdam, The Netherlands.","DOI":"10.1007\/978-3-319-49409-8_34"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Cordts, M., Omran, M., Ramos, S., Rehfeld, T., Enzweiler, M., Benenson, R., Franke, U., Roth, S., and Schiele, B. (2016, January 27\u201330). The Cityscapes Dataset for Semantic Urban Scene Understanding. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.350"},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Zhou, B., Zhao, H., Puig, X., Fidler, S., Barriuso, A., and Torralba, A. (2017, January 21\u201326). Scene Parsing through ADE20K Dataset. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA.","DOI":"10.1109\/CVPR.2017.544"},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"303","DOI":"10.1007\/s11263-009-0275-4","article-title":"The Pascal Visual Object Classes (VOC) Challenge","volume":"88","author":"Everingham","year":"2010","journal-title":"Int. J. Comput. Vis."},{"key":"ref_40","unstructured":"NVIDIA (2022, July 30). TensorRT Release Notes. Available online: https:\/\/docs.nvidia.com\/deeplearning\/tensorrt\/release-notes\/."},{"key":"ref_41","unstructured":"NVIDIA (2022, July 30). NVIDIA TensorRT. Available online: https:\/\/developer.nvidia.com\/tensorrt."},{"key":"ref_42","unstructured":"Tang, R., Adhikari, A., and Lin, J. (2018). FLOPs as a Direct Optimization Objective for Learning Sparse Neural Networks. arXiv."},{"key":"ref_43","unstructured":"Veiros, A., Mesquita, R., Gaspar, P.D., and Sim\u00f5es, M.P. (June, January 30). Multitask Robotic rover for agricultural activities (R2A2): A robotic platform for peach culture. Proceedings of the X International Peach Symposium, Naoussa, Greece."},{"key":"ref_44","unstructured":"Yu, H., Chen, C., Du, X., Li, Y., Rashwan, A., Hou, L., Jin, P., Yang, F., Liu, F., and Kim, J. (2022, July 30). TensorFlow DeepLab Model Zoo. Available online: https:\/\/github.com\/tensorflow\/models\/blob\/master\/research\/deeplab\/g3doc\/model_zoo.md\/."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/17\/4217\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T00:16:10Z","timestamp":1760141770000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/17\/4217"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,8,26]]},"references-count":44,"journal-issue":{"issue":"17","published-online":{"date-parts":[[2022,9]]}},"alternative-id":["rs14174217"],"URL":"https:\/\/doi.org\/10.3390\/rs14174217","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,8,26]]}}}