{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,24]],"date-time":"2026-02-24T16:22:49Z","timestamp":1771950169890,"version":"3.50.1"},"reference-count":62,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2025,2,25]],"date-time":"2025-02-25T00:00:00Z","timestamp":1740441600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001659","name":"Deutsche Forschungsgemeinschaft","doi-asserted-by":"publisher","award":["EXC-2070-390732324"],"award-info":[{"award-number":["EXC-2070-390732324"]}],"id":[{"id":"10.13039\/501100001659","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Robot. AI"],"abstract":"<jats:p>Robust perception systems allow farm robots to recognize weeds and vegetation, enabling the selective application of fertilizers and herbicides to mitigate the environmental impact of traditional agricultural practices. Today\u2019s perception systems typically rely on deep learning to interpret sensor data for tasks such as distinguishing soil, crops, and weeds. These approaches usually require substantial amounts of manually labeled training data, which is often time-consuming and requires domain expertise. This paper aims to reduce this limitation and propose an automated labeling pipeline for crop-weed semantic image segmentation in managed agricultural fields. It allows the training of deep learning models without or with only limited manual labeling of images. Our system uses RGB images recorded with unmanned aerial or ground robots operating in the field to produce semantic labels exploiting the field row structure for spatially consistent labeling. We use the rows previously detected to identify multiple crop rows, reducing labeling errors and improving consistency. We further reduce labeling errors by assigning an \u201cunknown\u201d class to challenging-to-segment vegetation. We use evidential deep learning because it provides predictions uncertainty estimates that we use to refine and improve our predictions. In this way, the evidential deep learning assigns high uncertainty to the weed class, as it is often less represented in the training data, allowing us to use the uncertainty to correct the semantic predictions. Experimental results suggest that our approach outperforms general-purpose labeling methods applied to crop fields by a large margin and domain-specific approaches on multiple fields and crop species. Using our generated labels to train deep learning models boosts our prediction performance on previously unseen fields with respect to unseen crop species, growth stages, or different lighting conditions. We obtain an IoU of 88.6% on crops, and 22.7% on weeds for a managed field of sugarbeets, where fully supervised methods have 83.4% on crops and 33.5% on weeds and other unsupervised domain-specific methods get 54.6% on crops and 11.2% on weeds. Finally, our method allows fine-tuning models trained in a fully supervised fashion to improve their performance in unseen field conditions up to +17.6% in mean IoU without additional manual labeling.<\/jats:p>","DOI":"10.3389\/frobt.2025.1548143","type":"journal-article","created":{"date-parts":[[2025,2,25]],"date-time":"2025-02-25T05:10:33Z","timestamp":1740460233000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":4,"title":["Unsupervised semantic label generation in agricultural fields"],"prefix":"10.3389","volume":"12","author":[{"given":"Gianmarco","family":"Roggiolani","sequence":"first","affiliation":[]},{"given":"Julius","family":"R\u00fcckin","sequence":"additional","affiliation":[]},{"given":"Marija","family":"Popovi\u0107","sequence":"additional","affiliation":[]},{"given":"Jens","family":"Behley","sequence":"additional","affiliation":[]},{"given":"Cyrill","family":"Stachniss","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2025,2,25]]},"reference":[{"key":"B1","doi-asserted-by":"publisher","first-page":"243","DOI":"10.1016\/j.inffus.2021.05.008","article-title":"A review of uncertainty quantification in deep learning: techniques, applications and challenges","volume":"76","author":"Abdar","year":"2021","journal-title":"Inf. fusion"},{"key":"B2","doi-asserted-by":"crossref","DOI":"10.1109\/ICRA40945.2020.9197114","article-title":"Visual servoing-based navigation for monitoring row-crop fields","volume-title":"Proc. Of the IEEE intl. Conf. On robotics and automation (ICRA)","author":"Ahmadi","year":"2020"},{"key":"B3","doi-asserted-by":"publisher","first-page":"1441371","DOI":"10.3389\/frobt.2024.1441371","article-title":"Targeted weed management of palmer amaranth using robotics and deep learning (yolov7)","volume":"11","author":"Balabantaray","year":"2024","journal-title":"Front. Robotics AI"},{"key":"B4","first-page":"9368","article-title":"The power of ensembles for active learning in image classification","volume-title":"Proc. Of the IEEE\/CVF conf. On computer vision and pattern recognition (CVPR)","author":"Beluch","year":""},{"key":"B5","doi-asserted-by":"crossref","DOI":"10.1109\/CVPR.2018.00976","article-title":"The power of ensembles for active learning in image classification","volume-title":"Proc. Of the IEEE\/CVF conf. On computer vision and pattern recognition (CVPR)","author":"Beluch","year":""},{"key":"B6","doi-asserted-by":"publisher","first-page":"627067","DOI":"10.3389\/frobt.2021.627067","article-title":"Towards a machine vision-based yield monitor for the counting and quality mapping of shallots","volume":"8","author":"Boatswain Jacques","year":"2021","journal-title":"Front. Robotics AI"},{"key":"B7","doi-asserted-by":"publisher","first-page":"7","DOI":"10.1002\/rob.21869","article-title":"Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture","volume":"37","author":"Bosilj","year":"2020","journal-title":"J. Field Robotics (JFR)"},{"key":"B8","doi-asserted-by":"publisher","first-page":"679","DOI":"10.1109\/TPAMI.1986.4767851","article-title":"A computational approach to edge detection","volume":"8","author":"Canny","year":"1986","journal-title":"IEEE Trans. Pattern Analysis Mach. Intell. (TPAMI)"},{"key":"B9","doi-asserted-by":"crossref","DOI":"10.1007\/978-3-031-25066-8_9","article-title":"Swin-unet: unet-like pure transformer for medical image segmentation","volume-title":"Proc. Of the europ. Conf. On computer vision (ECCV)","author":"Cao","year":"2023"},{"key":"B10","doi-asserted-by":"publisher","first-page":"1418201","DOI":"10.3389\/fpls.2024.1418201","article-title":"Weakly supervised localization model for plant disease based on siamese networks","volume":"15","author":"Chen","year":"2024","journal-title":"Front. Plant Sci."},{"key":"B11","volume-title":"Rethinking atrous convolution for semantic image segmentation","author":"Chen","year":"2017"},{"key":"B12","article-title":"A simple framework for contrastive learning of visual representations","volume-title":"Proc. Of the intl. Conf. On machine learning (ICML)","author":"Chen","year":"2020"},{"key":"B13","first-page":"15334","article-title":"Boundary iou: improving object-centric image segmentation evaluation","volume-title":"Proc. Of the IEEE\/CVF conf. On computer vision and pattern recognition (CVPR)","author":"Cheng","year":"2021"},{"key":"B14","doi-asserted-by":"publisher","first-page":"48","DOI":"10.3390\/machines11010048","article-title":"Recent advancements in agriculture robots: benefits and challenges","volume":"11","author":"Cheng","year":"2023","journal-title":"Machines"},{"key":"B15","doi-asserted-by":"publisher","first-page":"e0239591","DOI":"10.1371\/journal.pone.0239591","article-title":"A novel nir-image segmentation method for the precise estimation of above-ground biomass in rice crops","volume":"15","author":"Colorado","year":"2020","journal-title":"PLOS ONE"},{"key":"B16","doi-asserted-by":"publisher","first-page":"1344958","DOI":"10.3389\/fpls.2024.1344958","article-title":"Improving u-net network for semantic segmentation of corns and weeds during corn seedling stage in field","volume":"15","author":"Cui","year":"2024","journal-title":"Front. Plant Sci."},{"key":"B17","doi-asserted-by":"publisher","first-page":"1298791","DOI":"10.3389\/fpls.2024.1298791","article-title":"Granoscan: an ai-powered mobile app for in-field identification of biotic threats of wheat","volume":"15","author":"Dainelli","year":"2024","journal-title":"Front. Plant Sci."},{"key":"B18","doi-asserted-by":"crossref","DOI":"10.1109\/CVPR.2009.5206848","article-title":"Imagenet: a large-scale hierarchical image database","volume-title":"Proc. Of the IEEE conf. On computer vision and pattern recognition (CVPR)","author":"Deng","year":"2009"},{"key":"B19","article-title":"An image is worth 16x16 words: transformers for image recognition at scale","volume-title":"Proc. Of the intl. Conf. On learning representations (ICLR)","author":"Dosovitskiy","year":"2021"},{"key":"B20","doi-asserted-by":"publisher","first-page":"303","DOI":"10.1007\/s11263-009-0275-4","article-title":"The pascal visual object classes (VOC) challenge","volume":"88","author":"Everingham","year":"2010","journal-title":"Intl. J. Comput. Vis. (IJCV)"},{"key":"B21","doi-asserted-by":"publisher","first-page":"351","DOI":"10.1146\/annurev-resource-102422-090105","article-title":"Agroecology for a sustainable agriculture and food system: from local solutions to large-scale adoption","volume":"15","author":"Ewert","year":"2023","journal-title":"Annu. Rev. Resour. Econ."},{"key":"B22","doi-asserted-by":"publisher","first-page":"167","DOI":"10.1023\/b:visi.0000022288.19776.77","article-title":"Efficient graph-based image segmentation","volume":"59","author":"Felzenszwalb","year":"2004","journal-title":"Intl. J. Comput. Vis. (IJCV)"},{"key":"B23","article-title":"Dropout as a bayesian approximation: representing model uncertainty in deep learning","volume-title":"Proc. Of the intl. Conf. On machine learning (ICML)","author":"Gal","year":"2016"},{"key":"B24","doi-asserted-by":"publisher","first-page":"164123","DOI":"10.1016\/j.ijleo.2019.164123","article-title":"A wavelet transform-based image segmentation method","volume":"208","author":"Gao","year":"2020","journal-title":"Intl. J. Light Electron Opt."},{"key":"B25","article-title":"Unsupervised semantic segmentation by distilling feature correspondences","volume-title":"Proc. Of the intl. Conf. On learning representations (ICLR)","author":"Hamilton","year":"2022"},{"key":"B26","doi-asserted-by":"crossref","DOI":"10.1109\/ICCV.2017.322","article-title":"Mask R-CNN","volume-title":"Proc. Of the IEEE intl. Conf. On computer vision (ICCV)","author":"He","year":"2017"},{"key":"B27","doi-asserted-by":"publisher","first-page":"2112","DOI":"10.3390\/agriculture13112112","article-title":"Optimal coverage path planning for agricultural vehicles with curvature constraints","volume":"13","author":"H\u00f6ffmann","year":"2023","journal-title":"Agriculture"},{"key":"B28","doi-asserted-by":"publisher","first-page":"445","DOI":"10.1289\/ehp.02110445","article-title":"How sustainable agriculture can address the environmental and human health harms of industrial agriculture","volume":"110","author":"Horrigan","year":"2002","journal-title":"Environ. health Perspect."},{"key":"B29","article-title":"Machine analysis of bubble chamber pictures","volume-title":"Proc. Of the intl. Conf. On high-energy accelerators and instrumentation","author":"Hough","year":"1959"},{"key":"B30","article-title":"Adam: a method for stochastic optimization","volume-title":"Proc. Of the intl. Conf. On learning representations (ICLR)","author":"Kingma","year":"2015"},{"key":"B31","article-title":"Simple and scalable predictive uncertainty estimation using deep ensembles","volume-title":"Proc. Of the conf. Neural information processing systems (NIPS)","author":"Lakshminarayanan","year":"2017"},{"key":"B32","doi-asserted-by":"crossref","DOI":"10.1109\/CVPR52688.2022.01639","article-title":"Weakly supervised semantic segmentation using out-of-distribution data","volume-title":"Proc. Of the IEEE\/CVF conf. On computer vision and pattern recognition (CVPR)","author":"Lee","year":"2022"},{"key":"B33","doi-asserted-by":"crossref","DOI":"10.1007\/978-3-319-10602-1_48","article-title":"Microsoft COCO: common objects in context","volume-title":"Proc. Of the europ. Conf. On computer vision (ECCV)","author":"Lin","year":"2014"},{"key":"B34","doi-asserted-by":"crossref","DOI":"10.1109\/ICRA.2016.7487720","article-title":"An effective classification system for separating sugar beets and weeds for precision farming applications","volume-title":"Proc. Of the IEEE intl. Conf. On robotics and automation (ICRA)","author":"Lottes","year":"2016"},{"key":"B35","doi-asserted-by":"publisher","first-page":"1160","DOI":"10.1002\/rob.21675","article-title":"Effective vision-based classification for separating sugar beets and weeds for precision farming","volume":"34","author":"Lottes","year":"2017","journal-title":"J. Field Robotics (JFR)"},{"key":"B36","doi-asserted-by":"crossref","DOI":"10.1109\/IROS.2017.8206403","article-title":"Semi-supervised online visual crop and weed classification in precision farming exploiting plant arrangement","volume-title":"Proc. Of the IEEE\/RSJ intl. Conf. On intelligent robots and systems (IROS)","author":"Lottes","year":"2017"},{"key":"B37","doi-asserted-by":"publisher","first-page":"129","DOI":"10.1109\/TIT.1982.1056489","article-title":"Least squares quantization in pcm","volume":"28","author":"Loyd","year":"1982","journal-title":"IEEE Trans. Inf. Theory"},{"key":"B38","doi-asserted-by":"publisher","first-page":"108114","DOI":"10.1016\/j.compag.2023.108114","article-title":"From one field to another \u2013 unsupervised domain adaptation for semantic segmentation in agricultural robotics","volume":"212","author":"Magistri","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"B39","doi-asserted-by":"crossref","DOI":"10.1109\/ICISS49785.2020.9316063","article-title":"Smart automated pesticide spraying bot","volume-title":"Proc. Of the intl. Conf. On intelligent sustainable systems (ICISS)","author":"Murugan","year":"2020"},{"key":"B40","doi-asserted-by":"publisher","first-page":"1163","DOI":"10.1109\/34.546254","article-title":"Geodesic saliency of watershed contours and hierarchical segmentation","volume":"18","author":"Najman","year":"1996","journal-title":"IEEE Trans. Pattern Analysis Mach. Intell. (TPAMI)"},{"key":"B41","doi-asserted-by":"crossref","DOI":"10.1109\/CVPR52733.2024.02146","article-title":"Unsupervised universal image segmentation","volume-title":"Proc. Of the IEEE\/CVF conf. On computer vision and pattern recognition (CVPR)","author":"Niu","year":"2024"},{"key":"B42","doi-asserted-by":"publisher","first-page":"62","DOI":"10.1109\/tsmc.1979.4310076","article-title":"A threshold selection method from gray-level histograms","volume":"9","author":"Otsu","year":"1979","journal-title":"IEEE Trans. Syst. Man, Cybern."},{"key":"B43","doi-asserted-by":"crossref","DOI":"10.1109\/IROS55552.2023.10342067","article-title":"Panoptic mapping with fruit completion and pose estimation for horticultural robots","volume-title":"Proc. Of the IEEE\/RSJ intl. Conf. On intelligent robots and systems (IROS)","author":"Pan","year":"2023"},{"key":"B44","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/0734-189x(84)90046-x","article-title":"Experiments in segmentation using a facet model region grower","volume":"25","author":"Pong","year":"1984","journal-title":"Comput. Vis. Graph. Image Process."},{"key":"B45","doi-asserted-by":"publisher","first-page":"105201","DOI":"10.1016\/j.compag.2019.105201","article-title":"Robust index-based semantic plant\/background segmentation for rgb-images","volume":"169","author":"Riehle","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"B46","doi-asserted-by":"publisher","first-page":"263","DOI":"10.1109\/tits.2017.2750080","article-title":"ERFNet: efficient residual factorized ConvNet for real-time semantic segmentation","volume":"19","author":"Romera","year":"2018","journal-title":"IEEE Trans. Intelligent Transp. Syst. (TITS)"},{"key":"B47","doi-asserted-by":"crossref","DOI":"10.1007\/978-3-319-24574-4_28","article-title":"U-net: convolutional networks for biomedical image segmentation","volume-title":"Proc. Of the medical image computing and computer-assisted intervention (MICCAI)","author":"Ronneberger","year":"2015"},{"key":"B48","doi-asserted-by":"publisher","first-page":"107956","DOI":"10.1016\/j.compag.2023.107956","article-title":"Segmentation of weeds and crops using multispectral imaging and crf-enhanced u-net","volume":"211","author":"Sahin","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"B49","doi-asserted-by":"publisher","first-page":"1211235","DOI":"10.3389\/fpls.2023.1211235","article-title":"Towards deep learning based smart farming for intelligent weeds management in crops","volume":"14","author":"Saqib","year":"2023","journal-title":"Front. Plant Sci."},{"key":"B50","article-title":"Evidential deep learning to quantify classification uncertainty","volume-title":"Proc. Of the conf. On neural information processing systems (NeurIPS)","author":"Sensoy","year":"2018"},{"key":"B51","doi-asserted-by":"publisher","first-page":"127178","DOI":"10.1016\/j.eja.2024.127178","article-title":"Research priorities to leverage smart digital technologies for sustainable crop production","volume":"156","author":"Storm","year":"2024","journal-title":"Eur. J. Agron."},{"key":"B52","doi-asserted-by":"crossref","DOI":"10.1109\/ICCV48922.2021.00717","article-title":"Segmenter: transformer for semantic segmentation","volume-title":"Proc. Of the IEEE\/CVF intl. Conf. On computer vision (ICCV)","author":"Strudel","year":"2021"},{"key":"B53","article-title":"Attention is all you need","volume-title":"Proc. Of the conf. On neural information processing systems (NeurIPS)","author":"Vaswani","year":"2017"},{"key":"B54","doi-asserted-by":"publisher","first-page":"6148","DOI":"10.1073\/pnas.1707462114","article-title":"Smart farming is key to developing sustainable agriculture","volume":"114","author":"Walter","year":"2017","journal-title":"Proc. Natl. Acad. Sci."},{"key":"B55","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1109\/TKDE.2022.3178128","article-title":"Generalizing to unseen domains: a survey on domain generalization","volume":"35","author":"Wang","year":"2022","journal-title":"IEEE Trans. Knowl. Data Eng."},{"key":"B56","doi-asserted-by":"publisher","first-page":"9583","DOI":"10.1109\/TPAMI.2024.3419548","article-title":"Phenobench: a large dataset and benchmarks for semantic image interpretation in the agricultural domain","volume":"46","author":"Weyler","year":"2024","journal-title":"IEEE Trans. Pattern Analysis Mach. Intell. (TPAMI)"},{"key":"B57","doi-asserted-by":"crossref","DOI":"10.1109\/WACV51458.2022.00302","article-title":"In-field phenotyping based on crop leaf and plant instance segmentation","volume-title":"Proc. Of the IEEE winter conf. On applications of computer vision (WACV)","author":"Weyler","year":""},{"key":"B58","doi-asserted-by":"publisher","first-page":"3787","DOI":"10.1109\/lra.2022.3147462","article-title":"Joint plant and leaf instance segmentation on field-scale uav imagery","volume":"7","author":"Weyler","year":"","journal-title":"IEEE Robotics Automation Lett. (RA-L)"},{"key":"B59","doi-asserted-by":"publisher","first-page":"3394","DOI":"10.1109\/lra.2018.2852841","article-title":"Crop row detection on tiny plants with the pattern Hough transform","volume":"3","author":"Winterhalter","year":"2018","journal-title":"IEEE Robotics Automation Lett. (RA-L)"},{"key":"B60","doi-asserted-by":"publisher","first-page":"322","DOI":"10.1002\/rob.21938","article-title":"Robotic weed control using automated weed and crop classification","volume":"37","author":"Wu","year":"2020","journal-title":"J. Field Robotics (JFR)"},{"key":"B61","doi-asserted-by":"publisher","first-page":"774068","DOI":"10.3389\/fpls.2021.774068","article-title":"Outdoor plant segmentation with deep learning for high-throughput field phenotyping on a diverse wheat dataset","volume":"12","author":"Zenkl","year":"2022","journal-title":"Front. Plant Sci."},{"key":"B62","doi-asserted-by":"publisher","first-page":"9846","DOI":"10.3390\/s23249846","article-title":"A weakly supervised semantic segmentation model of maize seedlings and weed images based on scrawl labels","volume":"23","author":"Zhao","year":"2023","journal-title":"Sensors"}],"container-title":["Frontiers in Robotics and AI"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2025.1548143\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,2,25]],"date-time":"2025-02-25T05:10:47Z","timestamp":1740460247000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2025.1548143\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,2,25]]},"references-count":62,"alternative-id":["10.3389\/frobt.2025.1548143"],"URL":"https:\/\/doi.org\/10.3389\/frobt.2025.1548143","relation":{},"ISSN":["2296-9144"],"issn-type":[{"value":"2296-9144","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,2,25]]},"article-number":"1548143"}}