{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,29]],"date-time":"2026-01-29T21:49:25Z","timestamp":1769723365920,"version":"3.49.0"},"reference-count":52,"publisher":"MDPI AG","issue":"1","license":[{"start":{"date-parts":[[2020,12,23]],"date-time":"2020-12-23T00:00:00Z","timestamp":1608681600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"United States Department of Agriculture National Institute of Food and Agriculture","award":["WIS02002"],"award-info":[{"award-number":["WIS02002"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>In recent years, precision agriculture has been researched to increase crop production with less inputs, as a promising means to meet the growing demand of agriculture products. Computer vision-based crop detection with unmanned aerial vehicle (UAV)-acquired images is a critical tool for precision agriculture. However, object detection using deep learning algorithms rely on a significant amount of manually prelabeled training datasets as ground truths. Field object detection, such as bales, is especially difficult because of (1) long-period image acquisitions under different illumination conditions and seasons; (2) limited existing prelabeled data; and (3) few pretrained models and research as references. This work increases the bale detection accuracy based on limited data collection and labeling, by building an innovative algorithms pipeline. First, an object detection model is trained using 243 images captured with good illimitation conditions in fall from the crop lands. In addition, domain adaptation (DA), a kind of transfer learning, is applied for synthesizing the training data under diverse environmental conditions with automatic labels. Finally, the object detection model is optimized with the synthesized datasets. The case study shows the proposed method improves the bale detecting performance, including the recall, mean average precision (mAP), and F measure (F1 score), from averages of 0.59, 0.7, and 0.7 (the object detection) to averages of 0.93, 0.94, and 0.89 (the object detection + DA), respectively. This approach could be easily scaled to many other crop field objects and will significantly contribute to precision agriculture.<\/jats:p>","DOI":"10.3390\/rs13010023","type":"journal-article","created":{"date-parts":[[2020,12,23]],"date-time":"2020-12-23T12:19:51Z","timestamp":1608725991000},"page":"23","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":57,"title":["Augmenting Crop Detection for Precision Agriculture with Deep Visual Transfer Learning\u2014A Case Study of Bale Detection"],"prefix":"10.3390","volume":"13","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-6790-8628","authenticated-orcid":false,"given":"Wei","family":"Zhao","sequence":"first","affiliation":[{"name":"Department of Biological System Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA"}]},{"given":"William","family":"Yamada","sequence":"additional","affiliation":[{"name":"Department of Biological System Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA"}]},{"given":"Tianxin","family":"Li","sequence":"additional","affiliation":[{"name":"Cockrell School of Engineering, University of Texas at Austin, Austin, TX 78712, USA"}]},{"given":"Matthew","family":"Digman","sequence":"additional","affiliation":[{"name":"Department of Biological System Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4389-609X","authenticated-orcid":false,"given":"Troy","family":"Runge","sequence":"additional","affiliation":[{"name":"Department of Biological System Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA"}]}],"member":"1968","published-online":{"date-parts":[[2020,12,23]]},"reference":[{"key":"ref_1","unstructured":"Census Bureau (2020, December 16). U.S. and World Population Clock, Available online: https:\/\/www.census.gov\/data-tools\/demo\/idb\/#\/country?YR_ANIM=2050&COUNTRY_YEAR=2050."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"51","DOI":"10.1111\/agec.12089","article-title":"The future of food demand: Understanding differences in global economic models","volume":"45","author":"Valin","year":"2014","journal-title":"Agric. Econ."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"116","DOI":"10.1016\/j.tifs.2015.01.001","article-title":"Image acquisition techniques for assessment of legume quality","volume":"42","author":"Mahajan","year":"2015","journal-title":"Trends Food Sci. Technol."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"52","DOI":"10.1016\/j.biosystemseng.2016.01.017","article-title":"A review on the main challenges in automatic plant disease identification based on visible range images","volume":"144","author":"Barbedo","year":"2016","journal-title":"Biosyst. Eng."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"495","DOI":"10.1007\/s00138-015-0670-5","article-title":"Design and implementation of a computer vision-guided greenhouse crop diagnostics system","volume":"26","author":"Story","year":"2015","journal-title":"Mach. Vis. Appl."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"69","DOI":"10.1016\/j.compag.2018.08.001","article-title":"Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review","volume":"153","author":"Rieder","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"171","DOI":"10.1016\/j.biosystemseng.2007.09.026","article-title":"Mechanical within-row weed control for transplanted crops using computer vision","volume":"99","author":"Tillett","year":"2008","journal-title":"Biosyst. Eng."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1419","DOI":"10.3389\/fpls.2016.01419","article-title":"Using Deep Learning for Image-Based Plant Disease Detection","volume":"7","author":"Mohanty","year":"2016","journal-title":"Front. Plant Sci."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Rastogi, A., Arora, R., and Sharma, S. (2015, January 19\u201320). Leaf disease detection and grading using computer vision technology & fuzzy logic. Proceedings of the 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India.","DOI":"10.1109\/SPIN.2015.7095350"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Choi, H., Geeves, M., Alsalam, B., and Gonzalez, F. (2016, January 5\u201312). Open source computer-vision based guidance system for UAVs on-board decision making. Proceedings of the 2016 IEEE Aerospace Conference, Big Sky, MT, USA.","DOI":"10.1109\/AERO.2016.7500600"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Ward, S.L., Hensler, J., Alsalam, B.H.Y., Duncan, C., and Felipe, G. (2016, January 4\u201311). Autonomous UAVs Wildlife Monitoring and Tracking Using Thermal Imaging and Computer vision. Proceedings of the IEEE Aerospace Conferece, Big Sky, MT, USA.","DOI":"10.1109\/AERO.2016.7500671"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"104","DOI":"10.1016\/j.biosystemseng.2010.11.003","article-title":"Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform","volume":"108","author":"Xiang","year":"2011","journal-title":"Biosyst. Eng."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"359","DOI":"10.1007\/s11119-005-2324-5","article-title":"Evaluation of Digital Photography from Model Aircraft for Remote Sensing of Crop Biomass and Nitrogen Status","volume":"6","author":"Hunt","year":"2005","journal-title":"Precis. Agric."},{"key":"ref_14","first-page":"119","article-title":"High-Precision Positioning and Real-Time Data Processing of UAV-Systems","volume":"38","author":"Rieke","year":"2012","journal-title":"ISPRS Int. Arch. Photogramm. Remote. Sens. Spat. Inf. Sci."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Zhao, W., Yin, J., Wang, X., Hu, J., Qi, B., and Runge, T. (2019). Real-Time Vehicle Motion Detection and Motion Altering for Connected Vehicle: Algorithm Design and Practical Applications. Sensors, 19.","DOI":"10.3390\/s19194108"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"794","DOI":"10.1109\/TBC.2011.2160106","article-title":"Illumination-Sensitive Background Modeling Approach for Accurate Moving Object Detection","volume":"57","author":"Cheng","year":"2011","journal-title":"IEEE Trans. Broadcast."},{"key":"ref_17","first-page":"1","article-title":"A Sensor-Based Visual Effect Evaluation of Chevron Alignment Signs\u2019 Colors on Drivers through the Curves in Snow and Ice Environment","volume":"2017","author":"Zhao","year":"2017","journal-title":"J. Sens."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"221975","DOI":"10.1109\/ACCESS.2020.3043662","article-title":"Ground-level Mapping and Navigating for Agriculture based on IoT and Computer Vision","volume":"8","author":"Zhao","year":"2020","journal-title":"IEEE Access"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Hornberg, A. (2017). Handbook of Machine and Computer Vision: The Guide for Developers and Users, Wiley-VCH.","DOI":"10.1002\/9783527413409"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Baweja, H.S., Parhar, T., and Nuske, S. (2017). Early-season Vineyard Shoot and Leaf Estimation Using Computer Vision Techniques. 2017 Spokane Wash.","DOI":"10.13031\/aim.201700349"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"1090","DOI":"10.1109\/TCYB.2016.2538199","article-title":"Cross-Domain Recognition by Identifying Joint Subspaces of Source Domain and Target Domain","volume":"47","author":"Lin","year":"2017","journal-title":"IEEE Trans. Cybern."},{"key":"ref_22","first-page":"91","article-title":"R-CNN: Towards Real-Time Object Detection with Region Proposal Networks","volume":"39","author":"Ren","year":"2016","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"164","DOI":"10.1049\/iet-its.2012.0099","article-title":"Video-based traffic data collection system for multiple vehicle types","volume":"8","author":"Li","year":"2014","journal-title":"IET Intell. Transp. Syst."},{"key":"ref_24","first-page":"1","article-title":"Computer vision technology in agricultural automation\u2014A review","volume":"7","author":"Tian","year":"2020","journal-title":"Inf. Process. Agric."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"63","DOI":"10.1016\/j.compag.2014.03.009","article-title":"Use of artificial vision techniques for diagnostic of nitrogen nutritional status in maize plants","volume":"104","author":"Romualdo","year":"2014","journal-title":"Comput. Electron. Agric."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"136","DOI":"10.1016\/j.compag.2018.05.019","article-title":"A pattern recognition strategy for visual grape bunch detection in vineyards","volume":"151","author":"Cheein","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_27","first-page":"1","article-title":"Identifying crop water stress using deep learning models","volume":"1\u201315","author":"Chandel","year":"2020","journal-title":"Neural Comput. Appl."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"105684","DOI":"10.1016\/j.compag.2020.105684","article-title":"Edge detection for weed recognition in lawns","volume":"176","author":"Parra","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.compag.2014.03.001","article-title":"Automatic detection of powdery mildew on grapevine leaves by image analysis: Optimal view-angle range to increase the sensitivity","volume":"104","author":"Oberti","year":"2014","journal-title":"Comput. Electron. Agric."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"221","DOI":"10.1016\/j.compag.2014.11.021","article-title":"An optimum method for real-time in-field detection of Huanglongbing disease using a vision sensor","volume":"110","author":"Pourreza","year":"2015","journal-title":"Comput. Electron. Agric."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"63","DOI":"10.1016\/j.compag.2016.11.019","article-title":"Detection of soybean aphids in a greenhouse using an image processing technique","volume":"132","author":"Maharlooei","year":"2017","journal-title":"Comput. Electron. Agric."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.compag.2018.07.034","article-title":"An intelligent mobile application for diagnosis of crop diseases in Pakistan using fuzzy inference system","volume":"153","author":"Toseef","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"17","DOI":"10.1016\/j.aspen.2019.11.006","article-title":"Application of an image and environmental sensor network for automated greenhouse insect pest monitoring","volume":"23","author":"Rustia","year":"2020","journal-title":"J. Asia-Pacific \u00c8ntomol."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"57","DOI":"10.1016\/j.biosystemseng.2016.01.013","article-title":"Colour-agnostic shape-based 3D fruit detection for crop harvesting robots","volume":"146","author":"Barnea","year":"2016","journal-title":"Biosyst. Eng."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"872","DOI":"10.1109\/LRA.2017.2655622","article-title":"Autonomous Sweet Pepper Harvesting for Protected Cropping Systems","volume":"2","author":"Lehnert","year":"2017","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"141","DOI":"10.1007\/s12155-010-9108-0","article-title":"The Economics of Biomass Collection and Transportation and Its Supply to Indiana Cellulosic and Electric Utility Facilities","volume":"4","author":"Brechbill","year":"2011","journal-title":"BioEnergy Res."},{"key":"ref_37","unstructured":"Krizhevsky, A., Sutskever, I., and Hinton, G.E. (2012, January 1). Imagenet classification with deep convolutional neural networks. Proceedings of the 25th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA."},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Richter, S.R., Vineet, V., Roth, S., and Koltun, V. (2016). Playing for Data: Ground Truth from Computer Games, Springer Science and Business Media LLC.","DOI":"10.1007\/978-3-319-46475-6_7"},{"key":"ref_39","unstructured":"Ganin, Y., and Lempitsky, V. (2015). Unsupervised domain adaptation by backpropagation. International Conference on Machine Learning, PMLR."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"4441","DOI":"10.1109\/TGRS.2017.2692281","article-title":"Domain Adaptation Network for Cross-Scene Classification","volume":"55","author":"Othman","year":"2017","journal-title":"IEEE Trans. Geosci. Remote. Sens."},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"1020","DOI":"10.1007\/s12555-014-0119-z","article-title":"Domain adaption of vehicle detector based on convolutional neural networks","volume":"13","author":"Li","year":"2015","journal-title":"Int. J. Control. Autom. Syst."},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Qi, B.Z., Liu, P., Ji, T., Wei, Z., and Suman, B. (2018). Augmenting Driving Analytics with Multi-Modal Information, IEEE Vehicular Networking Conference (VNC).","DOI":"10.1109\/VNC.2018.8628415"},{"key":"ref_43","unstructured":"Ren, S., He, K., Girshick, R., and Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in Neural Information Processing Systems, Neural Information Processing Systems Foundation Inc."},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016, January 27\u201330). You Only Look Once: Unified, Real-Time Object Detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_45","doi-asserted-by":"crossref","unstructured":"He, K., Gkioxari, G., Doll\u00e1r, P., and Girshick, R. (2017, January 16). Mask r-cnn. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.322"},{"key":"ref_46","unstructured":"Redmon, J., and Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv."},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"1324","DOI":"10.1109\/LGRS.2019.2896411","article-title":"Domain Adaptation for Convolutional Neural Networks-Based Remote Sensing Scene Classification","volume":"16","author":"Song","year":"2019","journal-title":"IEEE Geosci. Remote. Sens. Lett."},{"key":"ref_48","unstructured":"Ren, S., He, K., Girshick, R., and Jian, S. (2015, January 11\u201312). Faster R-CNN: Towards real-time object detection with region proposal networks. Proceedings of the 29th Annual Conference on Neural Information Processing Systems, Montreal, QC, Canada."},{"key":"ref_49","doi-asserted-by":"crossref","unstructured":"Khodabandeh, M., Vahdat, A., Ranjbar, M., and Macready, W. (2019, January 27). A Robust Learning Approach to Domain Adaptive Object Detection. Proceedings of the 2019 IEEE\/CVF International Conference on Computer Vision (ICCV), Seoul, Korea.","DOI":"10.1109\/ICCV.2019.00057"},{"key":"ref_50","doi-asserted-by":"crossref","unstructured":"Zhu, J.-Y., Park, T., Isola, P., and Efros, A.A. (2017, January 22\u201329). Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks. Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy.","DOI":"10.1109\/ICCV.2017.244"},{"key":"ref_51","unstructured":"Darrenl, T. (2015, November 15). Labelimg. Available online: https:\/\/github.com\/tzutalin\/labelImg."},{"key":"ref_52","unstructured":"Kentaro Wada (2016, September 30). labelme: Image Polygonal Annotation with Python. Available online: https:\/\/github.com\/wkentaro\/labelme."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/1\/23\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T10:48:43Z","timestamp":1760179723000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/1\/23"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,12,23]]},"references-count":52,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2021,1]]}},"alternative-id":["rs13010023"],"URL":"https:\/\/doi.org\/10.3390\/rs13010023","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,12,23]]}}}