{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,15]],"date-time":"2026-04-15T18:18:01Z","timestamp":1776277081662,"version":"3.50.1"},"reference-count":68,"publisher":"MDPI AG","issue":"23","license":[{"start":{"date-parts":[[2023,12,3]],"date-time":"2023-12-03T00:00:00Z","timestamp":1701561600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>The rapid expansion of the world\u2019s population has resulted in an increased demand for agricultural products which necessitates the need to improve crop yields. To enhance crop yields, it is imperative to control weeds. Traditionally, weed control predominantly relied on the use of herbicides; however, the indiscriminate application of herbicides presents potential hazards to both crop health and productivity. Fortunately, the advent of cutting-edge technologies such as unmanned vehicle technology (UAVs) and computer vision has provided automated and efficient solutions for weed control. These approaches leverage drone images to detect and identify weeds with a certain level of accuracy. Nevertheless, the identification of weeds in drone images poses significant challenges attributed to factors like occlusion, variations in color and texture, and disparities in scale. The utilization of traditional image processing techniques and deep learning approaches, which are commonly employed in existing methods, presents difficulties in extracting features and addressing scale variations. In order to address these challenges, an innovative deep learning framework is introduced which is designed to classify every pixel in a drone image into categories such as weed, crop, and others. In general, our proposed network adopts an encoder\u2013decoder structure. The encoder component of the network effectively combines the Dense-inception network with the Atrous spatial pyramid pooling module, enabling the extraction of multi-scale features and capturing local and global contextual information seamlessly. The decoder component of the network incorporates deconvolution layers and attention units, namely, channel and spatial attention units (CnSAUs), which contribute to the restoration of spatial information and enhance the precise localization of weeds and crops in the images. The performance of the proposed framework is assessed using a publicly available benchmark dataset known for its complexity. The effectiveness of the proposed framework is demonstrated via comprehensive experiments, showcasing its superiority by achieving a 0.81 mean Intersection over Union (mIoU) on the challenging dataset.<\/jats:p>","DOI":"10.3390\/rs15235615","type":"journal-article","created":{"date-parts":[[2023,12,4]],"date-time":"2023-12-04T03:40:58Z","timestamp":1701661258000},"page":"5615","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":19,"title":["Weed\u2013Crop Segmentation in Drone Images with a Novel Encoder\u2013Decoder Framework Enhanced via Attention Modules"],"prefix":"10.3390","volume":"15","author":[{"given":"Sultan Daud","family":"Khan","sequence":"first","affiliation":[{"name":"Department of Computer Science, National University of Technology, Islamabad 44000, Pakistan"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2276-8307","authenticated-orcid":false,"given":"Saleh","family":"Basalamah","sequence":"additional","affiliation":[{"name":"Department of Computer Engineering, Umm Al-Qura University, Mecca 24382, Saudi Arabia"}]},{"given":"Ahmed","family":"Lbath","sequence":"additional","affiliation":[{"name":"Department of Computer Science, Universit\u00e9 Grenoble Alpes, 38400 Grenoble, France"}]}],"member":"1968","published-online":{"date-parts":[[2023,12,3]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"4843","DOI":"10.1109\/ACCESS.2020.3048415","article-title":"Machine learning applications for precision agriculture: A comprehensive review","volume":"9","author":"Sharma","year":"2020","journal-title":"IEEE Access"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"106067","DOI":"10.1016\/j.compag.2021.106067","article-title":"A survey of deep learning techniques for weed detection from images","volume":"184","author":"Hasan","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"139263","DOI":"10.1016\/j.chemosphere.2023.139263","article-title":"Experimental investigation on the effect of soil solarization incorporating black, silver, and transparent polythene, and straw as mulch, on the microbial population and weed growth","volume":"336","author":"Shinde","year":"2023","journal-title":"Chemosphere"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Pe\u00f1a, J.M., Torres-S\u00e1nchez, J., de Castro, A.I., Kelly, M., and L\u00f3pez-Granados, F. (2013). Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PLoS ONE, 8.","DOI":"10.1371\/journal.pone.0077151"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"177","DOI":"10.1046\/j.1365-3180.2002.00277.x","article-title":"Weed management in organic agriculture: Are we addressing the right issues?","volume":"42","year":"2002","journal-title":"Weed Res."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"183","DOI":"10.1007\/s11119-015-9415-8","article-title":"Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds","volume":"17","year":"2016","journal-title":"Precis. Agric."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"170","DOI":"10.3846\/16487788.2013.861224","article-title":"Features of the use of unmanned aerial vehicles for agriculture applications","volume":"17","author":"Urbahs","year":"2013","journal-title":"Aviation"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"105760","DOI":"10.1016\/j.compag.2020.105760","article-title":"A survey of public datasets for computer vision tasks in precision agriculture","volume":"178","author":"Lu","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"ref_9","first-page":"1","article-title":"Computer vision technology in agricultural automation\u2014A review","volume":"7","author":"Tian","year":"2020","journal-title":"Inf. Process. Agric."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"100083","DOI":"10.1016\/j.atech.2022.100083","article-title":"A survey on using deep learning techniques for plant disease diagnosis and recommendations for development of appropriate tools","volume":"3","author":"Ahmad","year":"2022","journal-title":"Smart Agric. Technol."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"110","DOI":"10.34218\/IJCET.10.3.2019.013","article-title":"An approach for prediction of crop yield using machine learning and big data techniques","volume":"10","author":"Palanivel","year":"2019","journal-title":"Int. J. Comput. Eng. Technol."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"124","DOI":"10.1016\/j.isprsjprs.2019.11.008","article-title":"Estimating wheat yields in Australia using climate records, satellite image time series and machine learning methods","volume":"160","author":"Kamir","year":"2020","journal-title":"Isprs J. Photogramm. Remote. Sens."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Maimaitijiang, M., Sagan, V., Sidike, P., Daloye, A.M., Erkbol, H., and Fritschi, F.B. (2020). Crop monitoring using satellite\/UAV data fusion and machine learning. Remote Sens., 12.","DOI":"10.3390\/rs12091357"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Fu, Z., Jiang, J., Gao, Y., Krienke, B., Wang, M., Zhong, K., Cao, Q., Tian, Y., Zhu, Y., and Cao, W. (2020). Wheat growth monitoring and yield estimation based on multi-rotor unmanned aerial vehicle. Remote Sens., 12.","DOI":"10.3390\/rs12030508"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"103","DOI":"10.1016\/j.compag.2015.12.016","article-title":"Weed detection using image processing under different illumination for site-specific areas spraying","volume":"122","author":"Tang","year":"2016","journal-title":"Comput. Electron. Agric."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Islam, N., Rashid, M.M., Wibowo, S., Xu, C.Y., Morshed, A., Wasimi, S.A., Moore, S., and Rahman, S.M. (2021). Early weed detection using image processing and machine learning techniques in an Australian chilli farm. Agriculture, 11.","DOI":"10.3390\/agriculture11050387"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"337","DOI":"10.1016\/j.compag.2010.12.011","article-title":"Real-time image processing for crop\/weed discrimination in maize fields","volume":"75","author":"Ribeiro","year":"2011","journal-title":"Comput. Electron. Agric."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Bah, M.D., Hafiane, A., and Canals, R. (2018). Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens., 10.","DOI":"10.20944\/preprints201809.0088.v1"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Huang, H., Deng, J., Lan, Y., Yang, A., Deng, X., and Zhang, L. (2018). A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE, 13.","DOI":"10.1371\/journal.pone.0196302"},{"key":"ref_20","unstructured":"Mads, D., Skov, M.H., and Krogh, M.A. (2016, January 26\u201329). Pixel-wise classification of weeds and crops in images by using a fully convolutional neural network. Proceedings of the International Conference on Agricultural Engineering, Aarhus, Denmark."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"107956","DOI":"10.1016\/j.compag.2023.107956","article-title":"Segmentation of weeds and crops using multispectral imaging and CRF-enhanced U-Net","volume":"211","author":"Sahin","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"106242","DOI":"10.1016\/j.compag.2021.106242","article-title":"A modified U-Net with a specific data argumentation method for semantic segmentation of weed images in the field","volume":"187","author":"Zou","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Huang, H., Deng, J., Lan, Y., Yang, A., Deng, X., Wen, S., Zhang, H., and Zhang, Y. (2018). Accurate weed mapping and prescription map generation based on fully convolutional networks using UAV imagery. Sensors, 18.","DOI":"10.3390\/s18103299"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"2870","DOI":"10.1109\/LRA.2018.2846289","article-title":"Fully convolutional networks with sequential information for robust crop and weed detection in precision farming","volume":"3","author":"Lottes","year":"2018","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Long, J., Shelhamer, E., and Darrell, T. (2015, January 7\u201312). Fully convolutional networks for semantic segmentation. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA.","DOI":"10.1109\/CVPR.2015.7298965"},{"key":"ref_26","unstructured":"Ronneberger, O., Fischer, P., and Brox, T. (2015). Proceedings, Part III 18, Proceedings of the Medical Image Computing and Computer-Assisted Intervention\u2013MICCAI 2015: 18th International Conference, Munich, Germany, 5\u20139 October 2015, Springer."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"215","DOI":"10.1016\/j.compag.2017.07.028","article-title":"Maize and weed classification using color indices with support vector data description in outdoor fields","volume":"141","author":"Zheng","year":"2017","journal-title":"Comput. Electron. Agric."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"441","DOI":"10.13031\/2013.2723","article-title":"Classification of weed species using color texture features and discriminant analysis","volume":"43","author":"Burks","year":"2000","journal-title":"Trans. Asae"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"97","DOI":"10.1016\/j.compag.2016.11.021","article-title":"Automatic crop detection under field conditions using the HSV colour space and morphological operations","volume":"133","author":"Hamuda","year":"2017","journal-title":"Comput. Electron. Agric."},{"key":"ref_30","first-page":"602","article-title":"Weed detection in sugar beet fields using machine vision","volume":"8","author":"Jafari","year":"2006","journal-title":"Int. J. Agric. Biol"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"271","DOI":"10.13031\/2013.27839","article-title":"Shape features for identifying young weeds using image analysis","volume":"38","author":"Woebbecke","year":"1995","journal-title":"Trans. ASAE"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"484","DOI":"10.1016\/j.biosystemseng.2008.05.003","article-title":"Classification of crops and weeds extracted by active shape models","volume":"100","author":"Persson","year":"2008","journal-title":"Biosyst. Eng."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"197","DOI":"10.1016\/S0168-1699(99)00068-X","article-title":"Colour and shape analysis techniques for weed detection in cereal fields","volume":"25","author":"Perez","year":"2000","journal-title":"Comput. Electron. Agric."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Lin, F., Zhang, D., Huang, Y., Wang, X., and Chen, X. (2017). Detection of corn and weed species by the combination of spectral, shape and textural features. Sustainability, 9.","DOI":"10.3390\/su9081335"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"121","DOI":"10.1016\/j.compag.2005.09.004","article-title":"Plant species identification using Elliptic Fourier leaf shape analysis","volume":"50","author":"Neto","year":"2006","journal-title":"Comput. Electron. Agric."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"965","DOI":"10.13031\/2013.27914","article-title":"Effective criteria for weed identification in wheat fields using machine vision","volume":"38","author":"Chaisattapagon","year":"1995","journal-title":"Trans. ASAE"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1189","DOI":"10.13031\/2013.17244","article-title":"Textural imaging and discriminant analysis for distinguishingweeds for spot spraying","volume":"41","author":"Meyer","year":"1998","journal-title":"Trans. ASAE"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.biosystemseng.2017.02.002","article-title":"Weed segmentation using texture features extracted from wavelet sub-images","volume":"157","author":"Bakhshipour","year":"2017","journal-title":"Biosyst. Eng."},{"key":"ref_39","first-page":"840","article-title":"Weed\/corn seedling recognition by support vector machine using texture features","volume":"4","author":"Wu","year":"2009","journal-title":"Afr. J. Agric. Res."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"53","DOI":"10.1016\/j.compag.2008.12.003","article-title":"Weed image classification using Gabor wavelet and gradient field distribution","volume":"66","author":"Ishak","year":"2009","journal-title":"Comput. Electron. Agric."},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Ahmed, F., Bari, A.H., Shihavuddin, A., Al-Mamun, H.A., and Kwan, P. (2011, January 21\u201322). A study on local binary pattern for automated weed classification using template matching and support vector machine. Proceedings of the 2011 IEEE 12th International Symposium on Computational Intelligence and Informatics (CINTI), Budapest, Hungary.","DOI":"10.1109\/CINTI.2011.6108524"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"78","DOI":"10.1016\/j.eja.2019.01.004","article-title":"Deep learning for image-based weed detection in turfgrass","volume":"104","author":"Yu","year":"2019","journal-title":"Eur. J. Agron."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"105750","DOI":"10.1016\/j.compag.2020.105750","article-title":"A DNN-based semantic segmentation for detecting weed and crop","volume":"178","author":"You","year":"2020","journal-title":"Comput. Electron. Agric."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"10940","DOI":"10.1109\/ACCESS.2021.3050296","article-title":"Weed identification using deep learning and image processing in vegetable plantation","volume":"9","author":"Jin","year":"2021","journal-title":"IEEE Access"},{"key":"ref_45","unstructured":"Duan, K., Bai, S., Xie, L., Qi, H., Huang, Q., and Tian, Q. (November, January 27). Centernet: Keypoint triplets for object detection. Proceedings of the IEEE\/CVF International Conference on Computer Vision, Seoul, Republic of Korea."},{"key":"ref_46","first-page":"100308","article-title":"Weed detection in soybean crops using custom lightweight deep learning models","volume":"8","author":"Razfar","year":"2022","journal-title":"J. Agric. Food Res."},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.C. (2018, January 18\u201323). Mobilenetv2: Inverted residuals and linear bottlenecks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","DOI":"10.1109\/CVPR.2018.00474"},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2016, January 27\u201330). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_49","doi-asserted-by":"crossref","first-page":"314","DOI":"10.1016\/j.compag.2017.10.027","article-title":"Weed detection in soybean crops using ConvNets","volume":"143","author":"Freitas","year":"2017","journal-title":"Comput. Electron. Agric."},{"key":"ref_50","first-page":"355","article-title":"A deep semantic segmentation-based algorithm to segment crops and weeds in agronomic color images","volume":"9","author":"Sodjinou","year":"2022","journal-title":"Inf. Process. Agric."},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"471","DOI":"10.3390\/agriengineering2030032","article-title":"A deep learning approach for weed detection in lettuce crops using multispectral images","volume":"2","author":"Osorio","year":"2020","journal-title":"AgriEngineering"},{"key":"ref_52","unstructured":"Redmon, J., and Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv."},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"He, K., Gkioxari, G., Doll\u00e1r, P., and Girshick, R. (2017, January 22\u201329). Mask r-cnn. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.322"},{"key":"ref_54","doi-asserted-by":"crossref","unstructured":"Pettorelli, N. (2013). The Normalized Difference Vegetation Index, Oxford University Press.","DOI":"10.1093\/acprof:osobl\/9780199693160.001.0001"},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"107146","DOI":"10.1016\/j.compag.2022.107146","article-title":"MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds","volume":"199","author":"Kim","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_56","doi-asserted-by":"crossref","unstructured":"Fawakherji, M., Youssef, A., Bloisi, D., Pretto, A., and Nardi, D. (2019, January 25\u201327). Crop and weeds classification for precision agriculture using context-independent pixel-wise segmentation. Proceedings of the 2019 Third IEEE International Conference on Robotic Computing (IRC), Naples, Italy.","DOI":"10.1109\/IRC.2019.00029"},{"key":"ref_57","first-page":"100759","article-title":"Deep learning-based precision agriculture through weed recognition in sugar beet fields","volume":"35","author":"Nasiri","year":"2022","journal-title":"Sustain. Comput. Inf. Syst."},{"key":"ref_58","doi-asserted-by":"crossref","unstructured":"Khan, A., Ilyas, T., Umraiz, M., Mannan, Z.I., and Kim, H. (2020). Ced-net: Crops and weeds segmentation for smart farming using a small cascaded encoder-decoder architecture. Electronics, 9.","DOI":"10.3390\/electronics9101602"},{"key":"ref_59","doi-asserted-by":"crossref","first-page":"107994","DOI":"10.1016\/j.compag.2023.107994","article-title":"Instance segmentation method for weed detection using UAV imagery in soybean fields","volume":"211","author":"Xu","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"ref_60","first-page":"101545","article-title":"Multi-level feature re-weighted fusion for the semantic segmentation of crops and weeds","volume":"35","author":"Janneh","year":"2023","journal-title":"J. King Saud Univ.-Comput. Inf. Sci."},{"key":"ref_61","doi-asserted-by":"crossref","unstructured":"Zhang, J., Gong, J., Zhang, Y., Mostafa, K., and Yuan, G. (2023). Weed Identification in Maize Fields Based on Improved Swin-Unet. Agronomy, 13.","DOI":"10.3390\/agronomy13071846"},{"key":"ref_62","doi-asserted-by":"crossref","first-page":"3310","DOI":"10.1109\/LRA.2023.3262417","article-title":"Towards Domain Generalization in Crop and Weed Segmentation for Precision Farming Robots","volume":"8","author":"Weyler","year":"2023","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"100188","DOI":"10.1016\/j.atech.2023.100188","article-title":"A comparative study of Fourier transform and CycleGAN as domain adaptation techniques for weed segmentation","volume":"4","author":"Bertoglio","year":"2023","journal-title":"Smart Agric. Technol."},{"key":"ref_64","doi-asserted-by":"crossref","first-page":"226","DOI":"10.1016\/j.compag.2019.02.005","article-title":"A review on weed detection using ground-based machine vision and image processing techniques","volume":"158","author":"Wang","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_65","doi-asserted-by":"crossref","unstructured":"Wu, Z., Chen, Y., Zhao, B., Kang, X., and Ding, Y. (2021). Review of weed detection methods based on computer vision. Sensors, 21.","DOI":"10.3390\/s21113647"},{"key":"ref_66","doi-asserted-by":"crossref","unstructured":"Rakhmatulin, I., Kamilaris, A., and Andreasen, C. (2021). Deep neural networks to detect weeds from crops in agricultural environments in real-time: A review. Remote Sens., 13.","DOI":"10.2139\/ssrn.3959386"},{"key":"ref_67","doi-asserted-by":"crossref","first-page":"3911","DOI":"10.1109\/TCSVT.2019.2915238","article-title":"Channel-wise and spatial feature modulation network for single image super-resolution","volume":"30","author":"Hu","year":"2019","journal-title":"IEEE Trans. Circuits Syst. Video Technol."},{"key":"ref_68","doi-asserted-by":"crossref","unstructured":"Woo, S., Park, J., Lee, J.Y., and Kweon, I.S. (2018, January 8\u201314). Cbam: Convolutional block attention module. Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany.","DOI":"10.1007\/978-3-030-01234-2_1"}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/23\/5615\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T21:37:05Z","timestamp":1760132225000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/23\/5615"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,12,3]]},"references-count":68,"journal-issue":{"issue":"23","published-online":{"date-parts":[[2023,12]]}},"alternative-id":["rs15235615"],"URL":"https:\/\/doi.org\/10.3390\/rs15235615","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,12,3]]}}}