{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,21]],"date-time":"2026-03-21T19:27:51Z","timestamp":1774121271440,"version":"3.50.1"},"reference-count":76,"publisher":"MDPI AG","issue":"6","license":[{"start":{"date-parts":[[2022,3,14]],"date-time":"2022-03-14T00:00:00Z","timestamp":1647216000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"UKRI-ISCF-TFP","award":["134063"],"award-info":[{"award-number":["134063"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Nitrogen (N) fertilizer is routinely applied by farmers to increase crop yields. At present, farmers often over-apply N fertilizer in some locations or at certain times because they do not have high-resolution crop N status data. N-use efficiency can be low, with the remaining N lost to the environment, resulting in higher production costs and environmental pollution. Accurate and timely estimation of N status in crops is crucial to improving cropping systems\u2019 economic and environmental sustainability. Destructive approaches based on plant tissue analysis are time consuming and impractical over large fields. Recent advances in remote sensing and deep learning have shown promise in addressing the aforementioned challenges in a non-destructive way. In this work, we propose a novel deep learning framework: a self-supervised spectral\u2013spatial attention-based vision transformer (SSVT). The proposed SSVT introduces a Spectral Attention Block (SAB) and a Spatial Interaction Block (SIB), which allows for simultaneous learning of both spatial and spectral features from UAV digital aerial imagery, for accurate N status prediction in wheat fields. Moreover, the proposed framework introduces local-to-global self-supervised learning to help train the model from unlabelled data. The proposed SSVT has been compared with five state-of-the-art models including: ResNet, RegNet, EfficientNet, EfficientNetV2, and the original vision transformer on both testing and independent datasets. The proposed approach achieved high accuracy (0.96) with good generalizability and reproducibility for wheat N status estimation.<\/jats:p>","DOI":"10.3390\/rs14061400","type":"journal-article","created":{"date-parts":[[2022,3,15]],"date-time":"2022-03-15T02:56:20Z","timestamp":1647312980000},"page":"1400","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":30,"title":["The Self-Supervised Spectral\u2013Spatial Vision Transformer Network for Accurate Prediction of Wheat Nitrogen Status from UAV Imagery"],"prefix":"10.3390","volume":"14","author":[{"given":"Xin","family":"Zhang","sequence":"first","affiliation":[{"name":"Department of Computing and Mathematics, Manchester Metropolitan University, Manchester M15GD, UK"}]},{"given":"Liangxiu","family":"Han","sequence":"additional","affiliation":[{"name":"Department of Computing and Mathematics, Manchester Metropolitan University, Manchester M15GD, UK"}]},{"given":"Tam","family":"Sobeih","sequence":"additional","affiliation":[{"name":"Department of Computing and Mathematics, Manchester Metropolitan University, Manchester M15GD, UK"}]},{"given":"Lewis","family":"Lappin","sequence":"additional","affiliation":[{"name":"GMV, Glasgow, Scotland G431QQ, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-8060-2762","authenticated-orcid":false,"given":"Mark A.","family":"Lee","sequence":"additional","affiliation":[{"name":"Department of Health Studies, Royal Holloway, University of London, Egham TW200EX, UK"}]},{"given":"Andew","family":"Howard","sequence":"additional","affiliation":[{"name":"Bockhanger Farms Ltd., Oaklands Farm, Ashford TN261ER, UK"}]},{"given":"Aron","family":"Kisdi","sequence":"additional","affiliation":[{"name":"GMV, Glasgow, Scotland G431QQ, UK"}]}],"member":"1968","published-online":{"date-parts":[[2022,3,14]]},"reference":[{"key":"ref_1","unstructured":"FAO (2017). World Fertilizer Trends and Outlook to 2020: Summary Report, FAO."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"869","DOI":"10.1126\/science.aas8737","article-title":"Toward nitrogen-fixing plants","volume":"359","author":"Good","year":"2018","journal-title":"Science"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"111758","DOI":"10.1016\/j.rse.2020.111758","article-title":"Crop nitrogen monitoring: Recent progress and principal developments in the context of imaging spectroscopy missions","volume":"242","author":"Berger","year":"2020","journal-title":"Remote Sens. Environ."},{"key":"ref_4","first-page":"681","article-title":"Excessive nitrogen application decreases grain yield and increases nitrogen loss in a wheat\u2013soil system","volume":"61","author":"Wang","year":"2011","journal-title":"Acta Agric. Scand. Sect. B-Soil Plant Sci."},{"key":"ref_5","unstructured":"Knoema (2021, August 08). Wheat Area Harvested. Available online: https:\/\/knoema.com\/\/atlas\/topics\/Agriculture\/Crops-Production-Area-Harvested\/Wheat-area-harvested."},{"key":"ref_6","unstructured":"Benitez Ramirez, M. (2010). Monitoring Nitrogen Levels in the Cotton Canopy Using Real-Time Active-Illumination Spectral Sensing. [Master\u2019s Thesis, University of Tennessee]."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Wang, J., Shen, C., Liu, N., Jin, X., Fan, X., Dong, C., and Xu, Y. (2017). Non-destructive evaluation of the leaf nitrogen concentration by in-field visible\/near-infrared spectroscopy in pear orchards. Sensors, 17.","DOI":"10.3390\/s17030538"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"59","DOI":"10.1016\/j.inffus.2020.01.007","article-title":"An overview on spectral and spatial information fusion for hyperspectral image classification: Current trends and challenges","volume":"59","author":"Imani","year":"2020","journal-title":"Inf. Fusion"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"116","DOI":"10.1016\/j.rse.2013.10.027","article-title":"An assessment of pre-and within-season remotely sensed variables for forecasting corn and soybean yields in the United States","volume":"141","author":"Johnson","year":"2014","journal-title":"Remote Sens. Environ."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"318","DOI":"10.1016\/j.fcr.2010.01.010","article-title":"Measuring and predicting canopy nitrogen nutrition in wheat using a spectral index\u2014The canopy chlorophyll content index (CCCI)","volume":"116","author":"Fitzgerald","year":"2010","journal-title":"Field Crops Res."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"61","DOI":"10.1016\/j.compag.2018.05.012","article-title":"Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review","volume":"151","author":"Chlingaryan","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"255","DOI":"10.1126\/science.aaa8415","article-title":"Machine learning: Trends, perspectives, and prospects","volume":"349","author":"Jordan","year":"2015","journal-title":"Science"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"105860","DOI":"10.1016\/j.compag.2020.105860","article-title":"Rice nitrogen nutrition estimation with RGB images and machine learning methods","volume":"180","author":"Shi","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"106421","DOI":"10.1016\/j.compag.2021.106421","article-title":"Estimation of nitrogen nutrition index in rice from UAV RGB images coupled with machine learning algorithms","volume":"189","author":"Qiu","year":"2021","journal-title":"Comput. Electron. Agric."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Zhang, X., Han, L., Han, L., and Zhu, L. (2020). How well do deep learning-based methods for land cover classification and object detection perform on high resolution remote sensing imagery?. Remote Sens., 12.","DOI":"10.3390\/rs12030417"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"93","DOI":"10.1007\/s11119-017-9501-1","article-title":"Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach","volume":"19","author":"Roth","year":"2018","journal-title":"Precis. Agric."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Alom, M.Z., Hasan, M., Yakopcic, C., Taha, T.M., and Asari, V.K. (2018). Improved inception-residual convolutional neural network for object recognition. Neural Comput. Appl.","DOI":"10.1109\/IJCNN.2018.8489635"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"158","DOI":"10.1016\/j.patcog.2017.05.025","article-title":"Handcrafted vs. non-handcrafted features for computer vision classification","volume":"71","author":"Nanni","year":"2017","journal-title":"Pattern Recognit."},{"key":"ref_19","first-page":"6","article-title":"Classification And Detection of Nutritional Deficiencies in Coffee Plants Using Image Processing And Convolutional Neural Network (CNN)","volume":"9","author":"Lewis","year":"2020","journal-title":"Int. J. Sci. Technol. Res."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"5703","DOI":"10.1007\/s12652-020-01938-8","article-title":"Nitrogen Deficiency Prediction of Rice Crop Based on Convolutional Neural Network","volume":"11","author":"Sethy","year":"2020","journal-title":"J. Ambient. Intell. Humaniz. Comput."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Tran, T.T., Choi, J.W., Le, T.T.H., and Kim, J.W. (2019). A Comparative Study of Deep CNN in Forecasting and Classifying the Macronutrient Deficiencies on Development of Tomato Plant. Appl. Sci., 9.","DOI":"10.3390\/app9081601"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Wolf, T., Debut, L., Sanh, V., Chaumond, J., Delangue, C., Moi, A., Cistac, P., Rault, T., Louf, R., and Funtowicz, M. (2020, January 16\u201320). Transformers: State-of-the-Art Natural Language Processing. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, Online. Available online: https:\/\/aclanthology.org\/2020.emnlp-demos.6\/.","DOI":"10.18653\/v1\/2020.emnlp-demos.6"},{"key":"ref_23","first-page":"518","article-title":"Remote sensing for nitrogen management","volume":"57","author":"Scharf","year":"2002","journal-title":"J. Soil Water Conserv."},{"key":"ref_24","first-page":"103","article-title":"A visible band index for remote sensing leaf chlorophyll content at the canopy scale","volume":"21","author":"Hunt","year":"2013","journal-title":"Int. J. Appl. Earth Obs. Geoinf."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"719","DOI":"10.1134\/S1021443708060010","article-title":"Screening of visible and UV radiation as a photoprotective mechanism in plants","volume":"55","author":"Solovchenko","year":"2008","journal-title":"Russ. J. Plant Physiol."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"212","DOI":"10.2134\/agronj2003.2120","article-title":"Using leaf color charts to estimate leaf nitrogen status of rice","volume":"95","author":"Yang","year":"2003","journal-title":"Agron. J."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"275","DOI":"10.1016\/j.rse.2007.02.018","article-title":"LAI, fAPAR and fCover CYCLOPES global products derived from VEGETATION: Part 1: Principles of the algorithm","volume":"110","author":"Baret","year":"2007","journal-title":"Remote Sens. Environ."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"515","DOI":"10.1007\/s10712-018-9492-0","article-title":"Spaceborne imaging spectroscopy for sustainable agriculture: Contributions and challenges","volume":"40","author":"Hank","year":"2019","journal-title":"Surv. Geophys."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Lu, B., and He, Y. (2019). Evaluating empirical regression, machine learning, and radiative transfer modelling for estimating vegetation chlorophyll content using bi-seasonal hyperspectral images. Remote Sens., 11.","DOI":"10.3390\/rs11171979"},{"key":"ref_30","first-page":"102219","article-title":"RTM-based dynamic absorption integrals for the retrieval of biochemical vegetation traits","volume":"93","author":"Wocher","year":"2020","journal-title":"Int. J. Appl. Earth Obs. Geoinf."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"125","DOI":"10.1016\/0034-4257(84)90057-9","article-title":"Light scattering by leaf layers with application to canopy reflectance modeling: The SAIL model","volume":"16","author":"Verhoef","year":"1984","journal-title":"Remote Sens. Environ."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"3172","DOI":"10.1109\/JSTARS.2015.2422734","article-title":"Leaf nitrogen content indirectly estimated by leaf traits derived from the PROSPECT model","volume":"8","author":"Wang","year":"2015","journal-title":"IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Padilla, F.M., Gallardo, M., Pe\u00f1a-Fleitas, M.T., De Souza, R., and Thompson, R.B. (2018). Proximal optical sensors for nitrogen management of vegetable crops: A review. Sensors, 18.","DOI":"10.3390\/s18072083"},{"key":"ref_34","first-page":"344","article-title":"Remote estimation of crop and grass chlorophyll and nitrogen content using red-edge bands on Sentinel-2 and-3","volume":"23","author":"Clevers","year":"2013","journal-title":"Int. J. Appl. Earth Obs. Geoinf."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"63","DOI":"10.1016\/j.proenv.2016.03.057","article-title":"Nitrogen content estimation of rice crop based on near infrared (NIR) reflectance using artificial neural network (ANN)","volume":"33","author":"Afandi","year":"2016","journal-title":"Procedia Environ. Sci."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"213","DOI":"10.1016\/j.compag.2008.10.003","article-title":"New method to assess barley nitrogen nutrition status based on image colour analysis: Comparison with SPAD-502","volume":"65","author":"Pagola","year":"2009","journal-title":"Comput. Electron. Agric."},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Zha, H., Miao, Y., Wang, T., Li, Y., Zhang, J., Sun, W., Feng, Z., and Kusnierek, K. (2020). Improving unmanned aerial vehicle remote sensing-based rice nitrogen nutrition index prediction with machine learning. Remote Sens., 12.","DOI":"10.3390\/rs12020215"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"390","DOI":"10.3389\/fpls.2016.00390","article-title":"Predicting pre-planting risk of Stagonospora nodorum blotch in winter wheat using machine learning models","volume":"7","author":"Mehra","year":"2016","journal-title":"Front. Plant Sci."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"151","DOI":"10.1007\/s12892-011-0029-z","article-title":"Estimating canopy cover from color digital camera image of rice field","volume":"14","author":"Lee","year":"2011","journal-title":"J. Crop Sci. Biotechnol."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"221","DOI":"10.1016\/j.fcr.2010.05.011","article-title":"Estimating the nitrogen status of crops using a digital camera","volume":"118","author":"Li","year":"2010","journal-title":"Field Crops Res."},{"key":"ref_41","first-page":"502","article-title":"Estimating the Growth Indices and Nitrogen Status Based on Color Digital Image Analysis During Early Growth Period of Winter Wheat","volume":"12","author":"Zhao","year":"2021","journal-title":"Front. Plant Sci."},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"108650","DOI":"10.1016\/j.measurement.2020.108650","article-title":"A deep learning approach to measure stress level in plants due to Nitrogen deficiency","volume":"173","author":"Azimi","year":"2021","journal-title":"Measurement"},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.patcog.2017.05.015","article-title":"How deep learning extracts and learns leaf features for plant classification","volume":"71","author":"Lee","year":"2017","journal-title":"Pattern Recognit."},{"key":"ref_44","unstructured":"Islam, M.A., Jia, S., and Bruce, N.D.B. (2020). How Much Position Information Do Convolutional Neural Networks Encode?. arXiv."},{"key":"ref_45","unstructured":"Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., and Polosukhin, I. (2017). Attention Is All You Need. arXiv."},{"key":"ref_46","unstructured":"Dong, Y., Cordonnier, J.B., and Loukas, A. (2021). Attention is Not All You Need: Pure Attention Loses Rank Doubly Exponentially with Depth. arXiv."},{"key":"ref_47","unstructured":"Touvron, H., Cord, M., Douze, M., Massa, F., Sablayrolles, A., and J\u00e9gou, H. (2021). Training data-efficient image transformers & distillation through attention. arXiv."},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Liu, X., Zhang, F., Hou, Z., Mian, L., Wang, Z., Zhang, J., and Tang, J. (2021). Self-supervised learning: Generative or contrastive. IEEE Trans. Knowl. Data Eng., 1.","DOI":"10.1109\/TKDE.2021.3090866"},{"key":"ref_49","unstructured":"Goodfellow, I.J., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., and Bengio, Y. (2014). Generative Adversarial Networks. arXiv."},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"1735","DOI":"10.1109\/CVPR.2006.100","article-title":"Dimensionality Reduction by Learning an Invariant Mapping","volume":"Volume 2","author":"Hadsell","year":"2006","journal-title":"Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR\u201906)"},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"669","DOI":"10.1142\/S0218001493000339","article-title":"Signature verification using a \u201csiamese\u201d time delay neural network","volume":"7","author":"Bromley","year":"1993","journal-title":"Int. J. Pattern Recognit. Artif. Intell."},{"key":"ref_52","unstructured":"Grill, J.B., Strub, F., Altch\u00e9, F., Tallec, C., Richemond, P.H., Buchatskaya, E., Doersch, C., Pires, B.A., Guo, Z.D., and Azar, M.G. (2020). Bootstrap your own latent: A new approach to self-supervised Learning. arXiv."},{"key":"ref_53","unstructured":"Caron, M., Misra, I., Mairal, J., Goyal, P., Bojanowski, P., and Joulin, A. (2021). Unsupervised Learning of Visual Features by Contrasting Cluster Assignments. arXiv."},{"key":"ref_54","doi-asserted-by":"crossref","unstructured":"He, K., Fan, H., Wu, Y., Xie, S., and Girshick, R. (2020, January 14\u201319). Momentum Contrast for Unsupervised Visual Representation Learning. Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition (CVPR 2020), Seattle, WA, USA.","DOI":"10.1109\/CVPR42600.2020.00975"},{"key":"ref_55","unstructured":"Chen, T., Kornblith, S., Norouzi, M., and Hinton, G. (2020). A Simple Framework for Contrastive Learning of Visual Representations. arXiv."},{"key":"ref_56","unstructured":"OpenDroneMap\/ODM (2022, January 26). A Command Line Toolkit to Generate Maps, Point Clouds, 3D Models and DEMs from Drone, Balloon or Kite Images. Available online: https:\/\/github.com\/OpenDroneMap\/ODM."},{"key":"ref_57","unstructured":"El-Nouby, A., Touvron, H., Caron, M., Bojanowski, P., Douze, M., Joulin, A., Laptev, I., Neverova, N., Synnaeve, G., and Verbeek, J. (2021). XCiT: Cross-Covariance Image Transformers. arXiv."},{"key":"ref_58","unstructured":"Ba, J.L., Kiros, J.R., and Hinton, G.E. (2016). Layer Normalization. arXiv."},{"key":"ref_59","unstructured":"Howard, A.G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., and Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv."},{"key":"ref_60","doi-asserted-by":"crossref","unstructured":"Zhang, L., Song, J., Gao, A., Chen, J., Bao, C., and Ma, K. (2019, January 27\u201328). Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation. Proceedings of the 2019 IEEE\/CVF International Conference on Computer Vision (ICCV), Seoul, Korea.","DOI":"10.1109\/ICCV.2019.00381"},{"key":"ref_61","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (July, January 26). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA."},{"key":"ref_62","unstructured":"Tan, M., and Le, Q. (2019, January 9\u201315). Efficientnet: Rethinking model scaling for convolutional neural networks. Proceedings of the International Conference on Machine Learning (PMLR), Long Beach, CA, USA."},{"key":"ref_63","doi-asserted-by":"crossref","unstructured":"Radosavovic, I., Kosaraju, R.P., Girshick, R., He, K., and Doll\u00e1r, P. (2020). Designing Network Design Spaces. arXiv.","DOI":"10.1109\/CVPR42600.2020.01044"},{"key":"ref_64","unstructured":"Tan, M., and Le, Q.V. (2021). EfficientNetV2: Smaller Models and Faster Training. arXiv."},{"key":"ref_65","doi-asserted-by":"crossref","unstructured":"Zhang, H., Cisse, M., Dauphin, Y.N., and Lopez-Paz, D. (2017). mixup: Beyond empirical risk minimization. arXiv.","DOI":"10.1007\/978-1-4899-7687-1_79"},{"key":"ref_66","unstructured":"M\u00fcller, R., Kornblith, S., and Hinton, G. (2019). When does label smoothing help?. arXiv."},{"key":"ref_67","doi-asserted-by":"crossref","unstructured":"Buslaev, A., Iglovikov, V.I., Khvedchenya, E., Parinov, A., Druzhinin, M., and Kalinin, A.A. (2020). Albumentations: Fast and Flexible Image Augmentations. Information, 11.","DOI":"10.3390\/info11020125"},{"key":"ref_68","doi-asserted-by":"crossref","first-page":"147","DOI":"10.1007\/s11119-019-09656-8","article-title":"Improving nitrogen assessment with an RGB camera across uncertain natural light from above-canopy measurements","volume":"21","author":"Putra","year":"2020","journal-title":"Precis. Agric."},{"key":"ref_69","doi-asserted-by":"crossref","first-page":"60","DOI":"10.1186\/s40537-019-0197-0","article-title":"A survey on image data augmentation for deep learning","volume":"6","author":"Shorten","year":"2019","journal-title":"J. Big Data"},{"key":"ref_70","doi-asserted-by":"crossref","first-page":"124","DOI":"10.1016\/j.compag.2016.01.020","article-title":"Optimal color space selection method for plant\/soil segmentation in agriculture","volume":"122","year":"2016","journal-title":"Comput. Electron. Agric."},{"key":"ref_71","doi-asserted-by":"crossref","unstructured":"Dyson, J., Mancini, A., Frontoni, E., and Zingaretti, P. (2019). Deep learning for soil and crop segmentation from remotely sensed data. Remote Sens., 11.","DOI":"10.3390\/rs11161859"},{"key":"ref_72","doi-asserted-by":"crossref","first-page":"90","DOI":"10.1016\/j.eja.2018.08.010","article-title":"Remotely assessing photosynthetic nitrogen use efficiency with in situ hyperspectral remote sensing in winter wheat","volume":"101","author":"Zhang","year":"2018","journal-title":"Eur. J. Agron."},{"key":"ref_73","doi-asserted-by":"crossref","first-page":"858","DOI":"10.1080\/01431161.2019.1650984","article-title":"Quantitative analysis and hyperspectral remote sensing of the nitrogen nutrition index in winter wheat","volume":"41","author":"Liu","year":"2020","journal-title":"Int. J. Remote Sens."},{"key":"ref_74","first-page":"1837","article-title":"Estimation of winter wheat leaf nitrogen accumulation using machine learning algorithm and visible spectral","volume":"36","author":"Cui","year":"2016","journal-title":"Guang Pu Xue Yu Guang Pu Fen Xi = Guang Pu"},{"key":"ref_75","doi-asserted-by":"crossref","unstructured":"Shah, S.H., Angel, Y., Houborg, R., Ali, S., and McCabe, M.F. (2019). A random forest machine learning approach for the retrieval of leaf chlorophyll content in wheat. Remote Sens., 11.","DOI":"10.3390\/rs11080920"},{"key":"ref_76","unstructured":"AHDB (2022, January 26). Nutrient Management Guide (RB209)|AHDB. Available online: https:\/\/ahdb.org.uk\/nutrient-management-guide-rb209."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/6\/1400\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T22:36:04Z","timestamp":1760135764000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/6\/1400"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,3,14]]},"references-count":76,"journal-issue":{"issue":"6","published-online":{"date-parts":[[2022,3]]}},"alternative-id":["rs14061400"],"URL":"https:\/\/doi.org\/10.3390\/rs14061400","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,3,14]]}}}