{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,2]],"date-time":"2026-02-02T13:38:12Z","timestamp":1770039492197,"version":"3.49.0"},"reference-count":48,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2021,4,14]],"date-time":"2021-04-14T00:00:00Z","timestamp":1618358400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["41771104"],"award-info":[{"award-number":["41771104"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Sentinel-2 images have been widely used in studying land surface phenomena and processes, but they inevitably suffer from cloud contamination. To solve this critical optical data availability issue, it is ideal to fuse Sentinel-1 and Sentinel-2 images to create fused, cloud-free Sentinel-2-like images for facilitating land surface applications. In this paper, we propose a new data fusion model, the Multi-channels Conditional Generative Adversarial Network (MCcGAN), based on the conditional generative adversarial network, which is able to convert images from Domain A to Domain B. With the model, we were able to generate fused, cloud-free Sentinel-2-like images for a target date by using a pair of reference Sentinel-1\/Sentinel-2 images and target-date Sentinel-1 images as inputs. In order to demonstrate the superiority of our method, we also compared it with other state-of-the-art methods using the same data. To make the evaluation more objective and reliable, we calculated the root-mean-square-error (RSME), R2, Kling\u2013Gupta efficiency (KGE), structural similarity index (SSIM), spectral angle mapper (SAM), and peak signal-to-noise ratio (PSNR) of the simulated Sentinel-2 images generated by different methods. The results show that the simulated Sentinel-2 images generated by the MCcGAN have a higher quality and accuracy than those produced via the previous methods.<\/jats:p>","DOI":"10.3390\/rs13081512","type":"journal-article","created":{"date-parts":[[2021,4,14]],"date-time":"2021-04-14T15:30:39Z","timestamp":1618414239000},"page":"1512","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":8,"title":["Deriving Non-Cloud Contaminated Sentinel-2 Images with RGB and Near-Infrared Bands from Sentinel-1 Images Based on a Conditional Generative Adversarial Network"],"prefix":"10.3390","volume":"13","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-5109-3812","authenticated-orcid":false,"given":"Quan","family":"Xiong","sequence":"first","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"},{"name":"Center for Spatial Information Science and Systems, George Mason University, 4400 University Dr., Fairfax, VA 22030, USA"}]},{"given":"Liping","family":"Di","sequence":"additional","affiliation":[{"name":"Center for Spatial Information Science and Systems, George Mason University, 4400 University Dr., Fairfax, VA 22030, USA"}]},{"given":"Quanlong","family":"Feng","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"},{"name":"Key Laboratory of Remote Sensing for Agri-Hazards, Ministry of Agriculture, Beijing 100083, China"},{"name":"Key Laboratory of Agricultural Land Quality and Monitoring, Ministry of Natural Resources, Beijing 100083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-7025-2137","authenticated-orcid":false,"given":"Diyou","family":"Liu","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8755-4115","authenticated-orcid":false,"given":"Wei","family":"Liu","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"}]},{"given":"Xuli","family":"Zan","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"}]},{"given":"Lin","family":"Zhang","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"}]},{"given":"Dehai","family":"Zhu","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"},{"name":"Key Laboratory of Remote Sensing for Agri-Hazards, Ministry of Agriculture, Beijing 100083, China"},{"name":"Key Laboratory of Agricultural Land Quality and Monitoring, Ministry of Natural Resources, Beijing 100083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8214-8345","authenticated-orcid":false,"given":"Zhe","family":"Liu","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"},{"name":"Key Laboratory of Remote Sensing for Agri-Hazards, Ministry of Agriculture, Beijing 100083, China"},{"name":"Key Laboratory of Agricultural Land Quality and Monitoring, Ministry of Natural Resources, Beijing 100083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-8068-9415","authenticated-orcid":false,"given":"Xiaochuang","family":"Yao","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"},{"name":"Key Laboratory of Remote Sensing for Agri-Hazards, Ministry of Agriculture, Beijing 100083, China"},{"name":"Key Laboratory of Agricultural Land Quality and Monitoring, Ministry of Natural Resources, Beijing 100083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6347-4973","authenticated-orcid":false,"given":"Xiaodong","family":"Zhang","sequence":"additional","affiliation":[{"name":"College of Land Science and Technology, China Agricultural University, Beijing 100083, China"},{"name":"Key Laboratory of Remote Sensing for Agri-Hazards, Ministry of Agriculture, Beijing 100083, China"},{"name":"Key Laboratory of Agricultural Land Quality and Monitoring, Ministry of Natural Resources, Beijing 100083, China"}]}],"member":"1968","published-online":{"date-parts":[[2021,4,14]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"37","DOI":"10.1109\/MGRS.2014.2319270","article-title":"The European Space Agency\u2019s Earth Observation Program","volume":"2","author":"Desnos","year":"2014","journal-title":"IEEE Geosci. Remote Sens. Mag."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Ren, T., Liu, Z., Zhang, L., Liu, D., Xi, X., Kang, Y., Zhao, Y., Zhang, C., Li, S., and Zhang, X. (2020). Early Identification of Seed Maize and Common Maize Production Fields Using Sentinel-2 Images. Remote Sens., 12.","DOI":"10.3390\/rs12132140"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"16062","DOI":"10.3390\/rs71215815","article-title":"Building a data set over 12 globally distributed sites to support the development of agriculture monitoring applications with Sentinel-2","volume":"7","author":"Bontemps","year":"2015","journal-title":"Remote Sens."},{"key":"ref_4","unstructured":"Jel\u00ednek, Z., Ma\u0161ek, J., Star\u1ef3, K., Luk\u00e1\u0161, J., and Kumh\u00e1lov\u00e1, J. (2020, August 10). Winter wheat, Winter Rape and Poppy Crop Growth Evaluation with the Help of Remote and Proximal Sensing Measurements. Available online: https:\/\/doi.org\/10.15159\/ar.20.176."},{"key":"ref_5","first-page":"379","article-title":"Estimating Grassland Parameters from Sentinel-2: A Model Comparison Study","volume":"88","author":"Schwieder","year":"2020","journal-title":"PFG J. Photogramm. Remote Sens. Geoinf. Sci."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1437","DOI":"10.3390\/w7041437","article-title":"Urban flood mapping based on unmanned aerial vehicle remote sensing and random forest classifier\u2014A case of Yuyao, China","volume":"7","author":"Feng","year":"2015","journal-title":"Water"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Yang, N., Liu, D., Feng, Q., Xiong, Q., Zhang, L., Ren, T., Zhao, Y., Zhu, D., and Huang, J. (2019). Large-scale crop mapping based on machine learning and parallel computation with grids. Remote Sens., 11.","DOI":"10.3390\/rs11121500"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"112001","DOI":"10.1016\/j.rse.2020.112001","article-title":"Thick cloud removal in Landsat images based on autoregression of Landsat time-series data","volume":"249","author":"Cao","year":"2020","journal-title":"Remote Sens. Environ."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Zhang, L., Liu, Z., Liu, D., Xiong, Q., Yang, N., Ren, T., Zhang, C., Zhang, X., and Li, S. (2019). Crop Mapping Based on Historical Samples and New Training Samples Generation in Heilongjiang Province, China. Sustainability, 11.","DOI":"10.3390\/su11185052"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Tan, Z., Yue, P., Di, L., and Tang, J. (2018). Deriving high spatiotemporal remote sensing images using deep convolutional network. Remote Sens., 10.","DOI":"10.3390\/rs10071066"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"527","DOI":"10.3390\/rs10040527","article-title":"Spatiotemporal fusion of multisource remote sensing data: Literature survey, taxonomy, principles, applications, and future directions","volume":"10","author":"Zhu","year":"2018","journal-title":"Remote Sens."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1988","DOI":"10.1016\/j.rse.2009.05.011","article-title":"Generation of dense time series synthetic Landsat data through data blending with MODIS using a spatial and temporal adaptive reflectance fusion model","volume":"113","author":"Hilker","year":"2009","journal-title":"Remote Sens. Environ."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"55","DOI":"10.1016\/j.rse.2014.02.003","article-title":"Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data","volume":"145","author":"Weng","year":"2014","journal-title":"Remote Sens. Environ."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Feng, Q., Yang, J., Zhu, D., Liu, J., Guo, H., Bayartungalag, B., and Li, B. (2019). Integrating multitemporal sentinel-1\/2 data for coastal land cover classification using a multibranch convolutional neural network: A case of the Yellow River Delta. Remote Sens., 11.","DOI":"10.3390\/rs11091006"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"61","DOI":"10.1016\/j.isprsjprs.2020.04.007","article-title":"Unsupervised change detection between SAR images based on hypergraphs","volume":"164","author":"Wang","year":"2020","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"9","DOI":"10.1016\/j.rse.2011.05.028","article-title":"GMES Sentinel-1 mission","volume":"120","author":"Torres","year":"2012","journal-title":"Remote Sens. Environ."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"60338","DOI":"10.1109\/ACCESS.2020.2977103","article-title":"A SAR-to-Optical Image Translation Method Based on Conditional Generation Adversarial Network (cGAN)","volume":"8","author":"Li","year":"2020","journal-title":"IEEE Access"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Fuentes Reyes, M., Auer, S., Merkle, N., Henry, C., and Schmitt, M. (2019). Sar-to-optical image translation based on conditional generative adversarial networks\u2014Optimization, opportunities and limits. Remote Sens., 11.","DOI":"10.3390\/rs11172067"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"504","DOI":"10.1126\/science.1127647","article-title":"Reducing the dimensionality of data with neural networks","volume":"313","author":"Hinton","year":"2006","journal-title":"Science"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Wang, P., and Patel, V.M. (2018, January 23\u201327). Generating high quality visible images from SAR images using CNNs. Proceedings of the 2018 IEEE Radar Conference (RadarConf18), Oklahoma City, OK, USA.","DOI":"10.1109\/RADAR.2018.8378622"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Feng, Q., Yang, J., Liu, Y., Ou, C., Zhu, D., Niu, B., Liu, J., and Li, B. (2020). Multi-Temporal Unmanned Aerial Vehicle Remote Sensing for Vegetable Mapping Using an Attention-Based Recurrent Convolutional Neural Network. Remote Sens., 12.","DOI":"10.3390\/rs12101668"},{"key":"ref_22","unstructured":"Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., and Bengio, Y. (2014). Generative adversarial nets. Advances in Neural Information Processing Systems, MIT Press."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Isola, P., Zhu, J.Y., Zhou, T., and Efros, A.A. (2017, January 21\u201326). Image-to-image translation with conditional adversarial networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA.","DOI":"10.1109\/CVPR.2017.632"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Zhu, J.Y., Park, T., Isola, P., and Efros, A.A. (2017, January 22\u201329). Unpaired image-to-image translation using cycle-consistent adversarial networks. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.244"},{"key":"ref_25","unstructured":"Fu, S., Xu, F., and Jin, Y.Q. (2019). Reciprocal translation between SAR and optical remote sensing images with cascaded-residual adversarial networks. arXiv."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Yi, Z., Zhang, H., Tan, P., and Gong, M. (2017, January 22\u201329). Dualgan: Unsupervised dual learning for image-to-image translation. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.310"},{"key":"ref_27","unstructured":"Radford, A., Metz, L., and Chintala, S. (2015). Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Huang, B., Zhi, L., Yang, C., Sun, F., and Song, Y. (2020, January 1\u20135). Single Satellite Optical Imagery Dehazing using SAR Image Prior Based on conditional Generative Adversarial Networks. Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Snowmass Village, CO, USA.","DOI":"10.1109\/WACV45572.2020.9093471"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Liu, L., and Lei, B. (2018, January 22\u201327). Can SAR images and optical images transfer with each other?. Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain.","DOI":"10.1109\/IGARSS.2018.8518921"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Grohnfeldt, C., Schmitt, M., and Zhu, X. (2018, January 22\u201327). A conditional generative adversarial network to fuse sar and multispectral optical data for cloud removal from sentinel-2 images. Proceedings of the IGARSS 2018\u20142018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain.","DOI":"10.1109\/IGARSS.2018.8519215"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"129136","DOI":"10.1109\/ACCESS.2019.2939649","article-title":"SAR-to-optical image translation using supervised cycle-consistent adversarial networks","volume":"7","author":"Wang","year":"2019","journal-title":"IEEE Access"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1811","DOI":"10.1109\/JSTARS.2018.2803212","article-title":"Exploring the potential of conditional adversarial networks for optical and SAR image matching","volume":"11","author":"Merkle","year":"2018","journal-title":"IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Enomoto, K., Sakurada, K., Wang, W., Kawaguchi, N., Matsuoka, M., and Nakamura, R. (2018, January 22\u201327). Image translation between SAR and optical imagery with generative adversarial nets. Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain.","DOI":"10.1109\/IGARSS.2018.8518719"},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"373","DOI":"10.1016\/j.isprsjprs.2020.06.021","article-title":"Thin cloud removal in optical remote sensing images based on generative adversarial networks and physical model of cloud distortion","volume":"166","author":"Li","year":"2020","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Wang, X., Xu, G., Wang, Y., Lin, D., Li, P., and Lin, X. (August, January 28). Thin and Thick Cloud Removal on Remote Sensing Image by Conditional Generative Adversarial Network. Proceedings of the IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan.","DOI":"10.1109\/IGARSS.2019.8897958"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Gao, J., Yuan, Q., Li, J., Zhang, H., and Su, X. (2020). Cloud Removal with Fusion of High Resolution Optical and SAR Images Using Generative Adversarial Networks. Remote Sens., 12.","DOI":"10.3390\/rs12010191"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"333","DOI":"10.1016\/j.isprsjprs.2020.05.013","article-title":"Cloud removal in Sentinel-2 imagery using a deep residual neural network and SAR-optical data fusion","volume":"166","author":"Meraner","year":"2020","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"4956","DOI":"10.1002\/mp.14427","article-title":"Deep Learning Segmentation of General Interventional Tools in Two-dimensional Ultrasound Images","volume":"47","author":"Gillies","year":"2020","journal-title":"Med. Phys."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"299","DOI":"10.1016\/j.compeleceng.2019.04.012","article-title":"HIC-net: A deep convolutional neural network model for classification of histopathological breast images","volume":"76","author":"Akdemir","year":"2019","journal-title":"Comput. Electr. Eng."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"5","DOI":"10.5194\/isprs-annals-IV-1-5-2018","article-title":"Sar to optical image synthesis for cloud removal with generative adversarial networks","volume":"4","author":"Bermudez","year":"2018","journal-title":"ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci."},{"key":"ref_41","unstructured":"Mirza, M., and Osindero, S. (2014). Conditional generative adversarial nets. arXiv."},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Ronneberger, O., Fischer, P., and Brox, T. (2015). U-net: Convolutional networks for biomedical image segmentation. International Conference on Medical Image Computing and Computer-Assisted Intervention, Springer.","DOI":"10.1007\/978-3-319-24574-4_28"},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"He, W., and Yokoya, N. (2018). Multi-temporal sentinel-1 and-2 data fusion for optical image simulation. ISPRS Int. J. Geo-Inf., 7.","DOI":"10.3390\/ijgi7100389"},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2016, January 27\u201330). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"80","DOI":"10.1016\/j.jhydrol.2009.08.003","article-title":"Decomposition of the mean squared error and NSE performance criteria: Implications for improving hydrological modelling","volume":"377","author":"Gupta","year":"2009","journal-title":"J. Hydrol."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"600","DOI":"10.1109\/TIP.2003.819861","article-title":"Image quality assessment: From error visibility to structural similarity","volume":"13","author":"Wang","year":"2004","journal-title":"IEEE Trans. Image Process."},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Hore, A., and Ziou, D. (2010, January 23\u201326). Image quality metrics: PSNR vs. SSIM. Proceedings of the 2010 20th International Conference on Pattern Recognition, Istanbul, Turkey.","DOI":"10.1109\/ICPR.2010.579"},{"key":"ref_48","unstructured":"Yuhas, R.H., Goetz, A.F., and Boardman, J.W. (2020, August 10). Discrimination among Semi-Arid Landscape Endmembers Using the Spectral Angle Mapper (SAM) Algorithm. Available online: https:\/\/core.ac.uk\/download\/pdf\/42789956.pdf."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/8\/1512\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T05:47:59Z","timestamp":1760161679000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/8\/1512"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,4,14]]},"references-count":48,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2021,4]]}},"alternative-id":["rs13081512"],"URL":"https:\/\/doi.org\/10.3390\/rs13081512","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,4,14]]}}}