{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,8]],"date-time":"2026-01-08T07:03:40Z","timestamp":1767855820711,"version":"3.49.0"},"reference-count":39,"publisher":"MDPI AG","issue":"12","license":[{"start":{"date-parts":[[2023,6,7]],"date-time":"2023-06-07T00:00:00Z","timestamp":1686096000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["62201149"],"award-info":[{"award-number":["62201149"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["X200051UZ200"],"award-info":[{"award-number":["X200051UZ200"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/100017130","name":"Ji Hua Laboratory","doi-asserted-by":"publisher","award":["62201149"],"award-info":[{"award-number":["62201149"]}],"id":[{"id":"10.13039\/100017130","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/100017130","name":"Ji Hua Laboratory","doi-asserted-by":"publisher","award":["X200051UZ200"],"award-info":[{"award-number":["X200051UZ200"]}],"id":[{"id":"10.13039\/100017130","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Infrared (IR) and visible image fusion is an important data fusion and image processing technique that can accurately and comprehensively integrate the thermal radiation and texture details of source images. However, existing methods neglect the high-contrast fusion problem, leading to suboptimal fusion performance when thermal radiation target information in IR images is replaced by high-contrast information in visible images. To address this limitation, we propose a contrast-balanced framework for IR and visible image fusion. Specifically, a novel contrast balance strategy is proposed to process visible images and reduce energy while allowing for detailed compensation of overexposed areas. Moreover, a contrast-preserving guided filter is proposed to decompose the image into energy-detail layers to reduce high contrast and filter information. To effectively extract the active information in the detail layer and the brightness information in the energy layer, we proposed a new weighted energy-of-Laplacian operator and a Gaussian distribution of the image entropy scheme to fuse the detail and energy layers, respectively. The fused result was obtained by adding the detail and energy layers. Extensive experimental results demonstrate that the proposed method can effectively reduce the high contrast and highlighted target information in an image while simultaneously preserving details. In addition, the proposed method exhibited superior performance compared to the state-of-the-art methods in both qualitative and quantitative assessments.<\/jats:p>","DOI":"10.3390\/rs15122969","type":"journal-article","created":{"date-parts":[[2023,6,8]],"date-time":"2023-06-08T02:02:28Z","timestamp":1686189748000},"page":"2969","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":13,"title":["CBFM: Contrast Balance Infrared and Visible Image Fusion Based on Contrast-Preserving Guided Filter"],"prefix":"10.3390","volume":"15","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-9859-8191","authenticated-orcid":false,"given":"Xilai","family":"Li","sequence":"first","affiliation":[{"name":"Guangdong-Hong Kong-Macao Joint Laboratory for Intelligent Micro-Nano Optoelectronic Technology, School of Physics and Optoelectronic Engineering, Foshan University, Foshan 528225, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4672-1527","authenticated-orcid":false,"given":"Xiaosong","family":"Li","sequence":"additional","affiliation":[{"name":"Guangdong-Hong Kong-Macao Joint Laboratory for Intelligent Micro-Nano Optoelectronic Technology, School of Physics and Optoelectronic Engineering, Foshan University, Foshan 528225, China"}]},{"given":"Wuyang","family":"Liu","sequence":"additional","affiliation":[{"name":"Guangdong-Hong Kong-Macao Joint Laboratory for Intelligent Micro-Nano Optoelectronic Technology, School of Physics and Optoelectronic Engineering, Foshan University, Foshan 528225, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,6,7]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"107087","DOI":"10.1016\/j.knosys.2021.107087","article-title":"Joint image fusion and denoising via three-layer decomposition and sparse representation","volume":"224","author":"Li","year":"2021","journal-title":"Knowl.-Based Syst."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Li, X., Wang, X., Cheng, X., Tan, H., and Li, X. (2022). Multi-Focus Image Fusion Based on Hessian Matrix Decomposition and Salient Difference Focus Detection. Entropy, 24.","DOI":"10.3390\/e24111527"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Liu, X., Gao, H., Miao, Q., Xi, Y., Ai, Y., and Gao, D. (2022). MFST: Multi-Modal Feature Self-Adaptive Transformer for Infrared and Visible Image Fusion. Remote Sens., 14.","DOI":"10.3390\/rs14133233"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"4070","DOI":"10.1109\/TIP.2021.3069339","article-title":"Different Input Resolutions and Arbitrary Output Resolution: A Meta Learning-Based Deep Framework for Infrared and Visible Image Fusion","volume":"30","author":"Li","year":"2021","journal-title":"IEEE Trans. Image Process."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1016\/j.inffus.2023.02.011","article-title":"Feature dynamic alignment and refinement for infrared\u2013visible image fusion: Translation robust fusion","volume":"95","author":"Li","year":"2023","journal-title":"Inf. Fusion"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Qi, B., Jin, L., Li, G., Zhang, Y., Li, Q., Bi, G., and Wang, W. (2022). Infrared and visible image fusion based on co-occurrence analysis shearlet transform. Remote Sens., 14.","DOI":"10.3390\/rs14020283"},{"key":"ref_7","first-page":"374","article-title":"Nonrigid feature matching for remote sensing images via probabilistic inference with global and local regularizations","volume":"13","author":"Zhou","year":"2016","journal-title":"IEEE Geosci. Remote Sens. Lett."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"9645","DOI":"10.1109\/TIM.2020.3005230","article-title":"NestFuse: An Infrared and Visible Image Fusion Architecture Based on Nest Connection and Spatial\/Channel Attention Models","volume":"69","author":"Li","year":"2020","journal-title":"IEEE Trans. Instrum. Meas."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"502","DOI":"10.1109\/TPAMI.2020.3012548","article-title":"U2Fusion: A Unified Unsupervised Image Fusion Network","volume":"44","author":"Xu","year":"2022","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"99","DOI":"10.1016\/j.inffus.2019.07.011","article-title":"IFCNN: A general image fusion framework based on convolutional neural network","volume":"54","author":"Zhang","year":"2020","journal-title":"Inf. Fusion"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"105","DOI":"10.1109\/TCSVT.2021.3056725","article-title":"Learning a Deep Multi-Scale Feature Ensemble and an Edge-Attention Guidance for Image Fusion","volume":"32","author":"Liu","year":"2022","journal-title":"IEEE Trans. Circuits Syst. Video Technol."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"64","DOI":"10.1016\/j.ins.2019.08.066","article-title":"Infrared and visible image fusion based on target-enhanced multiscale transform decomposition","volume":"508","author":"Chen","year":"2020","journal-title":"Inf. Sci."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"104112","DOI":"10.1016\/j.infrared.2022.104112","article-title":"Infrared and visible image fusion based on relative total variation decomposition","volume":"123","author":"Chen","year":"2022","journal-title":"Infrared Phys. Technol."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"1460","DOI":"10.1109\/TMM.2021.3065496","article-title":"A Total Variation With Joint Norms For Infrared and Visible Image Fusion","volume":"24","author":"Nie","year":"2022","journal-title":"IEEE Trans. Multimed."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"41","DOI":"10.1016\/j.inffus.2021.04.005","article-title":"Attribute filter based infrared and visible image fusion","volume":"75","author":"Mo","year":"2021","journal-title":"Inf. Fusion"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Li, H., and Wu, X.-J. (2018). Infrared and visible image fusion using latent low-rank representation. arXiv.","DOI":"10.1109\/ICPR.2018.8546006"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"104129","DOI":"10.1016\/j.infrared.2022.104129","article-title":"Infrared polarization and intensity image fusion method based on multi-decomposition LatLRR","volume":"123","author":"Liu","year":"2022","journal-title":"Infrared Phys. Technol."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"3963010","DOI":"10.1109\/JPHOT.2022.3227159","article-title":"Underwater Image Enhancement Based on Color Balance and Multi-Scale Fusion","volume":"14","author":"Chen","year":"2022","journal-title":"IEEE Photonics J."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"3934","DOI":"10.1109\/TMM.2021.3110483","article-title":"Joint Contrast Enhancement and Exposure Fusion for Real-World Image Dehazing","volume":"24","author":"Liu","year":"2022","journal-title":"IEEE Trans. Multimed."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"4832","DOI":"10.1109\/TIP.2020.2975909","article-title":"Lower Bound on Transmission Using Non-Linear Bounding Function in Single Image Dehazing","volume":"29","author":"Raikwar","year":"2020","journal-title":"IEEE Trans. Image Process."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"130","DOI":"10.1016\/j.patcog.2018.02.005","article-title":"Joint medical image fusion, denoising and enhancement via discriminative low-rank sparse dictionaries learning","volume":"79","author":"Li","year":"2018","journal-title":"Pattern Recognit."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"3064","DOI":"10.1364\/AO.58.003064","article-title":"Infrared and visible image perceptive fusion through multi-level Gaussian curvature filtering image decomposition","volume":"58","author":"Tan","year":"2019","journal-title":"Appl. Opt."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"2864","DOI":"10.1109\/TIP.2013.2244222","article-title":"Image Fusion with Guided Filtering","volume":"22","author":"Li","year":"2013","journal-title":"IEEE Trans. Image Process."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1786","DOI":"10.1109\/TIP.2017.2658954","article-title":"Curvature Filters Efficiently Reduce Certain Variational Energies","volume":"26","author":"Gong","year":"2017","journal-title":"IEEE Trans. Image Process."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"120","DOI":"10.1109\/TIP.2014.2371234","article-title":"Weighted Guided Image Filtering","volume":"24","author":"Li","year":"2015","journal-title":"IEEE Trans. Image Process."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"109","DOI":"10.1016\/j.inffus.2021.02.008","article-title":"An infrared and visible image fusion method based on multi-scale transformation and norm optimization","volume":"71","author":"Li","year":"2021","journal-title":"Inf. Fusion"},{"key":"ref_27","first-page":"838","article-title":"No-Reference Quality Assessment of Contrast-Distorted Images Based on Natural Scene Statistics","volume":"22","author":"Fang","year":"2015","journal-title":"IEEE Signal Process. Lett."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Ou, F.-Z., Wang, Y.-G., and Zhu, G. (2019, January 22\u201325). A novel blind image quality assessment method based on refined natural scene statistics. Proceedings of the 2019 IEEE International Conference on Image Processing (ICIP), Taipei, Taiwan.","DOI":"10.1109\/ICIP.2019.8803047"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"493","DOI":"10.1016\/j.patrec.2006.09.005","article-title":"Evaluation of focus measures in multi-focus image fusion","volume":"28","author":"Huang","year":"2007","journal-title":"Pattern Recognit. Lett."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Fredembach, C., and S\u00fcsstrunk, S. (2008, January 10\u201315). Colouring the near infrared. Proceedings of the IS&T\/SID 16th Color Imaging Conference, Portland, OH, USA.","DOI":"10.2352\/CIC.2008.16.1.art00034"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"477","DOI":"10.1016\/j.inffus.2022.10.034","article-title":"DIVFusion: Darkness-free infrared and visible image fusion","volume":"91","author":"Tang","year":"2023","journal-title":"Inf. Fusion"},{"key":"ref_32","first-page":"5005014","article-title":"GANMcC: A Generative Adversarial Network With Multiclassification Constraints for Infrared and Visible Image Fusion","volume":"70","author":"Ma","year":"2021","journal-title":"IEEE Trans. Instrum. Meas."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"635","DOI":"10.1109\/TMM.2021.3129609","article-title":"Semantic-supervised Infrared and Visible Image Fusion via a Dual-discriminator Generative Adversarial Network","volume":"25","author":"Zhou","year":"2021","journal-title":"IEEE Trans. Multimed."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Wang, D., Liu, J., Fan, X., and Liu, R. (2022). Unsupervised misaligned infrared and visible image fusion via cross-modality image generation and registration. arXiv.","DOI":"10.24963\/ijcai.2022\/487"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"308","DOI":"10.1049\/el:20000267","article-title":"Objective image fusion performance measure","volume":"36","author":"Xydeas","year":"2000","journal-title":"Electron. Lett."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"94","DOI":"10.1109\/TPAMI.2011.109","article-title":"Objective Assessment of Multiresolution Image Fusion Algorithms for Context Enhancement in Night Vision: A Comparative Study","volume":"34","author":"Liu","year":"2012","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"189","DOI":"10.1109\/TNN.2008.2005601","article-title":"Normalized Mutual Information Feature Selection","volume":"20","author":"Estevez","year":"2009","journal-title":"IEEE Trans. Neural Netw."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"199","DOI":"10.1016\/j.optcom.2014.12.032","article-title":"Detail preserved fusion of visible and infrared images using regional saliency extraction and multi-scale image decomposition","volume":"341","author":"Cui","year":"2015","journal-title":"Opt. Commun."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"177","DOI":"10.1016\/j.inffus.2005.04.003","article-title":"A new metric based on extended spatial frequency and its application to DWT based fusion algorithms","volume":"8","author":"Zheng","year":"2007","journal-title":"Inf. Fusion"}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/12\/2969\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T19:50:00Z","timestamp":1760125800000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/12\/2969"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,6,7]]},"references-count":39,"journal-issue":{"issue":"12","published-online":{"date-parts":[[2023,6]]}},"alternative-id":["rs15122969"],"URL":"https:\/\/doi.org\/10.3390\/rs15122969","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,6,7]]}}}