{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,5,16]],"date-time":"2025-05-16T04:08:06Z","timestamp":1747368486118,"version":"3.40.5"},"reference-count":0,"publisher":"University of Florida George A Smathers Libraries","license":[{"start":{"date-parts":[[2025,5,14]],"date-time":"2025-05-14T00:00:00Z","timestamp":1747180800000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-nc\/4.0"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["FLAIRS"],"abstract":"<jats:p>Variational Autoencoders (VAEs) are popular Bayesianinference models that excel at approximating complexdata distributions in a lower-dimensional latent space.Despite their widespread use, VAEs frequently facechallenges in image generation, often resulting in blurryoutputs. This outcome is primarily attributed to twofactors: the inherent probabilistic nature of the VAEframework and the oversmoothing effect induced bythe Kullback-Leibler (KL) divergence term in the lossfunction. This paper explores the integration of Wasser-stein Distance into the VAEs framework, resulting ina Wasserstein Autoencoders (WAEs) designed to mit-igate the oversmoothing issue and enhance the qual-ity of generated images. We evaluated the proposedWAEs using the Fr\u00b4echet Inception Distance (FID), In-ception Score (IS) and Structural Similarity Index Mea-sure (SSIM). The experimental results in the CelebAdataset demonstrate that WAEs significantly outperformVAEs by 25% in FID, 13.6% in IS and 15.3% in SSIM.Additionally, the evaluation considers the issue of classimbalance in the ODIR dataset, where WAEs demon-strate superior accuracy and precision in classificationtasks. Our findings highlight WAEs as a practical andefficient alternative to VAEs for image generation andreconstruction, particularly in resource-limited settings<\/jats:p>","DOI":"10.32473\/flairs.38.1.139006","type":"journal-article","created":{"date-parts":[[2025,5,15]],"date-time":"2025-05-15T15:20:44Z","timestamp":1747322444000},"source":"Crossref","is-referenced-by-count":0,"title":["From KL Divergence to Wasserstein Distance: Enhancing Autoencoders with FID Analysis"],"prefix":"10.32473","volume":"38","author":[{"family":"Laxmi Kanta Poudel","sequence":"first","affiliation":[]},{"family":"Kshtiz Aryal","sequence":"additional","affiliation":[]},{"family":"Rajendra Bahadur Thapa","sequence":"additional","affiliation":[]},{"given":"Sushil","family":"Poudel","sequence":"additional","affiliation":[]}],"member":"17357","published-online":{"date-parts":[[2025,5,14]]},"container-title":["The International FLAIRS Conference Proceedings"],"original-title":[],"link":[{"URL":"https:\/\/journals.flvc.org\/FLAIRS\/article\/download\/139006\/144088","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.flvc.org\/FLAIRS\/article\/download\/139006\/144088","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,5,15]],"date-time":"2025-05-15T15:20:44Z","timestamp":1747322444000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.flvc.org\/FLAIRS\/article\/view\/139006"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,5,14]]},"references-count":0,"URL":"https:\/\/doi.org\/10.32473\/flairs.38.1.139006","relation":{},"ISSN":["2334-0762","2334-0754"],"issn-type":[{"value":"2334-0762","type":"electronic"},{"value":"2334-0754","type":"print"}],"subject":[],"published":{"date-parts":[[2025,5,14]]}}}