{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T13:46:10Z","timestamp":1776087970414,"version":"3.50.1"},"reference-count":26,"publisher":"SAGE Publications","issue":"5","license":[{"start":{"date-parts":[[2025,5,5]],"date-time":"2025-05-05T00:00:00Z","timestamp":1746403200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Computational Methods in Sciences and Engineering"],"published-print":{"date-parts":[[2025,9]]},"abstract":"<jats:p>Emotion analysis in literary texts is a complex task due to the intricate nature of language, rich contextual dependencies, and subtle emotional manifestations present in narratives, dialogues, and poetic structures. Traditional lexicon-based models frequently fail to capture the complexity of emotions in literature, leading to incorrect classifications. This study examines the ability of BERT to enhance emotion analysis for literary texts. An innovative Dynamic Honey Badger-tuned BERT (DHB-BERT) model is applied to effectively detect emotional changes in literary texts. Training and emotion analysis are ensured through a diverse dataset of Chinese literary works, with accurate samples to evaluate the model. Lemmatization and stemming serve as preprocessing techniques that reduce words to their most basic forms to preserve continuity. Then, feature extraction is performed using Word2Vec, which maintains relevant linkages and captures emotional fluctuations. Based on the sentiment analysis, texts are classified as positive, negative, neutral, or mixed. The Dynamic Honey Badger algorithm is employed for feature selection and adaptation of the dataset for improved emotion classification. The BERT model is then utilized to capture meaningful emotional variations across various literary forms. This simulation is executed using a Python platform. To evaluate the effectiveness of the simulation, various performance metrics are employed, including precision (94.76%), recall (95%), accuracy (95%), and F1-score (94.13%). Experimental results indicate that the DHB-BERT model significantly outperforms traditional algorithms, particularly in handling rhetorical language, irony, and nuanced emotional manifestations. This research highlights the potential of transformer-based architectures in advancing sentiment analysis for literary texts, providing valuable insights for computational linguistics, literary studies, and sentiment-aware systems.<\/jats:p>","DOI":"10.1177\/14727978251338004","type":"journal-article","created":{"date-parts":[[2025,5,5]],"date-time":"2025-05-05T19:57:17Z","timestamp":1746475037000},"page":"4650-4664","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":4,"title":["Improving sentiment analysis in literary texts through bidirectional encoder representations: A BERT-based approach"],"prefix":"10.1177","volume":"25","author":[{"ORCID":"https:\/\/orcid.org\/0009-0002-0559-0372","authenticated-orcid":false,"given":"Jiawen","family":"Feng","sequence":"first","affiliation":[{"name":"Xi\u2019an Kedagaoxin University, Xi\u2019an, China"}]}],"member":"179","published-online":{"date-parts":[[2025,5,5]]},"reference":[{"key":"e_1_3_4_2_2","doi-asserted-by":"publisher","DOI":"10.1515\/jisys-2022-0001"},{"key":"e_1_3_4_3_2","doi-asserted-by":"publisher","DOI":"10.1007\/s13198-025-02713-8"},{"key":"e_1_3_4_4_2","doi-asserted-by":"publisher","DOI":"10.1186\/s40537-025-01064-2"},{"key":"e_1_3_4_5_2","doi-asserted-by":"publisher","DOI":"10.3390\/info15110698"},{"key":"e_1_3_4_6_2","doi-asserted-by":"publisher","DOI":"10.17239\/jowr-2024.15.03.02"},{"key":"e_1_3_4_7_2","doi-asserted-by":"publisher","DOI":"10.1111\/ecin.13264"},{"key":"e_1_3_4_8_2","doi-asserted-by":"publisher","DOI":"10.1145\/3649451"},{"key":"e_1_3_4_9_2","doi-asserted-by":"publisher","DOI":"10.1038\/s41598-025-91622-8"},{"key":"e_1_3_4_10_2","doi-asserted-by":"publisher","DOI":"10.1007\/s10579-023-09661-4"},{"key":"e_1_3_4_11_2","doi-asserted-by":"publisher","DOI":"10.1145\/3687304"},{"key":"e_1_3_4_12_2","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0307186"},{"key":"e_1_3_4_13_2","doi-asserted-by":"publisher","DOI":"10.3390\/s23115232"},{"key":"e_1_3_4_14_2","doi-asserted-by":"publisher","DOI":"10.1007\/s13278-022-00910-y"},{"key":"e_1_3_4_15_2","doi-asserted-by":"publisher","DOI":"10.3390\/s23010506"},{"key":"e_1_3_4_16_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2024.3422268"},{"key":"e_1_3_4_17_2","doi-asserted-by":"publisher","DOI":"10.3390\/data8030046"},{"key":"e_1_3_4_18_2","doi-asserted-by":"publisher","DOI":"10.1007\/s13278-023-01102-y"},{"key":"e_1_3_4_19_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.knosys.2022.108668"},{"key":"e_1_3_4_20_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.array.2022.100157"},{"key":"e_1_3_4_21_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.eswa.2024.125533"},{"key":"e_1_3_4_22_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.ipm.2022.102872"},{"key":"e_1_3_4_23_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2024.3426604"},{"key":"e_1_3_4_24_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2024.3478835"},{"key":"e_1_3_4_25_2","doi-asserted-by":"publisher","DOI":"10.1007\/s10614-025-10901-8"},{"key":"e_1_3_4_26_2","doi-asserted-by":"publisher","DOI":"10.14569\/IJACSA.2022.01312112"},{"key":"e_1_3_4_27_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2024.3381515"}],"container-title":["Journal of Computational Methods in Sciences and Engineering"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/14727978251338004","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.1177\/14727978251338004","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/14727978251338004","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,13]],"date-time":"2026-02-13T16:31:13Z","timestamp":1771000273000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.1177\/14727978251338004"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,5,5]]},"references-count":26,"journal-issue":{"issue":"5","published-print":{"date-parts":[[2025,9]]}},"alternative-id":["10.1177\/14727978251338004"],"URL":"https:\/\/doi.org\/10.1177\/14727978251338004","relation":{},"ISSN":["1472-7978","1875-8983"],"issn-type":[{"value":"1472-7978","type":"print"},{"value":"1875-8983","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,5,5]]}}}