{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,8]],"date-time":"2026-03-08T20:29:37Z","timestamp":1773001777107,"version":"3.50.1"},"reference-count":43,"publisher":"Wiley","issue":"1","license":[{"start":{"date-parts":[[2025,10,16]],"date-time":"2025-10-16T00:00:00Z","timestamp":1760572800000},"content-version":"vor","delay-in-days":288,"URL":"http:\/\/creativecommons.org\/licenses\/by\/4.0\/"},{"start":{"date-parts":[[2025,1,1]],"date-time":"2025-01-01T00:00:00Z","timestamp":1735689600000},"content-version":"tdm","delay-in-days":0,"URL":"http:\/\/doi.wiley.com\/10.1002\/tdm_license_1.1"}],"content-domain":{"domain":["onlinelibrary.wiley.com"],"crossmark-restriction":true},"short-container-title":["International Journal of Intelligent Systems"],"published-print":{"date-parts":[[2025,1]]},"abstract":"<jats:p>With the exponential growth of multimedia content, visual sentiment classification has emerged as a significant research area. However, it poses unique challenges due to the complexity and subjective nature of the visual information. This can be attributed to the significant presence of semantically ambiguous images within the current benchmark datasets, which enhances the performance of sentiment analysis but ignores the differences between various annotators. Moreover, most current methods concentrate on improving local emotional representations that focus on object extraction procedures rather than utilizing robust features that can effectively indicate the relevance of objects within an image through color information. Motivated by these observations, this paper addresses the need for efficient algorithms for labeling and classifying sentiment from visual images by introducing a novel hybrid model, which combines content\u2010based image retrieval (CBIR) and a multi\u2010input convolutional neural network (CNN). The CBIR model extracts color features from all dataset images, creating a numerical representation for each. It compares a query image to dataset images\u2019 features to find similar features. This process continues until the images are grouped according to color similarity, which allows accurate sentimental categories based on similar features and feelings. Then, a multi\u2010input CNN model is utilized to extract and efficiently incorporate high\u2010level contextual visual information. This model comprises 70 layers, with six branches, each containing 11 layers. It seeks to facilitate the fusion of complementary information by incorporating multiple input categories that differ according to the color features extracted by the CBIR technique. This feature enables the model to understand the target and generate more precise predictions fully. The proposed model demonstrates significant improvements over existing algorithms, as evidenced by evaluations of six benchmark datasets of varying sizes. Also, it outperforms the state of the art in sentiment classification accuracy, getting 87.88%, 84.62%, 84.1%, 83.7%, 80.7%, and 91.2% accuracy for the EmotionROI, ArtPhoto, Twitter I, Twitter II, Abstract, and FI datasets, respectively. Furthermore, the model is evaluated on two newly collected large datasets, which confirm its scalability and robustness in handling large\u2010scale sentiment classification tasks, and thus achieves a significant accuracy of 85.21% and 83.72% with the BGETTY and Twitter datasets, respectively. This paper contributes to the advancement of visual sentiment classification by offering a comprehensive solution for analyzing sentiment from images and laying the foundation for further research.<\/jats:p>","DOI":"10.1155\/int\/5581601","type":"journal-article","created":{"date-parts":[[2025,10,17]],"date-time":"2025-10-17T06:33:46Z","timestamp":1760682826000},"update-policy":"https:\/\/doi.org\/10.1002\/crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Hybrid Model for Visual Sentiment Classification Using Content\u2010Based Image Retrieval and Multi\u2010Input Convolutional Neural Network"],"prefix":"10.1155","volume":"2025","author":[{"ORCID":"https:\/\/orcid.org\/0009-0008-3563-2266","authenticated-orcid":false,"given":"Israa K.","family":"Salman Al-Tameemi","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8548-976X","authenticated-orcid":false,"given":"Mohammad-Reza","family":"Feizi-Derakhshi","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4411-6761","authenticated-orcid":false,"given":"Zari","family":"Farhadi","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2445-0033","authenticated-orcid":false,"given":"Amir-Reza","family":"Feizi-Derakhshi","sequence":"additional","affiliation":[]}],"member":"311","published-online":{"date-parts":[[2025,10,16]]},"reference":[{"key":"e_1_2_12_1_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.array.2022.100157"},{"key":"e_1_2_12_2_2","doi-asserted-by":"publisher","DOI":"10.32604\/CMC.2023.030262"},{"key":"e_1_2_12_3_2","doi-asserted-by":"publisher","DOI":"10.32604\/CMC.2023.031867"},{"key":"e_1_2_12_4_2","doi-asserted-by":"publisher","DOI":"10.3390\/SU15065003"},{"key":"e_1_2_12_5_2","doi-asserted-by":"publisher","DOI":"10.37398\/jsr.2022.660213"},{"key":"e_1_2_12_6_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.3009482"},{"key":"e_1_2_12_7_2","doi-asserted-by":"publisher","DOI":"10.1109\/TIP.2016.2612829"},{"key":"e_1_2_12_8_2","doi-asserted-by":"publisher","DOI":"10.1016\/J.KNOSYS.2019.105245"},{"key":"e_1_2_12_9_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11063-019-10027-7"},{"key":"e_1_2_12_10_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICIP.2016.7532430"},{"key":"e_1_2_12_11_2","first-page":"477","article-title":"A Deep Study of Content Based Image Retrieval System Using Sentiment Analysis","volume":"7","author":"Munjal M. N.","year":"2018","journal-title":"Int. J. Eng. Sci. Math"},{"key":"e_1_2_12_12_2","doi-asserted-by":"publisher","DOI":"10.1007\/S13735-023-00292-7"},{"key":"e_1_2_12_13_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.inffus.2018.11.004"},{"key":"e_1_2_12_14_2","doi-asserted-by":"publisher","DOI":"10.4218\/etrij.2022-0071"},{"key":"e_1_2_12_15_2","doi-asserted-by":"publisher","DOI":"10.1007\/s42979-021-00529-4"},{"key":"e_1_2_12_16_2","doi-asserted-by":"publisher","DOI":"10.1007\/s13369-020-04384-y"},{"key":"e_1_2_12_17_2","doi-asserted-by":"publisher","DOI":"10.1002\/CPE.6851"},{"key":"e_1_2_12_18_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-981-16-6289-8_37"},{"key":"e_1_2_12_19_2","doi-asserted-by":"publisher","DOI":"10.1007\/s41870-021-00806-8"},{"key":"e_1_2_12_20_2","doi-asserted-by":"publisher","DOI":"10.1016\/J.INS.2023.118938"},{"key":"e_1_2_12_21_2","doi-asserted-by":"publisher","DOI":"10.32604\/cmc.2023.040997"},{"key":"e_1_2_12_22_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11042-022-11982-5"},{"key":"e_1_2_12_23_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICSPC51351.2021.9451669"},{"key":"e_1_2_12_24_2","doi-asserted-by":"publisher","DOI":"10.1007\/s10462-022-10212-6"},{"key":"e_1_2_12_25_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICRITO48877.2020.9197899"},{"key":"e_1_2_12_26_2","doi-asserted-by":"publisher","DOI":"10.1145\/3484274.3484296"},{"key":"e_1_2_12_27_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.3024948"},{"key":"e_1_2_12_28_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11042-016-4310-5"},{"key":"e_1_2_12_29_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.2999128"},{"key":"e_1_2_12_30_2","doi-asserted-by":"publisher","DOI":"10.1007\/s00530-020-00656-7"},{"key":"e_1_2_12_31_2","doi-asserted-by":"publisher","DOI":"10.1109\/TMM.2018.2803520"},{"key":"e_1_2_12_32_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2018.05.104"},{"key":"e_1_2_12_33_2","doi-asserted-by":"publisher","DOI":"10.3390\/s21062136"},{"key":"e_1_2_12_34_2","doi-asserted-by":"publisher","DOI":"10.1016\/J.NEUCOM.2018.12.053"},{"key":"e_1_2_12_35_2","doi-asserted-by":"publisher","DOI":"10.13195\/j.kzyjc.2021.0622"},{"key":"e_1_2_12_36_2","doi-asserted-by":"publisher","DOI":"10.1109\/JBHI.2022.3187215"},{"key":"e_1_2_12_37_2","doi-asserted-by":"publisher","DOI":"10.55730\/1300-0632.4031"},{"key":"e_1_2_12_38_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v29i1.9179"},{"key":"e_1_2_12_39_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11265-019-01508-y"},{"key":"e_1_2_12_40_2","doi-asserted-by":"publisher","DOI":"10.3390\/su141610357"},{"key":"e_1_2_12_41_2","doi-asserted-by":"publisher","DOI":"10.1145\/1873951.1873965"},{"key":"e_1_2_12_42_2","doi-asserted-by":"publisher","DOI":"10.1145\/2502081.2502282"},{"key":"e_1_2_12_43_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v30i1.9987"}],"container-title":["International Journal of Intelligent Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/onlinelibrary.wiley.com\/doi\/pdf\/10.1155\/int\/5581601","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/onlinelibrary.wiley.com\/doi\/full-xml\/10.1155\/int\/5581601","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/onlinelibrary.wiley.com\/doi\/pdf\/10.1155\/int\/5581601","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,3,8]],"date-time":"2026-03-08T18:03:19Z","timestamp":1772992999000},"score":1,"resource":{"primary":{"URL":"https:\/\/onlinelibrary.wiley.com\/doi\/10.1155\/int\/5581601"}},"subtitle":[],"editor":[{"given":"Yingjie","family":"Yang","sequence":"additional","affiliation":[]}],"short-title":[],"issued":{"date-parts":[[2025,1]]},"references-count":43,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2025,1]]}},"alternative-id":["10.1155\/int\/5581601"],"URL":"https:\/\/doi.org\/10.1155\/int\/5581601","archive":["Portico"],"relation":{},"ISSN":["0884-8173","1098-111X"],"issn-type":[{"value":"0884-8173","type":"print"},{"value":"1098-111X","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,1]]},"assertion":[{"value":"2024-11-04","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-09-11","order":2,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-10-16","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}],"article-number":"5581601"}}