{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,8,2]],"date-time":"2025-08-02T17:47:39Z","timestamp":1754156859417,"version":"3.41.2"},"reference-count":50,"publisher":"Emerald","issue":"3","license":[{"start":{"date-parts":[[2023,12,29]],"date-time":"2023-12-29T00:00:00Z","timestamp":1703808000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/www.emerald.com\/insight\/site-policies"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["DTA"],"published-print":{"date-parts":[[2024,7,19]]},"abstract":"<jats:sec><jats:title content-type=\"abstract-subheading\">Purpose<\/jats:title><jats:p>Aspect-based sentiment analysis (ASA) is a task of sentiment analysis that requires predicting aspect sentiment polarity for a given sentence. Many traditional techniques use graph-based mechanisms, which reduce prediction accuracy and introduce large amounts of noise. The other problem with graph-based mechanisms is that for some context words, the feelings change depending on the aspect, and therefore it is impossible to draw conclusions on their own. ASA is challenging because a given sentence can reveal complicated feelings about multiple aspects.<\/jats:p><\/jats:sec><jats:sec><jats:title content-type=\"abstract-subheading\">Design\/methodology\/approach<\/jats:title><jats:p>This research proposed an optimized attention-based DL model known as optimized aspect and self-attention aware long short-term memory for target-based semantic analysis (OAS-LSTM-TSA). The proposed model goes through three phases: preprocessing, aspect extraction and classification. Aspect extraction is done using a double-layered convolutional neural network (DL-CNN). The optimized aspect and self-attention embedded LSTM (OAS-LSTM) is used to classify aspect sentiment into three classes: positive, neutral and negative.<\/jats:p><\/jats:sec><jats:sec><jats:title content-type=\"abstract-subheading\">Findings<\/jats:title><jats:p>To detect and classify sentiment polarity of the aspect using the optimized aspect and self-attention embedded LSTM (OAS-LSTM) model. The results of the proposed method revealed that it achieves a high accuracy of 95.3 per cent for the restaurant dataset and 96.7 per cent for the laptop dataset.<\/jats:p><\/jats:sec><jats:sec><jats:title content-type=\"abstract-subheading\">Originality\/value<\/jats:title><jats:p>The novelty of the research work is the addition of two effective attention layers in the network model, loss function reduction and accuracy enhancement, using a recent efficient optimization algorithm. The loss function in OAS-LSTM is minimized using the adaptive pelican optimization algorithm, thus increasing the accuracy rate. The performance of the proposed method is validated on four real-time datasets, Rest14, Lap14, Rest15 and Rest16, for various performance metrics.<\/jats:p><\/jats:sec>","DOI":"10.1108\/dta-10-2022-0408","type":"journal-article","created":{"date-parts":[[2023,12,29]],"date-time":"2023-12-29T10:48:57Z","timestamp":1703846937000},"page":"447-471","source":"Crossref","is-referenced-by-count":0,"title":["Optimized aspect and self-attention aware LSTM for target-based semantic analysis (OAS-LSTM-TSA)"],"prefix":"10.1108","volume":"58","author":[{"given":"B.","family":"Vasavi","sequence":"first","affiliation":[]},{"given":"P.","family":"Dileep","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6759-2416","authenticated-orcid":false,"given":"Ulligaddala","family":"Srinivasarao","sequence":"additional","affiliation":[]}],"member":"140","published-online":{"date-parts":[[2023,12,29]]},"reference":[{"journal-title":"Applied Computing and Informatics","article-title":"Aspect-based sentiment analysis using smart government review data","year":"2020","key":"key2024072308212843700_ref001"},{"key":"key2024072308212843700_ref002","doi-asserted-by":"crossref","first-page":"77820","DOI":"10.1109\/ACCESS.2020.2990306","article-title":"Combination of recursive and recurrent neural networks for aspect-based sentiment analysis using inter-aspect relations","volume":"8","year":"2020","journal-title":"IEEE Access"},{"issue":"3","key":"key2024072308212843700_ref003","first-page":"e137271","article-title":"Aspect-based sentiment classification model employing whale-optimized adaptive neural network","volume":"69","year":"2021","journal-title":"Bulletin of the Polish Academy of Sciences Technical Sciences"},{"issue":"3","key":"key2024072308212843700_ref004","doi-asserted-by":"crossref","first-page":"195","DOI":"10.26599\/BDMA.2021.9020003","article-title":"A multi-task multi-view neural network for end-to-end aspect-based sentiment analysis","volume":"4","year":"2021","journal-title":"Big Data Mining and Analytics"},{"first-page":"452","article-title":"Recurrent attention network on memory for aspect sentiment analysis","year":"2017","key":"key2024072308212843700_ref005"},{"first-page":"547","article-title":"Transfer capsule network for aspect level sentiment classification","year":"2019","key":"key2024072308212843700_ref006"},{"first-page":"3685","article-title":"Relation-aware collaborative learning for unified aspect-based sentiment analysis","year":"2020","key":"key2024072308212843700_ref007"},{"key":"key2024072308212843700_ref008","doi-asserted-by":"crossref","first-page":"272","DOI":"10.1016\/j.eswa.2018.10.003","article-title":"Deep learning for aspect-based sentiment analysis: a comparative review","volume":"118","year":"2019","journal-title":"Expert Systems with Applications"},{"first-page":"5489","article-title":"Capsule network with interactive attention for aspect-level sentiment classification","year":"2019","key":"key2024072308212843700_ref009"},{"first-page":"3433","article-title":"Multi-grained attention network for aspect-level sentiment classification","year":"2018","key":"key2024072308212843700_ref010"},{"journal-title":"International Journal of Research in Marketing","article-title":"More than a feeling: accuracy and application of sentiment analysis","year":"2022","key":"key2024072308212843700_ref011"},{"year":"2018","key":"key2024072308212843700_ref012","first-page":"1"},{"year":"2019","key":"key2024072308212843700_ref013","article-title":"An interactive multi-task learning network for end-to-end aspect-based sentiment analysis"},{"first-page":"187","article-title":"Aspect-based sentiment analysis using bert","year":"2019","key":"key2024072308212843700_ref014"},{"first-page":"6280","article-title":"A challenge dataset and effective models for aspect-based sentiment analysis","year":"2019","key":"key2024072308212843700_ref015"},{"first-page":"8797","article-title":"Adversarial training for aspect-based sentiment analysis with Bert","year":"2021","key":"key2024072308212843700_ref016"},{"key":"key2024072308212843700_ref017","doi-asserted-by":"crossref","first-page":"106799","DOI":"10.1109\/ACCESS.2020.3000739","article-title":"Weakly supervised framework for aspect-based sentiment analysis on students' reviews of MOOCs","volume":"8","year":"2020","journal-title":"IEEE Access"},{"issue":"8","key":"key2024072308212843700_ref018","doi-asserted-by":"crossref","first-page":"3221","DOI":"10.1007\/s00521-019-04105-z","article-title":"Aspect-based sentiment analysis using deep networks and stochastic optimization","volume":"32","year":"2020","journal-title":"Neural Computing & Applications"},{"key":"key2024072308212843700_ref019","doi-asserted-by":"crossref","first-page":"106435","DOI":"10.1016\/j.asoc.2020.106435","article-title":"User reviews: sentiment analysis using lexicon integrated two-channel CNN-LSTM family models","volume":"94","year":"2020","journal-title":"Applied Soft Computing"},{"key":"key2024072308212843700_ref020","doi-asserted-by":"crossref","first-page":"46868","DOI":"10.1109\/ACCESS.2020.2978511","article-title":"Enhancing BERT representation with context-aware embedding for aspect-based sentiment analysis","volume":"8","year":"2020","journal-title":"IEEE Access"},{"year":"2021","key":"key2024072308212843700_ref021","article-title":"Learning implicit sentiment in aspect-based sentiment analysis with supervised contrastive pre-training"},{"key":"key2024072308212843700_ref022","doi-asserted-by":"crossref","first-page":"107643","DOI":"10.1016\/j.knosys.2021.107643","article-title":"Aspect-based sentiment analysis via affective knowledge enhanced graph convolutional networks","volume":"235","year":"2022","journal-title":"Knowledge-Based Systems"},{"issue":"4","key":"key2024072308212843700_ref023","doi-asserted-by":"crossref","first-page":"1215","DOI":"10.1007\/s11280-021-00898-z","article-title":"Aspect-based sentiment analysis for online reviews with hybrid attention networks","volume":"24","year":"2021","journal-title":"World Wide Web"},{"key":"key2024072308212843700_ref024","doi-asserted-by":"crossref","first-page":"105010","DOI":"10.1016\/j.knosys.2019.105010","article-title":"Aspect-based sentiment analysis with gated alternate neural network","volume":"188","year":"2020","journal-title":"Knowledge-Based Systems"},{"issue":"2","key":"key2024072308212843700_ref025","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1145\/3432049","article-title":"Multilingual review-aware deep recommender system via aspect-based sentiment analysis","volume":"39","year":"2021","journal-title":"ACM Transactions on Information Systems (TOIS)"},{"first-page":"1023","article-title":"Content attention model for aspect based sentiment analysis","year":"2018","key":"key2024072308212843700_ref026"},{"first-page":"3402","article-title":"IARM: inter-aspect relation modeling with memory networks in aspect-based sentiment analysis","year":"2018","key":"key2024072308212843700_ref027"},{"issue":"3","key":"key2024072308212843700_ref028","first-page":"1743","article-title":"The biases of pre-trained language models: an empirical study on prompt-based sentiment analysis and emotion detection","volume":"14","year":"2022","journal-title":"IEEE Transactions on Affective Computing"},{"issue":"15","key":"key2024072308212843700_ref029","doi-asserted-by":"crossref","first-page":"13543","DOI":"10.1609\/aaai.v35i15.17597","article-title":"A joint training dual-mrc framework for aspect based sentiment analysis","volume":"35","year":"2021","journal-title":"Proceedings of the AAAI Conference on Artificial Intelligence"},{"key":"key2024072308212843700_ref030","doi-asserted-by":"crossref","first-page":"113234","DOI":"10.1016\/j.eswa.2020.113234","article-title":"Aspect-based sentiment analysis using adaptive aspect-based lexicons","volume":"148","year":"2020","journal-title":"Expert Systems with Applications"},{"year":"2019","key":"key2024072308212843700_ref031","article-title":"Market trend prediction using sentiment analysis: lessons learned and paths forward"},{"first-page":"92","article-title":"Reproducibility, replicability and beyond: assessing production readiness of aspect based sentiment analysis in the wild","year":"2021","key":"key2024072308212843700_ref032"},{"key":"key2024072308212843700_ref033","doi-asserted-by":"crossref","first-page":"114231","DOI":"10.1016\/j.eswa.2020.114231","article-title":"A new topic modeling based approach for aspect extraction in aspect based sentiment analysis: SS-LDA","volume":"168","year":"2021","journal-title":"Expert Systems with Applications"},{"first-page":"3211","article-title":"Modelling context and syntactical features for aspect-based sentiment analysis","year":"2020","key":"key2024072308212843700_ref034"},{"journal-title":"Expert Systems","article-title":"Sentiment analysis from email pattern using feature selection algorithm","year":"2021","key":"key2024072308212843700_ref035"},{"key":"key2024072308212843700_ref036","doi-asserted-by":"crossref","first-page":"103477","DOI":"10.1016\/j.artint.2021.103477","article-title":"Enhanced aspect-based sentiment analysis models with progressive self-supervised attention learning","volume":"296","year":"2021","journal-title":"Artificial Intelligence"},{"year":"2015","key":"key2024072308212843700_ref037","article-title":"Effective LSTMs for target-dependent sentiment classification"},{"issue":"6","key":"key2024072308212843700_ref038","doi-asserted-by":"crossref","first-page":"1423","DOI":"10.1007\/s12559-021-09948-0","article-title":"A convolutional stacked bidirectional LSTM with a multiplicative attention mechanism for aspect category and sentiment detection","volume":"13","year":"2021","journal-title":"Cognitive Computation"},{"issue":"05","key":"key2024072308212843700_ref039","doi-asserted-by":"crossref","first-page":"9122","DOI":"10.1609\/aaai.v34i05.6447","article-title":"Target-aspect-sentiment joint detection for aspect-based sentiment analysis","volume":"34","year":"2020","journal-title":"Proceedings of the AAAI Conference on Artificial Intelligence"},{"year":"2022","key":"key2024072308212843700_ref040","article-title":"A contrastive cross-channel data augmentation framework for aspect-based sentiment analysis"},{"year":"2020","key":"key2024072308212843700_ref041","article-title":"Relational graph attention network for aspect-based sentiment analysis"},{"year":"2020","key":"key2024072308212843700_ref042","article-title":"Grid tagging scheme for aspect-oriented fine-grained opinion extraction"},{"key":"key2024072308212843700_ref043","doi-asserted-by":"crossref","first-page":"48","DOI":"10.1016\/j.neucom.2021.10.091","article-title":"Exploring fine-grained syntactic information for aspect-based sentiment classification with dual graph neural networks","volume":"471","year":"2022","journal-title":"Neurocomputing"},{"year":"2019","key":"key2024072308212843700_ref044","article-title":"BERT post-training for review reading comprehension and aspect-based sentiment analysis"},{"year":"2018","key":"key2024072308212843700_ref045","article-title":"Aspect based sentiment analysis with gated convolutional networks"},{"issue":"3","key":"key2024072308212843700_ref046","doi-asserted-by":"crossref","first-page":"463","DOI":"10.1016\/j.ipm.2018.12.004","article-title":"Aspect-based sentiment analysis with alternating coattention networks","volume":"56","year":"2019","journal-title":"Information Processing and Management"},{"issue":"01","key":"key2024072308212843700_ref047","doi-asserted-by":"crossref","first-page":"7370","DOI":"10.1609\/aaai.v33i01.33017370","article-title":"Graph convolutional networks for text classification","volume":"33","year":"2019","journal-title":"Proceedings of the AAAI Conference on Artificial Intelligence"},{"key":"key2024072308212843700_ref048","doi-asserted-by":"crossref","first-page":"15561","DOI":"10.1109\/ACCESS.2021.3052937","article-title":"Combination of convolutional neural network and gated recurrent unit for aspect-based sentiment analysis","volume":"9","year":"2021","journal-title":"IEEE Access"},{"year":"2019","key":"key2024072308212843700_ref049","article-title":"Towards scalable and reliable capsule networks for challenging NLP applications"},{"year":"2022","key":"key2024072308212843700_ref050","article-title":"Knowledge graph augmented network towards multi-view representation learning for aspect-based sentiment analysis"}],"container-title":["Data Technologies and Applications"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/DTA-10-2022-0408\/full\/xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/DTA-10-2022-0408\/full\/html","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,7,24]],"date-time":"2025-07-24T23:15:26Z","timestamp":1753398926000},"score":1,"resource":{"primary":{"URL":"http:\/\/www.emerald.com\/dta\/article\/58\/3\/447-471\/1226330"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,12,29]]},"references-count":50,"journal-issue":{"issue":"3","published-online":{"date-parts":[[2023,12,29]]},"published-print":{"date-parts":[[2024,7,19]]}},"alternative-id":["10.1108\/DTA-10-2022-0408"],"URL":"https:\/\/doi.org\/10.1108\/dta-10-2022-0408","relation":{},"ISSN":["2514-9288","2514-9288"],"issn-type":[{"type":"print","value":"2514-9288"},{"type":"electronic","value":"2514-9288"}],"subject":[],"published":{"date-parts":[[2023,12,29]]}}}