{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,4]],"date-time":"2026-02-04T02:40:37Z","timestamp":1770172837307,"version":"3.49.0"},"reference-count":28,"publisher":"SAGE Publications","issue":"2","license":[{"start":{"date-parts":[[2020,6,6]],"date-time":"2020-06-06T00:00:00Z","timestamp":1591401600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"published-print":{"date-parts":[[2020,8,31]]},"abstract":"<jats:p>In this paper, we present an extractive approach to document summarization, the Siamese Hierarchical Transformer Encoders system, that is based on the use of siamese neural networks and the transformer encoders which are extended in a hierarchical way. The system, trained for binary classification, is able to assign attention scores to each sentence in the document. These scores are used to select the most relevant sentences to build the summary. The main novelty of our proposal is the use of self-attention mechanisms at sentence level for document summarization, instead of using only attentions at word level. The experimentation carried out using the CNN\/DailyMail summarization corpus shows promising results in-line with the state-of-the-art.<\/jats:p>","DOI":"10.3233\/jifs-179901","type":"journal-article","created":{"date-parts":[[2020,6,9]],"date-time":"2020-06-09T12:56:56Z","timestamp":1591707416000},"page":"2409-2419","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":2,"title":["Extractive summarization using siamese hierarchical transformer encoders"],"prefix":"10.1177","volume":"39","author":[{"given":"Jos\u00e9 \u00c1ngel","family":"Gonz\u00e1lez","sequence":"first","affiliation":[{"name":"VRAIN: Valencian Research Institute for Artificial Intelligence, Universitat Polit\u00e8cnica de Val\u00e8ncia, Cam\u00ed de Vera sn, Val\u00e8ncia, Spain"}]},{"given":"Encarna","family":"Segarra","sequence":"additional","affiliation":[{"name":"VRAIN: Valencian Research Institute for Artificial Intelligence, Universitat Polit\u00e8cnica de Val\u00e8ncia, Cam\u00ed de Vera sn, Val\u00e8ncia, Spain"}]},{"given":"Fernando","family":"Garc\u00eda-Granada","sequence":"additional","affiliation":[{"name":"VRAIN: Valencian Research Institute for Artificial Intelligence, Universitat Polit\u00e8cnica de Val\u00e8ncia, Cam\u00ed de Vera sn, Val\u00e8ncia, Spain"}]},{"given":"Emilio","family":"Sanchis","sequence":"additional","affiliation":[{"name":"VRAIN: Valencian Research Institute for Artificial Intelligence, Universitat Polit\u00e8cnica de Val\u00e8ncia, Cam\u00ed de Vera sn, Val\u00e8ncia, Spain"}]},{"given":"Llu\u00eds-F.","family":"Hurtado","sequence":"additional","affiliation":[{"name":"VRAIN: Valencian Research Institute for Artificial Intelligence, Universitat Polit\u00e8cnica de Val\u00e8ncia, Cam\u00ed de Vera sn, Val\u00e8ncia, Spain"}]}],"member":"179","published-online":{"date-parts":[[2020,6,6]]},"reference":[{"key":"e_1_3_2_2_2","doi-asserted-by":"crossref","unstructured":"AmbartsoumianA. and PopowichF. Self-attention: A better building block for sentiment analysis neural network classifiers. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity Sentiment and Social Media Analysis pp. 130\u2013139 Brussels Belgium Oct. 2018. Association for Computational Linguistics.","DOI":"10.18653\/v1\/W18-6219"},{"key":"e_1_3_2_3_2","unstructured":"BaJ. KirosR. and HintonG.E. Layer normalization. CoRR abs\/1607.06450 2016."},{"key":"e_1_3_2_4_2","unstructured":"BegumN. FattahM. and RenF. Automatic text summarization using support vector machine 5 (2009) 1987\u20131996."},{"key":"e_1_3_2_5_2","doi-asserted-by":"crossref","unstructured":"CarbonellJ. and GoldsteinJ. The use of mmr diversity-based reranking for reordering documents and producing summaries. In Proceedings of the 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval SIGIR \u201998 (1998) pp. 335\u2013336 New York NY USA. ACM.","DOI":"10.1145\/290941.291025"},{"key":"e_1_3_2_6_2","doi-asserted-by":"crossref","unstructured":"ChengJ. and LapataM. Neural summarization by extracting sentences and words. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics ACL 2016 August 7-12 2016 Berlin Germany Volume 1: Long Papers 2016.","DOI":"10.18653\/v1\/P16-1046"},{"key":"e_1_3_2_7_2","doi-asserted-by":"crossref","unstructured":"ConneauA. KielaD. SchwenkH. BarraultL. and BordesA. Supervised learning of universal sentence representations fromnatural language inference data. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing EMNLP 2017 Copenhagen Denmark September 9-11 (2017) pp. 670\u2013680.","DOI":"10.18653\/v1\/D17-1070"},{"key":"e_1_3_2_8_2","unstructured":"DevlinJ. ChangM. LeeK. and ToutanovaK. BERT: pretraining of deep bidirectional transformers for language understanding. CoRR abs\/1810.04805 2018."},{"issue":"1","key":"e_1_3_2_9_2","first-page":"457","article-title":"Lexrank: Graph-based lexical centrality as salience in text summarization","volume":"22","author":"Erkan G.","year":"2004","unstructured":"ErkanG. and RadevD.R., Lexrank: Graph-based lexical centrality as salience in text summarization, J Artif Int Res22(1) (2004), 457\u2013479.","journal-title":"J Artif Int Res"},{"key":"e_1_3_2_10_2","doi-asserted-by":"publisher","DOI":"10.3233\/JIFS-179011"},{"key":"e_1_3_2_11_2","unstructured":"HermannK.M. Ko\u010disk\u00fdT. GrefenstetteE. EspeholtL. KayW. SuleymanM. and BlunsomP. Teaching machines to read and comprehend. In Proceedings of the 28th International Conference on Neural Information Processing Systems -Volume 1 NIPS\u201915 (2015) pp. 1693\u20131701 Cambridge MA USA 2015. MIT Press."},{"key":"e_1_3_2_12_2","unstructured":"LinC.-Y. Rouge: A package for automatic evaluation of summaries. In S. S. Marie-Francine Moens editor Text Summarization Branches Out: Proceedings of the ACL-04 Workshop (2004) pp. 74\u201381 Barcelona Spain July 2004. Association for Computational Linguistics."},{"key":"e_1_3_2_13_2","unstructured":"LiuP.J. SalehM.A. PotE. GoodrichB. SepassiR. KaiserL. and ShazeerN. Generating wikipedia by summarizing long sequences. In International Conference on Learning Representations 2018."},{"key":"e_1_3_2_14_2","doi-asserted-by":"publisher","DOI":"10.1007\/s10462-011-9216-z"},{"key":"e_1_3_2_15_2","doi-asserted-by":"publisher","DOI":"10.1162\/COLI_a_00123"},{"key":"e_1_3_2_16_2","unstructured":"MihalceaR. and TarauP. Textrank: Bringing order into text. In Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing 2004."},{"key":"e_1_3_2_17_2","unstructured":"NallapatiR. ZhaiF. and ZhouB. Summarunner: A recurrent neural network based sequence model for extractive summarization of documents. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence February 4-9 2017 San Francisco California USA (2017) pp. 3075\u20133081."},{"key":"e_1_3_2_18_2","doi-asserted-by":"crossref","unstructured":"NallapatiR. ZhouB. dos SantosC.N. \u00c7agl[Takasea. G. and XiangB. Abstractive text summarization using sequence-tosequence rnns and beyond. In CoNLL (2016) pp. 280\u2013290. ACL 2016.","DOI":"10.18653\/v1\/K16-1028"},{"key":"e_1_3_2_19_2","doi-asserted-by":"crossref","unstructured":"NarayanS. CohenS.B. and LapataM. Ranking sentences for extractive summarization with reinforcement learning. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Volume 1 (Long Papers) (2018) pp. 1747\u20131759 New Orleans Louisiana June 2018. Association for Computational Linguistics.","DOI":"10.18653\/v1\/N18-1158"},{"key":"e_1_3_2_20_2","unstructured":"OzsoyM.G. CicekliI. and AlpaslanF.N. Text summarization of turkish texts using latent semantic analysis. In Proceedings of the 23rd International Conference on Computational Linguistics COLING \u201910 (2010) pp. 869\u2013876 Stroudsburg PA USA. Association for Computational Linguistics."},{"key":"e_1_3_2_21_2","unstructured":"PaulusR. XiongC. and SocherR. Adeep reinforced model for abstractive summarization. CoRR abs\/1705.04304 2017."},{"key":"e_1_3_2_22_2","unstructured":"SaggionH. Torres-MorenoJ.-M. CunhaI.d. and SanJuanE. Multilingual summarization evaluation without human models. In Proceedings of the 23rd International Conference on Computational Linguistics: Posters COLING \u201910 (2010) pp. 1059\u20131067 Stroudsburg PA USA. Association for Computational Linguistics."},{"key":"e_1_3_2_23_2","doi-asserted-by":"crossref","unstructured":"SeeA. LiuP.J. and ManningC.D. Get to the point: Summarization with pointer-generator networks. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2017) pp. 1073\u20131083. Association for Computational Linguistics 2017.","DOI":"10.18653\/v1\/P17-1099"},{"key":"e_1_3_2_24_2","unstructured":"ShenD. SunJ.-T. LiH. YangQ. and ChenZ. Document summarization using conditional random fields. In Proceedings of the 20th International Joint Conference on Artifical Intelligence IJCAI\u201907 (2007) pp. 2862\u20132867 San Francisco CA USA. Morgan Kaufmann Publishers Inc."},{"key":"e_1_3_2_25_2","doi-asserted-by":"crossref","unstructured":"StojanovskiD. and FraserA. Coreference and coherence in neural machine translation: A study using oracle experiments. In Proceedings of the Third Conference on Machine Translation: Research Papers (2018) pp. 49\u201360 Belgium Brussels Oct. 2018. Association for Computational Linguistics.","DOI":"10.18653\/v1\/W18-6306"},{"key":"e_1_3_2_26_2","doi-asserted-by":"crossref","unstructured":"TurG. and De MoriR. Spoken language understanding: Systems for extracting semantic information from speech. John Wiley & Sons 2011.","DOI":"10.1002\/9781119992691"},{"key":"e_1_3_2_27_2","first-page":"5998","article-title":"Attention is all you need","volume":"30","author":"Vaswani A.","year":"2017","unstructured":"VaswaniA., ShazeerN., ParmarN., UszkoreitJ., JonesL., GomezA.N., KaiserL.u. and PolosukhinI., Attention is all you need. In GuyonI., LuxburgU. V., BengioS., WallachH., FergusR., VishwanathanS., GarnettR., editors, Advances in Neural Information Processing Systems30 (2017), pp. 5998\u20136008. Curran Associates, Inc.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_28_2","doi-asserted-by":"crossref","unstructured":"YangZ. YangD. DyerC. HeX. SmolaA. and HovyE. Hierarchical attention networks for document classification. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2016) pp. 1480\u20131489. Association for Computational Linguistics.","DOI":"10.18653\/v1\/N16-1174"},{"key":"e_1_3_2_29_2","doi-asserted-by":"crossref","unstructured":"ZhangJ. LuanH. SunM. ZhaiF. XuJ. ZhangM. and LiuY. Improving the transformer translation model with document-level context. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (2018) pp. 533\u2013542 Brussels Belgium. Association for Computational Linguistics.","DOI":"10.18653\/v1\/D18-1049"}],"container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-179901","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.3233\/JIFS-179901","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-179901","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,3]],"date-time":"2026-02-03T12:56:46Z","timestamp":1770123406000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.3233\/JIFS-179901"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,6,6]]},"references-count":28,"journal-issue":{"issue":"2","published-print":{"date-parts":[[2020,8,31]]}},"alternative-id":["10.3233\/JIFS-179901"],"URL":"https:\/\/doi.org\/10.3233\/jifs-179901","relation":{},"ISSN":["1064-1246","1875-8967"],"issn-type":[{"value":"1064-1246","type":"print"},{"value":"1875-8967","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,6,6]]}}}