{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,18]],"date-time":"2026-02-18T00:15:00Z","timestamp":1771373700505,"version":"3.50.1"},"publisher-location":"New York, NY, USA","reference-count":14,"publisher":"ACM","license":[{"start":{"date-parts":[[2021,12,28]],"date-time":"2021-12-28T00:00:00Z","timestamp":1640649600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"name":"the Major Science and Technology Innovation Projects of Key R&D Programs of Shandong Province in 2019","award":["No.2019JZZY010113"],"award-info":[{"award-number":["No.2019JZZY010113"]}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2021,12,28]]},"DOI":"10.1145\/3491396.3506511","type":"proceedings-article","created":{"date-parts":[[2022,1,7]],"date-time":"2022-01-07T23:54:52Z","timestamp":1641599692000},"page":"83-88","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":5,"title":["Real-Time Prediction of Ocean Observation Data Based on Transformer Model"],"prefix":"10.1145","author":[{"given":"Wenqing","family":"Chang","sequence":"first","affiliation":[{"name":"Qilu University of Technology, (Shandong Academy of Sciences), Shandong Computer Science Center, (National Supercomputer Center in Jinan), Jinan, China"}]},{"given":"Xiang","family":"Li","sequence":"additional","affiliation":[{"name":"Qilu University of Technology, (Shandong Academy of Sciences), Shandong Computer Science Center, (National Supercomputer Center in Jinan), Jinan, China"}]},{"given":"Huomin","family":"Dong","sequence":"additional","affiliation":[{"name":"Shandong Scicom Holdings Group Co., Ltd, Jinan, China"}]},{"given":"Chunxiao","family":"Wang","sequence":"additional","affiliation":[{"name":"Qilu University of Technology, (Shandong Academy of Sciences), Shandong Computer Science Center, (National Supercomputer Center in Jinan), Jinan, China"}]},{"given":"Zhigang","family":"Zhao","sequence":"additional","affiliation":[{"name":"Qilu University of Technology, (Shandong Academy of Sciences), Shandong Computer Science Center, (National Supercomputer Center in Jinan), Jinan, China"}]},{"given":"Yinglong","family":"Wang","sequence":"additional","affiliation":[{"name":"Qilu University of Technology, (Shandong Academy of Sciences), Shandong Computer Science Center, (National Supercomputer Center in Jinan), Jinan, China"}]}],"member":"320","published-online":{"date-parts":[[2022,1,7]]},"reference":[{"key":"e_1_3_2_1_1_1","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR46437.2021.01212"},{"key":"e_1_3_2_1_2_1","doi-asserted-by":"publisher","DOI":"10.1109\/ITSC.2002.1041308"},{"key":"e_1_3_2_1_3_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICASSP.2018.8462506"},{"key":"e_1_3_2_1_4_1","unstructured":"Kai Han Yunhe Wang Hanting Chen Xinghao Chen Jianyuan Guo Zhenhua Liu Yehui Tang An Xiao Chunjing Xu Yixing Xu etal 2020. A survey on visual transformer. arXiv preprint arXiv:2012.12556 (2020).  Kai Han Yunhe Wang Hanting Chen Xinghao Chen Jianyuan Guo Zhenhua Liu Yehui Tang An Xiao Chunjing Xu Yixing Xu et al. 2020. A survey on visual transformer. arXiv preprint arXiv:2012.12556 (2020)."},{"key":"e_1_3_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.3390\/info10030103"},{"key":"e_1_3_2_1_6_1","first-page":"5243","article-title":"Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting","volume":"32","author":"Li Shiyang","year":"2019","unstructured":"Shiyang Li , Xiaoyong Jin , Yao Xuan , Xiyou Zhou , Wenhu Chen , Yu-Xiang Wang , and Xifeng Yan . 2019 . Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting . Advances in Neural Information Processing Systems 32 (2019), 5243 -- 5253 . Shiyang Li, Xiaoyong Jin, Yao Xuan, Xiyou Zhou, Wenhu Chen, Yu-Xiang Wang, and Xifeng Yan. 2019. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Advances in Neural Information Processing Systems 32 (2019), 5243--5253.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_1_7_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICTIS.2015.7232194"},{"key":"e_1_3_2_1_8_1","first-page":"1","article-title":"A transformer self-attention model for time series forecasting","volume":"9","author":"Mohammdi Farsani R","year":"2021","unstructured":"R Mohammdi Farsani and E Pazouki . 2021 . A transformer self-attention model for time series forecasting . Journal of Electrical and Computer Engineering Innovations (JECEI) 9 , 1 (2021), 1 -- 10 . R Mohammdi Farsani and E Pazouki. 2021. A transformer self-attention model for time series forecasting. Journal of Electrical and Computer Engineering Innovations (JECEI) 9, 1 (2021), 1--10.","journal-title":"Journal of Electrical and Computer Engineering Innovations (JECEI)"},{"key":"e_1_3_2_1_9_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.petrol.2019.106682"},{"key":"e_1_3_2_1_10_1","unstructured":"Ashish Vaswani Noam Shazeer Niki Parmar Jakob Uszkoreit Llion Jones Aidan N Gomez \u0141ukasz Kaiser and Illia Polosukhin. 2017. Attention is all you need. In Advances in neural information processing systems. 5998--6008.  Ashish Vaswani Noam Shazeer Niki Parmar Jakob Uszkoreit Llion Jones Aidan N Gomez \u0141ukasz Kaiser and Illia Polosukhin. 2017. Attention is all you need. In Advances in neural information processing systems. 5998--6008."},{"key":"e_1_3_2_1_11_1","doi-asserted-by":"publisher","DOI":"10.3390\/w12092521"},{"key":"e_1_3_2_1_12_1","doi-asserted-by":"crossref","unstructured":"Thomas Wolf Lysandre Debut Victor Sanh Julien Chaumond Clement Delangue Anthony Moi Pierric Cistac Tim Rault R\u00e9mi Louf Morgan Funtowicz etal 2019. Huggingface's transformers: State-of-the-art natural language processing. arXiv preprint arXiv:1910.03771 (2019).  Thomas Wolf Lysandre Debut Victor Sanh Julien Chaumond Clement Delangue Anthony Moi Pierric Cistac Tim Rault R\u00e9mi Louf Morgan Funtowicz et al. 2019. Huggingface's transformers: State-of-the-art natural language processing. arXiv preprint arXiv:1910.03771 (2019).","DOI":"10.18653\/v1\/2020.emnlp-demos.6"},{"key":"e_1_3_2_1_13_1","volume-title":"Deep transformer models for time series forecasting: The influenza prevalence case. arXiv preprint arXiv:2001.08317","author":"Wu Neo","year":"2020","unstructured":"Neo Wu , Bradley Green , Xue Ben , and Shawn O'Banion . 2020. Deep transformer models for time series forecasting: The influenza prevalence case. arXiv preprint arXiv:2001.08317 ( 2020 ). Neo Wu, Bradley Green, Xue Ben, and Shawn O'Banion. 2020. Deep transformer models for time series forecasting: The influenza prevalence case. arXiv preprint arXiv:2001.08317 (2020)."},{"key":"e_1_3_2_1_14_1","doi-asserted-by":"publisher","DOI":"10.1007\/s00521-016-2763-0"}],"event":{"name":"ACM ICEA '21: 2021 ACM International Conference on Intelligent Computing and its Emerging Applications","location":"Jinan China","acronym":"ACM ICEA '21","sponsor":["SIGAPP ACM Special Interest Group on Applied Computing"]},"container-title":["Proceedings of the 2021 ACM International Conference on Intelligent Computing and its Emerging Applications"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3491396.3506511","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3491396.3506511","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T19:30:47Z","timestamp":1750188647000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3491396.3506511"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,12,28]]},"references-count":14,"alternative-id":["10.1145\/3491396.3506511","10.1145\/3491396"],"URL":"https:\/\/doi.org\/10.1145\/3491396.3506511","relation":{},"subject":[],"published":{"date-parts":[[2021,12,28]]},"assertion":[{"value":"2022-01-07","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}