{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,25]],"date-time":"2026-02-25T19:59:00Z","timestamp":1772049540928,"version":"3.50.1"},"reference-count":39,"publisher":"Institution of Engineering and Technology (IET)","issue":"1","license":[{"start":{"date-parts":[[2024,7,8]],"date-time":"2024-07-08T00:00:00Z","timestamp":1720396800000},"content-version":"vor","delay-in-days":189,"URL":"http:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100016984","name":"Xijing University","doi-asserted-by":"publisher","id":[{"id":"10.13039\/501100016984","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["ietresearch.onlinelibrary.wiley.com"],"crossmark-restriction":true},"short-container-title":["IET Software"],"published-print":{"date-parts":[[2024,1]]},"abstract":"<jats:p>Long\u2010term time series forecasting has received significant attention from researchers in recent years. Transformer model\u2010based approaches have emerged as promising solutions in this domain. Nevertheless, most existing methods rely on point\u2010by\u2010point self\u2010attention mechanisms or employ transformations, decompositions, and reconstructions of the entire sequence to capture dependencies. The point\u2010by\u2010point self\u2010attention mechanism becomes impractical for long\u2010term time series forecasting due to its quadratic complexity with respect to the time series length. Decomposition and reconstruction methods may introduce information loss, leading to performance bottlenecks in the models. In this paper, we propose a Transformer\u2010based forecasting model called NPformer. Our method introduces a novel multiscale segmented Fourier attention mechanism. By segmenting the long\u2010term time series and performing discrete Fourier transforms on different segments, we aim to identify frequency\u2010domain correlations between these segments. This allows us to capture dependencies more effectively. In addition, we incorporate a normalization module and a desmoothing factor into the model. These components address the problem of oversmoothing that arises in sequence decomposition methods. Furthermore, we introduce an isometry convolution method to enhance the prediction accuracy of the model. The experimental results demonstrate that NPformer outperforms other Transformer\u2010based methods in long\u2010term time series forecasting.<\/jats:p>","DOI":"10.1049\/2024\/2920167","type":"journal-article","created":{"date-parts":[[2024,7,8]],"date-time":"2024-07-08T08:08:10Z","timestamp":1720426090000},"update-policy":"https:\/\/doi.org\/10.1002\/crossmark_policy","source":"Crossref","is-referenced-by-count":2,"title":["Segmented Frequency\u2010Domain Correlation Prediction Model for Long\u2010Term Time Series Forecasting Using Transformer"],"prefix":"10.1049","volume":"2024","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-8174-3655","authenticated-orcid":false,"given":"Haozhuo","family":"Tong","sequence":"first","affiliation":[]},{"given":"Lingyun","family":"Kong","sequence":"additional","affiliation":[]},{"given":"Jie","family":"Liu","sequence":"additional","affiliation":[]},{"given":"Shiyan","family":"Gao","sequence":"additional","affiliation":[]},{"given":"Yilu","family":"Xu","sequence":"additional","affiliation":[]},{"given":"Yuezhe","family":"Chen","sequence":"additional","affiliation":[]}],"member":"265","published-online":{"date-parts":[[2024,7,8]]},"reference":[{"key":"e_1_2_9_1_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-981-16-4807-6_39"},{"key":"e_1_2_9_2_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2019.2923006"},{"key":"e_1_2_9_3_2","doi-asserted-by":"crossref","unstructured":"LiangY. XiaY. KeS. WangY. WenQ. ZhangJ. ZhengY. andZimmermannR. AirFormer: predicting nationwide air quality in China with transformers Proceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence and Thirty-Fifth Conference on Innovative Applications of Artificial Intelligence and Thirteenth Symposium on Educational Advances in Artificial Intelligence February 2023 AAAI Press 14329\u201314337 https:\/\/doi.org\/10.1609\/aaai.v37i12.26676.","DOI":"10.1609\/aaai.v37i12.26676"},{"key":"e_1_2_9_4_2","doi-asserted-by":"publisher","DOI":"10.1111\/tgis.12644"},{"key":"e_1_2_9_5_2","volume-title":"31st Conference on Neural Information Processing Systems (NIPS 2017)","author":"Vaswani A.","year":"2017"},{"key":"e_1_2_9_6_2","first-page":"6778","volume-title":"Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence","author":"Wen Q.","year":"2022"},{"key":"e_1_2_9_7_2","unstructured":"WangW. ChenW. QiuQ. ChenL. WuB. LinB. HeX. andLiuW. CrossFormer++: a versatile vision transformer hinging on cross-scale attention 2023 https:\/\/doi.org\/10.48550\/arXiv.2303.06908."},{"key":"e_1_2_9_8_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v35i12.17325"},{"key":"e_1_2_9_9_2","first-page":"22419","article-title":"Autoformer: decomposition transformers with auto-correlation for long-term series forecasting","volume":"34","author":"Wu H.","year":"2021","journal-title":"Advances in Neural Information Processing Systems (NeurIPS 2021)"},{"key":"e_1_2_9_10_2","doi-asserted-by":"crossref","unstructured":"DuD. SuB. andWeiZ. Preformer: predictive transformer with multi-scale segment-wise correlations for long-term time series forecasting ICASSP 2023\u20142023 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP) June 2023 Rhodes Island Greece IEEE 1\u20135 https:\/\/doi.org\/10.1109\/ICASSP49357.2023.10096881.","DOI":"10.1109\/ICASSP49357.2023.10096881"},{"key":"e_1_2_9_11_2","unstructured":"ZhouT. MaZ. WenQ. WangX. SunL. andJinR. FEDformer: frequency enhanced decomposed transformer for long-term series forecasting Proceedings of the 39 th International Conference on Machine Learning 2022 Baltimore Maryland USA PMLR."},{"key":"e_1_2_9_12_2","unstructured":"LiuY. WuH. WangJ. andLongM. Non-stationary transformers: exploring the stationarity in time series forecasting 35 Proceedings of the 36th International Conference on Neural Information Processing Systems November 2022 Curran Associates Inc. 9881\u20139893."},{"key":"e_1_2_9_13_2","unstructured":"ZhangX. JinX. GopalswamyK. GuptaG. ParkY. ShiX. WangH. MaddixD. C. andWangY. First de-trend then attend: rethinking attention for time-series forecasting 2022 https:\/\/doi.org\/10.48550\/arXiv.2212.08151."},{"key":"e_1_2_9_14_2","doi-asserted-by":"publisher","DOI":"10.1093\/biomet\/46.3-4.306"},{"key":"e_1_2_9_15_2","doi-asserted-by":"publisher","DOI":"10.1186\/s12874-021-01235-8"},{"key":"e_1_2_9_16_2","doi-asserted-by":"crossref","unstructured":"YamakP. T. YujianL. andGadoseyP. K. A Comparison between ARIMA LSTM and GRU for time series forecasting Proceedings of the 2019 2nd International Conference on Algorithms Computing and Artificial Intelligence December 2019 Sanya China Association for Computing Machinery 49\u201355 https:\/\/doi.org\/10.1145\/3377713.3377722.","DOI":"10.1145\/3377713.3377722"},{"key":"e_1_2_9_17_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.eswa.2020.113609"},{"key":"e_1_2_9_18_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.chaos.2020.109864"},{"key":"e_1_2_9_19_2","doi-asserted-by":"publisher","DOI":"10.1089\/big.2020.0159"},{"key":"e_1_2_9_20_2","doi-asserted-by":"crossref","unstructured":"Bon\u00e9R. AssaadM. andCrucianuM. Boosting recurrent neural networks for time series prediction Artificial Neural Nets and Genetic Algorithms: Proceedings of the International Conference in Roanne 2003 Springer Vienna 11\u201322 https:\/\/doi.org\/10.1007\/978-3-7091-0646-4_4.","DOI":"10.1007\/978-3-7091-0646-4_4"},{"key":"e_1_2_9_21_2","doi-asserted-by":"publisher","DOI":"10.1049\/iet-its.2016.0208"},{"key":"e_1_2_9_22_2","doi-asserted-by":"crossref","unstructured":"Siami-NaminiS. TavakoliN. andNaminA. S. The performance of LSTM and BiLSTM in forecasting time series 2019 IEEE International Conference on Big Data (Big Data) December 2019 Los Angeles CA USA IEEE 3285\u20133292 https:\/\/doi.org\/10.1109\/BigData47090.2019.9005997.","DOI":"10.1109\/BigData47090.2019.9005997"},{"key":"e_1_2_9_23_2","unstructured":"RangapuramS. S. SeegerM. GasthausJ. StellaL. WangY. andJanuschowskiT. Deep state space models for time series forecasting Proceedings of the 32nd International Conference on Neural Information Processing Systems December 2018 Curran Associates Inc. 7796\u20137805."},{"key":"e_1_2_9_24_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.eswa.2019.113082"},{"key":"e_1_2_9_25_2","doi-asserted-by":"publisher","DOI":"10.3390\/app10031144"},{"key":"e_1_2_9_26_2","doi-asserted-by":"crossref","unstructured":"PanapongpakornT.andBanjerdpongchaiD. Short-term load forecast for energy management systems using time series analysis and neural network method with average true range 2019 First International Symposium on Instrumentation Control Artificial Intelligence and Robotics (ICA-SYMP) January 2019 Bangkok Thailand IEEE 86\u201389 https:\/\/doi.org\/10.1109\/ICA-SYMP.2019.8646068 2-s2.0-85063250597.","DOI":"10.1109\/ICA-SYMP.2019.8646068"},{"key":"e_1_2_9_27_2","doi-asserted-by":"publisher","DOI":"10.3390\/en13246623"},{"key":"e_1_2_9_28_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11063-020-10319-3"},{"key":"e_1_2_9_29_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.jhydrol.2018.04.065"},{"key":"e_1_2_9_30_2","doi-asserted-by":"publisher","DOI":"10.3390\/app10072322"},{"key":"e_1_2_9_31_2","unstructured":"SenR. YuH.-F. andDhillonI. S. Think globally act locally: a deep neural network approach to high-dimensional time series forecasting Proceedings of the 33rd International Conference on Neural Information Processing Systems December 2019 Curran Associates Inc. 4837\u20134846."},{"key":"e_1_2_9_32_2","unstructured":"WuS. XiaoX. DingQ. ZhaoP. WeiY. andHuangJ. Adversarial sparse transformer for time series forecasting Proceedings of the 34th International Conference on Neural Information Processing Systems December 2020 Vancouver BC Canada Curran Associates Inc. 17105\u201317115."},{"key":"e_1_2_9_33_2","doi-asserted-by":"crossref","unstructured":"NieX. ZhouX. LiZ. WangL. LinX. andTongT. LogTrans: providing efficient local-global fusion with transformer and CNN parallel network for biomedical image segmentation 2022 IEEE 24th Int Conf on High Performance Computing & Communications; 8th Int Conf on Data Science & Systems; 20th Int Conf on Smart City; 8th Int Conf on Dependability in Sensor Cloud & Big Data Systems & Application (HPCC\/DSS\/SmartCity\/DependSys) December 2022 Hainan China IEEE 769\u2013776 https:\/\/doi.org\/10.1109\/HPCC-DSS-SmartCity-DependSys57074.2022.00128.","DOI":"10.1109\/HPCC-DSS-SmartCity-DependSys57074.2022.00128"},{"key":"e_1_2_9_34_2","unstructured":"KitaevN. Kaiser\u0141. andLevskayaA. Reformer: the efficient transformer 2020 https:\/\/doi.org\/10.48550\/arXiv.2001.04451."},{"key":"e_1_2_9_35_2","unstructured":"WooG. LiuC. SahooD. KumarA. andHoiS. ETSformer: exponential smoothing transformers for time-series forecasting 2022 https:\/\/doi.org\/10.48550\/arXiv.2202.01381."},{"key":"e_1_2_9_36_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v37i9.26317"},{"key":"e_1_2_9_37_2","unstructured":"WangH. PengJ. HuangF. WangJ. ChenJ. andXiaoY. MICN: multi-scale local and global context modeling for long-term series forecasting The Eleventh International Conference on Learning Representations 2023 ICLR."},{"key":"e_1_2_9_38_2","doi-asserted-by":"crossref","unstructured":"LaiG. ChangW.-C. YangY. andLiuH. Modeling long-and short-term temporal patterns with deep neural networks The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval June 2018 Ann Arbor MI USA Association for Computing Machinery 95\u2013104 https:\/\/doi.org\/10.1145\/3209978.3210006 2-s2.0-85051553862.","DOI":"10.1145\/3209978.3210006"},{"key":"e_1_2_9_39_2","unstructured":"KingmaD. P.andBaJ. Adam: a method for stochastic optimization 2014 https:\/\/doi.org\/10.48550\/arXiv.1412.6980."}],"container-title":["IET Software"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/ietresearch.onlinelibrary.wiley.com\/doi\/pdf\/10.1049\/2024\/2920167","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,11,5]],"date-time":"2025-11-05T09:06:31Z","timestamp":1762333591000},"score":1,"resource":{"primary":{"URL":"https:\/\/ietresearch.onlinelibrary.wiley.com\/doi\/10.1049\/2024\/2920167"}},"subtitle":[],"editor":[{"given":"Vincenzo","family":"Moscato","sequence":"additional","affiliation":[]}],"short-title":[],"issued":{"date-parts":[[2024,1]]},"references-count":39,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2024,1]]}},"alternative-id":["10.1049\/2024\/2920167"],"URL":"https:\/\/doi.org\/10.1049\/2024\/2920167","archive":["Portico"],"relation":{},"ISSN":["1751-8806","1751-8814"],"issn-type":[{"value":"1751-8806","type":"print"},{"value":"1751-8814","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,1]]},"assertion":[{"value":"2023-07-14","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2024-06-08","order":2,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2024-07-08","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}],"article-number":"2920167"}}