{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,28]],"date-time":"2026-01-28T01:26:20Z","timestamp":1769563580292,"version":"3.49.0"},"reference-count":0,"publisher":"IOS Press","isbn-type":[{"value":"9781643686448","type":"electronic"}],"license":[{"start":{"date-parts":[[2026,1,27]],"date-time":"2026-01-27T00:00:00Z","timestamp":1769472000000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-nc\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2026,1,27]]},"abstract":"<jats:p>In contemporary network traffic engineering and anomaly detection, the measurement of network traffic is critically important. A comprehensive dataset of network traffic provides rapid and valuable assistance for traffic scheduling, fault detection, and network monitoring. However, inadequate monitoring facilities at certain observation points, along with high measurement costs, result in incomplete traffic data and significant sparsity within network monitoring systems. Recent studies have demonstrated that tensor completion can effectively recover missing traffic data from partial measurements. Despite its promising potential, existing tensor completion methods often lack the ability to retain historical information, making it challenging to efficiently capture long-term dependencies among tasks. To address this challenge, an online approach based on probabilistic sparse self-attention (PSSA) for Transfer Learning is proposed. This method effectively leverages transfer learning to transmit historical information to new tasks. Experiments conducted on two real-world datasets demonstrate that the integration of the PSSA module accelerates model training speed and improves the accuracy of tensor recovery.<\/jats:p>","DOI":"10.3233\/faia251659","type":"book-chapter","created":{"date-parts":[[2026,1,27]],"date-time":"2026-01-27T13:19:06Z","timestamp":1769519946000},"source":"Crossref","is-referenced-by-count":0,"title":["Online Network Traffic Recovery Based on Probabilistic Sparse Self-Attention for Transfer Learning: A Lightweight and Efficient Tensor Completion Method"],"prefix":"10.3233","author":[{"ORCID":"https:\/\/orcid.org\/0009-0000-4083-3873","authenticated-orcid":false,"given":"Hongkai","family":"Chen","sequence":"first","affiliation":[{"name":"School of Computer Science and Technology, Guangdong University of Technology, Guangzhou 510006, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7128-8943","authenticated-orcid":false,"given":"Miao","family":"Jiang","sequence":"additional","affiliation":[{"name":"School of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7021-0992","authenticated-orcid":false,"given":"Yiqing","family":"Li","sequence":"additional","affiliation":[{"name":"School of Computer Science and Technology, Guangdong University of Technology, Guangzhou 510006, China"}]}],"member":"7437","container-title":["Frontiers in Artificial Intelligence and Applications","Fuzzy Systems and Data Mining XI"],"original-title":[],"link":[{"URL":"https:\/\/ebooks.iospress.nl\/pdf\/doi\/10.3233\/FAIA251659","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,1,27]],"date-time":"2026-01-27T13:19:07Z","timestamp":1769519947000},"score":1,"resource":{"primary":{"URL":"https:\/\/ebooks.iospress.nl\/doi\/10.3233\/FAIA251659"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2026,1,27]]},"ISBN":["9781643686448"],"references-count":0,"URL":"https:\/\/doi.org\/10.3233\/faia251659","relation":{},"ISSN":["0922-6389","1879-8314"],"issn-type":[{"value":"0922-6389","type":"print"},{"value":"1879-8314","type":"electronic"}],"subject":[],"published":{"date-parts":[[2026,1,27]]}}}