{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,22]],"date-time":"2026-04-22T19:52:47Z","timestamp":1776887567145,"version":"3.51.2"},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2023,8]]},"abstract":"<jats:p>For many real-world time series tasks, the computational complexity of prevalent deep leaning models often hinders the deployment on resource limited environments (e.g., smartphones). Moreover, due to the inevitable domain shift between model training (source) and deploying (target) stages, compressing those deep models under cross-domain scenarios becomes more challenging. Although some of existing works have already explored cross-domain knowledge distillation for model compression, they are either biased to source data or heavily tangled between source and target data. To this end, we design a novel end-to-end framework called UNiversal and joInt Knowledge Distillation (UNI-KD) for cross-domain model compression. In particular, we propose to transfer both the universal feature-level knowledge across source and target domains and the joint logit-level knowledge shared by both domains from the teacher to the student model via an adversarial learning scheme. More specifically, a feature-domain discriminator is employed to align teacher\u2019s and student\u2019s representations for universal knowledge transfer. A data-domain discriminator is utilized to prioritize the domain-shared samples for joint knowledge transfer. Extensive experimental results on four time series datasets demonstrate the superiority of our proposed method over state-of-the-art (SOTA) benchmarks. The source code is available at https:\/\/github.com\/ijcai2023\/UNI KD.<\/jats:p>","DOI":"10.24963\/ijcai.2023\/496","type":"proceedings-article","created":{"date-parts":[[2023,8,11]],"date-time":"2023-08-11T08:31:30Z","timestamp":1691742690000},"page":"4460-4468","source":"Crossref","is-referenced-by-count":3,"title":["Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data"],"prefix":"10.24963","author":[{"given":"Qing","family":"Xu","sequence":"first","affiliation":[{"name":"Institute for Infocomm Research, A*STAR, Singapore"},{"name":"Nanyang Technological University"}]},{"given":"Min","family":"Wu","sequence":"additional","affiliation":[{"name":"Institute for Infocomm Research, A*STAR, Singapore"}]},{"given":"Xiaoli","family":"Li","sequence":"additional","affiliation":[{"name":"Institute for Infocomm Research, A*STAR, Singapore"},{"name":"Centre for Frontier AI Research, A*STAR, Singapore"}]},{"given":"Kezhi","family":"Mao","sequence":"additional","affiliation":[{"name":"Nanyang Technological University"}]},{"given":"Zhenghua","family":"Chen","sequence":"additional","affiliation":[{"name":"Institute for Infocomm Research, A*STAR, Singapore"},{"name":"Centre for Frontier AI Research, A*STAR, Singapore"}]}],"member":"10584","event":{"name":"Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}","theme":"Artificial Intelligence","location":"Macau, SAR China","acronym":"IJCAI-2023","number":"32","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"start":{"date-parts":[[2023,8,19]]},"end":{"date-parts":[[2023,8,25]]}},"container-title":["Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2023,8,11]],"date-time":"2023-08-11T08:49:59Z","timestamp":1691743799000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2023\/496"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2023,8]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2023\/496","relation":{},"subject":[],"published":{"date-parts":[[2023,8]]}}}