{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,9]],"date-time":"2026-04-09T14:28:06Z","timestamp":1775744886625,"version":"3.50.1"},"reference-count":50,"publisher":"Association for Computing Machinery (ACM)","issue":"4","license":[{"start":{"date-parts":[[2024,2,13]],"date-time":"2024-02-13T00:00:00Z","timestamp":1707782400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"crossref","award":["61972161, and P0040473"],"award-info":[{"award-number":["61972161, and P0040473"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100021171","name":"Guangdong Basic and Applied Basic Research Foundation","doi-asserted-by":"crossref","award":["2022A1515010374"],"award-info":[{"award-number":["2022A1515010374"]}],"id":[{"id":"10.13039\/501100021171","id-type":"DOI","asserted-by":"crossref"}]},{"name":"Guangzhou Basic and Applied Basic Research Foundation","award":["202201010715"],"award-info":[{"award-number":["202201010715"]}]},{"name":"Theme-based Research Scheme","award":["T43-513\/23-N"],"award-info":[{"award-number":["T43-513\/23-N"]}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["ACM Trans. Knowl. Discov. Data"],"published-print":{"date-parts":[[2024,5,31]]},"abstract":"<jats:p>\n            Federated learning enables multiple clients to collaboratively learn machine learning models in a privacy-preserving manner. However, in real-world scenarios, a key challenge encountered in federated learning is the statistical heterogeneity among clients. Existing work mainly focused on a single global model shared across the clients, making it hard to generalize well to all clients due to the large discrepancy in the data distributions. To address this challenge, we propose\n            <jats:italic>pFedLT<\/jats:italic>\n            , a novel approach that can adapt the single global model to different data distributions. Specifically, we propose to perform a pluggable layer-wise transformation during the local update phase based on scaling and shifting operations. In particular, these operations are learned with a meta-learning strategy. By doing so,\n            <jats:italic>pFedLT<\/jats:italic>\n            can capture the diversity of data distribution among clients, therefore, can generalize well even when the data distributions among clients exhibit high statistical heterogeneity. We conduct extensive experiments on synthetic and real-world datasets (MNIST, Fashion_MNIST, CIFAR-10, and Office+Caltech10) under different Non-IID settings. Experimental results demonstrate that\n            <jats:italic>pFedLT<\/jats:italic>\n            significantly improves the model accuracy by up to 11.67% and reduces the communication costs compared with state-of-the-art approaches.\n          <\/jats:p>","DOI":"10.1145\/3638252","type":"journal-article","created":{"date-parts":[[2023,12,28]],"date-time":"2023-12-28T21:57:51Z","timestamp":1703800671000},"page":"1-21","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":17,"title":["Personalized Federated Learning with Layer-Wise Feature Transformation via Meta-Learning"],"prefix":"10.1145","volume":"18","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-0730-8994","authenticated-orcid":false,"given":"Jingke","family":"Tu","sequence":"first","affiliation":[{"name":"South China University of Technology, CHINA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9752-8607","authenticated-orcid":false,"given":"Jiaming","family":"Huang","sequence":"additional","affiliation":[{"name":"South China University of Technology, CHINA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8732-3675","authenticated-orcid":false,"given":"Lei","family":"Yang","sequence":"additional","affiliation":[{"name":"South China University of Technology, CHINA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-7328-8039","authenticated-orcid":false,"given":"Wanyu","family":"Lin","sequence":"additional","affiliation":[{"name":"The Hong Kong Polytechnic University, CHINA"}]}],"member":"320","published-online":{"date-parts":[[2024,2,13]]},"reference":[{"key":"e_1_3_1_2_2","doi-asserted-by":"publisher","DOI":"10.1109\/IJCNN48605.2020.9207469"},{"key":"e_1_3_1_3_2","unstructured":"Sebastian Caldas Sai Meher Karthik Duddu Peter Wu Tian Li Jakub Kone\u010dn\u1ef3 H Brendan McMahan Virginia Smith and Ameet Talwalkar. 2018. Leaf: A benchmark for federated settings. arXiv:1812.01097. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1812.01097"},{"key":"e_1_3_1_4_2","unstructured":"Xingjian Cao Gang Sun Hongfang Yu and Mohsen Guizani. 2022. PerFED-GAN: Personalized federated learning via generative adversarial networks. arXiv:2202.09155. Retrieved from https:\/\/arxiv.org\/abs\/cs\/2202.09155"},{"key":"e_1_3_1_5_2","unstructured":"Fei Chen Mi Luo Zhenhua Dong Zhenguo Li and Xiuqiang He. 2018. Federated meta-learning with fast convergence and efficient communication. arXiv:1802.07876. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1802.07876"},{"key":"e_1_3_1_6_2","doi-asserted-by":"publisher","DOI":"10.1109\/TWC.2020.3024629"},{"key":"e_1_3_1_7_2","first-page":"2089","volume-title":"International Conference on Machine Learning","author":"Collins Liam","year":"2021","unstructured":"Liam Collins, Hamed Hassani, Aryan Mokhtari, and Sanjay Shakkottai. 2021. Exploiting shared representations for personalized federated learning. In International Conference on Machine Learning. PMLR, 2089\u20132099."},{"key":"e_1_3_1_8_2","unstructured":"Xinyan Dai Xiao Yan Kaiwen Zhou Han Yang Kelvin KW Ng James Cheng and Yu Fan. 2019. Hyper-sphere quantization: Communication-efficient sgd for federated learning. arXiv:1911.04655. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1911.04655"},{"key":"e_1_3_1_9_2","first-page":"3557","article-title":"Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach","volume":"33","author":"Fallah Alireza","year":"2020","unstructured":"Alireza Fallah, Aryan Mokhtari, and Asuman Ozdaglar. 2020. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. Advances in Neural Information Processing Systems 33 (2020), 3557\u20133568.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_1_10_2","doi-asserted-by":"publisher","DOI":"10.1145\/3378679.3394528"},{"key":"e_1_3_1_11_2","first-page":"1126","volume-title":"Proceedings of the 34th International Conference on Machine Learning, ICML 2017, Sydney, NSW, Australia, 6-11 August 2017 (Proceedings of Machine Learning Research)","volume":"70","author":"Finn Chelsea","year":"2017","unstructured":"Chelsea Finn, Pieter Abbeel, and Sergey Levine. 2017. Model-agnostic meta-learning for fast adaptation of deep networks. In Proceedings of the 34th International Conference on Machine Learning, ICML 2017, Sydney, NSW, Australia, 6-11 August 2017 (Proceedings of Machine Learning Research). Doina Precup and Yee Whye Teh (Eds.), Vol. 70. PMLR, 1126\u20131135."},{"key":"e_1_3_1_12_2","first-page":"19586","article-title":"An efficient framework for clustered federated learning","volume":"33","author":"Ghosh Avishek","year":"2020","unstructured":"Avishek Ghosh, Jichan Chung, Dong Yin, and Kannan Ramchandran. 2020. An efficient framework for clustered federated learning. Advances in Neural Information Processing Systems 33 (2020), 19586\u201319597.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_1_13_2","doi-asserted-by":"publisher","DOI":"10.5555\/2354409.2355024"},{"key":"e_1_3_1_14_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v35i9.16960"},{"key":"e_1_3_1_15_2","unstructured":"Yihan Jiang Jakub Kone\u010dn\u1ef3 Keith Rush and Sreeram Kannan. 2019. Improving federated learning personalization via model agnostic meta learning. arXiv:1909.12488. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1909.12488"},{"key":"e_1_3_1_16_2","doi-asserted-by":"publisher","DOI":"10.1561\/2200000083"},{"key":"e_1_3_1_17_2","first-page":"5132","volume-title":"Proceedings of the 37th International Conference on Machine Learning","volume":"119","author":"Karimireddy Sai Praneeth","year":"2020","unstructured":"Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank Reddi, Sebastian Stich, and Ananda Theertha Suresh. 2020. SCAFFOLD: Stochastic controlled averaging for federated learning. In Proceedings of the 37th International Conference on Machine Learning, Vol. 119. PMLR, 5132\u20135143."},{"key":"e_1_3_1_18_2","unstructured":"Mikhail Khodak Maria-Florina F. Balcan and Ameet S. Talwalkar. 2019. Adaptive gradient-based meta-learning methods. Advances in Neural Information Processing Systems 32 (2019) 5917\u20135928."},{"key":"e_1_3_1_19_2","first-page":"3478","volume-title":"International Conference on Machine Learning","author":"Koloskova Anastasia","year":"2019","unstructured":"Anastasia Koloskova, Sebastian Stich, and Martin Jaggi. 2019. Decentralized stochastic optimization and gossip algorithms with compressed communication. In International Conference on Machine Learning. PMLR, 3478\u20133487."},{"key":"e_1_3_1_20_2","unstructured":"Alex Krizhevsky and Geoffrey Hinton. 2009. Learning multiple layers of features from tiny images. Handbook of Systemic Autoimmune Diseases 1 4 (2009)."},{"key":"e_1_3_1_21_2","doi-asserted-by":"crossref","unstructured":"Alex Krizhevsky Ilya Sutskever and Geoffrey E. Hinton. 2017. ImageNet classification with deep convolutional neural networks. Communications of the ACM 60 6 (2017) 84\u201390.","DOI":"10.1145\/3065386"},{"key":"e_1_3_1_22_2","doi-asserted-by":"publisher","DOI":"10.1109\/WorldS450073.2020.9210355"},{"key":"e_1_3_1_23_2","doi-asserted-by":"publisher","DOI":"10.1109\/5.726791"},{"key":"e_1_3_1_24_2","unstructured":"Daliang Li and Junpu Wang. 2019. Fedmd: Heterogenous federated learning via model distillation. arXiv:1910.03581. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1910.03581"},{"key":"e_1_3_1_25_2","doi-asserted-by":"publisher","DOI":"10.1109\/IJCNN52387.2021.9533876"},{"key":"e_1_3_1_26_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR46437.2021.01057"},{"key":"e_1_3_1_27_2","doi-asserted-by":"publisher","DOI":"10.1109\/MSP.2020.2975749"},{"key":"e_1_3_1_28_2","first-page":"429","article-title":"Federated optimization in heterogeneous networks","volume":"2","author":"Li Tian","year":"2020","unstructured":"Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smith. 2020. Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems 2 (2020), 429\u2013450.","journal-title":"Proceedings of Machine Learning and Systems"},{"key":"e_1_3_1_29_2","volume-title":"8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020","author":"Li Tian","year":"2020","unstructured":"Tian Li, Maziar Sanjabi, Ahmad Beirami, and Virginia Smith. 2020. Fair resource allocation in federated learning. In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net."},{"key":"e_1_3_1_30_2","volume-title":"International Conference on Learning Representations","author":"Li Xiang","year":"2019","unstructured":"Xiang Li, Kaixuan Huang, Wenhao Yang, Shusen Wang, and Zhihua Zhang. 2019. On the convergence of FedAvg on non-IID data. In International Conference on Learning Representations."},{"key":"e_1_3_1_31_2","unstructured":"Paul Pu Liang Terrance Liu Liu Ziyin Nicholas B. Allen Randy P. Auerbach David Brent Ruslan Salakhutdinov and Louis-Philippe Morency. 2020. Think locally act globally: Federated learning with local and global representations. arXiv:2001.01523. Retrieved from https:\/\/arxiv.org\/abs\/cs\/2001.01523"},{"key":"e_1_3_1_32_2","first-page":"2351","article-title":"Ensemble distillation for robust model fusion in federated learning","volume":"33","author":"Lin Tao","year":"2020","unstructured":"Tao Lin, Lingjing Kong, Sebastian U Stich, and Martin Jaggi. 2020. Ensemble distillation for robust model fusion in federated learning. Advances in Neural Information Processing Systems 33 (2020), 2351\u20132363.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_1_33_2","first-page":"1273","volume-title":"Artificial Intelligence and Statistics","author":"McMahan Brendan","year":"2017","unstructured":"Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics. PMLR, 1273\u20131282."},{"key":"e_1_3_1_34_2","first-page":"8024","volume-title":"Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, December 8-14, 2019, Vancouver, BC, Canada","author":"Wallach Hanna M.","year":"2019","unstructured":"Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga and others. 2019. PyTorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, December 8-14, 2019, Vancouver, BC, Canada. Hanna M. Wallach, Hugo Larochelle, Alina Beygelzimer, Florence d\u2019Alch\u00e9-Buc, Emily B. Fox, and Roman Garnett (Eds.). Vol. 32. 8024\u20138035."},{"key":"e_1_3_1_35_2","unstructured":"Mohammad Rasouli Tao Sun and Ram Rajagopal. 2020. FedGAN: Federated generative adversarial networks for distributed data. arXiv:2006.07228. Retrieved from https:\/\/arxiv.org\/abs\/cs\/2006.07228"},{"key":"e_1_3_1_36_2","volume-title":"International Conference on Learning Representations","author":"Reddi Sashank","year":"2021","unstructured":"Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Kone\u010dn\u1ef3, Sanjiv Kumar, and H Brendan McMahan. 2021. Adaptive federated optimization. In International Conference on Learning Representations."},{"key":"e_1_3_1_37_2","first-page":"2021","volume-title":"International Conference on Artificial Intelligence and Statistics","author":"Reisizadeh Amirhossein","year":"2020","unstructured":"Amirhossein Reisizadeh, Aryan Mokhtari, Hamed Hassani, Ali Jadbabaie, and Ramtin Pedarsani. 2020. Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization. In International Conference on Artificial Intelligence and Statistics. PMLR, 2021\u20132031."},{"key":"e_1_3_1_38_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNNLS.2020.3015958"},{"key":"e_1_3_1_39_2","first-page":"21394","article-title":"Personalized federated learning with moreau envelopes","volume":"33","author":"Dinh Canh T","year":"2020","unstructured":"Canh T Dinh, Nguyen Tran, and Josh Nguyen. 2020. Personalized federated learning with moreau envelopes. Advances in Neural Information Processing Systems 33 (2020), 21394\u201321405.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_1_40_2","doi-asserted-by":"publisher","unstructured":"Alysa Ziying Tan Han Yu Lizhen Cui and Qiang Yang. 2023. Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems 34 12 (2023) 9587\u20139603. DOI:10.1109\/TNNLS.2022.3160699","DOI":"10.1109\/TNNLS.2022.3160699"},{"key":"e_1_3_1_41_2","volume-title":"International Conference on Learning Representations","author":"Wang Hongyi","year":"2020","unstructured":"Hongyi Wang, Mikhail Yurochkin, Yuekai Sun, Dimitris Papailiopoulos, and Yasaman Khazaeni. 2020. Federated learning with matched averaging. In International Conference on Learning Representations."},{"key":"e_1_3_1_42_2","doi-asserted-by":"publisher","DOI":"10.1109\/JSAC.2019.2904348"},{"key":"e_1_3_1_43_2","doi-asserted-by":"publisher","DOI":"10.1109\/MNET.2019.1800286"},{"key":"e_1_3_1_44_2","unstructured":"Shanshan Wu Tian Li Zachary Charles Yu Xiao Ziyu Liu Zheng Xu and Virginia Smith. 2022. Motley: Benchmarking heterogeneity and personalization in federated learning. arXiv:2206.09262. Retrieved from https:\/\/arxiv.org\/abs\/cs\/2206.09262"},{"key":"e_1_3_1_45_2","unstructured":"Han Xiao Kashif Rasul and Roland Vollgraf. 2017. Fashion-mnist: A novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1708.07747"},{"key":"e_1_3_1_46_2","unstructured":"Jian Xu Xinyi Tong and Shao-Lun Huang. 2023. Personalized federated learning with feature alignment and classifier collaboration. arXiv:2306.11867. Retrieved from https:\/\/arxiv.org\/abs\/cs\/2306.11867"},{"key":"e_1_3_1_47_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNSE.2020.2996612"},{"key":"e_1_3_1_48_2","doi-asserted-by":"publisher","DOI":"10.23919\/EUSIPCO54536.2021.9616052"},{"key":"e_1_3_1_49_2","unstructured":"Jie Zhang Song Guo Xiaosong Ma Haozhao Wang Wenchao Xu and Feijie Wu. 2021. Parameterized knowledge transfer for personalized federated learning. Advances in Neural Information Processing Systems M. Ranzato A. Beygelzimer Y. Dauphin P. S. Liang and J. Wortman Vaughan (Eds.). Vol. 34 Curran Associates Inc. 10092\u201310104. https:\/\/proceedings.neurips.cc\/paper_files\/paper\/2021\/file\/5383c7318a3158b9bc261d0b6996f7c2-Paper.pdf"},{"key":"e_1_3_1_50_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.ins.2020.05.137"},{"key":"e_1_3_1_51_2","unstructured":"Yue Zhao Meng Li Liangzhen Lai Naveen Suda Damon Civin and Vikas Chandra. 2018. Federated learning with Non-IID data. arXiv:1806.00582. Retrieved from https:\/\/arxiv.org\/abs\/cs\/1806.00582"}],"container-title":["ACM Transactions on Knowledge Discovery from Data"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3638252","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3638252","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,18]],"date-time":"2025-06-18T22:53:35Z","timestamp":1750287215000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3638252"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,2,13]]},"references-count":50,"journal-issue":{"issue":"4","published-print":{"date-parts":[[2024,5,31]]}},"alternative-id":["10.1145\/3638252"],"URL":"https:\/\/doi.org\/10.1145\/3638252","relation":{},"ISSN":["1556-4681","1556-472X"],"issn-type":[{"value":"1556-4681","type":"print"},{"value":"1556-472X","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,2,13]]},"assertion":[{"value":"2023-06-05","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2023-12-11","order":1,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2024-02-13","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}