{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,2]],"date-time":"2026-05-02T04:40:22Z","timestamp":1777696822693,"version":"3.51.4"},"reference-count":55,"publisher":"SAGE Publications","issue":"6","license":[{"start":{"date-parts":[[2025,2,26]],"date-time":"2025-02-26T00:00:00Z","timestamp":1740528000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"funder":[{"name":"Beijing Natural Science Foundation","award":["4222037, L181010"],"award-info":[{"award-number":["4222037, L181010"]}]},{"name":"BIT Research and Innovation Promoting Project","award":["2023YCXY036"],"award-info":[{"award-number":["2023YCXY036"]}]}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Intelligent Data Analysis: An International Journal"],"published-print":{"date-parts":[[2025,11]]},"abstract":"<jats:p>Federated learning enables multiple participants to train models without sharing their raw data. However, long-tailed data with imbalanced sample sizes among clients deteriorates the model\u2019s performance in federated learning. Additionally, existing studies on class prototypes are less effective for federated long-tailed issues, as the difference in class prototype representation between the head classes and tail classes exacerbates global model updates and instability among clients. Therefore, we propose a Federated Ensemble Prototypes Learning (FedEP) approach that employs ensemble class prototypes instead of local class prototypes to alleviate class representation bias. Specifically, each client partitions its local dataset into multiple subsets to derive subset class prototypes and filters biased subset class prototypes using a threshold to obtain ensemble class prototypes. The server then aggregates these ensemble prototypes to develop novel global ones, which guide local training without extra data. Concurrently, we track category probability differences to assess the degree of deviation among class prototypes during the iterative process. Furthermore, our method has proven effective and outperforms baseline approaches on long-tailed data across various experimental settings.<\/jats:p>","DOI":"10.1177\/1088467x251317420","type":"journal-article","created":{"date-parts":[[2025,2,27]],"date-time":"2025-02-27T02:04:09Z","timestamp":1740621849000},"page":"1459-1477","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":0,"title":["Federated ensemble learning on long-tailed data with prototypes"],"prefix":"10.1177","volume":"29","author":[{"ORCID":"https:\/\/orcid.org\/0009-0002-1933-3852","authenticated-orcid":false,"given":"Yang","family":"Li","sequence":"first","affiliation":[{"name":"School of Computer Science &amp; Technology, Beijing Institute of Technology, Beijing, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5010-6335","authenticated-orcid":false,"given":"Xin","family":"Liu","sequence":"additional","affiliation":[{"name":"School of Computer Science &amp; Technology, Beijing Institute of Technology, Beijing, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3528-4739","authenticated-orcid":false,"given":"Kan","family":"Li","sequence":"additional","affiliation":[{"name":"School of Computer Science &amp; Technology, Beijing Institute of Technology, Beijing, China"}]}],"member":"179","published-online":{"date-parts":[[2025,2,26]]},"reference":[{"key":"e_1_3_3_2_2","unstructured":"McMahan B Moore E Ramage D et\u00a0al. Communication-efficient learning of deep networks from decentralized data. In: Artificial intelligence and statistics. PMLR 2017 pp.1273\u20131282."},{"key":"e_1_3_3_3_2","unstructured":"Oh J Kim S Yun SY. FedBABU: towards enhanced representation for federated image classification. ArXiv: abs\/2106.06042 2021."},{"key":"e_1_3_3_4_2","doi-asserted-by":"publisher","DOI":"10.1109\/TBDATA.2022.3180117"},{"key":"e_1_3_3_5_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.knosys.2024.111633"},{"key":"e_1_3_3_6_2","unstructured":"Fan T Kang Y Ma G et\u00a0al. FATE-LLM: a industrial grade federated learning framework for large language models. arXiv e-prints 2023."},{"key":"e_1_3_3_7_2","doi-asserted-by":"crossref","unstructured":"Zhang J Vahidian S Kuo M et\u00a0al. Towards building the federated GPT: federated instruction tuning. In: IEEE international conference on acoustics speech and signal processing (ICASSP) 2024 pp.6915\u20136919.","DOI":"10.1109\/ICASSP48485.2024.10447454"},{"key":"e_1_3_3_8_2","unstructured":"Krizhevsky A. Learning multiple layers of features from tiny images. In: Technical report 2009."},{"key":"e_1_3_3_9_2","doi-asserted-by":"publisher","DOI":"10.1145\/3550302"},{"key":"e_1_3_3_10_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2024.127906"},{"key":"e_1_3_3_11_2","doi-asserted-by":"publisher","DOI":"10.1109\/TPAMI.2023.3268118"},{"key":"e_1_3_3_12_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.future.2023.01.019"},{"key":"e_1_3_3_13_2","doi-asserted-by":"crossref","unstructured":"Zhang B Li X Ye Y et\u00a0al. Prototype completion with primitive knowledge for few-shot learning. In: IEEE \/ CVF computer vision and pattern recognition conference (CVPR) 2021 pp.3754\u20133762.","DOI":"10.1109\/CVPR46437.2021.00375"},{"key":"e_1_3_3_14_2","doi-asserted-by":"crossref","unstructured":"Dai Y Chen Z Li J et\u00a0al. Tackling data heterogeneity in federated learning with class prototypes. In: Proceedings of AAAI conference on artificial intelligence (AAAI) 2023 pp.7314\u20137322.","DOI":"10.1609\/aaai.v37i6.25891"},{"key":"e_1_3_3_15_2","unstructured":"Chen Z Liu S Wang H et\u00a0al. Towards federated long-tailed learning. In: International workshop on trustworthy federated learning in conjunction with IJCAI 2022 2022."},{"key":"e_1_3_3_16_2","doi-asserted-by":"crossref","unstructured":"Sharma S Xian Y Yu N et\u00a0al. Learning prototype classifiers for long-tailed recognition. In: Proceedings of the thirty-second international joint conference on artificial intelligence (IJCAI) 2023 pp.1360\u20131368.","DOI":"10.24963\/ijcai.2023\/151"},{"key":"e_1_3_3_17_2","doi-asserted-by":"crossref","unstructured":"Shang X Lu Y Huang G et\u00a0al. Federated learning on heterogeneous and long-tailed data via classifier re-training with federated features. In: Proceedings of the thirty-first international joint conference on artificial intelligence IJCAI 2022 pp.2218\u20132224.","DOI":"10.24963\/ijcai.2022\/308"},{"key":"e_1_3_3_18_2","doi-asserted-by":"crossref","unstructured":"Liu J Song L Qin Y. Prototype rectification for few-shot learning. In: European conference on computer vision (ECCV) 2020 pp.741\u2013756.","DOI":"10.1007\/978-3-030-58452-8_43"},{"key":"e_1_3_3_19_2","doi-asserted-by":"crossref","unstructured":"Tan Y Long G Liu L et\u00a0al. FedProto: federated prototype learning across heterogeneous clients. In: Proceedings of AAAI conference on artificial intelligence (AAAI) 2022 pp.8432\u20138440.","DOI":"10.1609\/aaai.v36i8.20819"},{"key":"e_1_3_3_20_2","doi-asserted-by":"crossref","unstructured":"Zhang J Liu Y Hua Y et\u00a0al. FedTGP: trainable global prototypes with adaptive-margin-enhanced contrastive learning for data and model heterogeneity in federated learning. In: Thirty-eighth AAAI conference on artificial intelligence AAAI 2024 pp.16768\u201316776. AAAI Press.","DOI":"10.1609\/aaai.v38i15.29617"},{"key":"e_1_3_3_21_2","doi-asserted-by":"crossref","unstructured":"Kim H Kwak Y Jung M et\u00a0al. ProtoFL: unsupervised federated learning via prototypical distillation. In: International conference on computer vision (ICCV) 2023 pp.6470\u20136479.","DOI":"10.1109\/ICCV51070.2023.00595"},{"key":"e_1_3_3_22_2","doi-asserted-by":"crossref","unstructured":"He Y Wu J Wei X. Distilling virtual examples for long-tailed recognition. In: IEEE\/CVF international conference on computer vision (ICCV) 2021 pp.235\u2013244. IEEE.","DOI":"10.1109\/ICCV48922.2021.00030"},{"key":"e_1_3_3_23_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11263-022-01622-8"},{"key":"e_1_3_3_24_2","unstructured":"Yue Y Kang B Ma X et\u00a0al. Boosting offline reinforcement learning via data rebalancing. ArXiv :2210.09241 2022."},{"key":"e_1_3_3_25_2","doi-asserted-by":"publisher","DOI":"10.1109\/TC.2023.3315066"},{"key":"e_1_3_3_26_2","doi-asserted-by":"crossref","unstructured":"Cui Y Jia M Lin T et\u00a0al. Class-balanced loss based on effective number of samples. In: IEEE \/ CVF computer vision and pattern recognition conference (CVPR) 2019 pp.9268\u20139277.","DOI":"10.1109\/CVPR.2019.00949"},{"key":"e_1_3_3_27_2","unstructured":"Makhija D Han X Ho N et\u00a0al. Architecture agnostic federated learning for neural networks. In: International conference on machine learning (ICML) 2022 pp.14860\u201314870."},{"key":"e_1_3_3_28_2","article-title":"Prototype guided federated learning of visual feature representations","author":"Michieli U","year":"2021","unstructured":"Michieli U, Ozay M. Prototype guided federated learning of visual feature representations. ArXiv 2021, abs\/2105.08982.","journal-title":"ArXiv"},{"key":"e_1_3_3_29_2","doi-asserted-by":"crossref","unstructured":"Chou Y Hong S Sun C et\u00a0al. GRP-FED: addressing client imbalance in federated learning via global-regularized personalization. In: SIAM international conference on data mining (SDM) 2022 pp.451\u2013458.","DOI":"10.1137\/1.9781611977172.51"},{"key":"e_1_3_3_30_2","doi-asserted-by":"crossref","unstructured":"Li Q He B Song D. Model-contrastive federated learning. In: IEEE \/ CVF computer vision and pattern recognition conference (CVPR) 2021 pp.10713\u201310722.","DOI":"10.1109\/CVPR46437.2021.01057"},{"key":"e_1_3_3_31_2","unstructured":"Collins L Hassani H Mokhtari A et\u00a0al. Exploiting shared representations for personalized federated learning. In: International conference on machine learning (ICML) vol. 139 2021 pp.2089\u20132099. PMLR."},{"key":"e_1_3_3_32_2","unstructured":"Qiao Y Park S Kang SM et\u00a0al. Prototype helps federated learning: towards faster convergence. ArXiv:2303.12296 2023."},{"key":"e_1_3_3_33_2","doi-asserted-by":"publisher","DOI":"10.1109\/TPDS.2023.3324426"},{"key":"e_1_3_3_34_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2024.110542"},{"key":"e_1_3_3_35_2","first-page":"3732","article-title":"Prototype learning in machine learning: A literature review","volume":"33","author":"Zhang X","year":"2022","unstructured":"Zhang X, Zhu Z, Zhao Y, et\u00a0al. Prototype learning in machine learning: A literature review. J Softw 2022; 33: 3732\u20133753.","journal-title":"J Softw"},{"key":"e_1_3_3_36_2","doi-asserted-by":"crossref","unstructured":"Xu J Yang M Ding W et\u00a0al. Stabilizing and improving federated learning with Non-IID data and client dropout. arXiv e-prints 2023.","DOI":"10.1007\/s10489-024-05956-3"},{"key":"e_1_3_3_37_2","doi-asserted-by":"crossref","unstructured":"Qiao Y Munir MS Adhikary A et\u00a0al. A framework for multi-prototype based federated learning: towards the edge intelligence. In: International conference on information networking (ICOIN) 2023 pp.134\u2013139.","DOI":"10.1109\/ICOIN56518.2023.10048999"},{"key":"e_1_3_3_38_2","unstructured":"Snell J Swersky K Zemel R. Prototypical networks for few-shot learning. In: Neural information processing systems (NeurIPS) 2017 pp.4080\u20134090."},{"key":"e_1_3_3_39_2","unstructured":"Li J Zhou P Xiong C et\u00a0al. Prototypical contrastive learning of unsupervised representations. In: International conference on learning representations (ICLR) 2021."},{"key":"e_1_3_3_40_2","doi-asserted-by":"crossref","unstructured":"Du Y Shen J Zhen X et\u00a0al. SuperDisco: super-class discovery improves visual recognition for the long-tail. In: Proceeding of the IEEE\/CVF conference on computer vision and pattern recognition(CVPR) 2023 pp.19944\u201319954.","DOI":"10.1109\/CVPR52729.2023.01910"},{"key":"e_1_3_3_41_2","unstructured":"Yang Z Zhang Y Zheng Y et\u00a0al. FedFed: feature distillation against data heterogeneity in federated learning. In: Advances in neural information processing systems 36: annual conference on neural information processing systems (NeurIPS) 2023."},{"key":"e_1_3_3_42_2","first-page":"1","article-title":"Communication-efficient federated learning via knowledge distillation","volume":"13","author":"Wu C","year":"2022","unstructured":"Wu C, Wu F, Lyu L, et al. Communication-efficient federated learning via knowledge distillation. Nat Commun 2022; 13: 1\u20138.","journal-title":"Nat Commun"},{"key":"e_1_3_3_43_2","unstructured":"Li D Wang J. FedMD: heterogenous federated learning via model distillation. In: NeurIPS\u201919 workshop 2019."},{"key":"e_1_3_3_44_2","doi-asserted-by":"crossref","unstructured":"Qiao P Zhao K Bi B et\u00a0al. Feed: towards personalization-effective federated learning. In: IEEE 40th international conference on data engineering (ICDE) 2024 pp.1779\u20131791. IEEE.","DOI":"10.1109\/ICDE60146.2024.00144"},{"key":"e_1_3_3_45_2","doi-asserted-by":"crossref","unstructured":"Wang L Wang W Li B. CMFL: mitigating communication overhead for federated learning. In: 39th IEEE international conference on distributed computing systems (ICDCS) 2019 pp.954\u2013964. IEEE.","DOI":"10.1109\/ICDCS.2019.00099"},{"key":"e_1_3_3_46_2","unstructured":"Yang W Chen D Zhou H et\u00a0al. Integrating local real data with global gradient prototypes for classifier re-balancing in federated long-tailed learning. ArXiv 2023."},{"key":"e_1_3_3_47_2","doi-asserted-by":"publisher","DOI":"10.1631\/FITEE.2300181"},{"key":"e_1_3_3_48_2","doi-asserted-by":"publisher","DOI":"10.1109\/LCOMM.2024.3501956"},{"key":"e_1_3_3_49_2","first-page":"429","article-title":"Federated optimization in heterogeneous networks","volume":"2","author":"Li T","year":"2020","unstructured":"Li T, Sahu A, Zaheer M, et\u00a0al. Federated optimization in heterogeneous networks. Proc Mach Learn Syst 2020; 2: 429\u2013450.","journal-title":"Proc Mach Learn Syst"},{"key":"e_1_3_3_50_2","first-page":"2661","article-title":"Flexible clustered federated learning for client-level data distribution shift","volume":"33","author":"Duan M","year":"2022","unstructured":"Duan M, Liu D, Ji X, et\u00a0al. Flexible clustered federated learning for client-level data distribution shift. IEEE Trans Parallel Distrib Syst 2022; 33: 2661\u20132674.","journal-title":"IEEE Trans Parallel Distrib Syst"},{"key":"e_1_3_3_51_2","unstructured":"Li X Huang K Yang W et\u00a0al. On the convergence of FedAvg on non-IID data. In: 8th International conference on learning representations (ICLR) 2020."},{"key":"e_1_3_3_52_2","first-page":"1565","article-title":"Learning imbalanced datasets with label-distribution-aware margin loss","volume":"32","author":"Cao K","year":"2019","unstructured":"Cao K, Wei C, Gaidon A, et al. Learning imbalanced datasets with label-distribution-aware margin loss. Neural Information Processing Systems (NeurIPS) 2019; 32: 1565\u20131576.","journal-title":"Neural Information Processing Systems (NeurIPS)"},{"key":"e_1_3_3_53_2","unstructured":"Karimireddy S Kale S Mohri M et\u00a0al. Scaffold: stochastic controlled averaging for federated learning. In: International conference on machine learning (ICML) 2020 pp.5132\u20135143. PMLR."},{"key":"e_1_3_3_54_2","unstructured":"Li X Jiang M Zhang X et\u00a0al. FedBN: federated learning on non-IID features via local batch normalization. In: International conference on learning representations (ICLR) 2021."},{"key":"e_1_3_3_55_2","unstructured":"Sarkar D Narang A Rai S. Fed-focal loss for imbalanced data classification in federated learning. In: FL-IJCAI\u201920 2020."},{"key":"e_1_3_3_56_2","unstructured":"Shen Z Cervino J Hassani H Ribeiro A. An agnostic approach to federated learning with class imbalance. In: International conference on learning representations (ICLR) 2022."}],"container-title":["Intelligent Data Analysis: An International Journal"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/1088467X251317420","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.1177\/1088467X251317420","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/1088467X251317420","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,4,29]],"date-time":"2026-04-29T09:21:03Z","timestamp":1777454463000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.1177\/1088467X251317420"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,2,26]]},"references-count":55,"journal-issue":{"issue":"6","published-print":{"date-parts":[[2025,11]]}},"alternative-id":["10.1177\/1088467X251317420"],"URL":"https:\/\/doi.org\/10.1177\/1088467x251317420","relation":{},"ISSN":["1088-467X","1571-4128"],"issn-type":[{"value":"1088-467X","type":"print"},{"value":"1571-4128","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,2,26]]}}}