{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,6]],"date-time":"2026-02-06T05:34:26Z","timestamp":1770356066177,"version":"3.49.0"},"reference-count":0,"publisher":"IOS Press","isbn-type":[{"value":"9781643686318","type":"electronic"}],"license":[{"start":{"date-parts":[[2025,10,21]],"date-time":"2025-10-21T00:00:00Z","timestamp":1761004800000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-nc\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2025,10,21]]},"abstract":"<jats:p>The success of current Large-Language Models (LLMs) hinges on extensive training data that are collected and stored centrally, called Centralized Learning (CL). However, such a collection manner poses a privacy threat, and one potential solution is Federated Learning (FL), which transfers gradients, not raw data, among clients. Unlike traditional networks, FL for LLMs incurs significant communication costs due to their tremendous parameters. In this study, we introduce an innovative approach to compress gradients to improve communication efficiency during LLM FL, formulating the new FL pipeline named CG-FedLLM. This approach integrates an encoder on the client side to acquire the compressed gradient features and a decoder on the server side to reconstruct the gradients. We also develop a novel training strategy that comprises Temporal-ensemble Gradient-Aware Pre-training (TGAP)\u00a0to identify characteristic gradients of the target model and Federated AutoEncoder-Involved Fine-tuning (FAF)\u00a0to compress gradients adaptively. Extensive experiments confirm that our approach reduces communication costs and improves performance (e.g., average 3 points increment compared with traditional CL- and FL-based fine-tuning with several foundation models on well-recognized benchmarks, MMLU and C-Eval). This is because our encoder-decoder, trained via TGAP\u00a0and FAF,\u00a0can filter gradients while selectively preserving critical features. Furthermore, we present a series of experimental analyses that focus on the communication efficiency, accuracy, and generalization ability within this privacy-centric framework, providing insights into the development of more efficient and private LLMs fine-tuning.<\/jats:p>","DOI":"10.3233\/faia251320","type":"book-chapter","created":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T09:58:10Z","timestamp":1761127090000},"source":"Crossref","is-referenced-by-count":1,"title":["CG-FedLLM: How to Compress Gradients in Federated Fine-Tuning for Large Language Models"],"prefix":"10.3233","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-8471-4219","authenticated-orcid":false,"given":"Huiwen","family":"Wu","sequence":"first","affiliation":[{"name":"Zhejiang Laboratory"}]},{"given":"Xiaogang","family":"Xu","sequence":"additional","affiliation":[{"name":"The Chinese University of Hong Kong"}]},{"given":"Deyi","family":"Zhang","sequence":"additional","affiliation":[{"name":"Zhejiang Laboratory"}]},{"given":"Xiaohan","family":"Li","sequence":"additional","affiliation":[{"name":"Zhejiang Laboratory"}]},{"given":"Jiafei","family":"Wu","sequence":"additional","affiliation":[{"name":"Zhejiang Laboratory"}]},{"given":"Zhe","family":"Liu","sequence":"additional","affiliation":[{"name":"Zhejiang Laboratory"}]}],"member":"7437","container-title":["Frontiers in Artificial Intelligence and Applications","ECAI 2025"],"original-title":[],"link":[{"URL":"https:\/\/ebooks.iospress.nl\/pdf\/doi\/10.3233\/FAIA251320","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T09:58:11Z","timestamp":1761127091000},"score":1,"resource":{"primary":{"URL":"https:\/\/ebooks.iospress.nl\/doi\/10.3233\/FAIA251320"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,10,21]]},"ISBN":["9781643686318"],"references-count":0,"URL":"https:\/\/doi.org\/10.3233\/faia251320","relation":{},"ISSN":["0922-6389","1879-8314"],"issn-type":[{"value":"0922-6389","type":"print"},{"value":"1879-8314","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,10,21]]}}}