{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,9,24]],"date-time":"2025-09-24T00:14:55Z","timestamp":1758672895185,"version":"3.44.0"},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2025,9]]},"abstract":"<jats:p>Classical Chinese, as the core carrier of Chinese culture, plays a crucial role in the inheritance and study of ancient literature. However, existing natural language processing models primarily optimize for Modern Chinese, resulting in inadequate performance on Classical Chinese. This paper presents a comprehensive solution for Classical Chinese language processing. By continuing pre-training and instruction fine-tuning on the LLaMA3-8B-Chinese model, we construct a large language model, WenyanGPT, which is specifically designed for Classical Chinese tasks. Additionally, we develop an evaluation benchmark dataset, WenyanBENCH. Experimental results on WenyanBENCH demonstrate that WenyanGPT significantly outperforms current advanced LLMs in various Classical Chinese tasks. We make the model's training data, instruction fine-tuning data, and evaluation benchmark dataset publicly available to promote further research and development in the field of Classical Chinese processing.<\/jats:p>","DOI":"10.24963\/ijcai.2025\/927","type":"proceedings-article","created":{"date-parts":[[2025,9,19]],"date-time":"2025-09-19T08:10:40Z","timestamp":1758269440000},"page":"8339-8347","source":"Crossref","is-referenced-by-count":0,"title":["WenyanGPT: A Large Language Model for Classical Chinese Tasks"],"prefix":"10.24963","author":[{"given":"Xinyu","family":"Yao","sequence":"first","affiliation":[{"name":"School of Information Engineering, Minzu University of China"}]},{"given":"Mengdi","family":"Wang","sequence":"additional","affiliation":[{"name":"School of Information Engineering, Minzu University of China"}]},{"given":"Bo","family":"Chen","sequence":"additional","affiliation":[{"name":"School of Information Engineering, Minzu University of China"},{"name":"National Language Resource Monitoring and Research Center of Minority Languages"}]},{"given":"Xiaobing","family":"Zhao","sequence":"additional","affiliation":[{"name":"School of Information Engineering, Minzu University of China"},{"name":"National Language Resource Monitoring and Research Center of Minority Languages"}]}],"member":"10584","event":{"number":"34","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"acronym":"IJCAI-2025","name":"Thirty-Fourth International Joint Conference on Artificial Intelligence {IJCAI-25}","start":{"date-parts":[[2025,8,16]]},"theme":"Artificial Intelligence","location":"Montreal, Canada","end":{"date-parts":[[2025,8,22]]}},"container-title":["Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2025,9,23]],"date-time":"2025-09-23T11:35:30Z","timestamp":1758627330000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2025\/927"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2025,9]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2025\/927","relation":{},"subject":[],"published":{"date-parts":[[2025,9]]}}}