{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,13]],"date-time":"2026-01-13T16:11:58Z","timestamp":1768320718056,"version":"3.49.0"},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2023,8]]},"abstract":"<jats:p>In real teaching scenarios, an excellent teacher always teaches what he (or she) is good at but the student is not. This gives the student the best assistance in making up for his (or her) weaknesses and becoming a good one overall. Enlightened by this, we introduce the \"Teaching what you Should Teach\" strategy into a knowledge distillation framework, and propose a data-based distillation method named \"TST\" that searches for desirable augmented samples to assist in distilling more efficiently and rationally. To be specific, we design a neural network-based data augmentation module with priori bias to find out what meets the teacher's strengths but the student's weaknesses, by learning magnitudes and probabilities to generate suitable data samples. By training the data augmentation module and the generalized distillation paradigm alternately, a student model is learned with excellent generalization ability. To verify the effectiveness of our method, we conducted extensive comparative experiments on object recognition, detection, and segmentation tasks. The results on the CIFAR-100, ImageNet-1k, MS-COCO, and Cityscapes datasets demonstrate that our method achieves state-of-the-art performance on almost all teacher-student pairs. Furthermore, we conduct visualization studies to explore what magnitudes and probabilities are needed for the distillation process.<\/jats:p>","DOI":"10.24963\/ijcai.2023\/150","type":"proceedings-article","created":{"date-parts":[[2023,8,11]],"date-time":"2023-08-11T04:31:30Z","timestamp":1691728290000},"page":"1351-1359","source":"Crossref","is-referenced-by-count":1,"title":["Teaching What You Should Teach: A Data-Based Distillation Method"],"prefix":"10.24963","author":[{"given":"Shitong","family":"Shao","sequence":"first","affiliation":[{"name":"Beijing Key Laboratory of Intelligent Information Technology, School of Computer Science and Technology, Beijing Institute of Technology, China"}]},{"given":"Huanran","family":"Chen","sequence":"additional","affiliation":[{"name":"Beijing Key Laboratory of Intelligent Information Technology, School of Computer Science and Technology, Beijing Institute of Technology, China"}]},{"given":"Zhen","family":"Huang","sequence":"additional","affiliation":[{"name":"Univeristy of Science and Technology of China, Hefei, China"}]},{"given":"Linrui","family":"Gong","sequence":"additional","affiliation":[{"name":"Hunan University, Hunan, China"}]},{"given":"Shuai","family":"Wang","sequence":"additional","affiliation":[{"name":"Tsinghua University, Beijing, China"}]},{"given":"Xinxiao","family":"Wu","sequence":"additional","affiliation":[{"name":"Beijing Key Laboratory of Intelligent Information Technology, School of Computer Science and Technology, Beijing Institute of Technology, China"}]}],"member":"10584","event":{"name":"Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}","theme":"Artificial Intelligence","location":"Macau, SAR China","acronym":"IJCAI-2023","number":"32","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"start":{"date-parts":[[2023,8,19]]},"end":{"date-parts":[[2023,8,25]]}},"container-title":["Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2023,8,11]],"date-time":"2023-08-11T04:37:57Z","timestamp":1691728677000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2023\/150"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2023,8]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2023\/150","relation":{},"subject":[],"published":{"date-parts":[[2023,8]]}}}