{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,8,7]],"date-time":"2024-08-07T07:32:59Z","timestamp":1723015979214},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2019,8]]},"abstract":"<jats:p>Network pruning is widely applied to deep CNN models due to their heavy computation costs and achieves high performance by keeping important weights while removing the redundancy. Pruning redundant weights directly may hurt global information flow, which suggests that an efficient sparse network should take graph properties into account. Thus, instead of paying more attention to preserving important weight, we focus on the pruned architecture itself. We propose to use graph entropy as the measurement, which shows useful properties to craft high-quality neural graphs and enables us to propose efficient algorithm to construct them as the initial network architecture. Our algorithm can be easily implemented and deployed to different popular CNN models and achieve better trade-offs.<\/jats:p>","DOI":"10.24963\/ijcai.2019\/311","type":"proceedings-article","created":{"date-parts":[[2019,7,28]],"date-time":"2019-07-28T03:46:05Z","timestamp":1564285565000},"page":"2244-2250","source":"Crossref","is-referenced-by-count":2,"title":["Crafting Efficient Neural Graph of Large Entropy"],"prefix":"10.24963","author":[{"given":"Minjing","family":"Dong","sequence":"first","affiliation":[{"name":"School of Computer Science, Faculty of Engineering, University of Sydney, Australia"}]},{"given":"Hanting","family":"Chen","sequence":"additional","affiliation":[{"name":"Key Laboratory of Machine Perception (Ministry of Education), Peking University, China"},{"name":"Huawei Noah\u2019s Ark Lab"}]},{"given":"Yunhe","family":"Wang","sequence":"additional","affiliation":[{"name":"Huawei Noah's Ark Lab"}]},{"given":"Chang","family":"Xu","sequence":"additional","affiliation":[{"name":"School of Computer Science, Faculty of Engineering, University of Sydney, Australia"}]}],"member":"10584","event":{"number":"28","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"acronym":"IJCAI-2019","name":"Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}","start":{"date-parts":[[2019,8,10]]},"theme":"Artificial Intelligence","location":"Macao, China","end":{"date-parts":[[2019,8,16]]}},"container-title":["Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2019,7,28]],"date-time":"2019-07-28T03:48:27Z","timestamp":1564285707000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2019\/311"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2019,8]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2019\/311","relation":{},"subject":[],"published":{"date-parts":[[2019,8]]}}}