{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T23:36:43Z","timestamp":1761176203414,"version":"build-2065373602"},"reference-count":0,"publisher":"IOS Press","isbn-type":[{"value":"9781643686318","type":"electronic"}],"license":[{"start":{"date-parts":[[2025,10,21]],"date-time":"2025-10-21T00:00:00Z","timestamp":1761004800000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-nc\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2025,10,21]]},"abstract":"<jats:p>Convolutional neural networks (CNNs) excel at learning spatial patterns through local receptive fields, while channel attention mechanisms enhance feature discrimination by emphasizing informative channels. Traditional approaches like SENet generate channel weights through global pooling and bottlenecked MLPs, but fundamentally ignore cross-channel dependencies\u2014a critical limitation that discards contextual relationships between features. To address this, we propose the Cross-Channel Graph Attention mechanism (CCA), which explicitly models channel interactions via graph neural networks. Specifically, CCA represents channels as graph nodes, constructs sparse KNN graphs to capture inter-channel dependencies, and performs attention-based feature aggregation. Theoretically, we prove that CCA strictly expands the expressive power of conventional channel attention through full-rank attention matrices, while maintaining superior generalization via sparsity-induced Rademacher complexity bounds. Furthermore, we establish that CCA can universally approximate any continuous channel mixing function with controlled neighbor size (Theorem 3). Extensive experiments on image classification (CIFAR-10\/100, mini-ImageNet) and super-resolution (Set5\/14) validate that CCA-equipped networks achieve state-of-the-art performance\u2014improving ResNet34 by 2.16% TOP-1 accuracy on CIFAR-10 and boosting VDSR\u2019s PSNR by 0.43dB on \u00d73 scaling. Both theoretical and empirical analyses demonstrate that CCA achieves an optimal balance between feature interaction capacity and computational efficiency through its graph-structured attention paradigm. The code is available at https:\/\/github.com\/openreview-pro\/CCA<\/jats:p>","DOI":"10.3233\/faia251088","type":"book-chapter","created":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T09:51:06Z","timestamp":1761126666000},"source":"Crossref","is-referenced-by-count":0,"title":["Cross-Channel Graph Attention Mechanism for Deep Convolutional Neural Networks"],"prefix":"10.3233","author":[{"given":"Kaixuan","family":"Yao","sequence":"first","affiliation":[{"name":"Shanxi Taihang Laboratory, Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, School of Computer and Information Technology, Shanxi University"}]},{"given":"Mingxu","family":"Zhang","sequence":"additional","affiliation":[{"name":"Shanxi Taihang Laboratory, Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, School of Computer and Information Technology, Shanxi University"}]},{"given":"Jiao","family":"Zhao","sequence":"additional","affiliation":[{"name":"Shanxi Taihang Laboratory, Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, School of Computer and Information Technology, Shanxi University"}]},{"given":"Junbiao","family":"Cui","sequence":"additional","affiliation":[{"name":"Shanxi Taihang Laboratory, Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, School of Computer and Information Technology, Shanxi University"}]},{"given":"Jiye","family":"Liang","sequence":"additional","affiliation":[{"name":"Shanxi Taihang Laboratory, Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, School of Computer and Information Technology, Shanxi University"}]}],"member":"7437","container-title":["Frontiers in Artificial Intelligence and Applications","ECAI 2025"],"original-title":[],"link":[{"URL":"https:\/\/ebooks.iospress.nl\/pdf\/doi\/10.3233\/FAIA251088","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T09:51:06Z","timestamp":1761126666000},"score":1,"resource":{"primary":{"URL":"https:\/\/ebooks.iospress.nl\/doi\/10.3233\/FAIA251088"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,10,21]]},"ISBN":["9781643686318"],"references-count":0,"URL":"https:\/\/doi.org\/10.3233\/faia251088","relation":{},"ISSN":["0922-6389","1879-8314"],"issn-type":[{"value":"0922-6389","type":"print"},{"value":"1879-8314","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,10,21]]}}}