{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,19]],"date-time":"2026-01-19T10:09:12Z","timestamp":1768817352400,"version":"3.49.0"},"reference-count":32,"publisher":"MDPI AG","issue":"13","license":[{"start":{"date-parts":[[2021,6,23]],"date-time":"2021-06-23T00:00:00Z","timestamp":1624406400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"GDAS' Project of Science and Technology Development","award":["2021GDASYL-20210103001"],"award-info":[{"award-number":["2021GDASYL-20210103001"]}]},{"name":"GDAS' Project of Science and Technology Development","award":["2019GDASYL-0301001"],"award-info":[{"award-number":["2019GDASYL-0301001"]}]},{"name":"the Key Special Project for Introduced Talents Team of Southern Marine Science and Engineering Guangdong Laboratory (Guangzhou)","award":["GML2019ZD0301"],"award-info":[{"award-number":["GML2019ZD0301"]}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["41976189"],"award-info":[{"award-number":["41976189"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100002858","name":"China Postdoctoral Science Foundation","doi-asserted-by":"publisher","award":["BX20200100"],"award-info":[{"award-number":["BX20200100"]}],"id":[{"id":"10.13039\/501100002858","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>The capsule network (Caps) is a novel type of neural network that has great potential for the classification of hyperspectral remote sensing. However, the Caps suffers from the issue of gradient vanishing. To solve this problem, a powered activation regularization based adaptive capsule network (PAR-ACaps) was proposed for hyperspectral remote sensing classification, in which an adaptive routing algorithm without iteration was applied to amplify the gradient, and the powered activation regularization method was used to learn the sparser and more discriminative representation. The classification performance of PAR-ACaps was evaluated using two public hyperspectral remote sensing datasets, i.e., the Pavia University (PU) and Salinas (SA) datasets. The average overall classification accuracy (OA) of PAR-ACaps with shallower architecture was measured and compared with those of the benchmarks, including random forest (RF), support vector machine (SVM), 1-dimensional convolutional neural network (1DCNN), two-dimensional convolutional neural network (CNN), three-dimensional convolutional neural network (3DCNN), Caps, and the original adaptive capsule network (ACaps) with comparable network architectures. The OA of PAR-ACaps for PU and SA datasets was 99.51% and 94.52%, respectively, which was higher than those of benchmarks. Moreover, the classification performance of PAR-ACaps with relatively deeper neural architecture (four and six convolutional layers in the feature extraction stage) was also evaluated to demonstrate the effectiveness of gradient amplification. As shown in the experimental results, the classification performance of PAR-ACaps with relatively deeper neural architecture for PU and SA datasets was also superior to 1DCNN, CNN, 3DCNN, Caps, and ACaps with comparable neural architectures. Additionally, the training time consumed by PAR-ACaps was significantly lower than that of Caps. The proposed PAR-ACaps is, therefore, recommended as an effective alternative for hyperspectral remote sensing classification.<\/jats:p>","DOI":"10.3390\/rs13132445","type":"journal-article","created":{"date-parts":[[2021,6,23]],"date-time":"2021-06-23T03:22:00Z","timestamp":1624418520000},"page":"2445","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":19,"title":["An Adaptive Capsule Network for Hyperspectral Remote Sensing Classification"],"prefix":"10.3390","volume":"13","author":[{"given":"Xiaohui","family":"Ding","sequence":"first","affiliation":[{"name":"Guangzhou Institute of Geography, Guangdong Academy of Sciences, Guangzhou 510070, China"},{"name":"Southern Marine Science and Engineering Guangdong Laboratory (Guangzhou), Guangzhou 511458, China"},{"name":"Guangdong Open Laboratory of Geospatial Information Technology and Application, Guangzhou 510070, China"},{"name":"Key Laboratory of Guangdong for Utilization of Remote Sensing and Geographical Information System, Guangzhou 510070, China"}]},{"given":"Yong","family":"Li","sequence":"additional","affiliation":[{"name":"Guangzhou Institute of Geography, Guangdong Academy of Sciences, Guangzhou 510070, China"},{"name":"Southern Marine Science and Engineering Guangdong Laboratory (Guangzhou), Guangzhou 511458, China"},{"name":"Guangdong Open Laboratory of Geospatial Information Technology and Application, Guangzhou 510070, China"},{"name":"Key Laboratory of Guangdong for Utilization of Remote Sensing and Geographical Information System, Guangzhou 510070, China"}]},{"given":"Ji","family":"Yang","sequence":"additional","affiliation":[{"name":"Guangzhou Institute of Geography, Guangdong Academy of Sciences, Guangzhou 510070, China"},{"name":"Southern Marine Science and Engineering Guangdong Laboratory (Guangzhou), Guangzhou 511458, China"},{"name":"Guangdong Open Laboratory of Geospatial Information Technology and Application, Guangzhou 510070, China"},{"name":"Key Laboratory of Guangdong for Utilization of Remote Sensing and Geographical Information System, Guangzhou 510070, China"}]},{"given":"Huapeng","family":"Li","sequence":"additional","affiliation":[{"name":"Northeast Institute of Geography and Agroecology, Chinese Academy of Sciences, Changchun 130102, China"}]},{"given":"Lingjia","family":"Liu","sequence":"additional","affiliation":[{"name":"School of Geography and Environment, Jiangxi Normal University, Nanchang 330027, China"}]},{"given":"Yangxiaoyue","family":"Liu","sequence":"additional","affiliation":[{"name":"Guangzhou Institute of Geography, Guangdong Academy of Sciences, Guangzhou 510070, China"},{"name":"Southern Marine Science and Engineering Guangdong Laboratory (Guangzhou), Guangzhou 511458, China"},{"name":"Guangdong Open Laboratory of Geospatial Information Technology and Application, Guangzhou 510070, China"},{"name":"Key Laboratory of Guangdong for Utilization of Remote Sensing and Geographical Information System, Guangzhou 510070, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5100-3584","authenticated-orcid":false,"given":"Ce","family":"Zhang","sequence":"additional","affiliation":[{"name":"Lancaster Environment Centre, Lancaster University, Lancaster LA1 4YQ, UK"},{"name":"UK Centre for Ecology & Hydrology, Library Avenue, Lancaster LA1 4AP, UK"}]}],"member":"1968","published-online":{"date-parts":[[2021,6,23]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1674","DOI":"10.1126\/science.1118160","article-title":"The importance of land-cover change in simulating future climates","volume":"310","author":"Feddema","year":"2005","journal-title":"Science"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"1093","DOI":"10.1080\/01431161.2019.1655810","article-title":"A restrictive polymorphic ant colony algorithm for the optimal band selection of hyperspectral remote sensing images","volume":"41","author":"Ding","year":"2020","journal-title":"Int. J. Remote Sens."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1110","DOI":"10.1109\/LGRS.2018.2890421","article-title":"Hyperspectral coastal wetland classification based on a multiobject convolutional neural network model and decision fusion","volume":"16","author":"Hu","year":"2019","journal-title":"IEEE Geosci. Remote Sens. Lett."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"25789","DOI":"10.1109\/ACCESS.2020.2971327","article-title":"An improved ant colony algorithm for optimized band selection of hyperspectral remotely sensed imagery","volume":"8","author":"Ding","year":"2020","journal-title":"IEEE Access"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"504","DOI":"10.1126\/science.1127647","article-title":"Reducing the dimensionality of data with neural networks","volume":"313","author":"Hinton","year":"2006","journal-title":"Science"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"2664","DOI":"10.1080\/01431161.2019.1694725","article-title":"Analysis of various optimizers on deep convolutional neural network model in the application of hyperspectral remote sensing image classification","volume":"41","author":"Bera","year":"2020","journal-title":"Int. J. Remote Sens."},{"key":"ref_7","unstructured":"Liang, M., and Hu, X. (2015, January 7\u201312). Recurrent convolutional neural network for object recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Li, H., Zhang, C., Zhang, S., and Atkinson, P.M. (2019). A hybrid OSVM-OCNN method for crop classification from fine spatial resolution remotely sensed imagery. Remote Sens., 11.","DOI":"10.3390\/rs11202370"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"133","DOI":"10.1016\/j.isprsjprs.2017.07.014","article-title":"A hybrid MLP-CNN classifier for very fine resolution remotely sensed image classification","volume":"140","author":"Zhang","year":"2018","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"2149","DOI":"10.1080\/01431161.2016.1171928","article-title":"Using convolutional features and a sparse autoencoder for land-use scene classification","volume":"37","author":"Othman","year":"2016","journal-title":"Int. J. Remote Sens."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"19","DOI":"10.1016\/j.neunet.2017.07.017","article-title":"A patch-based convolutional neural network for remote sensing image classification","volume":"95","author":"Sharma","year":"2017","journal-title":"Neural Netw."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"6232","DOI":"10.1109\/TGRS.2016.2584107","article-title":"Deep Feature Extraction and Classification of Hyperspectral Images Based on Convolutional Neural Networks","volume":"54","author":"Chen","year":"2016","journal-title":"IEEE Trans. Geosci. Remote. Sens."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Alam, F.I., Zhou, J., Liew, A.W.C., and Jia, X. (2016, January 10\u201315). CRF learning with CNN features for hyperspectral image segmentation. Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China.","DOI":"10.1109\/IGARSS.2016.7730798"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"3599","DOI":"10.1109\/TGRS.2018.2886022","article-title":"A CNN with multiscale convolution and diversified metric for hyperspectral image classification","volume":"57","author":"Gong","year":"2019","journal-title":"IEEE Trans. Geosci. Remote Sens."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Khotimah, W.N., Bennamoun, M., Boussaid, F., Sohel, F., and Edwards, D. (2020). A High-Performance Spectral-Spatial Residual Network for Hyperspectral Image Classification with Small Training Data. Remote Sens., 12.","DOI":"10.3390\/rs12193137"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"1850","DOI":"10.1109\/LSP.2018.2873892","article-title":"ACaps: A novel multi-scale capsule network","volume":"25","author":"Xiang","year":"2018","journal-title":"IEEE Signal Process. Lett."},{"key":"ref_17","unstructured":"Sabour, S., Frosst, N., and Hinton, G.E. (2017). Dynamic routing between capsules. Advances in Neural Information Processing Systems, MIT Press."},{"key":"ref_18","unstructured":"Ren, Q., Shang, S., and He, L. (2019). 2019 Adaptive Routing Between Capsules. arXiv."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Nguyen, C.D.T., Dao, H.H., Huynh, M.T., and Phu Ward, T. (2019, January 13). ResCap: Residual Capsules Network for Medical Image Segmentation. Proceedings of the 2019 Kidney Tumor Segmentation Challenge: KiTS19, Shenzhen, China.","DOI":"10.24926\/548719.058"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Chen, R., Jalal, M.A., and Mihaylova, L. (2018, January 10\u201313). Learning capsules for vehicle logo recognition. Proceedings of the 2018 21st International Conference on Information Fusion, Cambridge, UK.","DOI":"10.23919\/ICIF.2018.8455227"},{"key":"ref_21","unstructured":"Duarte, K., Rawat, Y., and Shah, M. (2018). Videocapsulenet: A simplified network for action detection. Advances in Neural Information Processing Systems, Available online: https:\/\/dl.acm.org\/doi\/10.5555\/3327757.3327860."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Be\u015fer, F., Kizrak, M.A., Bolat, B., and Yildirim, T. (2018, January 2\u20135). Recognition of sign language using capsule networks. Proceedings of the 2018 26th Signal Processing and Communications Applications Conference (SIU), Izmir, Turkey.","DOI":"10.1109\/SIU.2018.8404385"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"2145","DOI":"10.1109\/TGRS.2018.2871782","article-title":"Capsule networks for hyperspectral image classification","volume":"57","author":"Paoletti","year":"2018","journal-title":"IEEE Trans. Geosci. Remote Sens."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Wang, W.Y., Li, H.C., Pan, L., Yang, G., and Du, Q. (2018, January 22\u201327). Hyperspectral Image Classification Based on Capsule Network. Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain.","DOI":"10.1109\/IGARSS.2018.8518951"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Deng, F., Pu, S., Chen, X., Shi, Y., Yuan, T., and Pu, S. (2018). Hyperspectral image classification with capsule network using limited training samples. Sensors, 18.","DOI":"10.3390\/s18093153"},{"key":"ref_26","unstructured":"Zhao, Z., Kleinhans, A., Sandhu, G., Patel, I., and Unnikrishnan, K.P. (2019). Capsule networks with max-min normalization. arXiv."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Jia, B., and Huang, Q. (2020). DE-CapsNet: A diverse enhanced capsule network with disperse dynamic routing. Appl. Sci., 10.","DOI":"10.3390\/app10030884"},{"key":"ref_28","unstructured":"Kwabena, P.M., Weyori, B.A., and Mighty, A.A. (2020). Exploring the performance of LBP-capsule networks with K-Means routing on complex images. J. King Saud Univ. Comput. Inf. Sci."},{"key":"ref_29","unstructured":"Yang, Z., and Wang, X. (2019). Reducing the dilution: An analysis of the information sensitiveness of capsule network with a practical solution. arXiv."},{"key":"ref_30","unstructured":"Hinton, G.E., Sabour, S., and Frosst, N. (May, January 30). Matrix capsules with EM routing. Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"223","DOI":"10.1080\/01431160500275762","article-title":"Comparing accuracy assessments to infer superiority of image classification methods","volume":"27","author":"Jia","year":"2006","journal-title":"Int. J. Remote Sens."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"57","DOI":"10.1016\/j.rse.2018.06.034","article-title":"An object-based convolutional neural network (OCNN) for urban land use classification","volume":"216","author":"Zhang","year":"2018","journal-title":"Remote Sens. Environ."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/13\/2445\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T06:21:35Z","timestamp":1760163695000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/13\/2445"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,6,23]]},"references-count":32,"journal-issue":{"issue":"13","published-online":{"date-parts":[[2021,7]]}},"alternative-id":["rs13132445"],"URL":"https:\/\/doi.org\/10.3390\/rs13132445","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,6,23]]}}}