{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,11]],"date-time":"2026-04-11T09:28:52Z","timestamp":1775899732592,"version":"3.50.1"},"reference-count":40,"publisher":"MDPI AG","issue":"20","license":[{"start":{"date-parts":[[2022,10,20]],"date-time":"2022-10-20T00:00:00Z","timestamp":1666224000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["42001198"],"award-info":[{"award-number":["42001198"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["42071195"],"award-info":[{"award-number":["42071195"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["2022B1212100006"],"award-info":[{"award-number":["2022B1212100006"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["2022GDASZH-2022010202"],"award-info":[{"award-number":["2022GDASZH-2022010202"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["2020GDASYL-20200104004"],"award-info":[{"award-number":["2020GDASYL-20200104004"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["CTCZ19K04"],"award-info":[{"award-number":["CTCZ19K04"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Science and Technology Program of Guangdong","award":["42001198"],"award-info":[{"award-number":["42001198"]}]},{"name":"Science and Technology Program of Guangdong","award":["42071195"],"award-info":[{"award-number":["42071195"]}]},{"name":"Science and Technology Program of Guangdong","award":["2022B1212100006"],"award-info":[{"award-number":["2022B1212100006"]}]},{"name":"Science and Technology Program of Guangdong","award":["2022GDASZH-2022010202"],"award-info":[{"award-number":["2022GDASZH-2022010202"]}]},{"name":"Science and Technology Program of Guangdong","award":["2020GDASYL-20200104004"],"award-info":[{"award-number":["2020GDASYL-20200104004"]}]},{"name":"Science and Technology Program of Guangdong","award":["CTCZ19K04"],"award-info":[{"award-number":["CTCZ19K04"]}]},{"name":"GDAS\u2019 Project of Science and Technology Development","award":["42001198"],"award-info":[{"award-number":["42001198"]}]},{"name":"GDAS\u2019 Project of Science and Technology Development","award":["42071195"],"award-info":[{"award-number":["42071195"]}]},{"name":"GDAS\u2019 Project of Science and Technology Development","award":["2022B1212100006"],"award-info":[{"award-number":["2022B1212100006"]}]},{"name":"GDAS\u2019 Project of Science and Technology Development","award":["2022GDASZH-2022010202"],"award-info":[{"award-number":["2022GDASZH-2022010202"]}]},{"name":"GDAS\u2019 Project of Science and Technology Development","award":["2020GDASYL-20200104004"],"award-info":[{"award-number":["2020GDASYL-20200104004"]}]},{"name":"GDAS\u2019 Project of Science and Technology Development","award":["CTCZ19K04"],"award-info":[{"award-number":["CTCZ19K04"]}]},{"name":"Open Fund of National-Local Joint Engineering Laboratory","award":["42001198"],"award-info":[{"award-number":["42001198"]}]},{"name":"Open Fund of National-Local Joint Engineering Laboratory","award":["42071195"],"award-info":[{"award-number":["42071195"]}]},{"name":"Open Fund of National-Local Joint Engineering Laboratory","award":["2022B1212100006"],"award-info":[{"award-number":["2022B1212100006"]}]},{"name":"Open Fund of National-Local Joint Engineering Laboratory","award":["2022GDASZH-2022010202"],"award-info":[{"award-number":["2022GDASZH-2022010202"]}]},{"name":"Open Fund of National-Local Joint Engineering Laboratory","award":["2020GDASYL-20200104004"],"award-info":[{"award-number":["2020GDASYL-20200104004"]}]},{"name":"Open Fund of National-Local Joint Engineering Laboratory","award":["CTCZ19K04"],"award-info":[{"award-number":["CTCZ19K04"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>The classification of architectural style for Chinese traditional settlements (CTSs) has become a crucial task for developing and preserving settlements. Traditionally, the classification of CTSs primarily relies on manual work, which is inefficient and time consuming. Inspired by the tremendous success of deep learning (DL), some recent studies attempted to apply DL networks such as convolution neural networks (CNNs) to achieve automated classification of the architecture styles. However, these studies suffer overfitting problems of the CNNs, leading to inferior classification performance. Moreover, most of the studies apply the CNNs as a black box providing limited interpretability. To address these limitations, a new DL classification framework is proposed in this study to overcome the overfitting problem by transfer learning and learning-based data augmentation technique (i.e., AutoAugment). Furthermore, we also employ class activation map (CAM) visualization technique to help understand how the CNN classifiers work to abstract patterns from the input. Specifically, due to a lack of architectural style datasets for the CTSs, a new annotated dataset is first established with six representative classes. Second, several representative CNNs are leveraged to benchmark the new dataset. Third, to address the overfitting problem of the CNNs, a new DL framework is proposed which combines transfer learning and AutoAugment to improve the classification performance. Extensive experiments are conducted on the new dataset to demonstrate the effectiveness of our framework. The proposed framework achieves much better performance than baselines, greatly mitigating the overfitting problem. Additionally, the CAM visualization technique is harnessed to explain what and how the CNN classifiers implicitly learn for recognizing a specified architectural style.<\/jats:p>","DOI":"10.3390\/rs14205250","type":"journal-article","created":{"date-parts":[[2022,10,21]],"date-time":"2022-10-21T00:34:30Z","timestamp":1666312470000},"page":"5250","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":30,"title":["Towards Classification of Architectural Styles of Chinese Traditional Settlements Using Deep Learning: A Dataset, a New Framework, and Its Interpretability"],"prefix":"10.3390","volume":"14","author":[{"given":"Qing","family":"Han","sequence":"first","affiliation":[{"name":"College of Geography and Tourism, Hengyang Normal University, Hengyang 421002, China"},{"name":"Cooperative Innovation Centre for Digitalization of Cultural Heritage in Traditional Villages and Towns, Hengyang Normal University, Hengyang 421002, China"},{"name":"National-Local Joint Engineering Laboratory on Digital Preservation and Innovative Technologies for the Culture of Traditional Villages and Towns, Hengyang Normal University, Hengyang 421002, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8656-1874","authenticated-orcid":false,"given":"Chao","family":"Yin","sequence":"additional","affiliation":[{"name":"Guangdong Province Engineering Laboratory for Geographic Spatio-temporal Big Data, Key Laboratory of Guangdong for Utilization of Remote Sensing and Geographical Information System, Guangdong Open Laboratory of Geospatial Information Technology and Application, Guangzhou Institute of Geography, Guangdong Academy of Sciences, Guangzhou 510070, China"},{"name":"Department of Civil and Environmental Engineering, The Hong Kong University of Science and Technology, Hong Kong 999077, China"}]},{"given":"Yunyuan","family":"Deng","sequence":"additional","affiliation":[{"name":"College of Geography and Tourism, Hengyang Normal University, Hengyang 421002, China"},{"name":"Cooperative Innovation Centre for Digitalization of Cultural Heritage in Traditional Villages and Towns, Hengyang Normal University, Hengyang 421002, China"},{"name":"National-Local Joint Engineering Laboratory on Digital Preservation and Innovative Technologies for the Culture of Traditional Villages and Towns, Hengyang Normal University, Hengyang 421002, China"}]},{"given":"Peilin","family":"Liu","sequence":"additional","affiliation":[{"name":"College of Geography and Tourism, Hengyang Normal University, Hengyang 421002, China"},{"name":"Institute of Rural Revitalization Research, Changsha University, Changsha 410022, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,10,20]]},"reference":[{"key":"ref_1","unstructured":"(2020, December 18). China Intangible Cultural Heritage, China-Ich. (n.d.). Available online: https:\/\/www.culturalheritagechina.org."},{"key":"ref_2","unstructured":"UNESCO\u2014China (2020, December 18). (n.d.). Available online: https:\/\/ich.unesco.org\/en\/state."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"(2005). Convention for the Safeguarding of the Intangible Cultural Heritage 2003. Int. J. Cult. Prop., 12, 447\u2013458.","DOI":"10.1017\/S0940739105050277"},{"key":"ref_4","unstructured":"(2020, December 18). Preservation of China\u2019s Intangible Cultural Heritage, EESD: The Encyclopedia of Education for Sustainable Development. (n.d.). Available online: http:\/\/www.encyclopediaesd.com\/blog-1\/2016\/5\/25\/preservation-of-chinas-intangible-cultural-heritage."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"292","DOI":"10.1080\/13527250600604639","article-title":"The Scope and Definitions of Heritage: From Tangible to Intangible","volume":"12","author":"Ahmad","year":"2006","journal-title":"Int. J. Herit. Stud."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"102","DOI":"10.1016\/j.patrec.2020.02.017","article-title":"Machine Learning for Cultural Heritage: A Survey","volume":"133","author":"Fiorucci","year":"2020","journal-title":"Pattern Recognit. Lett."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"011016","DOI":"10.1117\/1.JEI.26.1.011016","article-title":"Architectural style classification of Mexican historical buildings using deep convolutional neural networks and sparse features","volume":"26","author":"Obeso","year":"2016","journal-title":"J. Electron. Imaging"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"121","DOI":"10.1186\/s40494-020-00464-2","article-title":"Mural classification model based on high- and low-level vision fusion","volume":"8","author":"Cao","year":"2020","journal-title":"Herit Sci."},{"key":"ref_10","unstructured":"Simonyan, K., and Zisserman, A. (2014). Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv, Available online: http:\/\/arxiv.org\/abs\/1409.1556."},{"key":"ref_11","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2018, October 24). Deep Residual Learning for Image Recognition. Available online: https:\/\/arxiv.org\/abs\/1512.03385."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Huang, G., Liu, Z., van der Maaten, L., and Weinberger, K.Q. (2018). Densely Connected Convolutional Networks. arXiv, Available online: http:\/\/arxiv.org\/abs\/1608.06993.","DOI":"10.1109\/CVPR.2017.243"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Tan, C., Sun, F., Kong, T., Zhang, W., Yang, C., and Liu, C. (2018). A Survey on Deep Transfer Learning, Springer.","DOI":"10.1007\/978-3-030-01424-7_27"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Cubuk, E.D., Zoph, B., Mane, D., Vasudevan, V., and Le, Q.V. (2019, January 16\u201320). AutoAugment: Learning Augmentation Strategies from Data. Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA.","DOI":"10.1109\/CVPR.2019.00020"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"107473","DOI":"10.1016\/j.buildenv.2020.107473","article-title":"Heritage values of ancient vernacular residences in traditional villages in Western Hunan, China: Spatial patterns and influencing factors","volume":"188","author":"Fu","year":"2021","journal-title":"Build. Environ."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"386","DOI":"10.1016\/j.buildenv.2005.02.009","article-title":"An architectural evaluation method for conservation of traditional dwellings","volume":"41","year":"2006","journal-title":"Build. Environ."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"107959","DOI":"10.1016\/j.buildenv.2021.107959","article-title":"Analysis of climate change impact on the preservation of heritage elements in historic buildings with a deficient indoor microclimate in warm regions","volume":"200","year":"2021","journal-title":"Build. Environ."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.habitatint.2018.06.002","article-title":"Differentiation of spatial morphology of rural settlements from an ethnic cultural perspective on the Northeast Tibetan Plateau, China","volume":"79","author":"Li","year":"2018","journal-title":"Habitat Int."},{"key":"ref_19","first-page":"32","article-title":"Geographical features and development regularities of rural areas and settlements distribution in mountain countries","volume":"52","author":"Potosyan","year":"2017","journal-title":"Ann. Agrar. Sci."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"873","DOI":"10.1007\/s11442-022-1976-7","article-title":"Classification and detection of dominant factors in geospatial patterns of traditional settlements in China","volume":"32","author":"Wu","year":"2022","journal-title":"J. Geogr. Sci."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"679","DOI":"10.1007\/s11442-013-1037-3","article-title":"Settlement distribution and its relationship with environmental changes from the Neolithic to Shang-Zhou dynasties in northern Shandong, China","volume":"23","author":"Guo","year":"2013","journal-title":"J. Geogr. Sci."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"20","DOI":"10.1016\/j.culher.2017.03.004","article-title":"Multiple linear regression and fuzzy logic models applied to the functional service life prediction of cultural heritage","volume":"27","author":"Prieto","year":"2017","journal-title":"J. Cult. Herit."},{"key":"ref_23","first-page":"1496","article-title":"Landscape division of traditional settlement and effect elements of landscape gene in China","volume":"65","author":"Liu","year":"2010","journal-title":"Acta Geogr. Sin."},{"key":"ref_24","unstructured":"Battiato, S., Gallo, G., Schettini, R., and Stanco, F. (2017). Deep Multibranch Neural Network for Painting Categorization. Image Analysis and Processing\u2014ICIAP 2017, Springer International Publishing."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"43","DOI":"10.1016\/j.habitatint.2018.12.006","article-title":"Abandoned rural residential land: Using machine learning techniques to identify rural residential land vulnerable to be abandoned in mountainous areas","volume":"84","author":"Xu","year":"2019","journal-title":"Habitat Int."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Leal-Taix\u00e9, L., and Roth, S. (2019). Weakly Supervised Object Detection in Artworks. Computer Vision\u2014ECCV 2018 Workshops, Springer International Publishing.","DOI":"10.1007\/978-3-030-11018-5"},{"key":"ref_27","unstructured":"Pereira, F., Burges, C.J.C., Bottou, L., and Weinberger, K.Q. (2012). ImageNet Classification with Deep Convolutional Neural Networks. Advances in Neural Information Processing Systems 25, Curran Associates, Inc.. Available online: http:\/\/papers.nips.cc\/paper\/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1345","DOI":"10.1109\/TKDE.2009.191","article-title":"A Survey on Transfer Learning","volume":"22","author":"Pan","year":"2010","journal-title":"IEEE Trans. Knowl. Data Eng."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"43","DOI":"10.1109\/JPROC.2020.3004555","article-title":"A Comprehensive Survey on Transfer Learning","volume":"109","author":"Zhuang","year":"2020","journal-title":"Proc. IEEE"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Deng, J., Dong, W., Socher, R., Li, L., Li, K., and Li, F.-F. (2009, January 20\u201325). ImageNet: A large-scale hierarchical image database. Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA.","DOI":"10.1109\/CVPR.2009.5206848"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"60","DOI":"10.1186\/s40537-019-0197-0","article-title":"A survey on Image Data Augmentation for Deep Learning","volume":"6","author":"Shorten","year":"2019","journal-title":"J. Big Data"},{"key":"ref_32","unstructured":"Yang, S., Xiao, W., Zhang, M., Guo, S., Zhao, J., and Shen, F. (2022). Image Data Augmentation for Deep Learning: A Survey. arXiv."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Li, R., Li, X., Heng, P.-A., and Fu, C.-W. (2020). PointAugment: An Auto-Augmentation Framework for Point Cloud Classification. arXiv, Available online: http:\/\/arxiv.org\/abs\/2002.10876.","DOI":"10.1109\/CVPR42600.2020.00641"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Cubuk, E.D., Zoph, B., Shlens, J., and Le, Q.V. (2019). RandAugment: Practical automated data augmentation with a reduced search space. arXiv, Available online: http:\/\/arxiv.org\/abs\/1909.13719.","DOI":"10.1109\/CVPRW50498.2020.00359"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"144","DOI":"10.3974\/geodp.2018.02.03","article-title":"The Spatial Distribution Dataset of 2555 Chinese Traditional Villages","volume":"2","author":"Yu","year":"2018","journal-title":"J. Glob. Chang. Data Discov."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"336","DOI":"10.1007\/s11263-019-01228-7","article-title":"Grad-CAM: Visual Explanations from Deep Networks via Gradient-based Localization","volume":"128","author":"Selvaraju","year":"2020","journal-title":"Int. J. Comput. Vis."},{"key":"ref_37","unstructured":"Zoph, B., Ghiasi, G., Lin, T.-Y., Cui, Y., Liu, H., Cubuk, E.D., and Le, Q.V. (2020). Rethinking Pre-training and Self-training. arXiv, Available online: http:\/\/arxiv.org\/abs\/2006.06882."},{"key":"ref_38","unstructured":"Chen, T., Kornblith, S., Norouzi, M., and Hinton, G. (2020). A Simple Framework for Contrastive Learning of Visual Representations. arXiv, Available online: http:\/\/arxiv.org\/abs\/2002.05709."},{"key":"ref_39","unstructured":"Zhang, H., Wu, C., Zhang, Z., Zhu, Y., Zhang, Z., Lin, H., Sun, Y., He, T., Mueller, J., and Manmatha, R. (2020). ResNeSt: Split-Attention Networks. arXiv, Available online: http:\/\/arxiv.org\/abs\/2004.08955."},{"key":"ref_40","unstructured":"Tan, M., and Le, Q.V. (2020). EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks. arXiv, Available online: http:\/\/arxiv.org\/abs\/1905.11946."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/20\/5250\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T00:58:22Z","timestamp":1760144302000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/20\/5250"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,10,20]]},"references-count":40,"journal-issue":{"issue":"20","published-online":{"date-parts":[[2022,10]]}},"alternative-id":["rs14205250"],"URL":"https:\/\/doi.org\/10.3390\/rs14205250","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,10,20]]}}}