{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,5]],"date-time":"2026-02-05T09:21:16Z","timestamp":1770283276562,"version":"3.49.0"},"reference-count":34,"publisher":"Emerald","issue":"3","license":[{"start":{"date-parts":[[2022,2,28]],"date-time":"2022-02-28T00:00:00Z","timestamp":1646006400000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/www.emerald.com\/insight\/site-policies"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["IMDS"],"published-print":{"date-parts":[[2022,3,15]]},"abstract":"<jats:sec><jats:title content-type=\"abstract-subheading\">Purpose<\/jats:title><jats:p>Recently, more and more attention has been put forth on the application and deep learning, due to the widespread practicability of neural network computation. The purpose of this paper is developing an effective algorithm to automatically discover the optimal neural network architecture for several real applications.<\/jats:p><\/jats:sec><jats:sec><jats:title content-type=\"abstract-subheading\">Design\/methodology\/approach<\/jats:title><jats:p>The author proposes a novel algorithm, namely, progressive genetic-based neural architecture search (PG-NAS), as a solution to efficiently find the optimal neural network structure for given data. PG-NAS also employs several operations to effectively shrink the search space to reduce the computation cost and improve the accuracy validation.<\/jats:p><\/jats:sec><jats:sec><jats:title content-type=\"abstract-subheading\">Findings<\/jats:title><jats:p>The proposed PG-NAS could be utilized on several tasks for discovering the optimal network structure. The author reduces the demand of manual settings when implementing artificial intelligence (AI) models; hence, PG-NAS requires less human intervention than traditional machine learning. The average and top-1 metrics, such as error, loss and accuracy, are used to measure the discovered neural architectures of the proposed model over all baselines. The experimental results show that, with several real datasets, the proposed PG-NAS model consistently outperforms the state-of-the-art models in all metrics.<\/jats:p><\/jats:sec><jats:sec><jats:title content-type=\"abstract-subheading\">Originality\/value<\/jats:title><jats:p>Generally, the size and the complexity of the search space for the neural network dominates the performance of computation time and resources. In this study, PG-NAS utilizes genetic operations to effectively generate the compact candidate set, i.e. fewer combinations need to be generated when constructing the candidate set. Moreover, by the proposed selector in PG-NAS, the non-promising network structure could be significantly pruned off. In addition, the accuracy derivation of each combination in the candidate set is also a performance bottleneck. The author develops a predictor network to efficiently estimate the accuracy to avoid the time-consuming derivation. The learning of the prediction process is also adjusted dynamically; this adaptive learning of the predictor could capture the pattern of training data effectively and efficiently. Furthermore, the proposed PG-NAS algorithm is applied on several real datasets to show its practicability and scalability.<\/jats:p><\/jats:sec>","DOI":"10.1108\/imds-05-2021-0323","type":"journal-article","created":{"date-parts":[[2022,2,24]],"date-time":"2022-02-24T02:45:50Z","timestamp":1645670750000},"page":"645-665","source":"Crossref","is-referenced-by-count":3,"title":["A progressive genetic-based neural architecture search"],"prefix":"10.1108","volume":"122","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-7153-0410","authenticated-orcid":false,"given":"Yi-Cheng","family":"Chen","sequence":"first","affiliation":[]}],"member":"140","published-online":{"date-parts":[[2022,2,28]]},"reference":[{"key":"key2022031812495462300_ref001","article-title":"Designing neural network architectures using reinforcement learning","year":"2017"},{"key":"key2022031812495462300_ref002","article-title":"Accelerating neural architecture search using performance prediction","year":"2018"},{"key":"key2022031812495462300_ref003","unstructured":"BangKok Bank (2021), available at: https:\/\/www.bangkokbank.com."},{"issue":"10","key":"key2022031812495462300_ref004","first-page":"281","article-title":"Random search for hyper-parameter optimization","volume":"13","year":"2012","journal-title":"Journal of Machine Learning Research"},{"key":"key2022031812495462300_ref005","article-title":"Smash: one-shot model architecture search through hypernetworks","year":"2018"},{"key":"key2022031812495462300_ref006","article-title":"Efficient architecture search by network transformation","year":"2018"},{"key":"key2022031812495462300_ref007","article-title":"Xception: deep learning with depthwise separable convolutions","year":"2017"},{"key":"key2022031812495462300_ref008","article-title":"Imagenet: a large-scale hierarchical image database","year":"2009"},{"key":"key2022031812495462300_ref009","article-title":"Speeding up automatic hyperparameter optimization of deep neural networks by extrapolation of learning curves","year":"2015"},{"key":"key2022031812495462300_ref010","article-title":"Simple and efficient architecture search for convolutional neural networks","year":"2018"},{"key":"key2022031812495462300_ref011","first-page":"1","article-title":"Neural architecture search: a survey","volume":"20","year":"2019","journal-title":"Journal of Machine Learning Research"},{"issue":"1","key":"key2022031812495462300_ref012","doi-asserted-by":"crossref","first-page":"47","DOI":"10.1007\/s12065-007-0002-4","article-title":"Neuroevolution: from architectures to learning","volume":"1","year":"2008","journal-title":"Evolutionary Intelligence"},{"issue":"4","key":"key2022031812495462300_ref013","doi-asserted-by":"crossref","first-page":"1026","DOI":"10.1109\/JAS.2020.1003114","article-title":"AI-based modeling and data-driven evaluation for smart manufacturing processes","volume":"7","year":"2020","journal-title":"IEEE\/CAA Journal of Automatica Sinica"},{"key":"key2022031812495462300_ref014","article-title":"IRLAS: inverse reinforcement learning for architecture search"},{"key":"key2022031812495462300_ref015","article-title":"Squeeze-and-excitation networks"},{"key":"key2022031812495462300_ref016","volume-title":"Automatic Machine Learning: Methods, Systems, Challenges"},{"key":"key2022031812495462300_ref017","unstructured":"Kaggle datasets (2021), available at: https:\/\/www.kaggle.com\/datasets."},{"issue":"14","key":"key2022031812495462300_ref018","doi-asserted-by":"crossref","first-page":"11016","DOI":"10.1109\/JIOT.2021.3051414","article-title":"Deep learning in the industrial Internet of Things: potentials, challenges, and emerging applications","volume":"8","year":"2021","journal-title":"IEEE Internet of Things Journal"},{"issue":"6","key":"key2022031812495462300_ref019","doi-asserted-by":"crossref","first-page":"84","DOI":"10.1145\/3065386","article-title":"Imagenet classification with deep convolutional neural networks","volume":"60","year":"2017","journal-title":"Communications of ACM"},{"key":"key2022031812495462300_ref020","article-title":"Progressive neural architecture search","year":"2018"},{"key":"key2022031812495462300_ref021","article-title":"Differentiable architecture search","year":"2019"},{"key":"key2022031812495462300_ref022","article-title":"Neural architecture optimization","year":"2018"},{"key":"key2022031812495462300_ref023","article-title":"Towards automatically-tuned neural networks","year":"2016"},{"key":"key2022031812495462300_ref024","article-title":"Efficient neural architecture search via parameter sharing","year":"2018"},{"key":"key2022031812495462300_ref025","article-title":"Regularized evolution for image classifier architecture search","year":"2019"},{"key":"key2022031812495462300_ref026","first-page":"1","article-title":"Evolution strategies as a scalable alternative to reinforcement learning","year":"2017","journal-title":"OpenAI 2017"},{"issue":"2","key":"key2022031812495462300_ref027","doi-asserted-by":"crossref","first-page":"99","DOI":"10.1162\/106365602320169811","article-title":"Evolving neural networks through augmenting topologies","volume":"10","year":"2002","journal-title":"Evolutionary Computation"},{"issue":"6","key":"key2022031812495462300_ref028","doi-asserted-by":"crossref","first-page":"653","DOI":"10.1109\/TEVC.2005.856210","article-title":"Real-time neuroevolution in the NERO video game","volume":"9","year":"2005","journal-title":"IEEE Transactions on Evolutionary Computation"},{"key":"key2022031812495462300_ref029","article-title":"A genetic programming approach to designing convolutional neural network architectures","year":"2017"},{"key":"key2022031812495462300_ref030","article-title":"Going deeper with convolutions","year":"2015"},{"key":"key2022031812495462300_ref031","article-title":"Evolving artificial neural networks","year":"1999"},{"key":"key2022031812495462300_ref032","article-title":"Practical block-wise neural network architecture generation","year":"2018"},{"key":"key2022031812495462300_ref033","article-title":"Neural architecture search with reinforcement learning","year":"2017"},{"key":"key2022031812495462300_ref034","article-title":"Learning transferable architectures for scalable image recognition","year":"2018"}],"container-title":["Industrial Management &amp; Data Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/IMDS-05-2021-0323\/full\/xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/IMDS-05-2021-0323\/full\/html","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,7,24]],"date-time":"2025-07-24T21:52:13Z","timestamp":1753393933000},"score":1,"resource":{"primary":{"URL":"http:\/\/www.emerald.com\/imds\/article\/122\/3\/645-665\/186795"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,2,28]]},"references-count":34,"journal-issue":{"issue":"3","published-online":{"date-parts":[[2022,2,28]]},"published-print":{"date-parts":[[2022,3,15]]}},"alternative-id":["10.1108\/IMDS-05-2021-0323"],"URL":"https:\/\/doi.org\/10.1108\/imds-05-2021-0323","relation":{},"ISSN":["0263-5577"],"issn-type":[{"value":"0263-5577","type":"print"}],"subject":[],"published":{"date-parts":[[2022,2,28]]}}}