{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,11]],"date-time":"2026-03-11T16:36:19Z","timestamp":1773246979629,"version":"3.50.1"},"reference-count":70,"publisher":"MDPI AG","issue":"9","license":[{"start":{"date-parts":[[2022,8,25]],"date-time":"2022-08-25T00:00:00Z","timestamp":1661385600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Future Internet"],"abstract":"<jats:p>Forecasting the risk factor of the financial frontier markets has always been a very challenging task. Unlike an emerging market, a frontier market has a missing parameter named \u201cvolatility\u201d, which indicates the market\u2019s risk and as a result of the absence of this missing parameter and the lack of proper prediction, it has almost become difficult for direct customers to invest money in frontier markets. However, the noises, seasonality, random spikes and trends of the time-series datasets make it even more complicated to predict stock prices with high accuracy. In this work, we have developed a novel stacking ensemble of the neural network model that performs best on multiple data patterns. We have compared our model\u2019s performance with the performance results obtained by using some traditional machine learning ensemble models such as Random Forest, AdaBoost, Gradient Boosting Machine and Stacking Ensemble, along with some traditional deep learning models such as Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term (BiLSTM). We have calculated the missing parameter named \u201cvolatility\u201d using stock price (Close price) for 20 different companies of the frontier market and then made predictions using the aforementioned machine learning ensemble models, deep learning models and our proposed stacking ensemble of the neural network model. The statistical evaluation metrics RMSE and MAE have been used to evaluate the performance of the models. It has been found that our proposed stacking ensemble neural network model outperforms all other traditional machine learning and deep learning models which have been used for comparison in this paper. The lowest RMSE and MAE values we have received using our proposed model are 0.3626 and 0.3682 percent, respectively, and the highest RMSE and MAE values are 2.5696 and 2.444 percent, respectively. The traditional ensemble learning models give the highest RMSE and MAE error rate of 20.4852 and 20.4260 percent, while the deep learning models give 15.2332 and 15.1668 percent, respectively, which clearly states that our proposed model provides a very low error value compared with the traditional models.<\/jats:p>","DOI":"10.3390\/fi14090252","type":"journal-article","created":{"date-parts":[[2022,8,25]],"date-time":"2022-08-25T21:28:12Z","timestamp":1661462892000},"page":"252","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":17,"title":["Forecasting the Risk Factor of Frontier Markets: A Novel Stacking Ensemble of Neural Network Approach"],"prefix":"10.3390","volume":"14","author":[{"given":"Mst. Shapna","family":"Akter","sequence":"first","affiliation":[{"name":"Department of Computer Science, Kennesaw State University, 370 Paulding Ave., Kennesaw, GA 30144, USA"}]},{"given":"Hossain","family":"Shahriar","sequence":"additional","affiliation":[{"name":"Department of Information Technology, Kennesaw State University, 370 Paulding Ave., Kennesaw, GA 30144, USA"}]},{"given":"Reaz","family":"Chowdhury","sequence":"additional","affiliation":[{"name":"Department of Electrical and Engineering, University of Alberta, Edmonton, AB T6G 2P5, Canada"}]},{"given":"M. R. C.","family":"Mahdy","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, North South University, Dhaka 1229, Bangladesh"}]}],"member":"1968","published-online":{"date-parts":[[2022,8,25]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"18","DOI":"10.19030\/jabr.v30i2.8421","article-title":"Volatility spillovers between oil prices and stock returns: A focus on frontier markets","volume":"30","author":"Gomes","year":"2014","journal-title":"J. Appl. Bus. Res."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"124444","DOI":"10.1016\/j.physa.2020.124444","article-title":"Predicting the stock price of frontier markets using machine learning and modified Black\u2013Scholes Option pricing model","volume":"555","author":"Chowdhury","year":"2020","journal-title":"Phys. A Stat. Mech. Appl."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"170","DOI":"10.32861\/ijefr.67.170.179","article-title":"Predicting Intraday Prices in the Frontier Stock Market of Romania Using Machine Learning Algorithms","volume":"6","author":"Anghel","year":"2020","journal-title":"Int. J. Econ. Financ. Res."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"688","DOI":"10.1109\/ICNC.2007.780","article-title":"Time series prediction based on linear regression and SVR","volume":"Volume 1","author":"Lin","year":"2007","journal-title":"Proceedings of the Third International Conference on Natural Computation (ICNC 2007)"},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Kavitha, S., Varuna, S., and Ramya, R. (2016, January 19). A comparative analysis on linear regression and support vector regression. Proceedings of the 2016 Online International Conference on Green Engineering and Technologies (IC-GET), Virtual.","DOI":"10.1109\/GET.2016.7916627"},{"key":"ref_6","unstructured":"Johnsson, O. (2018). Predicting Stock Index Volatility Using Artificial Neural Networks: An Empirical Study of the OMXS30, FTSE100 & S&P\/ASX200. [Master\u2019s Thesis, Lund University]."},{"key":"ref_7","unstructured":"Madge, S., and Bhatt, S. (2015). Predicting stock price direction using support vector machines. Independent Work Report Spring, Princeton University."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"156","DOI":"10.1109\/HICSS.1991.184055","article-title":"Predicting stock price performance: A neural network approach","volume":"Volume 4","author":"Yoon","year":"1991","journal-title":"Proceedings of the Twenty-Fourth Annual Hawaii International Conference on System Sciences"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"9","DOI":"10.1016\/j.eneco.2017.05.023","article-title":"A deep learning ensemble approach for crude oil price forecasting","volume":"66","author":"Zhao","year":"2017","journal-title":"Energy Econ."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"112395","DOI":"10.1016\/j.cam.2019.112395","article-title":"Bitcoin price prediction using machine learning: An approach to sample dimension engineering","volume":"365","author":"Chen","year":"2020","journal-title":"J. Comput. Appl. Math."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Andriopoulos, N., Magklaras, A., Birbas, A., Papalexopoulos, A., Valouxis, C., Daskalaki, S., Birbas, M., Housos, E., and Papaioannou, G.P. (2021). Short Term Electric Load Forecasting Based on Data Transformation and Statistical Machine Learning. Appl. Sci., 11.","DOI":"10.3390\/app11010158"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Selvin, S., Vinayakumar, R., Gopalakrishnan, E., Menon, V.K., and Soman, K. (2017, January 13\u201316). Stock price prediction using LSTM, RNN and CNN-sliding window model. Proceedings of the 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Udupi, India.","DOI":"10.1109\/ICACCI.2017.8126078"},{"key":"ref_13","first-page":"13755","article-title":"Stock price prediction using artificial neural network","volume":"3","author":"Patel","year":"2014","journal-title":"Int. J. Innov. Res. Sci. Eng. Technol."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Liu, S., Liao, G., and Ding, Y. (June, January 31). Stock transaction prediction modeling and analysis based on LSTM. Proceedings of the 2018 13th IEEE Conference on Industrial Electronics and Applications (ICIEA), Wuhan, China.","DOI":"10.1109\/ICIEA.2018.8398183"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Siami-Namini, S., Tavakoli, N., and Namin, A.S. (2019, January 9\u201312). The performance of LSTM and BiLSTM in forecasting time series. Proceedings of the 2019 IEEE International Conference on Big Data (Big Data), Los Angeles, CA, USA.","DOI":"10.1109\/BigData47090.2019.9005997"},{"key":"ref_16","unstructured":"Elliot, A., and Hsu, C.H. (2017). Time Series Prediction: Predicting Stock Price. arXiv."},{"key":"ref_17","unstructured":"Elsayed, S., Thyssens, D., Rashed, A., Schmidt-Thieme, L., and Jomaa, H.S. (2021). Do We Really Need Deep Learning Models for Time Series Forecasting?. arXiv."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Luong, C., and Dokuchaev, N. (2018). Forecasting of realised volatility with the random forests algorithm. J. Risk Financ. Manag., 11.","DOI":"10.3390\/jrfm11040061"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Qiu, X., Zhang, L., Ren, Y., Suganthan, P.N., and Amaratunga, G. (2014, January 9\u201312). Ensemble deep learning for regression and time series forecasting. Proceedings of the 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning (CIEL), Orlando, FL, USA.","DOI":"10.1109\/CIEL.2014.7015739"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"889","DOI":"10.1007\/s10489-020-01839-5","article-title":"A multi-layer and multi-ensemble stock trader using deep learning and deep reinforcement learning","volume":"51","author":"Carta","year":"2021","journal-title":"Appl. Intell."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Livieris, I.E., Pintelas, E., Stavroyiannis, S., and Pintelas, P. (2020). Ensemble deep learning models for forecasting cryptocurrency time-series. Algorithms, 13.","DOI":"10.3390\/a13050121"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Li, S., Yao, Y., Hu, J., Liu, G., Yao, X., and Hu, J. (2018). An ensemble stacked convolutional neural network model for environmental event sound recognition. Appl. Sci., 8.","DOI":"10.3390\/app8071152"},{"key":"ref_23","unstructured":"Dey, S., Kumar, Y., Saha, S., and Basak, S. (2016). Forecasting to Classification: Predicting the Direction of Stock Market Price Using Xtreme Gradient Boosting, PESIT South Campus."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Albaity, M.S. (2011). Impact of the monetary policy instruments on Islamic stock market index return. Econ. Discuss. Pap.","DOI":"10.2139\/ssrn.1973469"},{"key":"ref_25","first-page":"229","article-title":"Analysing Volatility during Extreme Market Events Using the Mid Cap Share Index","volume":"17","author":"Selemela","year":"2021","journal-title":"Economica"},{"key":"ref_26","first-page":"10","article-title":"Measuring historical volatility","volume":"16","author":"Ederington","year":"2006","journal-title":"J. Appl. Financ."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"45","DOI":"10.2469\/faj.v61.n1.2683","article-title":"Practical issues in forecasting volatility","volume":"61","author":"Poon","year":"2005","journal-title":"Financ. Anal. J."},{"key":"ref_28","unstructured":"Botchkarev, A. (2018). Performance metrics (error measures) in machine learning regression, forecasting and prognostics: Properties and typology. arXiv."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"1117","DOI":"10.1016\/j.scitotenv.2019.02.093","article-title":"Assessing the performance of GIS-based machine learning models with different accuracy measures for determining susceptibility to gully erosion","volume":"664","author":"Garosi","year":"2019","journal-title":"Sci. Total. Environ."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Bouktif, S., Fiaz, A., Ouni, A., and Serhani, M.A. (2018). Optimal deep learning lstm model for electric load forecasting using feature selection and genetic algorithm: Comparison with machine learning approaches. Energies, 11.","DOI":"10.3390\/en11071636"},{"key":"ref_31","first-page":"17","article-title":"The effect of kernel values in support vector machine to forecasting performance of financial time series","volume":"4","author":"Altan","year":"2019","journal-title":"J. Cogn. Syst."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Song, H., Dai, J., Luo, L., Sheng, G., and Jiang, X. (2018). Power transformer operating state prediction method based on an LSTM network. Energies, 11.","DOI":"10.3390\/en11040914"},{"key":"ref_33","unstructured":"Botchkarev, A. (2022, July 10). Evaluating Performance of Regression Machine Learning Models Using Multiple Error Metrics in Azure Machine Learning Studio. Available online: https:\/\/ssrn.com\/abstract=3177507."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Xu, W., Zhang, J., Zhang, Q., and Wei, X. (2017, January 27\u201328). Risk prediction of type II diabetes based on random forest model. Proceedings of the 2017 Third International Conference on Advances in Electrical, Electronics, Information, Communication and Bio-Informatics (AEEICB), Chennai, India.","DOI":"10.1109\/AEEICB.2017.7972337"},{"key":"ref_35","unstructured":"Shaik, A.B., and Srinivasan, S. (2019, January 21\u201322). A brief survey on random forest ensembles in classification model. Proceedings of the International Conference on Innovative Computing and Communications, V\u0160B-Technical University of Ostrava, Ostrava, Czech Republic."},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Schapire, R.E. (2013). Explaining adaboost. Empirical Inference, Springer.","DOI":"10.1007\/978-3-642-41136-6_5"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"577","DOI":"10.1109\/TSMCB.2007.914695","article-title":"Adaboost-based algorithm for network intrusion detection","volume":"38","author":"Hu","year":"2008","journal-title":"IEEE Trans. Syst. Man Cybern. Part B (Cybern.)"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"308","DOI":"10.1016\/j.trc.2015.02.019","article-title":"A gradient boosting method to improve travel time prediction","volume":"58","author":"Zhang","year":"2015","journal-title":"Transp. Res. Part C Emerg. Technol."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"1937","DOI":"10.1007\/s10462-020-09896-5","article-title":"A comparative analysis of gradient boosting algorithms","volume":"54","year":"2021","journal-title":"Artif. Intell. Rev."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Polikar, R. (2012). Ensemble learning. Ensemble Machine Learning, Springer.","DOI":"10.1007\/978-1-4419-9326-7_1"},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"e1249","DOI":"10.1002\/widm.1249","article-title":"Ensemble learning: A survey","volume":"8","author":"Sagi","year":"2018","journal-title":"Wiley Interdiscip. Rev. Data Min. Knowl. Discov."},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Zhang, C., and Ma, Y. (2012). Ensemble Machine Learning: Methods and Applications, Springer.","DOI":"10.1007\/978-1-4419-9326-7"},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Sun, X. (2002, January 16\u201320). Pitch accent prediction using ensemble machine learning. Proceedings of the Seventh International Conference on Spoken Language Processing, Denver, CO, USA.","DOI":"10.21437\/ICSLP.2002-316"},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"2278","DOI":"10.1109\/5.726791","article-title":"Gradient-based learning applied to document recognition","volume":"86","author":"LeCun","year":"1998","journal-title":"Proc. IEEE"},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_46","doi-asserted-by":"crossref","unstructured":"Kiranyaz, S., Ince, T., Hamila, R., and Gabbouj, M. (2015, January 25\u201329). Convolutional neural networks for patient-specific ECG classification. Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy.","DOI":"10.1109\/EMBC.2015.7318926"},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"664","DOI":"10.1109\/TBME.2015.2468589","article-title":"Real-time patient-specific ECG classification by 1-D convolutional neural networks","volume":"63","author":"Kiranyaz","year":"2015","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Avci, O., Abdeljaber, O., Kiranyaz, S., and Inman, D. (2017). Structural damage detection in real time: Implementation of 1D convolutional neural networks for SHM applications. Structural Health Monitoring & Damage Detection, Volume 7, Springer.","DOI":"10.1007\/978-3-319-54109-9_6"},{"key":"ref_49","doi-asserted-by":"crossref","first-page":"8760","DOI":"10.1109\/TIE.2018.2833045","article-title":"Real-time fault detection and identification for MMC using 1-D convolutional neural networks","volume":"66","author":"Kiranyaz","year":"2018","journal-title":"IEEE Trans. Ind. Electron."},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"7067","DOI":"10.1109\/TIE.2016.2582729","article-title":"Real-time motor fault detection by 1-D convolutional neural networks","volume":"63","author":"Ince","year":"2016","journal-title":"IEEE Trans. Ind. Electron."},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"1308","DOI":"10.1016\/j.neucom.2017.09.069","article-title":"1-D CNNs for structural damage detection: Verification on a structural health monitoring benchmark data","volume":"275","author":"Abdeljaber","year":"2018","journal-title":"Neurocomputing"},{"key":"ref_52","unstructured":"Avci, O., Abdeljaber, O., Kiranyaz, S., Boashash, B., Sodano, H., and Inman, D.J. (2018, January 8\u201312). Efficiency validation of one dimensional convolutional neural networks for structural damage detection using a SHM benchmark data. Proceedings of the 25th International Congress on Sound and Vibration 2018, (ICSV 25), Hiroshima, Japan."},{"key":"ref_53","doi-asserted-by":"crossref","first-page":"107398","DOI":"10.1016\/j.ymssp.2020.107398","article-title":"1D convolutional neural networks and applications: A survey","volume":"151","author":"Kiranyaz","year":"2021","journal-title":"Mech. Syst. Signal Process."},{"key":"ref_54","doi-asserted-by":"crossref","unstructured":"Ragab, M.G., Abdulkadir, S.J., Aziz, N., Al-Tashi, Q., Alyousifi, Y., Alhussian, H., and Alqushaibi, A. (2020). A Novel One-Dimensional CNN with Exponential Adaptive Gradients for Air Pollution Index Prediction. Sustainability, 12.","DOI":"10.3390\/su122310090"},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"69053","DOI":"10.1109\/ACCESS.2018.2880044","article-title":"Monthly rainfall forecasting using one-dimensional deep convolutional neural network","volume":"6","author":"Haidar","year":"2018","journal-title":"IEEE Access"},{"key":"ref_56","doi-asserted-by":"crossref","unstructured":"Huang, S., Tang, J., Dai, J., and Wang, Y. (2019). Signal status recognition based on 1DCNN and its feature extraction mechanism analysis. Sensors, 19.","DOI":"10.3390\/s19092018"},{"key":"ref_57","doi-asserted-by":"crossref","first-page":"5735","DOI":"10.1109\/TII.2019.2955540","article-title":"Understanding and learning discriminant features based on multiattention 1DCNN for wheelset bearing fault diagnosis","volume":"16","author":"Wang","year":"2019","journal-title":"IEEE Trans. Ind. Inform."},{"key":"ref_58","doi-asserted-by":"crossref","unstructured":"Zhao, X., Sol\u00e9-Casals, J., Li, B., Huang, Z., Wang, A., Cao, J., Tanaka, T., and Zhao, Q. (2020, January 4\u20138). Classification of Epileptic IEEG Signals by CNN and Data Augmentation. Proceedings of the ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain.","DOI":"10.1109\/ICASSP40776.2020.9052948"},{"key":"ref_59","doi-asserted-by":"crossref","unstructured":"Mandic, D., and Chambers, J. (2001). Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability, John and Wiley and Sons.","DOI":"10.1002\/047084535X"},{"key":"ref_60","doi-asserted-by":"crossref","first-page":"132306","DOI":"10.1016\/j.physd.2019.132306","article-title":"Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network","volume":"404","author":"Sherstinsky","year":"2020","journal-title":"Phys. D Nonlinear Phenom."},{"key":"ref_61","doi-asserted-by":"crossref","first-page":"1735","DOI":"10.1162\/neco.1997.9.8.1735","article-title":"Long short-term memory","volume":"9","author":"Hochreiter","year":"1997","journal-title":"Neural Comput."},{"key":"ref_62","doi-asserted-by":"crossref","first-page":"243","DOI":"10.1162\/neco.1992.4.2.243","article-title":"A fixed size storage O (n 3) time complexity learning algorithm for fully recurrent continually running networks","volume":"4","author":"Schmidhuber","year":"1992","journal-title":"Neural Comput."},{"key":"ref_63","doi-asserted-by":"crossref","unstructured":"Graves, A. (2012). Long short-term memory. Supervised Sequence Labelling with Recurrent Neural Networks, Springer.","DOI":"10.1007\/978-3-642-24797-2"},{"key":"ref_64","doi-asserted-by":"crossref","first-page":"654","DOI":"10.1016\/j.ejor.2017.11.054","article-title":"Deep learning with long short-term memory networks for financial market predictions","volume":"270","author":"Fischer","year":"2018","journal-title":"Eur. J. Oper. Res."},{"key":"ref_65","doi-asserted-by":"crossref","unstructured":"Wang, Y., Huang, M., Zhu, X., and Zhao, L. (2016, January 1\u20134). Attention-based LSTM for aspect-level sentiment classification. Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, TX, USA.","DOI":"10.18653\/v1\/D16-1058"},{"key":"ref_66","doi-asserted-by":"crossref","first-page":"032115","DOI":"10.1088\/1755-1315\/440\/3\/032115","article-title":"Power load forecasting using BiLSTM-attention","volume":"440","author":"Du","year":"2020","journal-title":"Proc. Iop Conf. Ser. Earth Environ. Sci."},{"key":"ref_67","unstructured":"Vasquez, S., and Lewis, M. (2019). Melnet: A generative model for audio in the frequency domain. arXiv."},{"key":"ref_68","doi-asserted-by":"crossref","unstructured":"Jung, J.w., Heo, H.S., Kim, J.h., Shim, H.j., and Yu, H.J. (2019). Rawnet: Advanced end-to-end deep neural network using raw waveforms for text-independent speaker verification. arXiv.","DOI":"10.21437\/Interspeech.2019-1982"},{"key":"ref_69","doi-asserted-by":"crossref","unstructured":"Piczak, K.J. (2015, January 17\u201320). Environmental sound classification with convolutional neural networks. Proceedings of the 2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP), Boston, MA, USA.","DOI":"10.1109\/MLSP.2015.7324337"},{"key":"ref_70","doi-asserted-by":"crossref","first-page":"279","DOI":"10.1109\/LSP.2017.2657381","article-title":"Deep convolutional neural networks and data augmentation for environmental sound classification","volume":"24","author":"Salamon","year":"2017","journal-title":"IEEE Signal Process. Lett."}],"container-title":["Future Internet"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1999-5903\/14\/9\/252\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T00:15:09Z","timestamp":1760141709000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1999-5903\/14\/9\/252"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,8,25]]},"references-count":70,"journal-issue":{"issue":"9","published-online":{"date-parts":[[2022,9]]}},"alternative-id":["fi14090252"],"URL":"https:\/\/doi.org\/10.3390\/fi14090252","relation":{},"ISSN":["1999-5903"],"issn-type":[{"value":"1999-5903","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,8,25]]}}}