{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,2]],"date-time":"2026-05-02T06:48:47Z","timestamp":1777704527273,"version":"3.51.4"},"reference-count":18,"publisher":"SAGE Publications","issue":"5","license":[{"start":{"date-parts":[[2018,7,27]],"date-time":"2018-07-27T00:00:00Z","timestamp":1532649600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"published-print":{"date-parts":[[2018,11,20]]},"abstract":"<jats:p>Weight initialization is the most important component which affects the performance of artificial neural network during training the network using Back-propagation algorithm. The initial starting weights have significant effect on the training. If the weights are too large then the sigmoid will saturate, that makes learning slow. If weights are too small then gradients are also too small. In this paper a new weight initialization method has been proposed. The results for the proposed weight initialization technique are compared against the random weight initialization method. In this paper the proposed weight initialization method is statistically analyzed. Ten different data sets out of which five sets of data are taken from UCI machine learning repository and five sets of data are generated using function approximation problems that are used. Resilient Back Propagation training algorithm is used for training the feed forward artificial neural network. The proposed weight initialization method gives better results when compared with random weight initialization technique.<\/jats:p>","DOI":"10.3233\/jifs-169803","type":"journal-article","created":{"date-parts":[[2018,7,31]],"date-time":"2018-07-31T18:08:30Z","timestamp":1533060510000},"page":"5193-5201","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":7,"title":["A new weight initialization method for sigmoidal FFANN"],"prefix":"10.1177","volume":"35","author":[{"given":"M.P.S.","family":"Bhatia","sequence":"first","affiliation":[{"name":"Division of Computer Engineering, Netaji Subhas Institute of Technology (NSIT), Dwarka, New Delhi, India"}]},{"family":"Veenu","sequence":"additional","affiliation":[{"name":"Division of Computer Engineering, Netaji Subhas Institute of Technology (NSIT), Dwarka, New Delhi, India"}]},{"given":"Pravin","family":"Chandra","sequence":"additional","affiliation":[{"name":"University School of Information, Communication and Technology (USICT), Guru Gobind Singh Indraprastha University (GGSIPU), Dwarka, New Delhi, India"}]}],"member":"179","published-online":{"date-parts":[[2018,7,27]]},"reference":[{"key":"e_1_3_1_2_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNN.2004.831198"},{"key":"e_1_3_1_3_2","doi-asserted-by":"publisher","DOI":"10.1007\/3-540-49430-8_2"},{"key":"e_1_3_1_4_2","first-page":"1375","volume-title":"IEEE International Conference on Neural networks","volume":"3","author":"Kolen J.F.","year":"1994","unstructured":"KolenJ.F. and PollackJ.B., Back-Propagation Without Weight Transport, IEEE International Conference on Neural networks3 (1994), 1375\u20131380."},{"key":"e_1_3_1_5_2","first-page":"185","volume-title":"Neural Processing Letters 18","author":"Chandra P.","year":"2003","unstructured":"ChandraP., Sigmoidal Function class for Feedforward Artificial neural networks, Neural Processing Letters 18, printed in Netherland, 2003, pp. 185\u2013195."},{"key":"e_1_3_1_6_2","volume-title":"Neural Networks and learning Machines","author":"Haykin S.","year":"2011","unstructured":"HaykinS., PHI learning Private limited, Neural Networks and learning Machines2011, 3rd edition.","edition":"3"},{"key":"e_1_3_1_7_2","volume-title":"Information Systems Laboratory","author":"Nyugen D.","unstructured":"NyugenD. and WidrowB., Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, Information Systems Laboratory, Stanford University."},{"key":"e_1_3_1_8_2","doi-asserted-by":"publisher","DOI":"10.1007\/3-540-59497-3_220"},{"key":"e_1_3_1_9_2","doi-asserted-by":"publisher","DOI":"10.1109\/72.143378"},{"key":"e_1_3_1_10_2","first-page":"2396","volume-title":"IEEE International Joint Conference on Neural Networks","author":"Kim Y.K.","year":"1991","unstructured":"KimY.K. and RaJ.B., Weight Value initialization for improving training Speed in the Back propagation networks,(IJCNNA), IEEE International Joint Conference on Neural Networks, 1991, pp. 2396\u20132401."},{"key":"e_1_3_1_11_2","doi-asserted-by":"publisher","DOI":"10.1109\/72.508939"},{"issue":"1","key":"e_1_3_1_12_2","article-title":"Feasibility of Training and Investigation into Training of Sigmoidal FFANN with Gaussian Learning Rate and Zero Weight initializations","volume":"11","author":"Bhatia M.P.S","year":"2014","unstructured":"Bhatia, M.P.S, Veenu and ChandraP., Feasibility of Training and Investigation into Training of Sigmoidal FFANN with Gaussian Learning Rate and Zero Weight initializations, International Journal on Recent trend in Engineering and Technology11(1), 2014.","journal-title":"International Journal on Recent trend in Engineering and Technology"},{"key":"e_1_3_1_13_2","first-page":"325","volume-title":"International Conference on Computing for sustainable global development","author":"Bhatia M.P.S","year":"2015","unstructured":"Bhatia, M.P.S, Veenu and ChandraP., Comon of sigmoidal FFANN Training Algorithms for Function Approximation Problems, INDIACom-2015, 2015, International Conference on Computing for sustainable global development, pp. 325\u2013329."},{"key":"e_1_3_1_14_2","unstructured":"Online available UCI machine learning repository maintained by centre for machine learning and intelligent systems. http:\/\/archive.ics.uci.edu\/ml\/datasets"},{"key":"e_1_3_1_15_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.aasri.2014.05.004"},{"key":"e_1_3_1_16_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICNN.1993.298623"},{"key":"e_1_3_1_17_2","unstructured":"The MathWorks Inc. \u201cMATLAB version Ra 8.1\u201d 2013."},{"key":"e_1_3_1_18_2","doi-asserted-by":"publisher","DOI":"10.1109\/AdCC.2014.6779511"},{"key":"e_1_3_1_19_2","doi-asserted-by":"publisher","DOI":"10.1109\/IJCNN.2016.7727180"}],"container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-169803","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.3233\/JIFS-169803","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-169803","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,4,29]],"date-time":"2026-04-29T09:41:24Z","timestamp":1777455684000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.3233\/JIFS-169803"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,7,27]]},"references-count":18,"journal-issue":{"issue":"5","published-print":{"date-parts":[[2018,11,20]]}},"alternative-id":["10.3233\/JIFS-169803"],"URL":"https:\/\/doi.org\/10.3233\/jifs-169803","relation":{},"ISSN":["1064-1246","1875-8967"],"issn-type":[{"value":"1064-1246","type":"print"},{"value":"1875-8967","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,7,27]]}}}