{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,24]],"date-time":"2026-02-24T13:52:52Z","timestamp":1771941172296,"version":"3.50.1"},"reference-count":19,"publisher":"Springer Science and Business Media LLC","issue":"2","license":[{"start":{"date-parts":[[2024,3,12]],"date-time":"2024-03-12T00:00:00Z","timestamp":1710201600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2024,3,12]],"date-time":"2024-03-12T00:00:00Z","timestamp":1710201600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"FCT","award":["UIDB\/04033\/2020"],"award-info":[{"award-number":["UIDB\/04033\/2020"]}]},{"name":"FCT","award":["UIDB\/00048\/2020"],"award-info":[{"award-number":["UIDB\/00048\/2020"]}]},{"DOI":"10.13039\/501100015321","name":"Universidade de Tr\u00e1s-os-Montes e Alto Douro","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100015321","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Neural Process Lett"],"abstract":"<jats:title>Abstract<\/jats:title><jats:p>In this paper, a novel back-propagation error technique is presented. This neural network structure allows for two fundamental basic modes: (1) To decompose the neurones by transforming their variables, weights, and scalar functions into vectors. This conveys for the decomposition of the transfer function of every neurone (where the output variables are the components of the decomposition) and, consequently, to be written as the invariant sum of orthogonal functions, with the safeguard of preserving information This orthogonality is proven using Fourier theory. (2) In a second mode, a tuned neural network that occupies one of the channels of the neural network can see the weights of its supplementary channels adjusted to retain additional information. Only the decomposition algorithm of the network is presented here\u2014Multi-back-propagation algorithm. The adopted methodology is validated step-by-step with some representative examples. Namely, to assess the performance of the splitting method, two different examples have been constructed from scratch: (1) a 2D classification problem and (2) a 3D surface. In both problems, the signal and transfer functions of the neural network are successfully decomposed without information losses. Therefore, since the main contribution of this work is to allow for the organisation of the information stored in neural network structure, through a split process, this promising method shows potential use in various areas\u2014e.g. classification and\/or pattern recognition problems, data analysis, modelling and so on. In the future, we expect to work further in the method computational aspects to render it more efficient, versatile and robust.<\/jats:p>","DOI":"10.1007\/s11063-024-11518-y","type":"journal-article","created":{"date-parts":[[2024,3,12]],"date-time":"2024-03-12T21:22:41Z","timestamp":1710278561000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":3,"title":["Multi-back-propagation Algorithm for Signal Neural Network Decomposition"],"prefix":"10.1007","volume":"56","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-0041-0256","authenticated-orcid":false,"given":"Paulo","family":"Salgado","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3281-5357","authenticated-orcid":false,"given":"T.-P. Azevedo","family":"Perdico\u00falis","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2024,3,12]]},"reference":[{"key":"11518_CR1","first-page":"21","volume-title":"Neural networks in bioprocessing and chemical engineering","author":"DR Baughman","year":"1995","unstructured":"Baughman DR, Liu YA (1995) 2\u2014Fundamental and practical aspects of neural computing. In: Baughman DR, Liu YA (eds) Neural networks in bioprocessing and chemical engineering. Academic Press, Boston, pp 21\u2013109"},{"key":"11518_CR2","doi-asserted-by":"publisher","DOI":"10.1093\/oso\/9780198538493.001.0001","volume-title":"Neural networks for pattern recognition","author":"CM Bishop","year":"1995","unstructured":"Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press Inc, USA"},{"key":"11518_CR3","volume-title":"Artificial neural networks: an introduction","author":"LP Kevin","year":"2005","unstructured":"Kevin LP, Keller PE (2005) Artificial neural networks: an introduction. SPIE Press - Technology & Engineering, Bellingham, Washington, USA"},{"key":"11518_CR4","unstructured":"Urban S, van\u00a0der Smagt P (2016) A neural transfer function for a smooth and differentiable transition between additive and multiplicative interactions. arXiv: 1503.05724"},{"key":"11518_CR5","unstructured":"Kim J, Park Y, Kim G, Hwang SJ (2017) SplitNet: ksemantically split deep networks for parameter reduction and model parallelization. In: Precup D, Teh YW (eds.) Proceedings of the 34th international conference on machine learning, vol 70, pp 1866\u20131874. PMLR, Sydney NSW Australia"},{"key":"11518_CR6","unstructured":"Wo\u0142czyk M, Tabor J, \u015amieja M, Maszke S (2019) Biologically-inspired spatial neural networks. arXiv:1910:02776"},{"key":"11518_CR7","unstructured":"Hosseini E Malek, Hajabdollahi M, Karimi N, Samavi S, Shirani S (2020) Splitting convolutional neural network structures for efficient inference. arXiv: 2002.03302"},{"key":"11518_CR8","doi-asserted-by":"crossref","unstructured":"Henriksen P, Lomuscio A (2021) Deepsplit: an efficient splitting method for neural network verification via indirect effect analysis. In: Zhou Z-H (ed.) Proceedings of the thirtieth international joint conference on artificial intelligence, IJCAI-21, Montreal, Canada, pp 2549\u20132555","DOI":"10.24963\/ijcai.2021\/351"},{"issue":"1","key":"11518_CR9","doi-asserted-by":"publisher","first-page":"17","DOI":"10.1007\/BF01411371","volume":"1","author":"M Wynne-Jones","year":"1993","unstructured":"Wynne-Jones M (1993) Node splitting: a constructive algorithm for feedforward neural networks. Neural Comput Appl 1(1):17\u201322","journal-title":"Neural Comput Appl"},{"key":"11518_CR10","unstructured":"Adamu A, Maul T, Bargiela A (2013) On training neural networks with transfer function diversity. In: Proceedings of the international conference on computational intelligence and information technology, CIIT \u201913, Mumbai, India"},{"key":"11518_CR11","unstructured":"Duch W, Jankowski N (2001) Transfer functions: hidden possibilities for better neural networks. In: Brown D, Green S (eds.) Proceedings of the ESANN 2001, 9th European symposium on artificial neural networks, Bruges, Belgium, pp 25\u201327 (2001). ACM"},{"key":"11518_CR12","unstructured":"Himanshu S (2019) Activation functions : sigmoid, tanh, ReLU, Leaky ReLU, PReLU, ELU, Threshold ReLU and Softmax basics for neural networks and deep learning. https:\/\/himanshuxd.medium.com\/activation-functions-sigmoid-relu-leaky-relu-and-softmax-basics-for-neural-networks-and-deep-8d9c70eed91e"},{"issue":"11","key":"11518_CR13","doi-asserted-by":"publisher","first-page":"1947","DOI":"10.1016\/j.aml.2012.03.007","volume":"25","author":"MA Cohen","year":"2012","unstructured":"Cohen MA, Tan CO (2012) A polynomial approximation for arbitrary functions. Appl Math. Lett. 25(11):1947\u20131952","journal-title":"Appl Math. Lett."},{"issue":"1","key":"11518_CR14","doi-asserted-by":"publisher","first-page":"1","DOI":"10.55630\/sjc.2023.17.1-16","volume":"17","author":"B Nebioglu","year":"2023","unstructured":"Nebioglu B, Iliev AI (2023) Higher order orthogonal polynomials as activation functions in artificial neural networks. Serdica J. Comput. 17(1):1\u201316","journal-title":"Serdica J. Comput."},{"key":"11518_CR15","doi-asserted-by":"publisher","DOI":"10.1017\/CBO9781139165372","volume-title":"An introduction to harmonic analysis","author":"Y Katznelson","year":"2004","unstructured":"Katznelson Y (2004) An introduction to harmonic analysis, 3rd edn. Cambridge University Press, Cambridge, UK","edition":"3"},{"key":"11518_CR16","unstructured":"Osgood PB (2014) Lecture Notes for EE 261 the fourier transform and its applications. Create Space Independent Publishing Platform, Stanford University, CA, USA"},{"key":"11518_CR17","volume-title":"Pattern recognition and machine learning","author":"CM Bishop","year":"2006","unstructured":"Bishop CM (2006) Pattern recognition and machine learning. Springer, Cambridge, UK"},{"key":"11518_CR18","unstructured":"Rumelhart DE, Hilton GE, Williams RJ (1986) Learning internal representations by error propagation. In: Rumelhart DE, JLM, Group PR (eds) Parallel distributed processing: explorations in the microstructure of cognition, volume 1: foundations, pp 318\u2013362. MIT Press. Reprinted in Anderson and Rosenfeld (1988), Cambridge, MA (1986)"},{"key":"11518_CR19","unstructured":"Salgado PAC (2023) Multi-Back-Propagation algorithm\u2014the code. https:\/\/meocloud.pt\/link\/3c3b7720-8108-4ea9-9e8d-6a9db05d0df8\/MATLAB\/ (Accessed 21 July 2023)"}],"container-title":["Neural Processing Letters"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11063-024-11518-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s11063-024-11518-y\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11063-024-11518-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,5,16]],"date-time":"2024-05-16T20:35:12Z","timestamp":1715891712000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s11063-024-11518-y"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,3,12]]},"references-count":19,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2024,4]]}},"alternative-id":["11518"],"URL":"https:\/\/doi.org\/10.1007\/s11063-024-11518-y","relation":{},"ISSN":["1573-773X"],"issn-type":[{"value":"1573-773X","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,3,12]]},"assertion":[{"value":"24 November 2023","order":1,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"12 March 2024","order":2,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"Both authors certify that they have no affiliations with or involvement in any organisation or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}}],"article-number":"100"}}