{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T01:46:33Z","timestamp":1760060793001,"version":"build-2065373602"},"reference-count":20,"publisher":"MDPI AG","issue":"10","license":[{"start":{"date-parts":[[2025,9,23]],"date-time":"2025-09-23T00:00:00Z","timestamp":1758585600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Algorithms"],"abstract":"<jats:p>While backpropagation (BP) has long served as the cornerstone of training deep neural networks, it relies heavily on strict differentiation logic and global gradient information, lacking biological plausibility. In this paper, we systematically present a novel neural network training paradigm that depends solely on signal propagation, which we term Backward Signal Propagation (BSP). The core idea of this framework is to reinterpret network training as a symmetry-driven process of discovering inverse causal relationships. Starting from symmetry principles, we define symmetric differential equations and leverage their inherent properties to implement a learning mechanism analogous to differentiation. Furthermore, we introduce the concept of causal distance, a core invariant that bridges the forward propagation and inverse learning processes. It quantifies the influence strength between any two elements in the network, leading to a generalized form of the chain rule. With these innovations, we achieve precise, pointwise adjustment of model parameters. Unlike traditional BP, the BSP method enables parameter updates based solely on local signal features. This work offers a new direction toward efficient and biologically plausible learning algorithms.<\/jats:p>","DOI":"10.3390\/a18100594","type":"journal-article","created":{"date-parts":[[2025,9,23]],"date-time":"2025-09-23T10:27:53Z","timestamp":1758623273000},"page":"594","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Backward Signal Propagation: A Symmetry-Based Training Method for Neural Networks"],"prefix":"10.3390","volume":"18","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-5017-9139","authenticated-orcid":false,"given":"Kun","family":"Jiang","sequence":"first","affiliation":[{"name":"School of Electrical Engineering, Chongqing University, Chongqing 400044, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-7593-9886","authenticated-orcid":false,"given":"Zhihong","family":"Fu","sequence":"additional","affiliation":[{"name":"School of Electrical Engineering, Chongqing University, Chongqing 400044, China"}]}],"member":"1968","published-online":{"date-parts":[[2025,9,23]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"386","DOI":"10.1037\/h0042519","article-title":"The perceptron: A probabilistic model for information storage and organization in the brain","volume":"65","author":"Rosenblatt","year":"1958","journal-title":"Psychol. Rev."},{"key":"ref_2","unstructured":"Marvin, M., and Seymour, A.P. (1969). Perceptrons, MIT Press."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"533","DOI":"10.1038\/323533a0","article-title":"Learning representations by back-propagating errors","volume":"323","author":"Rumelhart","year":"1986","journal-title":"Nature"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"107","DOI":"10.1142\/S0218488598000094","article-title":"The vanishing gradient problem during learning recurrent neural nets and problem solutions","volume":"6","author":"Hochreiter","year":"1998","journal-title":"Int. J. Uncertain. Fuzziness Knowl.-Based Syst."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., and Li, F.-F. (2009, January 20\u201325). Imagenet: A large-scale hierarchical image database. Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA.","DOI":"10.1109\/CVPR.2009.5206848"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Raina, R., Madhavan, A., and Ng, A.Y. (2009, January 14\u201318). Large-scale deep unsupervised learning using graphics processors. Proceedings of the ICML \u201809: Proceedings of the 26th Annual International Conference on Machine Learning, Montreal, QC, Canada.","DOI":"10.1145\/1553374.1553486"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_8","unstructured":"Bottou, L. (2010, January 22\u201327). Large-scale machine learning with stochastic gradient descent. Proceedings of the COMPSTAT\u20192010: 19th International Conference on Computational Statistics, Paris, France. Keynote, Invited and Contributed Papers."},{"key":"ref_9","unstructured":"Kingma, D.P., and Ba, J. (2014). Adam: A method for stochastic optimization. arXiv."},{"key":"ref_10","unstructured":"Hinton, G. (2022). The forward-forward algorithm: Some preliminary investigations. arXiv."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"335","DOI":"10.1038\/s41583-020-0277-3","article-title":"Backpropagation and the brain","volume":"21","author":"Lillicrap","year":"2020","journal-title":"Nat. Rev. Neurosci."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"129","DOI":"10.1038\/337129a0","article-title":"The recent excitement about neural networks","volume":"337","author":"Crick","year":"1989","journal-title":"Nature"},{"key":"ref_13","unstructured":"(1989, January 18\u201322). Is backpropagation biologically plausible?. Proceedings of the International 1989 Joint Conference on Neural Networks, Washington, DC, USA."},{"key":"ref_14","unstructured":"Kun, J. (2024). A Neural Network Framework Based on Symmetric Differential Equations. ChinaXiv, ChinaXiv:202410.00055."},{"key":"ref_15","unstructured":"Jiang, K. (2025). From Propagator to Oscillator: The Dual Role of Symmetric Differential Equations in Neural Systems. arXiv."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Izhikevich, E.M. (2007). Dynamical Systems in Neuroscience, MIT Press.","DOI":"10.7551\/mitpress\/2526.001.0001"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Olver, P.J. (1995). Equivalence, Invariants and Symmetry, Cambridge University Press.","DOI":"10.1017\/CBO9780511609565"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"433","DOI":"10.1002\/wics.101","article-title":"Principal Component Analysis","volume":"2","author":"Abdi","year":"2010","journal-title":"Wiley Interdiscip. Rev. Comput. Stat."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Jiang, K. (2025). A Neural Network Training Method Based on Distributed PID Control. Symmetry, 17.","DOI":"10.3390\/sym17071129"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Jiang, K. (2025). A Neural Network Training Method Based on Neuron Connection Coefficient Adjustments. arXiv.","DOI":"10.3390\/sym17071129"}],"container-title":["Algorithms"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1999-4893\/18\/10\/594\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,9]],"date-time":"2025-10-09T18:47:37Z","timestamp":1760035657000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1999-4893\/18\/10\/594"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,9,23]]},"references-count":20,"journal-issue":{"issue":"10","published-online":{"date-parts":[[2025,10]]}},"alternative-id":["a18100594"],"URL":"https:\/\/doi.org\/10.3390\/a18100594","relation":{},"ISSN":["1999-4893"],"issn-type":[{"type":"electronic","value":"1999-4893"}],"subject":[],"published":{"date-parts":[[2025,9,23]]}}}