{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,31]],"date-time":"2026-03-31T11:12:21Z","timestamp":1774955541515,"version":"3.50.1"},"reference-count":30,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2026,2,1]],"date-time":"2026-02-01T00:00:00Z","timestamp":1769904000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2026,2,9]],"date-time":"2026-02-09T00:00:00Z","timestamp":1770595200000},"content-version":"vor","delay-in-days":8,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100001659","name":"Deutsche Forschungsgemeinschaft","doi-asserted-by":"publisher","award":["SPP 2298"],"award-info":[{"award-number":["SPP 2298"]}],"id":[{"id":"10.13039\/501100001659","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001659","name":"Deutsche Forschungsgemeinschaft","doi-asserted-by":"publisher","award":["SPP 2298"],"award-info":[{"award-number":["SPP 2298"]}],"id":[{"id":"10.13039\/501100001659","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Munich Data Science Institute"},{"DOI":"10.13039\/501100001663","name":"Volkswagen Foundation","doi-asserted-by":"publisher","id":[{"id":"10.13039\/501100001663","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Adv Comput Math"],"published-print":{"date-parts":[[2026,2]]},"abstract":"<jats:title>Abstract<\/jats:title>\n                  <jats:p>\n                    Besides classical feed-forward neural networks such as multilayer perceptrons, also neural ordinary differential equations (neural ODEs) have gained particular interest in recent years. Neural ODEs can be interpreted as an infinite depth limit of feed-forward or residual neural networks. We study the input\u2013output dynamics of finite and infinite depth neural networks with scalar output. In the finite-depth case, the input is a state associated with a finite number of nodes, which maps under multiple non-linear transformations to the state of one output node. In analogy, a neural ODE maps an affine linear transformation of the input to an affine linear transformation of its time-\n                    <jats:italic>T<\/jats:italic>\n                    map. We show that, depending on the specific structure of the network, the input\u2013output map has different properties regarding the existence and regularity of critical points. These properties can be characterized via Morse functions, which are scalar functions where every critical point is non-degenerate. We prove that critical points cannot exist if the dimension of the hidden layer is monotonically decreasing or the dimension of the phase space is smaller than or equal to the input dimension. In the case that critical points exist, we classify their regularity depending on the specific architecture of the network. We show that, except for a Lebesgue measure zero set in the weight space, each critical point is non-degenerate if for finite-depth neural networks, the underlying graph has no bottleneck, and if for neural ODEs, the affine linear transformations used have full rank. For each type of architecture, the proven properties are comparable in the finite and infinite depth cases. The established theorems allow us to formulate results on universal embedding and universal approximation, i.e., on the exact and approximate representation of maps by neural networks and neural ODEs. Our dynamical systems viewpoint on the geometric structure of the input\u2013output map provides a fundamental understanding of why certain architectures perform better than others.\n                  <\/jats:p>","DOI":"10.1007\/s10444-025-10273-5","type":"journal-article","created":{"date-parts":[[2026,2,9]],"date-time":"2026-02-09T12:20:02Z","timestamp":1770639602000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Analysis of the geometric structure of neural networks and neural ODEs via morse functions"],"prefix":"10.1007","volume":"52","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-7063-6173","authenticated-orcid":false,"given":"Christian","family":"Kuehn","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0009-0000-4611-9742","authenticated-orcid":false,"given":"Sara-Viola","family":"Kuntz","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2026,2,9]]},"reference":[{"key":"10273_CR1","doi-asserted-by":"publisher","unstructured":"Aggarwal, C.C.: Neural Networks and Deep Learning. Springer, 1 edition (2018). https:\/\/doi.org\/10.1007\/978-3-319-94463-0","DOI":"10.1007\/978-3-319-94463-0"},{"key":"10273_CR2","doi-asserted-by":"publisher","unstructured":"Chen, R.T.Q.,\u00a0Rubanova, Y.,\u00a0Bettencourt, J.,\u00a0Duvenaud, D.: Neural ordinary differential equations. NeurIPS (2018). https:\/\/doi.org\/10.48550\/ARXIV.1806.07366","DOI":"10.48550\/ARXIV.1806.07366"},{"key":"10273_CR3","doi-asserted-by":"publisher","unstructured":"Chicone, C.: Ordinary Differential Equations with Applications, volume\u00a034 of Texts in Applied Mathematics. Springer New York, 2 edition (2006). https:\/\/doi.org\/10.1007\/0-387-35794-7","DOI":"10.1007\/0-387-35794-7"},{"key":"10273_CR4","doi-asserted-by":"publisher","unstructured":"Cook, B.J., Peterson, A.D.H.,\u00a0Woldman, W., Terry, J.R.: Neural field models: a mathematical overview and unifying framework. Math. Neurosci. Appl. 2 (2022). https:\/\/doi.org\/10.46298\/mna.7284","DOI":"10.46298\/mna.7284"},{"key":"10273_CR5","doi-asserted-by":"publisher","unstructured":"Esteve, C.,\u00a0Geshkovski, B.,\u00a0Pighin, D.,\u00a0Zuazua, E.: Large-time asymptotics in deep learning. (2020). https:\/\/doi.org\/10.48550\/ARXIV.2008.02491","DOI":"10.48550\/ARXIV.2008.02491"},{"key":"10273_CR6","doi-asserted-by":"publisher","unstructured":"Forster, O.: Analysis 2, Differentialrechnung im $$\\mathbb{R}^{n}$$, gew\u00f6hnliche Differentialgleichungen. Grundkurs Mathematik. Springer Spektrum, 11 edition (2017). https:\/\/doi.org\/10.1007\/978-3-658-19411-6","DOI":"10.1007\/978-3-658-19411-6"},{"key":"10273_CR7","unstructured":"Goodfellow, I.,\u00a0Bengio, Y.,\u00a0Courville, A.: Deep Learning. MIT Press, (2016). http:\/\/www.deeplearningbook.org"},{"key":"10273_CR8","doi-asserted-by":"publisher","unstructured":"Guckenheimer, J.,\u00a0Holmes, P.: Nonlinear Oscillations, Dynamical Systems, and Bifurcations of Vector Fields, volume\u00a042 of Applied Mathematical Sciences. Springer New York, 7 edition (2002). https:\/\/doi.org\/10.1007\/978-1-4612-1140-2","DOI":"10.1007\/978-1-4612-1140-2"},{"key":"10273_CR9","unstructured":"Hale, J.K.: Ordinary Differential Equations. Krieger Publishing Company, 2 edition (1980)"},{"key":"10273_CR10","doi-asserted-by":"publisher","unstructured":"He, K.,\u00a0Zhang, X.,\u00a0Ren, S.,\u00a0Sun, J.: Deep residual learning for image recognition. IEEE Conference on Computer Vision and Pattern Recognition, pages 770\u2013778 (2016). https:\/\/doi.org\/10.1109\/cvpr.2016.90","DOI":"10.1109\/cvpr.2016.90"},{"key":"10273_CR11","doi-asserted-by":"publisher","unstructured":"Hirsch, M.W.: Differential Topology, volume\u00a033 of Graduate Texts in Mathematics. Springer New York, (1976). https:\/\/doi.org\/10.1007\/978-1-4684-9449-5","DOI":"10.1007\/978-1-4684-9449-5"},{"issue":"5","key":"10273_CR12","doi-asserted-by":"publisher","first-page":"359","DOI":"10.1016\/0893-6080(89)90020-8","volume":"2","author":"K Hornik","year":"1989","unstructured":"Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359\u2013366 (1989). https:\/\/doi.org\/10.1016\/0893-6080(89)90020-8","journal-title":"Neural Netw."},{"key":"10273_CR13","doi-asserted-by":"publisher","unstructured":"Johnson, J.: Deep, skinny neural networks are not universal approximators. (2018). https:\/\/doi.org\/10.48550\/arXiv.1810.00393","DOI":"10.48550\/arXiv.1810.00393"},{"key":"10273_CR14","doi-asserted-by":"publisher","unstructured":"Kidger, P.: On Neural Differential Equations. PhD thesis, Mathematical Institute, University of Oxford, (2022). https:\/\/doi.org\/10.48550\/ARXIV.2202.02435","DOI":"10.48550\/ARXIV.2202.02435"},{"issue":"5\u20136","key":"10273_CR15","doi-asserted-by":"publisher","first-page":"435","DOI":"10.1007\/s10472-020-09723-1","volume":"89","author":"A Kratsios","year":"2021","unstructured":"Kratsios, A.: The universal approximation property - characterization, construction, representation, and existence. Ann. Math. Artif. Intell. 89(5\u20136), 435\u2013469 (2021). https:\/\/doi.org\/10.1007\/s10472-020-09723-1","journal-title":"Ann. Math. Artif. Intell."},{"key":"10273_CR16","doi-asserted-by":"publisher","unstructured":"Kuehn, C., Kuntz, S.-V.: Embedding capabilities of neural ODEs. Preprint, (2023). https:\/\/doi.org\/10.48550\/ARXIV.2308.01213","DOI":"10.48550\/ARXIV.2308.01213"},{"key":"10273_CR17","unstructured":"Kuehn, C., Kuntz, S.-V.: The influence of the memory capacity of neural DDEs on the universal approximation property. (2025). arXiv:2505.07244"},{"issue":"7","key":"10273_CR18","doi-asserted-by":"publisher","first-page":"1162","DOI":"10.1134\/s0965542521070101","volume":"61","author":"SV Kurochkin","year":"2021","unstructured":"Kurochkin, S.V.: Neural network with smooth activation functions and without bottlenecks is almost surely a morse function. Comput. Math. Math. Phys. 61(7), 1162\u20131168 (2021). https:\/\/doi.org\/10.1134\/s0965542521070101","journal-title":"Comput. Math. Math. Phys."},{"key":"10273_CR19","doi-asserted-by":"publisher","unstructured":"Lin, H.,\u00a0Jegelka, S.: ResNet with one-neuron hidden layers is a universal approximator. Adv. Neural Inf. Process. Syst. 31, 6169\u20136178 (2018). https:\/\/doi.org\/10.48550\/ARXIV.1806.10909","DOI":"10.48550\/ARXIV.1806.10909"},{"key":"10273_CR20","doi-asserted-by":"publisher","unstructured":"Magnus, J.R.,\u00a0Neudecker, H.: Matrix Differential Calculus with Applications in Statistics and Econometrics. Wiley, 3 edition, (2019). https:\/\/doi.org\/10.1002\/9781119541219","DOI":"10.1002\/9781119541219"},{"key":"10273_CR21","doi-asserted-by":"crossref","unstructured":"Morse, M.: The Calculus of Variations in the Large, volume\u00a018 of Colloquium Publications. Am Math Soc (1934)","DOI":"10.1090\/coll\/018"},{"key":"10273_CR22","doi-asserted-by":"publisher","unstructured":"Nicolaescu, L.: An Invitation to Morse Theory. Springer New York (2011). https:\/\/doi.org\/10.1007\/978-1-4614-1105-5","DOI":"10.1007\/978-1-4614-1105-5"},{"key":"10273_CR23","doi-asserted-by":"publisher","first-page":"143","DOI":"10.1017\/s0962492900002919","volume":"8","author":"A Pinkus","year":"1999","unstructured":"Pinkus, A.: Approximation theory of the MLP model in neural networks. Acta Numer 8, 143\u2013195 (1999). https:\/\/doi.org\/10.1017\/s0962492900002919","journal-title":"Acta Numer"},{"key":"10273_CR24","doi-asserted-by":"publisher","unstructured":"Prasolov, V.: Elements of Combinatorial and Differential Topology. Am. Math. Soc. (2006). https:\/\/doi.org\/10.1090\/gsm\/074","DOI":"10.1090\/gsm\/074"},{"key":"10273_CR25","unstructured":"Rosenblatt, F.: The perceptron - a perceiving and recognizing automaton. Cornell Aeronautical Laboratory, INC., Buffalo, New York, 85(460-1) (1957)"},{"key":"10273_CR26","doi-asserted-by":"publisher","unstructured":"Sch\u00e4fer, A.M., Zimmermann, H.G.: Recurrent Neural Networks Are Universal Approximators, pages 632\u2013640. Springer Berlin Heidelberg, (2006). https:\/\/doi.org\/10.1007\/11840817_66","DOI":"10.1007\/11840817_66"},{"key":"10273_CR27","doi-asserted-by":"publisher","unstructured":"Shafarevich, I.R., Remizov, A.O.: Linear Algebra and Geometry. Springer Berlin Heidelberg, (2013). https:\/\/doi.org\/10.1007\/978-3-642-30994-6","DOI":"10.1007\/978-3-642-30994-6"},{"issue":"5","key":"10273_CR28","doi-asserted-by":"publisher","first-page":"951","DOI":"10.1007\/s00010-016-0412-4","volume":"90","author":"N Thome","year":"2016","unstructured":"Thome, N.: Inequalities and equalities for $$l=2$$ (sylvester), $$l=3$$ (frobenius), and $$l>3$$ matrices. Aequationes Math. 90(5), 951\u2013960 (2016). https:\/\/doi.org\/10.1007\/s00010-016-0412-4","journal-title":"Aequationes Math."},{"key":"10273_CR29","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1007\/s40304-017-0103-z","volume":"5","author":"E Weinan","year":"2017","unstructured":"Weinan, E.: A proposal on machine learning via dynamical systems. Commun. Math. Stat 5, 1\u201311 (2017). https:\/\/doi.org\/10.1007\/s40304-017-0103-z","journal-title":"Commun. Math. Stat"},{"key":"10273_CR30","doi-asserted-by":"publisher","unstructured":"Zhang, H.,\u00a0Gao, X.,\u00a0Unterman, J.,\u00a0Arodz, T.: Approximation capabilities of neural ODEs and invertible residual networks. Proceedings of the 37th International Conference on Machine Learning 119, 11086\u201311095 (2020). https:\/\/doi.org\/10.48550\/ARXIV.1907.12998","DOI":"10.48550\/ARXIV.1907.12998"}],"container-title":["Advances in Computational Mathematics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10444-025-10273-5.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s10444-025-10273-5","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10444-025-10273-5.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,3,31]],"date-time":"2026-03-31T10:27:05Z","timestamp":1774952825000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s10444-025-10273-5"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2026,2]]},"references-count":30,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2026,2]]}},"alternative-id":["10273"],"URL":"https:\/\/doi.org\/10.1007\/s10444-025-10273-5","relation":{},"ISSN":["1019-7168","1572-9044"],"issn-type":[{"value":"1019-7168","type":"print"},{"value":"1572-9044","type":"electronic"}],"subject":[],"published":{"date-parts":[[2026,2]]},"assertion":[{"value":"27 June 2024","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"14 November 2025","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"9 February 2026","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare no competing interests.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}}],"article-number":"9"}}