{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,7,30]],"date-time":"2025-07-30T16:31:09Z","timestamp":1753893069491,"version":"3.41.2"},"reference-count":22,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2023,11,24]],"date-time":"2023-11-24T00:00:00Z","timestamp":1700784000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Artif. Intell."],"abstract":"<jats:p>We propose the geometric framework of the Schubert variety as a tool for representing a collection of subspaces of a fixed vector space. Specifically, given a collection of <jats:italic>l<\/jats:italic>-dimensional subspaces <jats:italic>V<\/jats:italic><jats:sub>1<\/jats:sub>, \u2026, <jats:italic>V<\/jats:italic><jats:sub><jats:italic>r<\/jats:italic><\/jats:sub> of \u211d<jats:sup><jats:italic>n<\/jats:italic><\/jats:sup>, represented as the column spaces of matrices <jats:italic>X<\/jats:italic><jats:sub>1<\/jats:sub>, \u2026, <jats:italic>X<\/jats:italic><jats:sub><jats:italic>r<\/jats:italic><\/jats:sub>, we seek to determine a <jats:italic>representative<\/jats:italic> matrix <jats:italic>K<\/jats:italic>\u2208\u211d<jats:sup><jats:italic>n<\/jats:italic>\u00d7<jats:italic>k<\/jats:italic><\/jats:sup> such that each subspace <jats:italic>V<\/jats:italic><jats:sub><jats:italic>i<\/jats:italic><\/jats:sub> intersects (or comes close to intersecting) the span of the columns of <jats:italic>K<\/jats:italic> in at least <jats:italic>c<\/jats:italic> dimensions. We formulate a non-convex optimization problem to determine such a <jats:italic>K<\/jats:italic> along with associated sets of vectors {<jats:italic>a<\/jats:italic><jats:sub><jats:italic>i<\/jats:italic><\/jats:sub>} and {<jats:italic>b<\/jats:italic><jats:sub><jats:italic>i<\/jats:italic><\/jats:sub>} used to express linear combinations of the columns of the <jats:italic>X<\/jats:italic><jats:sub><jats:italic>i<\/jats:italic><\/jats:sub> that are close to linear combinations of the columns of <jats:italic>K<\/jats:italic>. Further, we present a mechanism for integrating this representation into an artificial neural network architecture as a computational unit (which we refer to as an abstract node). The representative matrix <jats:italic>K<\/jats:italic> can be learned <jats:italic>in situ<\/jats:italic>, or sequentially, as part of a learning problem. Additionally, the matrix <jats:italic>K<\/jats:italic> can be employed as a change of coordinates in the learning problem. The set of all <jats:italic>l<\/jats:italic>-dimensional subspaces of \u211d<jats:sup><jats:italic>n<\/jats:italic><\/jats:sup> that intersects the span of the columns of <jats:italic>K<\/jats:italic> in at least <jats:italic>c<\/jats:italic> dimensions is an example of a Schubert subvariety of the Grassmannian <jats:italic>GR<\/jats:italic>(<jats:italic>l, n<\/jats:italic>). When it is not possible to find a Schubert variety passing through a collection of points on <jats:italic>GR<\/jats:italic>(<jats:italic>l, n<\/jats:italic>), the goal of the non-convex optimization problem is to find the Schubert variety of best fit, i.e., the Schubert variety that comes as close as possible to the points. This may be viewed as an analog of finding a subspace of best fit to data in a vector space. The approach we take is well-suited to the modeling of collections of sets of data either as a stand-alone Schubert variety of best fit (SVBF), or in the processing workflow of a deep neural network. We present applications to some classification problems on sets of data to illustrate the behavior of the method.<\/jats:p>","DOI":"10.3389\/frai.2023.1274830","type":"journal-article","created":{"date-parts":[[2023,11,24]],"date-time":"2023-11-24T08:26:21Z","timestamp":1700814381000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":0,"title":["An algorithm for computing Schubert varieties of best fit with applications"],"prefix":"10.3389","volume":"6","author":[{"given":"Karim","family":"Karimov","sequence":"first","affiliation":[]},{"given":"Michael","family":"Kirby","sequence":"additional","affiliation":[]},{"given":"Chris","family":"Peterson","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2023,11,24]]},"reference":[{"key":"B1","doi-asserted-by":"publisher","first-page":"85","DOI":"10.3233\/IDA-2002-6106","article-title":"The bilipschitz criterion for mapping design in data analysis","volume":"6","author":"Anderle","year":"2002","journal-title":"Intell. Data Anal"},{"key":"B2","doi-asserted-by":"publisher","first-page":"351","DOI":"10.1109\/TPAMI.2008.200","article-title":"Principal angles separate subject illumination spaces in YDB and CMU-pie","volume":"31","author":"Beveridge","year":"2008","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell"},{"key":"B3","doi-asserted-by":"publisher","first-page":"2114","DOI":"10.1137\/S0036139998338583","article-title":"A new approach for dimensionality reduction: theory and algorithms","volume":"60","author":"Broomhead","year":"2000","journal-title":"SIAM J. Appl. Math"},{"key":"B4","doi-asserted-by":"publisher","first-page":"2595","DOI":"10.1162\/089976601753196049","article-title":"The Whitney reduction network: a method for computing autoassociative graphs","volume":"13","author":"Broomhead","year":"2001","journal-title":"Neural Comput"},{"key":"B5","first-page":"20","article-title":"Supervised dimensionality reduction and visualization using centroid-encoder","volume":"23","author":"Ghosh","year":"2022","journal-title":"J. Mach. Learn. Res"},{"key":"B6","first-page":"78","article-title":"The first stage of perception: growth of the assembly","volume":"4","author":"Hebb","year":"1949","journal-title":"Organ. Behav"},{"key":"B7","doi-asserted-by":"publisher","first-page":"70","DOI":"10.1063\/1.2810360","article-title":"Introduction to the theory of neural computation","volume":"44","author":"Hertz","year":"1991","journal-title":"Am. Instit. Phys"},{"key":"B8","first-page":"27","article-title":"\u201cSpherical nodes in neural networks with applications,\u201d","volume-title":"Intelligent Engineering Through Artificial Neural Networks, Vol. 5","author":"Hundley","year":"1995"},{"key":"B9","doi-asserted-by":"publisher","first-page":"307","DOI":"10.1561\/2200000056","article-title":"An introduction to variational autoencoders","volume":"12","author":"Kingma","year":"2019","journal-title":"Found. Trends Mach. Learn"},{"key":"B10","doi-asserted-by":"publisher","first-page":"390","DOI":"10.1162\/neco.1996.8.2.390","article-title":"Circular nodes in neural networks","volume":"8","author":"Kirby","year":"1996","journal-title":"Neural Comput"},{"key":"B11","doi-asserted-by":"publisher","first-page":"233","DOI":"10.1002\/aic.690370209","article-title":"Nonlinear principal component analysis using autoassociative neural networks","volume":"37","author":"Kramer","year":"1991","journal-title":"AIChE J"},{"key":"B12","first-page":"69","article-title":"\u201cA GPU-oriented algorithm design for secant-based dimensionality reduction,\u201d","volume-title":"2018 17th International Symposium on Parallel and Distributed Computing (ISPDC)","author":"Kvinge","year":""},{"key":"B13","first-page":"1","article-title":"\u201cToo many secants: a hierarchical approach to secant-based dimensionality reduction on large data sets,\u201d","volume-title":"2018 IEEE High Performance Extreme Computing Conference (HPEC)","author":"Kvinge","year":""},{"key":"B14","first-page":"4185","article-title":"\u201cThe flag manifold as a tool for analyzing and comparing sets of data sets,\u201d","volume-title":"Proceedings of the IEEE\/CVF International Conference on Computer Vision (ICCV) Workshops","author":"Ma","year":"2021"},{"key":"B15","first-page":"10339","article-title":"\u201cThe flag median and flagirls,\u201d","volume-title":"Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition","author":"Mankovich","year":"2022"},{"key":"B16","doi-asserted-by":"crossref","first-page":"457","DOI":"10.1007\/978-3-319-10705-9_45","article-title":"\u201cFlag manifolds for the characterization of geometric structure in large data sets,\u201d","volume-title":"Numerical Mathematics and Advanced Applications-ENUMATH 2013","author":"Marrinan","year":"2015"},{"key":"B17","doi-asserted-by":"crossref","first-page":"1082","DOI":"10.1109\/CVPR.2014.142","article-title":"\u201cFinding the subspace mean or median to fit your need,\u201d","volume-title":"2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)","author":"Marrinan","year":"2014"},{"key":"B18","doi-asserted-by":"publisher","first-page":"115","DOI":"10.1007\/BF02478259","article-title":"A logical calculus of the ideas immanent in nervous activity","volume":"5","author":"McCulloch","year":"1943","journal-title":"Bull. Math. Biophys"},{"key":"B19","doi-asserted-by":"publisher","first-page":"927","DOI":"10.1016\/S0893-6080(05)80089-9","article-title":"Principal components, minor components and linear neural networks","volume":"5","author":"Oja","year":"1992","journal-title":"Neural Netw"},{"key":"B20","first-page":"252","article-title":"\u201cLearning in a single pass: a neural model for principal component analysis and linear regression,\u201d","volume-title":"Proceedings of the IEE International Conference on Artificial Neural Networks","author":"Rosenblatt","year":"2002"},{"key":"B21","article-title":"\u201cAttention is all you need,\u201d","volume-title":"Advances in Neural Information Processing Systems, Vol. 30","author":"Vaswani","year":"2017"},{"key":"B22","doi-asserted-by":"publisher","first-page":"1945","DOI":"10.1109\/TPAMI.2005.244","article-title":"Generalized principal component analysis (GPCA)","volume":"27","author":"Vidal","year":"2005","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell"}],"container-title":["Frontiers in Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frai.2023.1274830\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,11,24]],"date-time":"2023-11-24T08:26:30Z","timestamp":1700814390000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frai.2023.1274830\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,11,24]]},"references-count":22,"alternative-id":["10.3389\/frai.2023.1274830"],"URL":"https:\/\/doi.org\/10.3389\/frai.2023.1274830","relation":{},"ISSN":["2624-8212"],"issn-type":[{"type":"electronic","value":"2624-8212"}],"subject":[],"published":{"date-parts":[[2023,11,24]]},"article-number":"1274830"}}