{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,24]],"date-time":"2026-04-24T08:26:34Z","timestamp":1777019194149,"version":"3.51.4"},"reference-count":48,"publisher":"IOP Publishing","issue":"2","license":[{"start":{"date-parts":[[2025,4,22]],"date-time":"2025-04-22T00:00:00Z","timestamp":1745280000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"},{"start":{"date-parts":[[2025,4,22]],"date-time":"2025-04-22T00:00:00Z","timestamp":1745280000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/iopscience.iop.org\/info\/page\/text-and-data-mining"}],"funder":[{"DOI":"10.13039\/100015711","name":"Michigan Institute for Data Science, University of Michigan","doi-asserted-by":"crossref","id":[{"id":"10.13039\/100015711","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["iopscience.iop.org"],"crossmark-restriction":false},"short-container-title":["Mach. Learn.: Sci. Technol."],"published-print":{"date-parts":[[2025,6,30]]},"abstract":"<jats:title>Abstract<\/jats:title>\n               <jats:p>Neural-network quantum states (NQS) has emerged as a powerful application of quantum-inspired deep learning for variational Monte Carlo methods, offering a competitive alternative to existing techniques for identifying ground states of quantum problems. A significant advancement toward improving the practical scalability of NQS has been the incorporation of autoregressive models, most recently transformers, as variational ans\u00e4tze. Transformers learn sequence information with greater expressiveness than recurrent models, but at the cost of increased time complexity with respect to sequence length. We explore the use of the retentive network (RetNet), a recurrent alternative to transformers, as an ansatz for solving electronic ground state problems in <jats:italic>ab initio<\/jats:italic> quantum chemistry. Unlike transformers, RetNets overcome this time complexity bottleneck by processing data in parallel during training, and recurrently during inference. We give a simple computational cost estimate of the RetNet and directly compare it with similar estimates for transformers, establishing a clear threshold ratio of problem-to-model size past which the RetNet\u2019s time complexity outperforms that of the transformer. Though this efficiency comes at the expense of decreased expressiveness relative to the transformer, we overcome this gap through training strategies that leverage the autoregressive structure of the model\u2014namely, variational neural annealing. Our findings support the RetNet as a means of improving the time complexity of NQS without sacrificing accuracy. We provide further evidence that the ablative improvements of neural annealing extend beyond the RetNet architecture, suggesting it would serve as an effective general training strategy for autoregressive NQS.<\/jats:p>","DOI":"10.1088\/2632-2153\/adcb88","type":"journal-article","created":{"date-parts":[[2025,4,10]],"date-time":"2025-04-10T22:53:17Z","timestamp":1744325597000},"page":"025022","update-policy":"https:\/\/doi.org\/10.1088\/crossmark-policy","source":"Crossref","is-referenced-by-count":4,"title":["Retentive neural quantum states: efficient ans\u00e4tze for <i>ab initio<\/i> quantum chemistry"],"prefix":"10.1088","volume":"6","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-9163-943X","authenticated-orcid":true,"given":"Oliver","family":"Knitter","sequence":"first","affiliation":[]},{"given":"Dan","family":"Zhao","sequence":"additional","affiliation":[]},{"given":"James","family":"Stokes","sequence":"additional","affiliation":[]},{"given":"Martin","family":"Ganahl","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0009-0004-9701-1968","authenticated-orcid":true,"given":"Stefan","family":"Leichenauer","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2294-7233","authenticated-orcid":true,"given":"Shravan","family":"Veerapaneni","sequence":"additional","affiliation":[]}],"member":"266","published-online":{"date-parts":[[2025,4,22]]},"reference":[{"key":"mlstadcb88bib1","doi-asserted-by":"publisher","first-page":"351","DOI":"10.1038\/s42256-022-00461-z","article-title":"Autoregressive neural-network wavefunctions for ab initio quantum chemistry","volume":"4","author":"Barrett","year":"2022","journal-title":"Nat. Mach. Intell."},{"key":"mlstadcb88bib2","doi-asserted-by":"publisher","first-page":"291","DOI":"10.1103\/RevModPhys.79.291","article-title":"Coupled-cluster theory in quantum chemistry","volume":"79","author":"Bartlett","year":"2007","journal-title":"Rev. Mod. Phys."},{"key":"mlstadcb88bib3","doi-asserted-by":"publisher","first-page":"618","DOI":"10.1038\/s42256-022-00509-0","article-title":"Neural error mitigation of near-term quantum simulations","volume":"4","author":"Bennewitz","year":"2022","journal-title":"Nat. Mach. Intell."},{"key":"mlstadcb88bib4","doi-asserted-by":"publisher","first-page":"457","DOI":"10.1002\/andp.19273892002","article-title":"Zur quantentheorie der molekeln","volume":"389","author":"Born","year":"1927","journal-title":"Ann. Phys., Lpz."},{"key":"mlstadcb88bib5","doi-asserted-by":"publisher","first-page":"602","DOI":"10.1126\/science.aag2302","article-title":"Solving the quantum many-body problem with artificial neural networks","volume":"355","author":"Carleo","year":"2017","journal-title":"Science"},{"key":"mlstadcb88bib6","doi-asserted-by":"publisher","first-page":"2368","DOI":"10.1038\/s41467-020-15724-9","article-title":"Fermionic neural-network states for ab-initio electronic structure","volume":"11","author":"Choo","year":"2020","journal-title":"Nat. Commun."},{"key":"mlstadcb88bib7","article-title":"FlashAttention-2: faster attention with better parallelism and work partitioning","author":"Dao","year":"2024"},{"key":"mlstadcb88bib8","first-page":"pp 16344","article-title":"Flashattention: fast and memory-efficient exact attention with io-awareness","volume":"vol 35","author":"Dao","year":"2022"},{"key":"mlstadcb88bib9","first-page":"pp 881","article-title":"Made: masked autoencoder for distribution estimation","author":"Germain","year":"2015"},{"key":"mlstadcb88bib10","article-title":"Classical quantum optimization with neural network quantum states","author":"Gomes","year":"2019"},{"key":"mlstadcb88bib11","article-title":"Mamba: linear-time sequence modeling with selective state spaces","author":"Gu","year":"2023"},{"key":"mlstadcb88bib12","doi-asserted-by":"publisher","DOI":"10.1016\/j.jcp.2019.108929","article-title":"Solving many-electron schr\u00f6dinger equation using deep neural networks","volume":"399","author":"Han","year":"2019","journal-title":"J. Comput. Phys."},{"key":"mlstadcb88bib13","doi-asserted-by":"publisher","DOI":"10.1103\/PhysRevResearch.2.023358","article-title":"Recurrent neural network wave functions","volume":"2","author":"Hibat-Allah","year":"2020","journal-title":"Phys. Rev. Res."},{"key":"mlstadcb88bib14","doi-asserted-by":"publisher","first-page":"952","DOI":"10.1038\/s42256-021-00401-3","article-title":"Variational neural annealing","volume":"3","author":"Hibat-Allah","year":"2021","journal-title":"Nat. Mach. Intell."},{"key":"mlstadcb88bib15","article-title":"Scaling laws for neural language models","author":"Kaplan","year":"2020"},{"key":"mlstadcb88bib16","first-page":"pp 5156","article-title":"Transformers are RNNs: fast autoregressive transformers with linear attention","author":"Katharopoulos","year":"2020"},{"key":"mlstadcb88bib17","article-title":"Toward neural network simulation of variational quantum algorithms","author":"Knitter","year":"2022"},{"key":"mlstadcb88bib18","doi-asserted-by":"publisher","DOI":"10.1063\/5.0214150","article-title":"Improved optimization for the neural-network quantum states and tests on the chromium dimer","volume":"160","author":"Li","year":"2024","journal-title":"J. Chem. Phys."},{"key":"mlstadcb88bib19","doi-asserted-by":"publisher","DOI":"10.1103\/PhysRevB.110.115137","article-title":"Neural network backflow for ab initio quantum chemistry","volume":"110","author":"Liu","year":"2024","journal-title":"Phys. Rev. B"},{"key":"mlstadcb88bib20","first-page":"pp 1","article-title":"Stochastic gradient descent with warm restarts","author":"Loshchilov","year":"2017"},{"key":"mlstadcb88bib21","article-title":"Autoregressive neural quantum states with quantum number symmetries","author":"Malyshev","year":"2023"},{"key":"mlstadcb88bib22","doi-asserted-by":"publisher","DOI":"10.1103\/RevModPhys.92.015003","article-title":"Quantum computational chemistry","volume":"92","author":"McArdle","year":"2020","journal-title":"Rev. Mod. Phys."},{"key":"mlstadcb88bib23","doi-asserted-by":"publisher","DOI":"10.1088\/2058-9565\/ab8ebc","article-title":"Openfermion: the electronic structure package for quantum computers","volume":"5","author":"McClean","year":"2020","journal-title":"Quantum Sci. Technol."},{"key":"mlstadcb88bib24","first-page":"pp 1928","article-title":"Asynchronous methods for deep reinforcement learning","author":"Mnih","year":"2016"},{"key":"mlstadcb88bib25","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1214\/10-BA521","article-title":"Monte Carlo gradient estimation in machine learning","volume":"21","author":"Guha","year":"2020","journal-title":"J. Mach. Learn. Res."},{"key":"mlstadcb88bib26","doi-asserted-by":"publisher","first-page":"33","DOI":"10.1038\/s41534-020-0259-3","article-title":"Ground-state energy estimation of the water molecule on a trapped-ion quantum computer","volume":"6","author":"Nam","year":"2020","journal-title":"npj Quantum Inf."},{"key":"mlstadcb88bib27","article-title":"PyTorch: an imperative style, high-performance deep learning library","volume":"vol 32","author":"Paszke","year":"2019"},{"key":"mlstadcb88bib28","doi-asserted-by":"crossref","DOI":"10.18653\/v1\/2023.findings-emnlp.936","article-title":"RWKV: reinventing rnns for the transformer era","author":"Peng","year":"2023"},{"key":"mlstadcb88bib29","doi-asserted-by":"publisher","DOI":"10.1103\/PhysRevResearch.2.033429","article-title":"Ab initio solution of the many-electron schr\u00f6dinger equation with deep neural networks","volume":"2","author":"Pfau","year":"2020","journal-title":"Phys. Rev. Res."},{"key":"mlstadcb88bib30","first-page":"pp 606","article-title":"Efficiently scaling transformer inference","volume":"vol 5","author":"Pope","year":"2023"},{"key":"mlstadcb88bib31","doi-asserted-by":"publisher","first-page":"259","DOI":"10.1103\/RevModPhys.77.259","article-title":"The density-matrix renormalization group","volume":"77","author":"Schollw\u00f6ck","year":"2005","journal-title":"Rev. Mod. Phys."},{"key":"mlstadcb88bib32","article-title":"Tangelo: an open-source python package for end-to-end chemistry workflows on quantum computers","author":"Senicourt","year":"2022"},{"key":"mlstadcb88bib33","article-title":"Flashattention-3: fast and accurate attention with asynchrony and low-precision","author":"Shah","year":"2024"},{"key":"mlstadcb88bib34","article-title":"Solving Schr\u00f6dinger equation with a language model","author":"Shang","year":"2023"},{"key":"mlstadcb88bib35","doi-asserted-by":"publisher","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"mlstadcb88bib36","doi-asserted-by":"publisher","DOI":"10.1103\/PhysRevLett.124.020503","article-title":"Deep autoregressive models for the efficient variational simulation of many-body quantum systems","volume":"124","author":"Sharir","year":"2020","journal-title":"Phys. Rev. Lett."},{"key":"mlstadcb88bib37","doi-asserted-by":"publisher","DOI":"10.1063\/5.0006002","article-title":"PSI4 1.4: open-source software for high-throughput quantum chemistry","volume":"152","author":"Smith","year":"2020","journal-title":"J. Chem. Phys."},{"key":"mlstadcb88bib38","doi-asserted-by":"publisher","first-page":"e1340","DOI":"10.1002\/wcms.1340","article-title":"PySCF: the python-based simulations of chemistry framework","volume":"8","author":"Sun","year":"2018","journal-title":"Wiley Interdiscip. Rev.-Comput. Mol. Sci."},{"key":"mlstadcb88bib39","article-title":"Retentive network: a successor to transformer for large language models","author":"Sun","year":"2023"},{"key":"mlstadcb88bib40","article-title":"A length-extrapolatable transformer","author":"Sun","year":"2022"},{"key":"mlstadcb88bib41","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/j.physrep.2022.08.003","article-title":"The variational quantum eigensolver: a review of methods and best practices","volume":"986","author":"Tilly","year":"2022","journal-title":"Phys. Rep."},{"key":"mlstadcb88bib42","doi-asserted-by":"publisher","first-page":"784","DOI":"10.26421\/QIC11.9-10-5","article-title":"Simulating quantum computers with probabilistic methods","volume":"11","author":"Van Den Nest","year":"2011","journal-title":"Quantum Inf. Comput."},{"key":"mlstadcb88bib43","article-title":"Attention is all you need","volume":"vol 30","author":"Vaswani","year":"2017"},{"key":"mlstadcb88bib44","first-page":"pp 1","article-title":"Nnqs-transformer: an efficient and scalable neural network quantum states approach for ab initio quantum chemistry","author":"Wu","year":"2023"},{"key":"mlstadcb88bib45","doi-asserted-by":"publisher","DOI":"10.1103\/PhysRevResearch.2.012039","article-title":"Deep learning-enhanced variational monte carlo method for quantum many-body physics","volume":"2","author":"Yang","year":"2020","journal-title":"Phys. Rev. Res."},{"key":"mlstadcb88bib46","first-page":"pp 1","article-title":"Overcoming barriers to scalability in variational quantum monte carlo","author":"Zhao","year":"2021"},{"key":"mlstadcb88bib47","doi-asserted-by":"publisher","DOI":"10.1088\/2632-2153\/acdb2f","article-title":"Scalable neural quantum states architecture for quantum chemistry","volume":"4","author":"Zhao","year":"2023","journal-title":"Mach. Learn.: Sci. Technol."},{"key":"mlstadcb88bib48","article-title":"A survey of large language models","author":"Zhao","year":"2023"}],"container-title":["Machine Learning: Science and Technology"],"original-title":[],"link":[{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88","content-type":"text\/html","content-version":"am","intended-application":"text-mining"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88\/pdf","content-type":"application\/pdf","content-version":"am","intended-application":"text-mining"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88\/pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88\/pdf","content-type":"application\/pdf","content-version":"am","intended-application":"syndication"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88\/pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88\/pdf","content-type":"application\/pdf","content-version":"am","intended-application":"similarity-checking"},{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88\/pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,4,22]],"date-time":"2025-04-22T07:16:18Z","timestamp":1745306178000},"score":1,"resource":{"primary":{"URL":"https:\/\/iopscience.iop.org\/article\/10.1088\/2632-2153\/adcb88"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,4,22]]},"references-count":48,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2025,4,22]]},"published-print":{"date-parts":[[2025,6,30]]}},"URL":"https:\/\/doi.org\/10.1088\/2632-2153\/adcb88","relation":{},"ISSN":["2632-2153"],"issn-type":[{"value":"2632-2153","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,4,22]]},"assertion":[{"value":"Retentive neural quantum states: efficient ans\u00e4tze for ab initio quantum chemistry","name":"article_title","label":"Article Title"},{"value":"Machine Learning: Science and Technology","name":"journal_title","label":"Journal Title"},{"value":"paper","name":"article_type","label":"Article Type"},{"value":"\u00a9 2025 The Author(s). Published by IOP Publishing Ltd","name":"copyright_information","label":"Copyright Information"},{"value":"2024-11-06","name":"date_received","label":"Date Received","group":{"name":"publication_dates","label":"Publication dates"}},{"value":"2025-04-10","name":"date_accepted","label":"Date Accepted","group":{"name":"publication_dates","label":"Publication dates"}},{"value":"2025-04-22","name":"date_epub","label":"Online publication date","group":{"name":"publication_dates","label":"Publication dates"}}]}}