{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,11,26]],"date-time":"2025-11-26T16:48:04Z","timestamp":1764175684166,"version":"3.41.2"},"reference-count":25,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2024,8,23]],"date-time":"2024-08-23T00:00:00Z","timestamp":1724371200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Artif. Intell."],"abstract":"<jats:p>The world urgently needs new sources of clean energy due to a growing global population, rising energy use, and the effects of climate change. Nuclear energy is one of the most promising solutions for meeting the world\u2019s energy needs now and in the future. One type of nuclear energy, Low Energy Nuclear Reactions (LENR), has gained interest as a potential clean energy source. Recent AI advancements create new ways to help research LENR and to comprehensively analyze the relationships between experimental parameters, materials, and outcomes across diverse LENR research endeavors worldwide. This study explores and investigates the effectiveness of modern AI capabilities leveraging embedding models and topic modeling techniques, including Latent Dirichlet Allocation (LDA), BERTopic, and Top2Vec, in elucidating the underlying structure and prevalent themes within a large LENR research corpus. These methodologies offer unique perspectives on understanding relationships and trends within the LENR research landscape, thereby facilitating advancements in this crucial energy research area. Furthermore, the study presents LENRsim, an experimental machine learning tool to identify similar LENR studies, along with a user-friendly web interface for widespread adoption and utilization. The findings contribute to the understanding and progression of LENR research through data-driven analysis and tool development, enabling more informed decision-making and strategic planning for future research in this field. The insights derived from this study, along with the experimental tools we developed and deployed, hold the potential to significantly aid researchers in advancing their studies of LENR.<\/jats:p>","DOI":"10.3389\/frai.2024.1401782","type":"journal-article","created":{"date-parts":[[2024,8,23]],"date-time":"2024-08-23T04:37:25Z","timestamp":1724387845000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":3,"title":["Exploring artificial intelligence techniques to research low energy nuclear reactions"],"prefix":"10.3389","volume":"7","author":[{"given":"Anasse","family":"Bari","sequence":"first","affiliation":[]},{"given":"Tanya Pushkin","family":"Garg","sequence":"additional","affiliation":[]},{"given":"Yvonne","family":"Wu","sequence":"additional","affiliation":[]},{"given":"Sneha","family":"Singh","sequence":"additional","affiliation":[]},{"given":"David","family":"Nagel","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2024,8,23]]},"reference":[{"key":"ref1","first-page":"13","article-title":"Evaluating topic coherence using distributional semantics","author":"Aletras","year":"2013"},{"key":"ref2","doi-asserted-by":"publisher","DOI":"10.48550\/arXiv.2008.09470","article-title":"Top2vec: distributed representations of topics","author":"Angelov","year":"2020","journal-title":"arXiv"},{"volume-title":"Chapter 3: processing raw text. In natural language processing with python","year":"2009","author":"Bird","key":"ref3"},{"key":"ref4","doi-asserted-by":"publisher","first-page":"993","DOI":"10.5555\/944919.944937","article-title":"Latent Dirichlet allocation","volume":"3","author":"Blei","year":"2003","journal-title":"J. Mach. Learn. Res."},{"key":"ref5","first-page":"31","article-title":"Normalized (pointwise) mutual information in collocation extraction","author":"Bouma","year":"2009"},{"key":"ref6","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1080\/03610927408827101","article-title":"A dendrite method for cluster analysis","volume":"3","author":"Cal\u00ednski","year":"1974","journal-title":"Commun. Statis."},{"key":"ref7","doi-asserted-by":"publisher","first-page":"102034","DOI":"10.1016\/j.ipm.2019.04.002","article-title":"An evaluation of document clustering and topic modelling in two online social networks: twitter and Reddit","volume":"57","author":"Curiskis","year":"2020","journal-title":"Infor. Process. Manage."},{"key":"ref8","doi-asserted-by":"publisher","first-page":"224","DOI":"10.1109\/TPAMI.1979.4766909","article-title":"A cluster separation measure","author":"Davies","year":"1979","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref9","doi-asserted-by":"publisher","first-page":"439","DOI":"10.1162\/tacl_a_00325","article-title":"Topic modeling in embedding spaces","volume":"8","author":"Dieng","year":"2020","journal-title":"Trans. Asso. Comp. Linguis."},{"key":"ref10","doi-asserted-by":"publisher","first-page":"886498","DOI":"10.3389\/fsoc.2022.886498","article-title":"A topic Modeling comparison between LDA, NMF, Top2Vec, and BERTopic to demystify twitter posts","volume":"7","author":"Egger","year":"2022","journal-title":"Front. Sociol."},{"key":"ref11","doi-asserted-by":"publisher","DOI":"10.48550\/arXiv.2304.02020","article-title":"A bibliometric review of large language models research from 2017 to 2023","author":"Fan","year":"2023","journal-title":"arXiv"},{"key":"ref12","doi-asserted-by":"publisher","DOI":"10.48550\/arXiv.2203.05794","article-title":"BERTopic: neural topic modeling with a class-based TF-IDF procedure","author":"Grootendorst","year":"2022","journal-title":"arXiv"},{"key":"ref13","first-page":"1188","article-title":"Distributed representations of sentences and documents","author":"Le","year":"2014"},{"key":"ref14","doi-asserted-by":"publisher","first-page":"93","DOI":"10.1080\/19312458.2018.1430754","article-title":"Applying LDA topic modeling in communication research: toward a valid and reliable methodology","volume":"12","author":"Maier","year":"2018","journal-title":"Commun. Methods Meas."},{"year":"2011","author":"Mimno","article-title":"Optimizing semantic coherence in topic models","key":"ref15"},{"key":"ref16","first-page":"11","article-title":"Potential advantages and impacts of LENR generators of thermal and electrical power and energy","volume":"103","author":"Nagel","year":"2012","journal-title":"Infin. Energy"},{"key":"ref17","doi-asserted-by":"publisher","first-page":"2825","DOI":"10.48550\/arXiv.1201.0490","article-title":"Scikit-learn: machine learning in python","volume":"12","author":"Pedregosa","year":"2011","journal-title":"J. Mach. Learn. Res."},{"key":"ref18","doi-asserted-by":"publisher","first-page":"1427","DOI":"10.1109\/TKDE.2020.2992485","article-title":"Short text topic modeling techniques, applications, and performance: a survey","volume":"34","author":"Qiang","year":"2020","journal-title":"IEEE Trans. Knowl. Data Eng."},{"key":"ref19","first-page":"45","article-title":"Software framework for topic modelling with large corpora","author":"\u0158eh\u016f\u0159ek","year":"2010"},{"key":"ref20","article-title":"Sentence-bert: sentence embeddings using siamese bert-networks","author":"Reimers","year":"2019","journal-title":"arXiv"},{"key":"ref21","first-page":"399","article-title":"Exploring the space of topic coherence measures","author":"R\u00f6der","year":"2015"},{"unstructured":"LENR-CANR.Org \u2013 a library of papers about cold fusion\n            RothwellJ.\n          2002","key":"ref22"},{"key":"ref23","doi-asserted-by":"publisher","first-page":"53","DOI":"10.1016\/0377-0427(87)90125-7","article-title":"Silhouettes: a graphical aid to the interpretation and validation of cluster analysis","volume":"20","author":"Rousseeuw","year":"1987","journal-title":"J. Comput. Appl. Math."},{"key":"ref24","first-page":"314","article-title":"Could LENR change the world?","volume":"33","author":"Ruer","year":"2020","journal-title":"J. Condense. Matter Nuc. Sci."},{"key":"ref25","doi-asserted-by":"publisher","DOI":"10.48550\/arXiv.2307.07164","article-title":"Learning to retrieve in-context examples for large language models","author":"Wang","year":"2023","journal-title":"arXiv"}],"container-title":["Frontiers in Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frai.2024.1401782\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,8,23]],"date-time":"2024-08-23T04:37:39Z","timestamp":1724387859000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frai.2024.1401782\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,8,23]]},"references-count":25,"alternative-id":["10.3389\/frai.2024.1401782"],"URL":"https:\/\/doi.org\/10.3389\/frai.2024.1401782","relation":{},"ISSN":["2624-8212"],"issn-type":[{"type":"electronic","value":"2624-8212"}],"subject":[],"published":{"date-parts":[[2024,8,23]]},"article-number":"1401782"}}