{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,30]],"date-time":"2026-04-30T07:16:09Z","timestamp":1777533369042,"version":"3.51.4"},"reference-count":54,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2024,1,3]],"date-time":"2024-01-03T00:00:00Z","timestamp":1704240000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2024,1,3]],"date-time":"2024-01-03T00:00:00Z","timestamp":1704240000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100004281","name":"Narodowe Centrum Nauki","doi-asserted-by":"publisher","award":["2019\/35\/N\/ST6\/02125"],"award-info":[{"award-number":["2019\/35\/N\/ST6\/02125"]}],"id":[{"id":"10.13039\/501100004281","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100004281","name":"Narodowe Centrum Nauki","doi-asserted-by":"publisher","award":["2020\/37\/N\/ST6\/02728"],"award-info":[{"award-number":["2020\/37\/N\/ST6\/02728"]}],"id":[{"id":"10.13039\/501100004281","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100007088","name":"Uniwersytet Jagiello\u0144ski w Krakowie","doi-asserted-by":"publisher","id":[{"id":"10.13039\/501100007088","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["J Cheminform"],"abstract":"<jats:title>Abstract<\/jats:title><jats:p>The prediction of molecular properties is a crucial aspect in drug discovery that can save a lot of money and time during the drug design process. The use of machine learning methods to predict molecular properties has become increasingly popular in recent years. Despite advancements in the field, several challenges remain that need to be addressed, like finding an optimal pre-training procedure to improve performance on small datasets, which are common in drug discovery. In our paper, we tackle these problems by introducing Relative Molecule Self-Attention Transformer for molecular representation learning. It is a novel architecture that uses relative self-attention and 3D molecular representation to capture the interactions between atoms and bonds that enrich the backbone model with domain-specific inductive biases. Furthermore, our two-step pretraining procedure allows us to tune only a few hyperparameter values to achieve good performance comparable with state-of-the-art models on a wide selection of downstream tasks.<\/jats:p>","DOI":"10.1186\/s13321-023-00789-7","type":"journal-article","created":{"date-parts":[[2024,1,3]],"date-time":"2024-01-03T14:02:53Z","timestamp":1704290573000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":28,"title":["Relative molecule self-attention transformer"],"prefix":"10.1186","volume":"16","author":[{"given":"\u0141ukasz","family":"Maziarka","sequence":"first","affiliation":[]},{"given":"Dawid","family":"Majchrowski","sequence":"additional","affiliation":[]},{"given":"Tomasz","family":"Danel","sequence":"additional","affiliation":[]},{"given":"Piotr","family":"Gai\u0144ski","sequence":"additional","affiliation":[]},{"given":"Jacek","family":"Tabor","sequence":"additional","affiliation":[]},{"given":"Igor","family":"Podolak","sequence":"additional","affiliation":[]},{"given":"Pawe\u0142","family":"Morkisz","sequence":"additional","affiliation":[]},{"given":"Stanis\u0142aw","family":"Jastrz\u0119bski","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2024,1,3]]},"reference":[{"key":"789_CR1","unstructured":"Rommel JB (2021) From prescriptive to predictive: An interdisciplinary perspective on the future of computational chemistry. arXiv preprint arXiv:2103.02933"},{"issue":"8","key":"789_CR2","doi-asserted-by":"publisher","first-page":"592","DOI":"10.1016\/j.tips.2019.06.004","volume":"40","author":"HS Chan","year":"2019","unstructured":"Chan HS, Shan H, Dahoun T, Vogel H, Yuan S (2019) Advancing drug discovery via artificial intelligence. Trends Pharmacol Sci 40(8):592\u2013604","journal-title":"Trends Pharmacol Sci"},{"issue":"2","key":"789_CR3","doi-asserted-by":"publisher","first-page":"511","DOI":"10.1016\/j.drudis.2020.12.009","volume":"26","author":"A Bender","year":"2021","unstructured":"Bender A, Cort\u00e9s-Ciriano I (2021) Artificial intelligence in drug discovery: what is realistic, what are illusions? part 1: Ways to make an impact, and why we are not there yet. Drug Discovery Today 26(2):511\u2013524","journal-title":"Drug Discovery Today"},{"issue":"12","key":"789_CR4","doi-asserted-by":"publisher","first-page":"4462","DOI":"10.1021\/acs.molpharmaceut.7b00578","volume":"14","author":"A Korotcov","year":"2017","unstructured":"Korotcov A, Tkachenko V, Russo DP, Ekins S (2017) Comparison of deep learning with multiple machine learning methods and metrics using diverse drug discovery data sets. Mol Pharm 14(12):4462\u20134475","journal-title":"Mol Pharm"},{"key":"789_CR5","unstructured":"Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: International Conference on Machine Learning. PMLR, pp 1263\u20131272"},{"key":"789_CR6","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/j.ddtec.2020.11.009","volume":"37","author":"O Wieder","year":"2020","unstructured":"Wieder O, Kohlbacher S, Kuenemann M, Garon A, Ducrot P, Seidel T, Langer T (2020) A compact review of molecular property prediction with graph neural networks. Drug Disc Today: Technol 37:1\u201312","journal-title":"Drug Disc Today: Technol"},{"key":"789_CR7","unstructured":"Devlin J, Chang M, Lee K, Toutanova K (2019) BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, 2\u20137 June, 2019, Volume 1 (Long and Short Papers), pp 4171\u20134186"},{"key":"789_CR8","doi-asserted-by":"crossref","unstructured":"Howard J, Ruder S (2018) Universal language model fine-tuning for text classification. In: Gurevych, I., Miyao, Y. (eds.) Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, ACL 2018, Melbourne, Australia, 15\u201320 July, 2018, Volume 1: Long Papers, pp 328\u2013339","DOI":"10.18653\/v1\/P18-1031"},{"key":"789_CR9","unstructured":"Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Guyon I, von Luxburg U, Bengio S, Wallach HM, Fergus R, Vishwanathan SVN, Garnett R (eds) Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4\u20139 Dec, 2017, Long Beach, CA, USA, pp 5998\u20136008"},{"key":"789_CR10","unstructured":"Wang A, Pruksachatkun Y, Nangia N, Singh A, Michael J, Hill F, Levy O, Bowman SR (2019) Superglue: A stickier benchmark for general-purpose language understanding systems. In: Wallach HM, Larochelle H, Beygelzimer A, d\u2019Alch\u00e9-Buc F, Fox EB, Garnett R (eds.) Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 Dec, 2019, Vancouver, BC, Canada, pp 3261\u20133275"},{"key":"789_CR11","unstructured":"Maziarka \u0141, Danel T, Mucha S, Rataj K, Tabor J, Jastrz\u0119bski S (2020) Molecule attention transformer. arXiv preprint arXiv:2002.08264"},{"key":"789_CR12","unstructured":"Maziarka \u0141, Danel T, Mucha S, Rataj K, Tabor J, Jastrzebski S (2019) Molecule-augmented attention transformer. NeurIPS 2020 Workshop on Graph Representation Learning"},{"key":"789_CR13","unstructured":"Hu W, Liu B, Gomes J, Zitnik M, Liang P, Pande VS, Leskovec J (2020) Strategies for pre-training graph neural networks. In: 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, 26\u201330 Apr, 2020"},{"key":"789_CR14","unstructured":"Chithrananda S, Grand G, Ramsundar B (2020) Chemberta: large-scale self-supervised pretraining for molecular property prediction. arXiv preprint arXiv:2010.09885"},{"key":"789_CR15","unstructured":"Fabian B, Edlich T, Gaspar H, Segler M, Meyers J, Fiscato M, Ahmed M (2020) Molecular representation learning with language models and domain-relevant auxiliary tasks. arXiv preprint arXiv:2011.13230"},{"key":"789_CR16","unstructured":"Rong Y, Bian Y, Xu T, Xie W, Wei Y, Huang W, Huang J (2020) Self-supervised graph transformer on large-scale molecular data. In: Larochelle H, Ranzato M, Hadsell R, Balcan M, Lin H (eds.) Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, 6\u201312 Dec 2020, Virtual"},{"issue":"8","key":"789_CR17","doi-asserted-by":"publisher","first-page":"3370","DOI":"10.1021\/acs.jcim.9b00237","volume":"59","author":"K Yang","year":"2019","unstructured":"Yang K, Swanson K, Jin W, Coley C, Eiden P, Gao H, Guzman-Perez A, Hopper T, Kelley B, Mathea M et al (2019) Analyzing learned molecular representations for property prediction. J Chem Inform Model 59(8):3370\u20133388","journal-title":"J Chem Inform Model"},{"key":"789_CR18","unstructured":"Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, Dehghani M, Minderer M, Heigold G, Gelly S, Uszkoreit J, Houlsby N (2021) An image is worth 16x16 words: Transformers for image recognition at scale. In: 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, 3\u20137 May 2021"},{"key":"789_CR19","doi-asserted-by":"crossref","unstructured":"Shaw P, Uszkoreit J, Vaswani A (2018) Self-attention with relative position representations. In: Walker MA, Ji H, Stent A (eds) Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT, New Orleans, Louisiana, USA, 1\u20136 June 2018, Volume 2 (Short Papers), pp 464\u2013468","DOI":"10.18653\/v1\/N18-2074"},{"key":"789_CR20","doi-asserted-by":"crossref","unstructured":"Dai Z, Yang Z, Yang Y, Carbonell JG, Le QV, Salakhutdinov R (2019) Transformer-XL: Attentive language models beyond a fixed-length context. In: Korhonen A, Traum DR, M\u00e0rquez L (eds.) Proceedings of the 57th Conference of the Association for Computational Linguistics, ACL 2019, Florence, Italy, July 28- 2 Aug, 2019, Volume 1: Long Papers, pp 2978\u20132988","DOI":"10.18653\/v1\/P19-1285"},{"key":"789_CR21","unstructured":"Ingraham J, Garg VK, Barzilay R, Jaakkola TS (2019) Generative models for graph-based protein design. In: Wallach HM, Larochelle H, Beygelzimer A, d\u2019Alch\u00e9-Buc F, Fox EB, Garnett R (eds) Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8\u201314 Dec 2019, Vancouver, BC, Canada, pp 15794\u201315805"},{"key":"789_CR22","doi-asserted-by":"crossref","unstructured":"Huang Z, Liang D, Xu P, Xiang B (2020) Improve transformer models with better relative position embeddings. In: Cohn T, He Y, Liu Y (eds) Findings of the Association for Computational Linguistics: EMNLP 2020, Online Event, 16-20 Nov 2020, vol EMNLP 2020, pp 3327\u20133335","DOI":"10.18653\/v1\/2020.findings-emnlp.298"},{"key":"789_CR23","unstructured":"Romero DW, Cordonnier J (2021) Group equivariant stand-alone self-attention for vision. In: 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, 3\u20137 May 2021"},{"issue":"10s","key":"789_CR24","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3505244","volume":"54","author":"S Khan","year":"2022","unstructured":"Khan S, Naseer M, Hayat M, Zamir SW, Khan FS, Shah M (2022) Transformers in vision: a survey. ACM computing Surveys (CSUR) 54(10s):1\u201341","journal-title":"ACM computing Surveys (CSUR)"},{"key":"789_CR25","unstructured":"Ke G, He D, Liu T-Y (2021) Rethinking positional encoding in language pre-training. In: International Conference on Learning Representations"},{"key":"789_CR26","first-page":"15084","volume":"34","author":"L Chen","year":"2021","unstructured":"Chen L, Lu K, Rajeswaran A, Lee K, Grover A, Laskin M, Abbeel P, Srinivas A, Mordatch I (2021) Decision transformer: reinforcement learning via sequence modeling. Adv Neural Inform Process Syst 34:15084\u201315097","journal-title":"Adv Neural Inform Process Syst"},{"issue":"4","key":"789_CR27","doi-asserted-by":"publisher","first-page":"432","DOI":"10.1038\/s42256-023-00639-z","volume":"5","author":"J Born","year":"2023","unstructured":"Born J, Manica M (2023) Regression transformer enables concurrent sequence regression and generation for molecular language modelling. Nature Machine Intell 5(4):432\u2013444","journal-title":"Nature Machine Intell"},{"key":"789_CR28","unstructured":"Radford A, Narasimhan K, Salimans T, Sutskever I, et al (2018) Improving language understanding by generative pre-training"},{"key":"789_CR29","doi-asserted-by":"crossref","unstructured":"Wang S, Guo Y, Wang Y, Sun H, Huang J (2019) SMILES-BERT: Large scale unsupervised pre-training for molecular property prediction. In: Proceedings of the 10th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics. BCB \u201919","DOI":"10.1145\/3307339.3342186"},{"key":"789_CR30","unstructured":"Honda S, Shi S, Ueda HR (2019) Smiles transformer: Pre-trained molecular fingerprint for low data drug discovery. arXiv preprint arXiv:1911.04738"},{"issue":"2","key":"789_CR31","doi-asserted-by":"publisher","first-page":"513","DOI":"10.1039\/C7SC02664A","volume":"9","author":"Z Wu","year":"2018","unstructured":"Wu Z, Ramsundar B, Feinberg EN, Gomes J, Geniesse C, Pappu AS, Leswing K, Pande V (2018) Moleculenet: a benchmark for molecular machine learning. Chem Sci 9(2):513\u2013530","journal-title":"Chem Sci"},{"issue":"1","key":"789_CR32","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1186\/s13321-020-00477-w","volume":"13","author":"D Jiang","year":"2021","unstructured":"Jiang D, Wu Z, Hsieh C-Y, Chen G, Liao B, Wang Z, Shen C, Cao D, Wu J, Hou T (2021) Could graph neural networks learn better molecular representation for drug discovery? a comparison study of descriptor-based and graph-based models. J Cheminfirm 13(1):1\u201323","journal-title":"J Cheminfirm"},{"key":"789_CR33","doi-asserted-by":"publisher","first-page":"717","DOI":"10.1007\/s10822-019-00274-0","volume":"34","author":"M Robinson","year":"2020","unstructured":"Robinson M, Glen R, Lee A (2020) Validating the validation: reanalyzing a large-scale comparison of deep learning and machine learning models for bioactivity prediction. J Computer-Aided Mol Design 34:717\u2013730","journal-title":"J Computer-Aided Mol Design"},{"issue":"24","key":"789_CR34","doi-asserted-by":"publisher","first-page":"5441","DOI":"10.1039\/C8SC00148K","volume":"9","author":"A Mayr","year":"2018","unstructured":"Mayr A, Klambauer G, Unterthiner T, Steijaert M, Wegner JK, Ceulemans H, Clevert D-A, Hochreiter S (2018) Large-scale comparison of machine learning methods for drug target prediction on chembl. Chem Sci 9(24):5441\u20135451","journal-title":"Chem Sci"},{"key":"789_CR35","unstructured":"Klicpera J, Gro\u00df J, G\u00fcnnemann S (2020) Directional message passing for molecular graphs. In: 8th International Conference on Learning Representations"},{"key":"789_CR36","unstructured":"Shang C, Liu Q, Chen K-S, Sun J, Lu J, Yi J, Bi J (2018) Edge attention-based multi-relational graph convolutional networks. arXiv preprint arXiv: 1802.04944"},{"key":"789_CR37","doi-asserted-by":"crossref","unstructured":"Veli\u010dkovi\u0107 P (2023) Everything is connected: Graph neural networks. arXiv preprint arXiv:2301.08210","DOI":"10.1016\/j.sbi.2023.102538"},{"key":"789_CR38","doi-asserted-by":"crossref","unstructured":"Schwaller P, Laino T, Gaudin T, Bolgar P, Hunter CA, Bekas C, Lee AA (2019) Molecular transformer: A model for uncertainty-calibrated chemical reaction prediction. ACS central science","DOI":"10.26434\/chemrxiv.7297379"},{"issue":"1","key":"789_CR39","doi-asserted-by":"publisher","first-page":"31","DOI":"10.1021\/ci00057a005","volume":"28","author":"D Weininger","year":"1988","unstructured":"Weininger D (1988) Smiles, a chemical language and information system. 1. introduction to methodology and encoding rules. J Chem Inf Comput Sci 28(1):31\u201336","journal-title":"J Chem Inf Comput Sci"},{"key":"789_CR40","unstructured":"Jastrz\u0119bski S, Le\u015bniak D, Czarnecki WM (2016) Learning to smile (s). arXiv preprint arXiv:1602.06289"},{"key":"789_CR41","unstructured":"Nguyen DQ, Nguyen TD, Phung D (2019) Unsupervised universal self-attention network for graph classification. CoRR abs\/1909.11855"},{"key":"#cr-split#-789_CR42.1","doi-asserted-by":"crossref","unstructured":"Choukroun Y, Wolf L (2022) Geometric transformer for end-to-end molecule properties prediction. In: Raedt LD","DOI":"10.24963\/ijcai.2022\/401"},{"key":"#cr-split#-789_CR42.2","unstructured":"(ed) Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI 2022, Vienna, Austria, 23-29 July 2022, pp 2895-2901"},{"key":"789_CR43","doi-asserted-by":"crossref","unstructured":"Wu F, Radev D, Li SZ (2023) Molformer: Motif-based transformer on 3d heterogeneous molecular graphs. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 37, pp 5312\u20135320","DOI":"10.1609\/aaai.v37i4.25662"},{"issue":"8","key":"789_CR44","doi-asserted-by":"publisher","first-page":"1757","DOI":"10.1021\/acs.jcim.6b00601","volume":"57","author":"CW Coley","year":"2017","unstructured":"Coley CW, Barzilay R, Green WH, Jaakkola TS, Jensen KF (2017) Convolutional embedding of attributed molecular graphs for physical property prediction. J Chem Inform Model 57(8):1757\u20131772","journal-title":"J Chem Inform Model"},{"key":"789_CR45","doi-asserted-by":"crossref","unstructured":"Pocha A, Danel T, Podlewska S, Tabor J, Maziarka \u0141 (2021) Comparison of atom representations in graph neural networks for molecular property prediction. In: 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, pp 1\u20138","DOI":"10.1109\/IJCNN52387.2021.9533698"},{"key":"789_CR46","unstructured":"Lin Z, Feng M, dos Santos CN, Yu M, Xiang B, Zhou B, Bengio Y (2016) A structured self-attentive sentence embedding. In: International Conference on Learning Representations"},{"key":"789_CR47","unstructured":"Landrum G (2016) Rdkit: Open-source cheminformatics software"},{"issue":"7","key":"789_CR48","doi-asserted-by":"publisher","first-page":"645","DOI":"10.1038\/s42256-022-00501-8","volume":"4","author":"Y Li","year":"2022","unstructured":"Li Y, Hsieh C-Y, Lu R, Gong X, Wang X, Li P, Liu S, Tian Y, Jiang D, Yan J et al (2022) An adaptive graph learning method for automated molecular interactions and properties predictions. Nature Machine Intell 4(7):645\u2013651","journal-title":"Nature Machine Intell"},{"key":"789_CR49","unstructured":"Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, G\u00f3mez-Bombarelli R, Hirzel T, Aspuru-Guzik A, Adams RP (2015) Convolutional networks on graphs for learning molecular fingerprints. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R (eds) Advances in Neural Information Processing Systems 28: Annual Conference on Neural Information Processing Systems 2015, 7\u201312 Dec 2015, Montreal, Quebec, Canada, pp 2224\u20132232"},{"key":"789_CR50","unstructured":"Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: International Conference on Learning Representations"},{"issue":"8","key":"789_CR51","doi-asserted-by":"publisher","first-page":"595","DOI":"10.1007\/s10822-016-9938-8","volume":"30","author":"S Kearnes","year":"2016","unstructured":"Kearnes S, McCloskey K, Berndl M, Pande V, Riley P (2016) Molecular graph convolutions: moving beyond fingerprints. J Computer-aided Mol Design 30(8):595\u2013608","journal-title":"J Computer-aided Mol Design"},{"issue":"1","key":"789_CR52","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1038\/sdata.2014.22","volume":"1","author":"R Ramakrishnan","year":"2014","unstructured":"Ramakrishnan R, Dral PO, Rupp M, Von Lilienfeld OA (2014) Quantum chemistry structures and properties of 134 kilo molecules. Sci Data 1(1):1\u20137","journal-title":"Sci Data"},{"key":"789_CR53","doi-asserted-by":"crossref","unstructured":"Gai\u0144ski P, Maziarka \u0141, Danel T, Jastrzebski S (2022) Huggingmolecules: An open-source library for transformer-based molecular property prediction (student abstract). In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 36, pp 12949\u201312950","DOI":"10.1609\/aaai.v36i11.21611"}],"container-title":["Journal of Cheminformatics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s13321-023-00789-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1186\/s13321-023-00789-7\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s13321-023-00789-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,1,3]],"date-time":"2024-01-03T14:10:39Z","timestamp":1704291039000},"score":1,"resource":{"primary":{"URL":"https:\/\/jcheminf.biomedcentral.com\/articles\/10.1186\/s13321-023-00789-7"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,1,3]]},"references-count":54,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2024,12]]}},"alternative-id":["789"],"URL":"https:\/\/doi.org\/10.1186\/s13321-023-00789-7","relation":{},"ISSN":["1758-2946"],"issn-type":[{"value":"1758-2946","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,1,3]]},"assertion":[{"value":"24 May 2023","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"28 November 2023","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"3 January 2024","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare that they have no competing interests.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"3"}}