{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,23]],"date-time":"2026-01-23T17:38:09Z","timestamp":1769189889462,"version":"3.49.0"},"reference-count":78,"publisher":"SAGE Publications","issue":"1","license":[{"start":{"date-parts":[[2025,10,17]],"date-time":"2025-10-17T00:00:00Z","timestamp":1760659200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"funder":[{"DOI":"10.13039\/100006192","name":"Advanced Scientific Computing Research","doi-asserted-by":"publisher","award":["DE-SC0023452"],"award-info":[{"award-number":["DE-SC0023452"]}],"id":[{"id":"10.13039\/100006192","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["The International Journal of High Performance Computing Applications"],"published-print":{"date-parts":[[2026,1]]},"abstract":"<jats:p>There has been active investigation into deep learning approaches for time series analysis, including foundation models. However, most studies do not address significant scientific applications. This paper aims to identify key features in time series by examining complex hydrology data. Our work advances computer science by emphasizing critical application features and contributes to hydrology and other scientific fields by identifying modeling approaches that effectively capture these features. Scientific time series data are inherently complex, involving observations from multiple locations, each with various time-dependent data streams and exogenous factors that may be static or time-varying and either application-dependent or purely mathematical. This research analyzes hydrology time series from the CAMELS and Caravan global datasets, which encompass rainfall and runoff data across catchments, featuring up to six observed streams and 209 static parameters across approximately 8000 locations. Our investigation assesses the impact of exogenous data through eight different model configurations for key hydrology tasks. Results demonstrate that integrating exogenous information enhances data representation, reducing root mean squared error by up to 40% in the largest dataset. Additionally, we present a detailed performance comparison of over 20 state-of-the-art pattern and foundation models. The analysis is fully open-source, facilitated by Jupyter Notebook on Google Colab for LSTM-based modeling, data preprocessing, and model comparisons. Preliminary findings using alternative deep learning architectures reveal that models incorporating comprehensive observed and exogenous data outperform more limited approaches, including foundation models. Notably, natural annual periodic exogenous time series contribute the most significant improvements, though static and other periodic factors are also valuable. This research serves as both an educational tool and benchmark resource.<\/jats:p>","DOI":"10.1177\/10943420251380008","type":"journal-article","created":{"date-parts":[[2025,10,17]],"date-time":"2025-10-17T12:14:21Z","timestamp":1760703261000},"page":"22-41","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":0,"title":["Deep learning foundation and pattern models: Challenges in hydrological time series"],"prefix":"10.1177","volume":"40","author":[{"ORCID":"https:\/\/orcid.org\/0009-0009-7422-7041","authenticated-orcid":false,"given":"Junyang","family":"He","sequence":"first","affiliation":[{"name":"University of Virginia"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5946-7831","authenticated-orcid":false,"given":"Ying-Jung","family":"Chen","sequence":"additional","affiliation":[{"name":"University of Virginia"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9079-1971","authenticated-orcid":false,"given":"Alireza","family":"Jafari","sequence":"additional","affiliation":[{"name":"University of Virginia"}]},{"ORCID":"https:\/\/orcid.org\/0009-0005-1730-390X","authenticated-orcid":false,"given":"Anushka","family":"Idamekorala","sequence":"additional","affiliation":[{"name":"University of Virginia"},{"name":"University of Virginia"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1017-1391","authenticated-orcid":false,"given":"Geoffrey","family":"Fox","sequence":"additional","affiliation":[{"name":"University of Virginia"},{"name":"Georgia Institute of Technology"}]}],"member":"179","published-online":{"date-parts":[[2025,10,17]]},"reference":[{"key":"e_1_3_4_2_1","doi-asserted-by":"publisher","DOI":"10.1016\/0022-1694(86)90114-9"},{"key":"e_1_3_4_3_1","doi-asserted-by":"publisher","DOI":"10.5194\/hess-21-5293-2017"},{"key":"e_1_3_4_4_1","doi-asserted-by":"publisher","DOI":"10.5194\/hess-22-5817-2018"},{"key":"e_1_3_4_5_1","unstructured":"Ansari AF Stella L Turkmen C et al. (2024) Chronos: learning the language of time series. https:\/\/arxiv.org\/abs\/2403.07815"},{"key":"e_1_3_4_6_1","doi-asserted-by":"publisher","DOI":"10.1038\/s41597-020-00583-2"},{"key":"e_1_3_4_7_1","unstructured":"Bai S Kolter JZ Koltun V (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271."},{"key":"e_1_3_4_8_1","doi-asserted-by":"publisher","DOI":"10.1002\/9781119951001"},{"key":"e_1_3_4_9_1","doi-asserted-by":"publisher","unstructured":"Boisier JP (2023) Cr2met: a high-resolution precipitation and temperature dataset for the period 1960-2021 Continental chile. DOI: 10.5281\/zenodo.7529682.","DOI":"10.5281\/zenodo.7529682"},{"key":"e_1_3_4_10_1","unstructured":"CDC (n.d.) National center for health statistics \u2014 cdc.gov. https:\/\/www.cdc.gov\/nchs\/index.html (accessed 04-10-2024)."},{"key":"e_1_3_4_11_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-12-2075-2020"},{"key":"e_1_3_4_12_1","first-page":"76","article-title":"Dilated recurrent neural networks","volume":"30","author":"Chang S","year":"2017","unstructured":"Chang S, Zhang Y, Han W, et al. (2017) Dilated recurrent neural networks. Advances in Neural Information Processing Systems 30: 76\u201386.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_4_13_1","unstructured":"Chen SA Li CL Yoder N et al. (2023) Tsmixer: an all-mlp architecture for time series forecasting. https:\/\/arxiv.org\/abs\/2303.06053"},{"key":"e_1_3_4_14_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-12-2459-2020"},{"key":"e_1_3_4_15_1","unstructured":"CR2 (2016) Explorador clim\u00c1tico. https:\/\/explorador.cr2.cl\/"},{"key":"e_1_3_4_16_1","volume-title":"Digital Simulation in Hydrology: Stanford Watershed Model IV by NH Crawford and RK Linsley","author":"Crawford NH","year":"1966","unstructured":"Crawford NH (1966) Digital Simulation in Hydrology: Stanford Watershed Model IV by NH Crawford and RK Linsley. Stanford University Department of Civil Engineering Technical report ;. Stanford University. Dept of Civil Engineering."},{"key":"e_1_3_4_17_1","unstructured":"Das A Kong W Leach A et al. (2023) Long-term forecasting with tide: time-series dense encoder. arXiv preprint arXiv:2304.08424."},{"key":"e_1_3_4_18_1","unstructured":"ddz (n.d.) Awesome time series forecasting\/prediction papers: a reading list of papers on time series forecasting\/prediction (TSF) and spatio-temporal forecasting\/prediction (STF). https:\/\/github.com\/ddz16\/TSFpaper (accessed 16 5 2024)."},{"key":"e_1_3_4_19_1","doi-asserted-by":"publisher","DOI":"10.5194\/iahs2022-521"},{"key":"e_1_3_4_20_1","volume-title":"Physical Hydrology","author":"Dingman SL","year":"2015","unstructured":"Dingman SL (2015) Physical Hydrology. Waveland press."},{"key":"e_1_3_4_21_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-10-765-2018"},{"key":"e_1_3_4_22_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-13-3847-2021"},{"key":"e_1_3_4_23_1","doi-asserted-by":"publisher","DOI":"10.6339\/21-JDS1007"},{"key":"e_1_3_4_24_1","doi-asserted-by":"publisher","DOI":"10.3390\/geohazards3020011"},{"key":"e_1_3_4_25_1","volume-title":"Artificial Neural Networks in Hydrology","author":"Govindaraju RS","year":"2013","unstructured":"Govindaraju RS, Rao AR (2013) Artificial Neural Networks in Hydrology. Springer Science & Business Media."},{"key":"e_1_3_4_26_1","first-page":"19622","article-title":"Large language models are zero-shot time series forecasters","volume":"36","author":"Gruver N","year":"2024","unstructured":"Gruver N, Finzi M, Qiu S, et al. (2024) Large language models are zero-shot time series forecasters. Advances in Neural Information Processing Systems 36: 19622\u201319635.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_4_27_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-10-787-2018"},{"key":"e_1_3_4_28_1","doi-asserted-by":"publisher","DOI":"10.1142\/9789812702838_0078"},{"key":"e_1_3_4_29_1","doi-asserted-by":"publisher","unstructured":"He J (2024) Data for science time series: deep learning in hydrology. DOI: 10.5281\/zenodo.13975174.","DOI":"10.5281\/zenodo.13975174"},{"key":"e_1_3_4_30_1","doi-asserted-by":"publisher","DOI":"10.1162\/neco.1997.9.8.1735"},{"key":"e_1_3_4_31_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-15-5755-2023"},{"key":"e_1_3_4_32_1","doi-asserted-by":"publisher","DOI":"10.1029\/95WR01955"},{"key":"e_1_3_4_33_1","doi-asserted-by":"publisher","DOI":"10.3390\/w10111543"},{"key":"e_1_3_4_34_1","doi-asserted-by":"publisher","DOI":"10.1109\/BigData47090.2019.9005496"},{"key":"e_1_3_4_35_1","first-page":"19","article-title":"On the use of self-registering rain and flood gauges in making observations of the relations of rainfall and flood discharges in a given catchment","volume":"4","author":"J MT","year":"1851","unstructured":"J MT (1851) On the use of self-registering rain and flood gauges in making observations of the relations of rainfall and flood discharges in a given catchment. Proceedings of the Institution of Civil Engineers of Ireland 4: 19\u201331.","journal-title":"Proceedings of the Institution of Civil Engineers of Ireland"},{"key":"e_1_3_4_36_1","doi-asserted-by":"crossref","unstructured":"Jafari A Fox G Rundle JB et al. (2024) Time series foundation models and deep learning architectures for earthquake temporal and spatial nowcasting. arXiv [cs.LG physics.geo-ph]. https:\/\/arxiv.org\/abs\/2408.11990","DOI":"10.3390\/geohazards5040059"},{"key":"e_1_3_4_37_1","doi-asserted-by":"publisher","unstructured":"Jin M Wen Q Liang Y et al. (2023) Large models for time series and spatio-temporal data: a survey and outlook. arXiv [cs.LG]. DOI: 10.48550\/arXiv.2310.10196.","DOI":"10.48550\/arXiv.2310.10196"},{"key":"e_1_3_4_38_1","doi-asserted-by":"publisher","unstructured":"Jin M Wang S Ma L et al. (2024) Time-llm: time series forecasting by reprogramming large language models. DOI: 10.48550\/arXiv.2310.01728.","DOI":"10.48550\/arXiv.2310.01728"},{"key":"e_1_3_4_39_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-7-143-2015"},{"key":"e_1_3_4_40_1","unstructured":"Kingma DP Ba J (2017) Adam: a method for stochastic optimization. https:\/\/arxiv.org\/abs\/1412.6980"},{"key":"e_1_3_4_41_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-13-4529-2021"},{"key":"e_1_3_4_42_1","doi-asserted-by":"publisher","DOI":"10.5194\/hess-22-6005-2018"},{"key":"e_1_3_4_43_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-030-28954-6_19"},{"key":"e_1_3_4_44_1","doi-asserted-by":"publisher","DOI":"10.21105\/joss.04050"},{"key":"e_1_3_4_45_1","doi-asserted-by":"publisher","DOI":"10.1038\/s41597-023-01975-w"},{"key":"e_1_3_4_46_1","doi-asserted-by":"crossref","unstructured":"Lai G Chang WC Yang Y et al(2018) Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval Ann Arbor MI July 8 2018 95\u2013104.","DOI":"10.1145\/3209978.3210006"},{"key":"e_1_3_4_47_1","unstructured":"Liang Y Wen H Nie Y et al. (2024) Foundation models for time series analysis: a tutorial and survey. arXiv [cs.LG]. https:\/\/arxiv.org\/abs\/2403.14735"},{"key":"e_1_3_4_48_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijforecast.2021.03.012"},{"key":"e_1_3_4_49_1","doi-asserted-by":"publisher","DOI":"10.1038\/s41597-019-0300-6"},{"key":"e_1_3_4_50_1","doi-asserted-by":"publisher","DOI":"10.3133\/fs20123047"},{"key":"e_1_3_4_51_1","unstructured":"Liu Y Hu T Zhang H et al. (2023) itransformer: inverted transformers are effective for time series forecasting. arXiv preprint arXiv:2310.06625."},{"key":"e_1_3_4_52_1","doi-asserted-by":"publisher","DOI":"10.1175\/JCLI-D-12-00508.1"},{"key":"e_1_3_4_53_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijforecast.2019.04.014"},{"key":"e_1_3_4_54_1","unstructured":"Mirchandani S Xia F Florence P et al. (2023) Large language models as general pattern machines. https:\/\/arxiv.org\/abs\/2307.04721"},{"key":"e_1_3_4_55_1","doi-asserted-by":"publisher","DOI":"10.1029\/WR017i005p01367"},{"key":"e_1_3_4_56_1","doi-asserted-by":"publisher","DOI":"10.5194\/essd-13-4349-2021"},{"key":"e_1_3_4_57_1","unstructured":"Nie Y Nguyen NH Sinthong P et al. (2022) A time series is worth 64 words: long-term forecasting with transformers. arXiv preprint arXiv:2211.14730."},{"key":"e_1_3_4_58_1","unstructured":"Nixtla (2024) Nixtla: a future for everybody. time series research and deployment. https:\/\/www.nixtla.io\/ (accessed: 2024-3-9)."},{"key":"e_1_3_4_59_1","first-page":"237","volume-title":"EGU General Assembly Conference Abstracts","author":"Nossent J","year":"2012","unstructured":"Nossent J, Bauwens W (2012) Application of a normalized Nash-Sutcliffe efficiency to improve the accuracy of the Sobol\u2019 sensitivity analysis of a hydrological model. In: EGU General Assembly Conference Abstracts. EGU General Assembly Conference Abstracts, 237."},{"key":"e_1_3_4_60_1","unstructured":"Olivares KG Chall\u00fa C Garza F et al. (2022) Neuralforecast: user friendly state-of-the-art neuralforecasting models. PyCon Salt Lake City Utah US 2022. https:\/\/github.com\/Nixtla\/neuralforecast"},{"key":"e_1_3_4_61_1","unstructured":"Pidwirny M (2006) The hydrologic cycle. https:\/\/www.physicalgeography.net\/fundamentals\/8b.html"},{"key":"e_1_3_4_62_1","unstructured":"Radford A Wu J Child R et al. (2019) Language models are unsupervised multitask learners."},{"issue":"140","key":"e_1_3_4_63_1","first-page":"1","article-title":"Exploring the limits of transfer learning with a unified text-to-text transformer","volume":"21","author":"Raffel C","year":"2020","unstructured":"Raffel C, Shazeer N, Roberts A, et al. (2020) Exploring the limits of transfer learning with a unified text-to-text transformer. Journal of Machine Learning Research 21(140): 1\u201367.","journal-title":"Journal of Machine Learning Research"},{"key":"e_1_3_4_64_1","doi-asserted-by":"publisher","DOI":"10.1080\/02626669509491401"},{"key":"e_1_3_4_65_1","doi-asserted-by":"publisher","unstructured":"Robinson E Blyth E Clark D et al. (2017) Climate hydrology and ecology research support system meteorology dataset for great britain (1961-2015). [chess-met] v1.2. DOI: 10.5285\/b745e7b1-626c-4ccc-ac27-56582e77b900.","DOI":"10.5285\/b745e7b1-626c-4ccc-ac27-56582e77b900"},{"key":"e_1_3_4_66_1","unstructured":"ScienceFMHub (2023) ScienceFMHub portal for science foundation model community. https:\/\/sciencefmhub.org\/ (accessed 3 11 2023)."},{"key":"e_1_3_4_67_1","doi-asserted-by":"publisher","DOI":"10.1061\/(ASCE)1084-0699(1999)4:3(232)"},{"key":"e_1_3_4_68_1","doi-asserted-by":"publisher","DOI":"10.1029\/2018WR022643"},{"key":"e_1_3_4_69_1","volume-title":"An Overview of Rainfall-Runoff Model Types","author":"Sitterson J","year":"2017","unstructured":"Sitterson J, Chris Knightes RP, Wolfe K, et al. (2017) An Overview of Rainfall-Runoff Model Types. Washington, DC: U.S. Environmental Protection Agency."},{"key":"e_1_3_4_70_1","doi-asserted-by":"publisher","DOI":"10.1002\/gdj3.239"},{"key":"e_1_3_4_71_1","unstructured":"Thornton P Thornton M Mayer B et al. (2014) Daymet: daily surface weather data on a 1-km grid for north America version 2. https:\/\/daac.ornl.gov\/cgi-bin\/dsviewer.pl?ds_id=1219"},{"key":"e_1_3_4_72_1","unstructured":"Tolstikhin I Houlsby N Kolesnikov A et al. (2021) Mlp-mixer: an all-mlp architecture for vision. https:\/\/arxiv.org\/abs\/2105.01601"},{"key":"e_1_3_4_73_1","unstructured":"USGS (2024) Earthquake hazards program of United States geological survey. https:\/\/earthquake.usgs.gov\/earthquakes\/search\/ (accessed 04-10-2024)."},{"key":"e_1_3_4_74_1","unstructured":"Vaswani A Shazeer N Parmar N et al. (2023) Attention is all you need. https:\/\/arxiv.org\/abs\/1706.03762"},{"key":"e_1_3_4_75_1","doi-asserted-by":"publisher","DOI":"10.1016\/0169-7439(87)80084-9"},{"key":"e_1_3_4_76_1","first-page":"22419","article-title":"Autoformer: decomposition transformers with auto-correlation for long-term series forecasting","volume":"34","author":"Wu H","year":"2021","unstructured":"Wu H, Xu J, Wang J, et al. (2021) Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems 34: 22419\u201322430.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_4_77_1","unstructured":"Wu H Hu T Liu Y et al. (2022) Timesnet: temporal 2d-variation modeling for general time series analysis. arXiv preprint arXiv:2210.02186."},{"key":"e_1_3_4_78_1","doi-asserted-by":"publisher","DOI":"10.1029\/2011JD016048"},{"key":"e_1_3_4_79_1","unstructured":"Ye J Zhang W Yi K et al. (2024) A survey of time series foundation models: generalizing time series representation with large language model. https:\/\/arxiv.org\/abs\/2405.02358"}],"container-title":["The International Journal of High Performance Computing Applications"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/10943420251380008","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.1177\/10943420251380008","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/10943420251380008","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,1,23]],"date-time":"2026-01-23T15:36:50Z","timestamp":1769182610000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.1177\/10943420251380008"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,10,17]]},"references-count":78,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2026,1]]}},"alternative-id":["10.1177\/10943420251380008"],"URL":"https:\/\/doi.org\/10.1177\/10943420251380008","relation":{},"ISSN":["1094-3420","1741-2846"],"issn-type":[{"value":"1094-3420","type":"print"},{"value":"1741-2846","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,10,17]]}}}